Skip to content

TARS Linker

Task-aware representation of sentences (TARS), is a simple and effective method for few-shot and even zero-shot learning for text classification. However, it was extended to perform Zero-Shot NERC.

Basically, TARS tries to convert the problem to a binary classification problem, predicting if a given text belongs to a specific class.

TARS doesn't need the descriptions of the entities, so if you can't provide the descriptions of the entities maybe this is the approach you're looking for.

The TARS linker will use the entities specified in the zshot.PipelineConfig.

Bases: Linker

TARS end2end Linker

Parameters:

Name Type Description Default
default_entities Optional[str]

Default entities to use in case no custom ones are set One of: - 'conll-short' - 'ontonotes-long' - 'ontonotes-short' - 'wnut_17-long' - 'wnut_17-short'

'conll-short'

flat_entities()

As TARS use only the labels, take just the name of the entities and not the description

load_models()

Load TARS model if its not initialized

predict(docs, batch_size=None)

Perform the entity prediction

Parameters:

Name Type Description Default
docs Iterator[Doc]

A list of spacy Document

required
batch_size Optional[Union[int, None]]

The batch size

None

Returns:

Type Description
List[List[Span]]

List Spans for each Document in docs

set_kg(entities)

Set new entities in the model

Parameters:

Name Type Description Default
entities Iterator[Entity]

New entities to use

required