Edit model card

Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget

       
       

A blazing fast and lightweight Information Extraction model for Entity Linking and Relation Extraction.

This repository contains the weights and the index for the Relation Extraction ReLiK pipeline.

πŸ› οΈ Installation

Installation from PyPI

pip install relik
Other installation options

Install with optional dependencies

Install with all the optional dependencies.

pip install relik[all]

Install with optional dependencies for training and evaluation.

pip install relik[train]

Install with optional dependencies for FAISS

FAISS PyPI package is only available for CPU. For GPU, install it from source or use the conda package.

For CPU:

pip install relik[faiss]

For GPU:

conda create -n relik python=3.10
conda activate relik

# install pytorch
conda install -y pytorch=2.1.0 pytorch-cuda=12.1 -c pytorch -c nvidia

# GPU
conda install -y -c pytorch -c nvidia faiss-gpu=1.8.0
# or GPU with NVIDIA RAFT
conda install -y -c pytorch -c nvidia -c rapidsai -c conda-forge faiss-gpu-raft=1.8.0

pip install relik

Install with optional dependencies for serving the models with FastAPI and Ray.

pip install relik[serve]

Installation from source

git clone https://github.com/SapienzaNLP/relik.git
cd relik
pip install -e .[all]

πŸš€ Quick Start

ReLiK is a lightweight and fast model for Entity Linking and Relation Extraction. It is composed of two main components: a retriever and a reader. The retriever is responsible for retrieving relevant documents from a large collection, while the reader is responsible for extracting entities and relations from the retrieved documents. ReLiK can be used with the from_pretrained method to load a pre-trained pipeline.

Here is an example of how to use ReLiK for Relation Extraction:

from relik import Relik
from relik.inference.data.objects import RelikOutput

relik = Relik.from_pretrained("sapienzanlp/relik-relation-extraction-nyt-large")
relik_out: RelikOutput = relik("Michael Jordan was one of the best players in the NBA.")
RelikOutput(
  text='Michael Jordan was one of the best players in the NBA.', 
  tokens=Michael Jordan was one of the best players in the NBA., 
  id=0, 
  spans=[
    Span(start=0, end=14, label='--NME--', text='Michael Jordan'), 
    Span(start=50, end=53, label='--NME--', text='NBA')
  ], 
  triplets=[
    Triplets(
      subject=Span(start=0, end=14, label='--NME--', text='Michael Jordan'), 
      label='company', 
      object=Span(start=50, end=53, label='--NME--', text='NBA'), 
      confidence=1.0
      )
  ], 
  candidates=Candidates(
    span=[], 
    triplet=[
              [
                [
                  {"text": "company", "id": 4, "metadata": {"definition": "company of this person"}}, 
                  {"text": "nationality", "id": 10, "metadata": {"definition": "nationality of this person or entity"}}, 
                  {"text": "child", "id": 17, "metadata": {"definition": "child of this person"}}, 
                  {"text": "founded by", "id": 0, "metadata": {"definition": "founder or co-founder of this organization, religion or place"}}, 
                  {"text": "residence", "id": 18, "metadata": {"definition": "place where this person has lived"}},
                  ...
              ]
          ]
      ]
  ),
)

πŸ“Š Performance

The following table shows the results (Micro F1) of ReLiK Large on the NYT dataset:

Model NYT NYT (Pretr) AIT (m:s)
REBEL 93.1 93.4 01:45
UiE 93.5 -- --
USM 94.0 94.1 --
➑️ ReLiKLarge 95.0 94.9 00:30

πŸ€– Models

Models can be found on πŸ€— Hugging Face.

πŸ’½ Cite this work

If you use any part of this work, please consider citing the paper as follows:

@inproceedings{orlando-etal-2024-relik,
    title     = "Retrieve, Read and LinK: Fast and Accurate Entity Linking and Relation Extraction on an Academic Budget",
    author    = "Orlando, Riccardo and Huguet Cabot, Pere-Llu{\'\i}s and Barba, Edoardo and Navigli, Roberto",
    booktitle = "Findings of the Association for Computational Linguistics: ACL 2024",
    month     = aug,
    year      = "2024",
    address   = "Bangkok, Thailand",
    publisher = "Association for Computational Linguistics",
}
Downloads last month
252
Inference API
Unable to determine this model's library. Check the docs .

Collection including sapienzanlp/relik-retriever-small-nyt-document-index