paper_persi_chat / README.md
ai-forever's picture
Update README.md
e26341b
metadata
license: mit
datasets:
  - ai-forever/paper_persi_chat
language:
  - en
pipeline_tag: conversational

This is a weights storage for the PaperPersiChat pipeline models.

The pipeline is presented in the paper PaperPersiChat: Scientific Paper Discussion Chatbot using Transformers and Discourse Flow Management

Installation

git lfs install
git clone https://huggingface.co/ai-forever/paper_persi_chat

Usage

  1. Full pipeline:

See more details on https://github.com/ai-forever/paper_persi_chat

  1. Single models:

Inference examples: Open In Colab

Three models (Summarizer, QA module and Response Generator) can be imported using transformers library after weights downloading:

from transformers import BartForConditionalGeneration, BartTokenizer
model_name_or_path = 'paper_persi_chat/distilbart_summarizer' # or 'paper_persi_chat/bart_response_generator'
tokenizer = BartTokenizer.from_pretrained(model_name_or_path)
model =  BartForConditionalGeneration.from_pretrained(model_name_or_path).to('cuda')
from transformers import pipeline
model = pipeline("question-answering", model='paper_persi_chat/deberta_qa')
pred = model(question,
             context,
             max_seq_len=384,
             doc_stride=64,
             max_answer_len=384)

Citation

If you find our models helpful, feel free to cite our publication PaperPersiChat: Scientific Paper Discussion Chatbot using Transformers and Discourse Flow Management:

@inproceedings{chernyavskiy-etal-2023-paperpersichat,
    title = "{P}aper{P}ersi{C}hat: Scientific Paper Discussion Chatbot using Transformers and Discourse Flow Management",
    author = "Chernyavskiy, Alexander  and
      Bregeda, Max  and
      Nikiforova, Maria",
    booktitle = "Proceedings of the 24th Meeting of the Special Interest Group on Discourse and Dialogue",
    month = sep,
    year = "2023",
    address = "Prague, Czechia",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2023.sigdial-1.54",
    pages = "584--587",
}