Edit model card

Recommendor-bert is a pre-trained language model to generate embeddings for research papers. It is pre-trained on a powerful signal of document-level relatedness: Arxiv tags, domains, citations, conferences, and co-authors. Recommendor-bert is built with the primary motivation of generating recommendations for research papers.

The model is finetuned using allenai/scibert_scivocab_uncased as the base model and a triplet loss function.

dataset: https://www.kaggle.com/datasets/Cornell-University/arxiv

Github: https://github.com/adit-negi/GameOfPapers

Downloads last month
12
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.