tiny-bert-ranker model card

This model is a fine-tuned version of prajjwal1/bert-tiny as part of our submission to ReNeuIR 2024.

Model Details

Model Description

The model is based on the pre-trained prajjwal1/bert-tiny. It is fine-tuned on a 1GB subset of data extracted from msmarco's Train Triples Small.

Tiny-bert-ranker is part of our investigation into the tradeoffs between efficiency and effectiveness in ranking models. This approach does not involve BM25 score injection or distillation.

  • Developed by: Team FSU at ReNeuIR 2024
  • Model type: sequence-to-sequence model
  • License: mit
  • Finetuned from model: prajjwal1/bert-tiny
Downloads last month
11
Safetensors
Model size
4.39M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.