Edit model card

Fine-Tuned BLINK CrossEncoder.

  • Base model: https://huggingface.co/UnlikelyAI/crossencoder-wiki-large
  • Training data:
    • 20% (stratified by source dataset) of the following entity resolution benchmarks:
      • handwritten_entity_linking
      • wikibank_entity_linking
      • kilt_entity_linking
      • jobe_entity_linking
      • qald9_entity_linking
  • Training setup:
    • 1 L4 GPU (23GB)
    • batch_size = 1
    • gradient_accumulation_steps = 8
    • type_optimization = "all_encoder_layers" (i.e. ["additional", "bert_model.encoder.layer"])
    • n_epochs = 2
Downloads last month
1
Inference API
Examples
Mask token: [MASK]
This model can be loaded on Inference API (serverless).