Edit model card

tmp_exs_faquad

This model is a fine-tuned version of neuralmind/bert-base-portuguese-cased on the FaQuAD dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

The model was trained on the train split and evaluated on the eval split.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3.0

Training results

Framework versions

  • Transformers 4.21.3
  • Pytorch 1.12.1+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1
Downloads last month
22
Hosted inference API
This model can be loaded on the Inference API on-demand.

Dataset used to train eraldoluis/faquad-bert-base-portuguese-cased

Evaluation results