YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
license: apache-2.0 base_model: bert-base-uncased tags:
- text-classification
- generated_from_trainer metrics:
- accuracy
- f1 widget:
- text:
- "Yucaipa owned Dominick's before selling the chain to Safeway in 1998 for $2.5 billion."
- "Yucaipa bought Dominick's in 1995 for $693 million and sold it to Safeway for $1.8 billion in 1998." example_title: Not Equivalent
- text:
- "Revenue in the first quarter of the year dropped 15 percent from the same period a year earlier."
- "With the scandal hanging over Stewart's company, revenue the first quarter of the year dropped 15 percent from the same period a year earlier." example_title: Equivalent model-index:
- name: platzi-distilroberta-base-mrpc-glue-alexander-ferreras results: []
platzi-distilroberta-base-mrpc-glue-alexander-ferreras
This model is a fine-tuned version of bert-base-uncased on the GLUE and the MRPC datasets. It achieves the following results on the evaluation set:
- Loss: 0.4556
- Accuracy: 0.8137
- F1: 0.8742
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9, 0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
0.5101 | 1.09 | 500 | 0.4556 | 0.8137 | 0.8742 |
0.2824 | 2.18 | 1000 | 0.6425 | 0.8480 | 0.8942 |
Framework versions
- Transformers: 4.37.2
- Pytorch: 2.1.0+cu121
- Datasets: 2.17.0
- Tokenizers: 0.15.1
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.