File size: 1,561 Bytes
96c5fb7 23b8124 ab6917e 23b8124 ab6917e 23b8124 96c5fb7 23b8124 96c5fb7 23b8124 96c5fb7 23b8124 96c5fb7 23b8124 96c5fb7 23b8124 96c5fb7 23b8124 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: apache-2.0
datasets:
- legacy-datasets/banking77
language:
- en
metrics:
- accuracy
base_model:
- bert-base-uncased
pipeline:
- GEM_pipeline
---
# GEM_Banking77 Model Card
This model card provides an overview of the GEM_Banking77 model, a fine-tuned implementation of the GEM architecture designed for the **Banking77** dataset.
## Purpose
The GEM_Banking77 model was developed to evaluate the performance of the **GEM architecture** on **domain-specific datasets**, particularly in the banking and financial sector. The **Banking77 dataset**, a benchmark for **intent classification**, was chosen to assess the model’s effectiveness.
## Key Details
- **License**: Apache-2.0
- **Dataset**: `legacy-datasets/banking77`
- **Language**: English
- **Metrics**: Accuracy: **92.56%**
- **Base Model**: bert-base-uncased
- **Pipeline**: GEM_pipeline
## Model Details
The GEM_Banking77 model is built on the **GEM architecture** and fine-tuned from `bert-base-uncased` using the **Banking77 dataset**. The model configuration is as follows:
- **Number of epochs**: **10**
- **Batch size**: **Dynamic scaling: 32 * number of GPUs**
- **Learning rate**: **2e-5**
- **Maximum sequence length**: **128**
- **Gradient accumulation steps**: **2**
- **Cluster size**: **256**
- **Number of domains**: **8**
- **Number of classes**: **77**
- **Number of attention heads**: **12**
## Training & Evaluation
The model was trained using the **GEM_pipeline** and evaluated using **accuracy**, achieving a score of **92.56%**. |