lewtun's picture
lewtun HF staff
Add evaluation results on ag_news dataset
f35dc4c
metadata
language:
  - en
thumbnail: >-
  https://avatars3.githubusercontent.com/u/32437151?s=460&u=4ec59abc8d21d5feea3dab323d23a5860e6996a4&v=4
tags:
  - text-classification
  - ag_news
  - pytorch
license: mit
datasets:
  - ag_news
metrics:
  - accuracy
model-index:
  - name: nateraw/bert-base-uncased-ag-news
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: ag_news
          type: ag_news
          config: default
          split: test
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9414473684210526
            verified: true
          - name: Precision Macro
            type: precision
            value: 0.9416084922682436
            verified: true
          - name: Precision Micro
            type: precision
            value: 0.9414473684210526
            verified: true
          - name: Precision Weighted
            type: precision
            value: 0.9416084922682435
            verified: true
          - name: Recall Macro
            type: recall
            value: 0.9414473684210527
            verified: true
          - name: Recall Micro
            type: recall
            value: 0.9414473684210526
            verified: true
          - name: Recall Weighted
            type: recall
            value: 0.9414473684210526
            verified: true
          - name: F1 Macro
            type: f1
            value: 0.9414706154045142
            verified: true
          - name: F1 Micro
            type: f1
            value: 0.9414473684210526
            verified: true
          - name: F1 Weighted
            type: f1
            value: 0.9414706154045143
            verified: true
          - name: loss
            type: loss
            value: 0.17173650860786438
            verified: true

bert-base-uncased-ag-news

Model description

bert-base-uncased finetuned on the AG News dataset using PyTorch Lightning. Sequence length 128, learning rate 2e-5, batch size 32, 4 T4 GPUs, 4 epochs. The code can be found here

Limitations and bias

  • Not the best model...

Training data

Data came from HuggingFace's datasets package. The data can be viewed on nlp viewer.

Training procedure

...

Eval results

...