batterydata's picture
Update README.md
383638f
|
raw
history blame
1.45 kB
metadata
language: en
tags: Text Classification
license: apache-2.0
datasets:
  - batterydata/paper-abstracts
metrics: glue

BERT-base-uncased for Battery Abstract Classification

Language model: bert-base-uncased Language: English
Downstream-task: Text Classification Training data: training_data.csv Eval data: val_data.csv Code: See example Infrastructure: 8x DGX A100

Hyperparameters

batch_size = 32
n_epochs = 13
base_LM_model = "bert-base-uncased"
learning_rate = 2e-5

Performance

"Validation accuracy": 96.79,
"Test accuracy": 96.29,

Usage

In Transformers

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/bert-base-uncased-abstract"

# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)

# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

Authors

Shu Huang: sh2009 [at] cam.ac.uk

Jacqueline Cole: jmc61 [at] cam.ac.uk

Citation

BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement