--- tags: - generated_from_keras_callback model-index: - name: bert-base-cased-trec-fine results: [] --- # bert-base-cased-trec-fine This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set: ## Model description More information needed ## Intended uses & limitations ### How to use ```from transformers import AutoTokenizer, AutoModelForSequenceClassification import tensorflow model_name = "ndavid/bert-base-cased-trec-fine" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForSequenceClassification.from_pretrained(model_name, from_tf=True) from transformers import pipeline nlp = pipeline("sentiment-analysis", model=model_name, tokenizer=model_name) results = nlp(["Where did the queen go?", "Why did the Queen hire 1000 ML Engineers?"]) print(results) ``` ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False} - training_precision: float32 ### Training results ### Framework versions - Transformers 4.18.0 - TensorFlow 2.8.0 - Datasets 2.0.0 - Tokenizers 0.12.1