AyoubChLin commited on
Commit
8b2cb2a
1 Parent(s): a3ee01f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -29,7 +29,7 @@ and a validation accuracy of 0.960415.
29
  - **Finetuned from model [optional]:** [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased)
30
 
31
 
32
- # Usage
33
 
34
  You can use this model with the Hugging Face Transformers library for a variety of natural language processing tasks, such as text classification, sentiment analysis, and more.
35
 
@@ -54,5 +54,5 @@ predicted_class_id = logits.argmax().item()
54
  ```
55
  In this example, we first load the tokenizer and the model using their respective from_pretrained methods. We then encode a news article using the tokenizer, pass the inputs through the model, and extract the predicted label using the argmax function. Finally, we map the predicted label to its corresponding category using a list of labels.
56
 
57
- Contributors
58
  This model was fine-tuned by CHERGUELAINE Ayoub and BOUBEKRI Faycal.
 
29
  - **Finetuned from model [optional]:** [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased)
30
 
31
 
32
+ ### Usage
33
 
34
  You can use this model with the Hugging Face Transformers library for a variety of natural language processing tasks, such as text classification, sentiment analysis, and more.
35
 
 
54
  ```
55
  In this example, we first load the tokenizer and the model using their respective from_pretrained methods. We then encode a news article using the tokenizer, pass the inputs through the model, and extract the predicted label using the argmax function. Finally, we map the predicted label to its corresponding category using a list of labels.
56
 
57
+ ### Contributors
58
  This model was fine-tuned by CHERGUELAINE Ayoub and BOUBEKRI Faycal.