Chhabi commited on
Commit
44fdc0c
1 Parent(s): e286d0a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -0
README.md CHANGED
@@ -35,6 +35,32 @@ The model was trained for 5 epochs with the following training parameters:
35
 
36
  The training loss consistently decreased, indicating successful learning.
37
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
38
  ## Evaluation
39
 
40
  ### Metrics
 
35
 
36
  The training loss consistently decreased, indicating successful learning.
37
 
38
+ ## Use case:
39
+ ```python
40
+ from transformers import BartTokenizer, BartForConditionalGeneration
41
+
42
+ # Load the trained model
43
+ model = BartForConditionalGeneration.from_pretrained("NepaliAI/NFT-6.9k")
44
+
45
+ # Load the tokenizer for generating new output
46
+ tokenizer = BartTokenizer.from_pretrained("NepaliAI/NFT-6.9k")
47
+
48
+ # Example text
49
+ input_text = "के म मेरो महिनावारीको समयमा स्ट्रेप थ्रोटको लागि डाक्टरले तोकेको औषधि लिन सक्छु?"
50
+
51
+ # Tokenize the input
52
+ inputs = tokenizer(input_text, return_tensors="pt", max_length=128, truncation=True)
53
+
54
+ # Generate text
55
+ generated_text = model.generate(**inputs, max_length=256, top_p=0.95, top_k=50, do_sample=True, temperature=0.7, num_return_sequences=1, no_repeat_ngram_size=2)
56
+
57
+ # Decode the generated text
58
+ generated_response = tokenizer.batch_decode(generated_text, skip_special_tokens=True)[0]
59
+
60
+ # Print the generated response
61
+ print("Generated Response:", generated_response)
62
+
63
+ ```
64
  ## Evaluation
65
 
66
  ### Metrics