varma007ut commited on
Commit
334d76e
·
verified ·
1 Parent(s): 5b7116a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -22
README.md CHANGED
@@ -33,26 +33,4 @@ This model is a fine-tuned version of the `unsloth/meta-llama-3.1-8b-bnb-4bit` d
33
 
34
  To use this model, ensure you have the necessary libraries installed. You can install them using pip:
35
 
36
- ```bash
37
- pip install transformers
38
- ## Usage
39
 
40
- Here’s an example of how to load and use the model for text generation:
41
-
42
- ```python
43
- from transformers import AutoModelForCausalLM, AutoTokenizer
44
-
45
- model_name = "your_model_name" # Replace with your model's name
46
-
47
- # Load model and tokenizer
48
- tokenizer = AutoTokenizer.from_pretrained(model_name)
49
- model = AutoModelForCausalLM.from_pretrained(model_name)
50
-
51
- # Generate text
52
- input_text = "What are the symptoms of diabetes?"
53
- input_ids = tokenizer.encode(input_text, return_tensors='pt')
54
-
55
- output = model.generate(input_ids, max_length=150)
56
- generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
57
-
58
- print(generated_text)
 
33
 
34
  To use this model, ensure you have the necessary libraries installed. You can install them using pip:
35
 
 
 
 
36