Dragneel's picture
Update README.md
a7732cb verified
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - mistral
  - trl
widget:
  - text: >-
      काठमाडौंको बहिराव बसपार्कमा एक भयानक दुर्घटना घटेको थियो। रातको समय थियो र
      भारी वर्षा जम्मा भएको थियो।
base_model: unsloth/Phi-3-mini-4k-instruct-bnb-4bit
pipeline_tag: text2text-generation
datasets:
  - sanjeev-bhandari01/nepali-summarization-dataset

Uploaded model

  • Developed by: Dragneel
  • License: apache-2.0
  • Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit

Use The Model

from transformers import AutoTokenizer, AutoModelForCausalLM

Load the tokenizer and model

tokenizer = AutoTokenizer.from_pretrained("Dragneel/Phi-3-mini-Nepali-Text-Summarization-f16")

model = AutoModelForCausalLM.from_pretrained("Dragneel/Phi-3-mini-Nepali-Text-Summarization-f16")

Example input text

input_text = "Summarize Nepali Text in Nepali: काठमाडौंको बहिराव बसपार्कमा एक भयानक दुर्घटना घटेको थियो। रातको समय थियो र भारी बर्फ जम्मा भएको थियो।"

Tokenize the input text

input_ids = tokenizer.encode(input_text, return_tensors='pt')

Generate text with adjusted parameters

outputs = model.generate(input_ids, max_new_tokens=50)

Decode the generated tokens

generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(generated_text)