Dorami
A GPT-based pretrained model using the BERT Tokenizer
Model description
Training data
Training code
How to use
1. Download model from Hugging Face Hub to local
git lfs install
git clone https://huggingface.co/lucky2me/Dorami
2. Use the model downloaded above
import torch
from transformers import AutoTokenizer,AutoModelForCausalLM
model_path = "The path of the model downloaded above"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(model_path)
text = "fill in any text you like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
predicted_token_id = torch.argmax(output.logits[:, -1, :], dim=-1)
decoded_text = tokenizer.decode(predicted_token_id, skip_special_tokens=True)
print("decoded text:",decoded_text)
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for lucky2me/Dorami
Base model
openai-community/gpt2