ruGPT-3.5-13B-8bit / README.md
pe4enov's picture
Update README.md
f82a40d
metadata
language:
  - ru

Квантизированная версия модели ruGPT-3.5

# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="pe4enov/ruGPT-3.5-13B-8bit")

# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("ai-forever/ruGPT-3.5-13B")
model = AutoModelForCausalLM.from_pretrained("pe4enov/ruGPT-3.5-13B-8bit")