--- license: mit language: - ru library_name: transformers --- # llama-600M-rus Simple and customized amateur experimental model pretrained on approximately 300 Mb of text fiction books from the scratch.
It could generate amateur, but more or less adequate output as well (in respect of training tokens).
The work can be used as a checkpoint for the further training or for experiments.
Simple usage example: ```python from transformers import LlamaTokenizerFast, LlamaForCausalLM model = LlamaForCausalLM.from_pretrained('demetera/llama-600M-rus') tokenizer = LlamaTokenizerFast.from_pretrained('demetera/llama-600M-rus') prompt = "Я вышел и улицу и" inputs = tokenizer(prompt, return_tensors='pt') outputs = model.generate(inputs.input_ids, attention_mask = inputs.attention_mask, max_new_tokens=250, do_sample=True, top_k=50, top_p=0.95) print (tokenizer.decode(outputs[0], skip_special_tokens=True)) ```