Text Generation
Transformers
Safetensors
German
llama
text-generation-inference

LLäMmlein 120M

This is a German Tinyllama 120M language model trained from scratch using the Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_120M")

tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_120M")

Performance

We evaluated our model on the SuperGLEBer benchmark.

Downloads last month
592
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for LSX-UniWue/LLaMmlein_120M

Quantizations
1 model

Dataset used to train LSX-UniWue/LLaMmlein_120M

Collection including LSX-UniWue/LLaMmlein_120M