LLäMmlein 120M

This is a German Tinyllama 120M language model trained from scratch using the Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_120M")

tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_120M")

Performance

We evaluated our model on the SuperGLEBer benchmark.

Downloads last month
705
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train LSX-UniWue/LLaMmlein_120M

Collection including LSX-UniWue/LLaMmlein_120M