LLäMmlein 1B

This is a German Tinyllama 1B language model trained from scratch using the Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_1B")

tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_1B")

Evaluation

We evaluated our results on the SuperGLEBer benchmark.

Downloads last month
1,293
Safetensors
Model size
1.1B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LSX-UniWue/LLaMmlein_1B

Adapters
8 models

Dataset used to train LSX-UniWue/LLaMmlein_1B

Space using LSX-UniWue/LLaMmlein_1B 1

Collection including LSX-UniWue/LLaMmlein_1B