Model Card for AICrossSim/clm-200m

A 200M parameter language model trained on 22 * 200M tokens from FineWeb-Edu dataset.

Model Details

aixsim-200M is a transformer-based language model with approximately 200 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb-Edu dataset.

Training Details

Experiment setup and training logs can be found at wandb run.

Usage

import transformers

model_name="AICrossSim/clm-200m"
model = transformers.AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)

lm-evaluation-harness

Tasks Version Filter n-shot Metric Value Stderr
wikitext 2 none 0 bits_per_byte ↓ 1.0994 ± N/A
none 0 byte_perplexity ↓ 2.1427 ± N/A
none 0 word_perplexity ↓ 58.8531 ± N/A
Downloads last month
73
Safetensors
Model size
262M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train AICrossSim/clm-200m

Collection including AICrossSim/clm-200m