Model Card for AICrossSim/bitflip-clm-200m

A 200M parameter bitflip-aware language model trained on 22 * 200M tokens from FineWeb-Edu dataset.

Model Details

bitflip-aixsim-200M is a transformer-based language model with approximately 200 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb-Edu dataset.

Training Details

Experiment setup and training logs can be found at wandb run.

Downloads last month
1
Safetensors
Model size
262M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train AICrossSim/bitflip-fc-clm-200m

Collection including AICrossSim/bitflip-fc-clm-200m