The official base model weights for "Efficient Continual Pre-training by Mitigating the Stability Gap".

The model has been continually pretrained on a high-quality medical sub-corpus from the RefinedWeb dataset.

Downloads last month
6
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for YiDuo1999/Llama-3-Physician-8B-Base

Quantizations
1 model