ARIA V3 has been trained over 100.000 high quality french language with a focus on data bias, grammar and overall language/writing capacities of the model.

The training has been done on Nvidia GPU in the cloud with Amazon Sagemaker.

Base Model : Llama2-70B-Chat-HF

Dataset : private dataset.

Added value : French Language / Writing / Content Creation / Data bias reduction

Feel free to reach out to us ! contact@faradaylab.fr

Downloads last month
1,166
Safetensors
Model size
69B params
Tensor type
FP16
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Faradaylab/ARIA-70B-V3

Quantizations
1 model