Text Generation
Transformers
Safetensors
English
mistral
text-generation-inference
unsloth
trl
langchain
Inference Endpoints
Edit model card

Updated to include LangChain code and documentation:

as the model previoulsy fcould not generate correct code: Transformer documentation was also re added (not lost) as python code fragments as well as markdown pages and html pages: This data was fit to 0.9 ( the bible was attempted again but the problem is still at 2,4.....perhaps i will have to have a specialized session for this data. ).

.. in training this model only 2million parameters were moved: ....its easy fit data as it can already program!! ( no need for deep embedding)

a lora of rank 1:1 <<< Usefull for data which will only need to be slightly fit or data that the model is already close to:

hence for bible training i will have to move more tensors!!! making a more drastic effect on the model !!!(not greatly desirable)<<<< Here we will have to create a specific merge candidate : and merge the data into the prime model instead!!

Uploaded model

  • Developed by: LeroyDyer
  • License: apache-2.0
  • Finetuned from model : LeroyDyer/Mixtral_AI_LCARS_tg_1

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.

Downloads last month
18
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Invalid base_model specified in model card metadata. Needs to be a model id from hf.co/models.

Datasets used to train LeroyDyer/Mixtral_AI_LCARS_LC