ariG23498's picture
Update README.md
2e3111c verified
|
raw
history blame
No virus
1.12 kB
metadata
library_name: transformers
tags: []

Turns out that Mistral-7B-Instruct-v0.3 only have safetensors. This repo is created to have the .bin files of the model.

This repo is created by:

model_id = "mistralai/Mistral-7B-Instruct-v0.3"
model = AutoModelForCausalLM.from_pretrained(model_id)
model.push_to_hub("ariG23498/Mistral-7B-Instruct-v0.3", safe_serialization=False)

This is due to the fact that the TensorFlow port cannot use safetensors and need bin files.

You can use this model with TF like so:

model_tf = TFAutoModelForCausalLM.from_pretrained("ariG23498/Mistral-7B-Instruct-v0.3", from_pt=True)
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.3")

prompt = "My favourite condiment is"
model_inputs = tokenizer([prompt], return_tensors="tf")
generated_ids = model_tf.generate(**model_inputs, max_new_tokens=100, do_sample=True)
tokenizer.batch_decode(generated_ids)[0]

As soon as the safetensors and TensorFlow issue is sorted one can ditch this repository and use the official repository!