Edit model card

Model Card for Nexa Temp Mapping

Model Description

This model, named Nexa Temp Mapping, is fine-tuned from the Mistral-7B-Instruct-v0.2 model for specialized tasks in creating test cases for Temperature Mapping of areas. It incorporates enhancements using PEFT (Pretrained Encoder Fine-Tuning) techniques to optimize performance for specific applications.

Training Data

Describe the dataset used for training the model:

  • Source: [Specify the source of the training data]
  • Size: 50 Datapoints
  • Details: Brief description of the dataset characteristics.

Intended Use

This model is intended for use in the creation of test cases to qualify equipment such as fridges, freezers, autoclaves and ovens. It is designed to improve the code model by including domain knowledge over Supplement 8 Temperature mapping of storage areas Technical supplement to WHO Technical Report Series, No. 961, 2011. Annex 9: Model guidance for the stoage and transport of time- and temperature-sensitive pharmaceutcial products.

How to Use

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("PeterGordon/nexa-temp-mapping")
model = AutoModelForCausalLM.from_pretrained("PeterGordon/nexa-temp-mapping")

text = "Your input text here"
encoded_input = tokenizer(text, return_tensors='pt')
output = model.generate(**encoded_input)
print(tokenizer.decode(output[0], skip_special_tokens=True))

---
license: apache-2.0
---
Downloads last month
1
Safetensors
Model size
3.86B params
Tensor type
F32
·
U8
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.