Edit model card

Training procedure

The following bitsandbytes quantization config was used during training:

  • quant_method: bitsandbytes
  • load_in_8bit: False
  • load_in_4bit: True
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: True
  • bnb_4bit_compute_dtype: bfloat16

Model Description

This is an SFT(Supervised Fine-Tuned) Model meant for SQl-based text generation tasks. We have used the LoRa(Low-Ranking Adaptors) method for Fine-Tuning.

Model Summary

train/loss : 0.4354 train/learning_rate: 0.00017567567567567568 train/epoch : 5.0 train/global_step : 10

Inference Code

After doing necessary imports

device_map = {"": 0} model_id = "mistralai/Mistral-7B-v0.1" new_model = "Akil15/mistral_SQL_v.0.1"

Reload model in FP16 and merge it with LoRA weights

base_model = AutoModelForCausalLM.from_pretrained( model_id, torch_dtype=torch.float16, device_map=device_map, )

model = PeftModel.from_pretrained(base_model, new_model) model = model.merge_and_unload()

Reload tokenizer to save it

tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True) tokenizer.pad_token = tokenizer.eos_token tokenizer.padding_side = "right"

sample text(example):

text = """Question: How many vegetable farms with over 100 acres of cultivated land utilize organic farming methods, and what is the average yield per acre for these farms? Context:CREATE TABLE vegetable_farm (Acres INTEGER,Organic BOOLEAN,Yield_Per_Acre DECIMAL);"""

text = input() inputs = tokenizer(text, return_tensors="pt").to(device) outputs = model.generate(**inputs, max_new_tokens=20) print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Note: Change the max_new_tokens length based on the question-context text input or just define it to 100

Framework versions

  • PEFT 0.4.0
Downloads last month
0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Akil15/mistral_SQL_v.0.1