Edit model card

Training procedure

The following bitsandbytes quantization config was used during training:

  • load_in_8bit: False
  • load_in_4bit: True
  • llm_int8_threshold: 6.0
  • llm_int8_skip_modules: None
  • llm_int8_enable_fp32_cpu_offload: False
  • llm_int8_has_fp16_weight: False
  • bnb_4bit_quant_type: nf4
  • bnb_4bit_use_double_quant: True
  • bnb_4bit_compute_dtype: bfloat16

Framework versions

  • PEFT 0.4.0

prompt = f""" You are going to determine whether the description includes the business model. Don't use any prior knowledge, only base your answer off of what's given. It might not be explicitly stated but if it says "they sell in retailers" or "they sell to customers", it can be reasonably assumed that a B2C model is stated. If it says they "create software solutions" or "support companies", it is safe to assume they are B2B. If it says they are "the top defense contractor" or that they "create intelligence software for the FBI", it is reasonable to say they are B2G. However, if the information is very sparse or you are unsure, "No business model" is also a category to classify into. You should only classify into B2C, B2B, B2G, No business model. The response should be in sentence form with the class and reasoning ->: : [{data_point["Description"]}] : {data_point["Answer"]} """

config = LoraConfig( r=64, lora_alpha=16, lora_dropout = 0.1, bias="none", task_type = "CAUSAL_LM" )

Downloads last month
0
Inference API
Unable to determine this model’s pipeline type. Check the docs .