CodeLlama 7B 🦙 finetuned on toolformer dataset with qlora
Gorilla-16k dataset: https://huggingface.co/datasets/yashgoenka/gorilla-16k
finetuning_repo: https://github.com/OpenAccess-AI-Collective/axolotl
library_name: peft
Training procedure
The following bitsandbytes
quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
Framework versions
- PEFT 0.6.0.dev0
Inferenece Example:
System:
Below is an instruction that describes a task. Write a response that appropriately completes the request.
Instruction:
Present an API to determine what animal species a picture from a database is in
Response:
<<>>: Image classification, <<>>: model = hub.load('https://tfhub.dev/google/imagenet/inception_v3/classification/5'), <<>>: TensorFlow Hub, <<>>: This API call uses TensorFlow's pre-trained Inception V3 image classification model from TensorFlow Hub, which can be used to classify images into multiple categories including animal species and objects, among other things. By loading this model, you can analyze a given picture of an animal from a database, and get the predictions for its class label. <<>>: import tensorflow as tf
import tensorflow_hub as hub
model_id = 'https://tfhub.dev/google/imagenet/inception_v3/classification/5'
model = hub.load(model_id)