learning_rate: 7e-6
epoch: 3
context_length: 4096
RAG setting:
search method: bm25
not answering proportion: 0.0
max document length: 5
extra bm25 data: false
prompt:
[
{“role”: “system”, “content”: “從<Document>裡找到答案並利用該答案回答<Query>所敘述的問題”},
{“role”: “system”, “content”: “<Document>{Q}{A}</Document>”},
{“role”: “user”, “content”: “<Query>{Q}</Query>”},
{“role”: “assistant”, “content”: {A}},
]
- PEFT 0.13.2
- Downloads last month
- 1
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for cool9203/Llama3.2-11B-Vision-Instruct-iii_finance-Peft
Base model
meta-llama/Llama-3.2-11B-Vision-Instruct