winglian's picture
model after one epoch of alpaca-gpt4, *bfloat16
bd55e5c
|
raw
history blame
585 Bytes
---
license: apache-2.0
datasets:
- vicgalle/alpaca-gpt4
language:
- en
library_name: transformers
pipeline_tag: conversational
---
# Freedom-AI-Collective/llama-13b-alpaca-wizard-vicuna
## Trained
- `vicgalle/alpaca-gpt4` 1 epoch, learning rate 3e-5 https://wandb.ai/wing-lian/wizard-vicuna-gpt4/overview
- `deepspeed scripts/finetune.py configs/axolotl/wizard-vicuna-13b-step1.yml --deepspeed configs/ds_config.json --num_epochs 2 --warmup_steps 46 --logging_steps 1 --save_steps 23`
- `wizardlm` TBD
- `vicuna` TBD
<pre>Brought to you by the Freedom AI Collective</pre>