Baptiste Jamin
baptistejamin
AI & ML interests
None yet
Organizations
baptistejamin's activity
Fine-tuning toolkit for Mixtral 8x7B MoE model
18
#10 opened 5 months ago
by
hiyouga
Had to hack it to use fp16 to work on my 24GB 4090 with OOM
7
#4 opened 5 months ago
by
aifartist