Faiz Ahmed
faizsameerahmed96
·
AI & ML interests
None yet
Organizations
None yet
faizsameerahmed96's activity
FP16 vs FP32
2
#127 opened about 2 months ago
by
Taylor658
![](https://cdn-avatars.huggingface.co/v1/production/uploads/641b754d1911d3be6745cce9/GXN8mEmaq3rfITRrw7GeZ.jpeg)
RuntimeError: FlashAttention only support fp16 and bf16 data type during fine tuning.
7
#11 opened 2 months ago
by
faizsameerahmed96
What pad token should I use for fine tuning?
1
#10 opened 2 months ago
by
faizsameerahmed96