How to finetune the w2v-bert2.0 with multi-GPUs?

#25
by kssmmm - opened

I implemented fine-tuning on a single GPU by following the steps in the blog. Obviously, this method of training is very slow. However, when I modified CUDA_VISIBLE_DEVICES to two GPUs, the following problem occurre:

1714295748976.jpg

The first warning doesn't seem to have any impact; it also appeared when I was training with a single GPU. The second warning appears to only occur during multi-GPUs training and seems to be the cause of the final failure. Do you have any solutions?

Sign up or log in to comment