Doubt regarding position of masking token

#18
by raghavvarmani2000 - opened

For the Multimodal AI internship task, the tokenizer adds a [CLS] token, do we have to mask the 6th token with the [CLS] token as it will shift back the position of the mask by 1 wrt to the original sentence? Or do we remove the [CLS] token and then mask the token?
Thank you.

I have the same question

HF Internships org

Both answers will be counted as correct (with or without cls).

Sign up or log in to comment