Edit model card

byt5-small-nc16-deen

This model is released as part of the work from Are Character-level Translations Worth the Wait? Comparing Character- and Subword-level Models for Machine Translation. It is a ByT5 model finetuned on German-->English translation using 250k sentence pairs from the WMT NewsCommentary v16 dataset.

To use the model correctly, you must prepend the prompt with "translate X to Y: ", where X and Y are your source and target languages (e.g. German, English).

NOTE: The decoder_start_token_id is 259 for byt5 models and 250099 for mt5 models, which is different from the default token from google's byt5 and mt5 models (which is 0).

Downloads last month
7
Safetensors
Model size
1.23B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including leukas/byt5-large-nc16-250k-deen