Edit model card

The aim is to compress the mT5-base model to leave only the Ukrainian language and some basic English.

Reproduced the similar result (but with another language) from this medium article.

Results:

  • 582M params -> 244M params (58%)
  • 250K tokens -> 30K tokens
  • 2.2GB size model -> 0.95GB size model
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kravchenko/uk-mt5-base

Finetunes
4 models