This is a variant of the google/mt5-base model, in which Ukrainian and 9% English words remain. This model has 252M parameters - 43% of the original size. Special thanks for the practical example and inspiration: cointegrated

Citing & Authors

@misc{Uaritm,
      title={SetFit: Classification of medical texts}, 
      author={Vitaliy Ostashko},
      year={2022},
      url={https://esemi.org}
}
Downloads last month
36
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support