Apply for community grant: Personal project

#1
by s3nh - opened

From last few month I am mainly involved in fine tuning self-instruct models to Polish language, using translated alpaca dolly dataset, combined with other dataset found and properly prepared. I focus on different size of architectures to show that there is a lot amount of additive value in smaller architectures. Not every model has to be fine tuned using lora, especially if you are focusing on gathering information straight from context.
Main purpose of this project is to make self instruct universal to test, request for different models of different size and compare them using your Spaces. I built my models using polish language, but there is vast amount of universality which can be used to train smaller architectures to perform really well on different languages (my code is somehow universal).

https://huggingface.co/s3nh
https://huggingface.co/Lajonbot

Sign up or log in to comment