Apply for community grant: Academic project

#1
by ptdataScience - opened
Copenhagen Business School org

We're two students at Copenhagen Business School, who's currently writing our master thesis on GPT models for low-resource languages. We've finetuned a GPT model based on a translated version of Stanford's Alpaca dataset. As there is a lack of literature on evaluating these fine-tuned models, we're seeking feedback from end-users, and would there appreciate the community grant. Hopefully our findings will help bridge the research gap.

Copenhagen Business School org

I'll be happy to provide more information if necessary. Please find our research questions below:

RQ1: To what extent can a GPT-J-6B model, trained on a corpus of Norwegian language data, effectively compete with larger models in performing downstream tasks, given the inherent low-resource characteristics of the Norwegian language?

RQ2: To what extent can the NB-GPT-J-6B model be effectively fine-tuned into an instruction-following language model given the inherent low-resource characteristics of the Norwegian language and constrained resources?

Sign up or log in to comment