Edit model card

Description

This is the polish gpt2 model in medium architecture.

This model was released on 30.11.2023.

Datasets

Data which are used to train this model:

  • clarin-knext/msmarco-pl
  • clarin-knext/nq-pl
  • clarin-knext/hotpotqa-pl
  • clarin-knext/scidocs-pl
  • clarin-knext/nfcorpus-pl
  • clarin-knext/dbpedia-pl
  • clarin-knext/trec-covid-pl
  • clarin-knext/quora-pl
  • clarin-knext/arguana-pl
  • clarin-knext/fiqa-pl
  • radlab/wikipedia-pl
  • radlab/legal-mc4-pl
  • own corpora not published yet

It is about 30,5 GB of data which is 3 times more than the prevoius version.

Metrics from W&B

image/png

image/png

image/png

Changelog

  • 2023.11.30 - publishing the first version of the model
Downloads last month
29
Safetensors
Model size
357M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train radlab/polish-gpt2-medium-v2

Collection including radlab/polish-gpt2-medium-v2