Edit model card

Citation

If you use this model, please cite the following paper:

@inproceedings {yang-language-models,
    title = {Training language models with low resources: RoBERTa, BART and ELECTRA experimental models for Hungarian},
    booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)},
    year = {2021},
    publisher = {IEEE},
    address = {Online},
    author = {Yang, Zijian Győző and Váradi, Tamás},
    pages = {279--285}
}
Downloads last month
1
Unable to determine this model’s pipeline type. Check the docs .