File size: 1,171 Bytes
7ca6460 99b0f5e 4f6851b 7ca6460 99b0f5e 4f6851b 99b0f5e 4f6851b 99b0f5e 4f6851b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
license: mit
language:
- en
tags:
- education
- learning analytics
- educational data mining
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This is the EduBERT model used in the [EduBERT: Pretrained Deep Language Models for Learning Analytics](https://arxiv.org/abs/1912.00690) from LAK20. It is a fine-tuned version of DistilBERT on educational data.
## Model Description
We originally trained this model to support Learning Analytics task, showing it performed well on well-known educational text classification task.
## Bias, Risks, and Limitations
The model is provided as-is, and trained on the data described in the paper. Learning Analytics is a complex field, and decisions should not be taken fully automatically by models. This model should be used for analysis and to inform only.
## Citation
**BibTeX:**
```
@inproceedings{clavié2019edubert,
title={EduBERT: Pretrained Deep Language Models for Learning Analytics},
author={Benjamin Clavié and Kobi Gal},
year={2020},
booktitle={Companion Proceedings of the The 10th international Learning Analytics & Knowledge (LAK20)}
}
``` |