Edit model card

#Software Benchmark SCIBERT model. This model is a fine-tuned version of the SCIBERT model on a dataset built based on the corpora SoMESCi and Softcite.

The objective of this model is to extract software mentions from scientific texts in the BIO domain.

The training code can be found on Github.

Corpus

The corpus have been built using two corpora in software mentions.

  • SoMESCi [1]. We have used the corpus uploaded to Github, more specifically, the corpus created with sentences.
  • Softcite [2]. This project has published another corpus for software mentions, which is also available on Github. We have to note that we only use the annotations from bio domain.

To build this corpus, we have removed the annotations of other entities such as version, url and those which are related with the relation of teh entity with the text.

To reconciliate both corpora, we have mapping the labels of both corpora. Also, some decisions about the annotations have been taken, for example, in the case of Microsoft Excel, we have decided to annotate Excel as software mention, not the whole text.

Training

The corpus have been splitted in a 70-30 proportion for training and testing.

The training code can be found on Github.

The results are:

  • Precision: 0.823
  • Recall: 0.814
  • F1-score: 0.819

Acknoledgements

This is a work done thank to the effort of other projects:

Authors

  • Esteban González Guardia
  • Daniel Garijo Verdejo

Contributors

Ontology Engineering Group Universidad Politécnica de Madrid

References

  1. Schindler, D., Bensmann, F., Dietze, S., & Krüger, F. (2021, October). Somesci-A 5 star open data gold standard knowledge graph of software mentions in scientific articles. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management (pp. 4574-4583).
  2. Du, C., Cohoon, J., Lopez, P., & Howison, J. (2021). Softcite dataset: A dataset of software mentions in biomedical and economic research publications. Journal of the Association for Information Science and Technology, 72(7), 870-884.
Downloads last month
10