---
tags:
- spacy
- token-classification
language:
- da
license: apache-2.0
model-index:
- name: da_dacy_small_DANSK_ner
results:
- task:
name: NER
type: token-classification
metrics:
- name: NER Precision
type: precision
value: 0.7718478986
- name: NER Recall
type: recall
value: 0.7728790915
- name: NER F Score
type: f_score
value: 0.7723631509
---
# DaCy_small_DANSK_ner
DaCy is a Danish language processing framework with state-of-the-art pipelines as well as functionality for analyzing Danish pipelines.
At the time of publishing this model, also included in DaCy encorporates the only models for fine-grained NER using DANSK dataset - a dataset containing 18 annotation types in the same format as Ontonotes.
Moreover, DaCy's largest pipeline has achieved State-of-the-Art performance on Named entity recognition, part-of-speech tagging and dependency parsing for Danish on the DaNE dataset.
Check out the [DaCy repository](https://github.com/centre-for-humanities-computing/DaCy) for material on how to use DaCy and reproduce the results.
DaCy also contains guides on usage of the package as well as behavioural test for biases and robustness of Danish NLP pipelines.
| Feature | Description |
| --- | --- |
| **Name** | `da_dacy_small_DANSK_ner` |
| **Version** | `0.1.0` |
| **spaCy** | `>=3.5.0,<3.6.0` |
| **Default Pipeline** | `transformer`, `ner` |
| **Components** | `transformer`, `ner` |
| **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
| **Sources** | DANSK - Danish Annotations for NLP Specific TasKs
[jonfd/electra-small-nordic](https://huggingface.co/jonfd/electra-small-nordic) (Jón Daðason) |
| **License** | `apache-2.0` |
| **Author** | [Centre for Humanities Computing Aarhus](https://chcaa.io/#/) |
### Label Scheme
View label scheme (18 labels for 1 components)
| Component | Labels |
| --- | --- |
| **`ner`** | `CARDINAL`, `DATE`, `EVENT`, `FACILITY`, `GPE`, `LANGUAGE`, `LAW`, `LOCATION`, `MONEY`, `NORP`, `ORDINAL`, `ORGANIZATION`, `PERCENT`, `PERSON`, `PRODUCT`, `QUANTITY`, `TIME`, `WORK OF ART` |
### Accuracy
| Type | Score |
| --- | --- |
| `ENTS_F` | 77.24 |
| `ENTS_P` | 77.18 |
| `ENTS_R` | 77.29 |
| `TRANSFORMER_LOSS` | 80975.57 |
| `NER_LOSS` | 90852.49 |
### Performance tables
The three tables below show the F1-scores for the three DaCy fine-grained models.
| Domain | DaCy large | DaCy medium | DaCy small |
|:--------------------: |:----------: |:-----------: |:----------: |
| All domains combined | 0.823 | 0.806 | 0.776 |
| Conversation | 0.796 | 0.718 | 0.82 |
| Dannet | 0.75 | 0.667 | 1 |
| Legal | 0.852 | 0.854 | 0.866 |
| News | 0.841 | 0.759 | 0.86 |
| Social Media | 0.793 | 0.847 | 0.8 |
| Web | 0.826 | 0.802 | 0.756 |
| Wiki and Books | 0.778 | 0.838 | 0.709 |
| Domain | DaCy large | DaCy medium | DaCy small |
|:--------------------: |:----------: |:-----------: |:----------: |
| All domains combined | 0.823 | 0.806 | 0.776 |
| Conversation | 0.796 | 0.718 | 0.82 |
| Dannet | 0.75 | 0.667 | 1 |
| Legal | 0.852 | 0.854 | 0.866 |
| News | 0.841 | 0.759 | 0.86 |
| Social Media | 0.793 | 0.847 | 0.8 |
| Web | 0.826 | 0.802 | 0.756 |
| Wiki and Books | 0.778 | 0.838 | 0.709 |
| Domain | DaCy large | DaCy medium | DaCy small |
|:--------------------: |:----------: |:-----------: |:----------: |
| All domains combined | 0.823 | 0.806 | 0.776 |
| Conversation | 0.796 | 0.718 | 0.82 |
| Dannet | 0.75 | 0.667 | 1 |
| Legal | 0.852 | 0.854 | 0.866 |
| News | 0.841 | 0.759 | 0.86 |
| Social Media | 0.793 | 0.847 | 0.8 |
| Web | 0.826 | 0.802 | 0.756 |
| Wiki and Books | 0.778 | 0.838 | 0.709 |
| | DaCy fine-grained model | | |
|:---:|:---:|:---:|:---:|
| | | Medium | Small |
| F1-score | 0.823 | 0.806 | 0.776 |
| Recall | 0.834 | 0.818 | 0.77 |
| Precision | 0.813 | 0.794 | 0.781 |