SBB
/

Token Classification
Transformers
PyTorch
German
bert
sequence-tagger-model
Inference Endpoints
cneud commited on
Commit
cbe839d
1 Parent(s): e163b1f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -5
README.md CHANGED
@@ -11,11 +11,10 @@ license: apache-2.0
11
  ---
12
  # About `sbb_ner`
13
 
14
- This is a BERT model for named entity recognition in historical German.
15
- It can predict the classes `PER`, `LOC` and `ORG`.
 
16
 
17
- The model is based on the 🤗 [Transformers](https://github.com/huggingface/transformers)
18
- `BERT Base multi-lingual cased` model.
19
  We applied unsupervised pre-training on 2,333,647 pages of
20
  unlabeled historical German text from the Berlin State Library
21
  digital collections, and supervised pre-training on two datasets
@@ -27,7 +26,7 @@ and [germeval_14](https://huggingface.co/models?dataset=dataset:germeval_14).
27
  In a 5-fold cross validation with different historical German NER corpora,
28
  the model obtained an F1-Score of **84.3**±1.1%.
29
 
30
- For details, see our [paper](https://corpora.linguistik.uni-erlangen.de/data/konvens/proceedings/papers/KONVENS2019_paper_4.pdf)
31
  or have a look at [sbb_ner](https://github.com/qurator-spk/sbb_ner) on GitHub.
32
 
33
  # Weights
 
11
  ---
12
  # About `sbb_ner`
13
 
14
+ This is a BERT model for named entity recognition (NER) in historical German.
15
+ It predicts the classes `PER`, `LOC` and `ORG`. The model is based on the 🤗
16
+ [`BERT base multilingual cased`](https://huggingface.co/bert-base-multilingual-cased) model.
17
 
 
 
18
  We applied unsupervised pre-training on 2,333,647 pages of
19
  unlabeled historical German text from the Berlin State Library
20
  digital collections, and supervised pre-training on two datasets
 
26
  In a 5-fold cross validation with different historical German NER corpora,
27
  the model obtained an F1-Score of **84.3**±1.1%.
28
 
29
+ For details, see our *KONVENS2019* [paper](https://corpora.linguistik.uni-erlangen.de/data/konvens/proceedings/papers/KONVENS2019_paper_4.pdf)
30
  or have a look at [sbb_ner](https://github.com/qurator-spk/sbb_ner) on GitHub.
31
 
32
  # Weights