metadata
language:
- sv
widget:
- text: >-
Den i HandelstidniDgens g&rdagsnnmmer omtalade hvalfisken, sorn fångats i
Frölnndaviken
example_title: 'News article #1'
- text: En Gosse fur plats nu genast ! inetallyrkc, JU 83 Drottninggatan.
example_title: 'News article #2'
- text: AfgäiigStiden bestämmes wid fartyget» hltkomst.
example_title: 'News article #3'
(Work in progress)
Swedish OCR correction
This model corrects OCR errors in Swedish text.
Model Description
This model is a fine-tuned version of byt5-small, a character-level multilingual transformer. It is fine-tuned on OCR samples from Swedish newspapers and historical text.
Training Data
The base model byt5 is pre-trained on mc4. This fine-tuned version is further trained on:
- Swedish newspapers from 1818 to 2018. Parts of the dataset are available from Språkbanken Text: Swedish newspapers 1818-1870, Swedish newspapers 1871-1906.
- Swedish blackletter documents from 1626 to 1816, available from Språkbaknen Text: Swedish fraktur 1626-1816
Usage
The model accepts input sequences of at most 128 UTF-8 bytes. Use the code below to get started with the model.
[Demo code here]