pszemraj commited on
Commit
043970e
1 Parent(s): 9f6da1a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +69 -11
README.md CHANGED
@@ -1,20 +1,78 @@
1
  ---
2
- license: apache-2.0
 
 
 
 
3
  tags:
4
- - generated_from_trainer
 
 
 
 
5
  datasets:
6
- - pszemraj/jflAUG-v5
7
- model-index:
8
- - name: bart-base-jflAUG-v5-r1
9
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
 
12
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
- should probably proofread and complete it, then remove this comment. -->
14
 
15
- # bart-base-jflAUG-v5-r1
16
 
17
- This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the pszemraj/jflAUG-v5 dataset.
18
 
19
  ## Model description
20
 
@@ -54,4 +112,4 @@ The following hyperparameters were used during training:
54
  - Transformers 4.28.1
55
  - Pytorch 2.0.1+cu117
56
  - Datasets 2.12.0
57
- - Tokenizers 0.13.3
 
1
  ---
2
+ languages:
3
+ - en
4
+ license:
5
+ - cc-by-nc-sa-4.0
6
+ - apache-2.0
7
  tags:
8
+ - grammar
9
+ - spelling
10
+ - punctuation
11
+ - error-correction
12
+ - grammar synthesis
13
  datasets:
14
+ - jfleg
15
+ widget:
16
+ - text: There car broke down so their hitching a ride to they're class.
17
+ example_title: compound-1
18
+ - text: i can has cheezburger
19
+ example_title: cheezburger
20
+ - text: >-
21
+ so em if we have an now so with fito ringina know how to estimate the tren
22
+ given the ereafte mylite trend we can also em an estimate is nod s i again
23
+ tort watfettering an we have estimated the trend an called wot to be called
24
+ sthat of exty right now we can and look at wy this should not hare a trend i
25
+ becan we just remove the trend an and we can we now estimate tesees ona
26
+ effect of them exty
27
+ example_title: Transcribed Audio Example 2
28
+ - text: >-
29
+ My coworker said he used a financial planner to help choose his stocks so he
30
+ wouldn't loose money.
31
+ example_title: incorrect word choice (context)
32
+ - text: >-
33
+ good so hve on an tadley i'm not able to make it to the exla session on
34
+ monday this week e which is why i am e recording pre recording an this
35
+ excelleision and so to day i want e to talk about two things and first of
36
+ all em i wont em wene give a summary er about ta ohow to remove trents in
37
+ these nalitives from time series
38
+ example_title: lowercased audio transcription output
39
+ - text: Frustrated, the chairs took me forever to set up.
40
+ example_title: dangling modifier
41
+ - text: I would like a peice of pie.
42
+ example_title: miss-spelling
43
+ - text: >-
44
+ Which part of Zurich was you going to go hiking in when we were there for
45
+ the first time together? ! ?
46
+ example_title: chatbot on Zurich
47
+ - text: >-
48
+ Most of the course is about semantic or content of language but there are
49
+ also interesting topics to be learned from the servicefeatures except
50
+ statistics in characters in documents. At this point, Elvthos introduces
51
+ himself as his native English speaker and goes on to say that if you
52
+ continue to work on social scnce,
53
+ example_title: social science ASR summary output
54
+ - text: >-
55
+ they are somewhat nearby right yes please i'm not sure how the innish is
56
+ tepen thut mayyouselect one that istatte lo variants in their property e ere
57
+ interested and anyone basical e may be applyind reaching the browing
58
+ approach were
59
+ - medical course audio transcription
60
+ parameters:
61
+ max_length: 128
62
+ min_length: 4
63
+ num_beams: 8
64
+ repetition_penalty: 1.21
65
+ length_penalty: 1
66
+ early_stopping: true
67
+ language:
68
+ - en
69
+ pipeline_tag: text2text-generation
70
  ---
71
 
 
 
72
 
73
+ # bart-base-grammar-synthesis
74
 
75
+ This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an expanded version of the JFLEG dataset.
76
 
77
  ## Model description
78
 
 
112
  - Transformers 4.28.1
113
  - Pytorch 2.0.1+cu117
114
  - Datasets 2.12.0
115
+ - Tokenizers 0.13.3