Edit model card

flan-t5-base-spelling

This model is a fine-tuned version of google/flan-t5-base on the wiki.en dataset from oliverguhr/spelling.

Model description

This an experimental model that should be capable of fixing typos and punctuation.

pipe = pipeline(
    "text2text-generation",
    model="jbochi/flan-t5-base-spelling",
    token=access_token
)
   
def fix_spelling(input_sentence):
    output = pipe("proofread: " + input_sentence, max_length=100)
    return output[0]['generated_text']
   
print(fix_spelling("lets do a comparsion"))
#lets do a comparison

Intended uses & limitations

Intented for research purposes.

  • It may produce artifacts.
  • It doesn't support languages other than English.
  • It was fine-tuned with a max_length of 100 tokens.

Training and evaluation data

Data from oliverguhr/spelling, with a "proofread: " prefix added to every example.

The model was only evaluated on the first 100 test examples only during training.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001 (probably too high)
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
0.412 0.01 1000 0.3298 95.7852 91.7559 95.7602 95.7515 33.6
0.3593 0.01 2000 0.3301 95.8348 91.8272 95.8136 95.8144 33.55
0.3558 0.02 3000 0.3235 95.7455 91.6348 95.7048 95.7262 33.51
0.3242 0.03 4000 0.3359 95.863 91.8177 95.8296 95.8436 33.49
0.313 0.03 5000 0.3359 95.6728 91.4223 95.6613 95.6584 33.54
0.3123 0.04 6000 0.3441 95.8138 91.8344 95.7997 95.8025 33.6
0.3099 0.05 7000 0.3321 95.8138 91.8344 95.7997 95.8025 33.62
0.2958 0.05 8000 0.3269 95.8108 91.664 95.7949 95.7904 33.54
0.2749 0.06 9000 0.3237 95.758 91.7913 95.7622 95.7659 33.59
0.2841 0.07 10000 0.3114 95.8508 91.8853 95.8329 95.832 33.51
0.2772 0.07 11000 0.3206 95.8584 91.9881 95.853 95.864 33.54
0.2875 0.08 12000 0.3164 95.824 91.8821 95.8201 95.8307 33.56
0.3008 0.09 13000 0.3202 95.9349 92.075 95.9245 95.9207 33.55
0.288 0.09 14000 0.3140 95.7841 91.7283 95.7548 95.7585 33.42
0.2866 0.1 15000 0.3207 95.8259 91.8816 95.8057 95.8181 33.57
0.284 0.11 16000 0.3177 95.8209 91.8465 95.7971 95.7896 33.59
0.2574 0.11 17000 0.3146 95.8119 91.8995 95.7928 95.794 33.55
0.2807 0.12 18000 0.3214 95.7925 91.8605 95.769 95.7734 33.55
0.2742 0.13 19000 0.3185 95.8752 91.9684 95.8473 95.8513 33.49
0.2784 0.13 20000 0.3237 95.8729 92.0086 95.8636 95.8659 33.53
0.2768 0.14 21000 0.3187 95.6921 91.5779 95.6763 95.6681 33.44
0.2789 0.15 22000 0.3245 95.786 91.8861 95.7659 95.7607 33.5
0.2422 0.15 23000 0.3285 95.8421 91.9532 95.8388 95.8337 33.59
0.2838 0.16 24000 0.3186 95.5789 91.3976 95.5557 95.5694 33.56
0.2603 0.17 25000 0.3268 95.7276 91.6634 95.7156 95.7154 33.55
0.2622 0.17 26000 0.3230 95.808 91.9242 95.8018 95.7992 33.58
0.264 0.18 27000 0.3143 95.7982 91.8439 95.803 95.7941 33.6
0.26 0.19 28000 0.3245 95.7435 91.7233 95.7274 95.7203 33.6
0.2644 0.19 29000 0.3173 95.7982 91.8439 95.803 95.7941 33.56
0.2619 0.2 30000 0.3234 95.6744 91.5742 95.669 95.6593 33.57
0.2621 0.21 31000 0.3211 95.8658 91.9664 95.8593 95.8504 33.56
0.247 0.21 32000 0.3232 95.7248 91.4886 95.7127 95.7006 33.57
0.2428 0.22 33000 0.3206 95.8412 91.9314 95.8346 95.826 33.56
0.2389 0.23 34000 0.3125 95.7443 91.724 95.7435 95.7439 33.57
0.2634 0.23 35000 0.3205 95.8085 91.863 95.8091 95.805 33.56
0.2552 0.24 36000 0.3112 95.7519 91.7062 95.7286 95.744 33.54
0.2554 0.25 37000 0.3141 95.736 91.7453 95.7348 95.7258 33.56
0.2587 0.25 38000 0.3140 95.7572 91.6578 95.7428 95.7436 33.48
0.2521 0.26 39000 0.3146 95.7416 91.4858 95.7294 95.7293 33.47
0.2625 0.27 40000 0.3175 95.69 91.5155 95.6853 95.6856 33.53
0.2459 0.27 41000 0.3094 95.7464 91.7371 95.7353 95.7386 33.57
0.245 0.28 42000 0.3132 95.7602 91.7861 95.7498 95.7554 33.61
0.2403 0.29 43000 0.3169 95.6634 91.593 95.6744 95.6835 33.57
0.2516 0.29 44000 0.3146 95.6702 91.4878 95.6453 95.6495 33.53
0.2463 0.3 45000 0.3082 95.6617 91.5631 95.6687 95.6628 33.61
0.23 0.31 46000 0.3109 95.7859 91.8242 95.7675 95.7743 33.55
0.2486 0.31 47000 0.3116 95.8384 91.8854 95.8349 95.8324 33.5
0.249 0.32 48000 0.3129 95.7614 91.7998 95.7548 95.7663 33.53
0.2285 0.33 49000 0.3149 95.7684 91.7453 95.7513 95.761 33.56
0.2447 0.33 50000 0.3133 95.7226 91.7332 95.7089 95.7118 33.55
0.2374 0.34 51000 0.3096 95.7373 91.772 95.7205 95.7261 33.57
0.2361 0.35 52000 0.3156 95.8283 92.0269 95.8162 95.8259 33.6
0.2408 0.35 53000 0.3098 95.6854 91.7511 95.6702 95.6927 33.63
0.2419 0.36 54000 0.3140 95.5872 91.3338 95.5907 95.6066 33.54
0.2436 0.37 55000 0.3134 95.7498 91.8573 95.7465 95.7411 33.54
0.2396 0.37 56000 0.3138 95.7169 91.7169 95.698 95.7106 33.51
0.2315 0.38 57000 0.3122 95.809 91.9188 95.8006 95.7915 33.49
0.2298 0.39 58000 0.3181 95.6967 91.6519 95.6813 95.6906 33.49
0.2345 0.39 59000 0.3173 95.7213 91.6964 95.7225 95.7077 33.5
0.2323 0.4 60000 0.3169 95.6666 91.543 95.6482 95.6599 33.58
0.236 0.41 61000 0.3164 95.7845 91.8149 95.7699 95.7677 33.56
0.2246 0.41 62000 0.3110 95.6412 91.4598 95.6383 95.6356 33.5
0.2267 0.42 63000 0.3088 95.7137 91.6683 95.706 95.7034 33.53
0.232 0.43 64000 0.3105 95.7599 91.7777 95.7602 95.7566 33.56
0.2123 0.43 65000 0.3082 95.6892 91.6144 95.6855 95.6935 33.57
0.2195 0.44 66000 0.3053 95.7095 91.7089 95.7063 95.7029 33.54
0.2434 0.45 67000 0.3093 95.8082 91.8858 95.7912 95.7946 33.52
0.2336 0.45 68000 0.3050 95.814 91.8745 95.8045 95.7973 33.55
0.2326 0.46 69000 0.3029 95.7247 91.7338 95.7163 95.7136 33.51
0.2454 0.47 70000 0.3123 95.7778 91.7202 95.7531 95.7492 33.48
0.2402 0.47 71000 0.3090 95.7694 91.6795 95.766 95.7524 33.47
0.2233 0.48 72000 0.3100 95.7594 91.7237 95.7389 95.7391 33.53
0.2199 0.49 73000 0.3135 95.7177 91.6686 95.7014 95.7123 33.47
0.2205 0.49 74000 0.3116 95.714 91.5665 95.7022 95.7061 33.51
0.2178 0.5 75000 0.3120 95.7485 91.6867 95.7277 95.7411 33.54
0.2226 0.51 76000 0.3130 95.7285 91.6919 95.7199 95.7192 33.5
0.2199 0.51 77000 0.3123 95.7969 91.8832 95.7934 95.7782 33.48
0.2177 0.52 78000 0.3090 95.7166 91.7218 95.7148 95.7098 33.55
0.216 0.53 79000 0.3024 95.6977 91.689 95.7016 95.6875 33.53
0.2252 0.53 80000 0.3057 95.6664 91.6233 95.6616 95.6674 33.54
0.2209 0.54 81000 0.3057 95.4622 91.2615 95.4438 95.4563 33.54
0.2134 0.55 82000 0.3107 95.6903 91.6428 95.686 95.6862 33.53
0.2174 0.55 83000 0.3078 95.7232 91.6865 95.7109 95.7141 33.58
0.2217 0.56 84000 0.3062 95.6664 91.6233 95.6616 95.6674 33.55
0.2186 0.57 85000 0.3096 95.5492 91.3676 95.5382 95.5372 33.54
0.2192 0.57 86000 0.3070 95.6729 91.4616 95.6675 95.6665 33.52
0.2315 0.58 87000 0.3034 95.5492 91.3676 95.5382 95.5372 33.54
0.2248 0.59 88000 0.3023 95.7411 91.6705 95.7342 95.7318 33.55
0.2193 0.59 89000 0.3061 95.7364 91.709 95.7285 95.7354 33.57
0.2212 0.6 90000 0.3061 95.6604 91.5168 95.6399 95.6356 33.57
0.2287 0.61 91000 0.3073 95.7703 91.7829 95.7669 95.7617 33.57
0.239 0.61 92000 0.3063 95.7232 91.6865 95.7109 95.7141 33.59
0.2113 0.62 93000 0.3123 95.6757 91.4957 95.6738 95.6683 33.56
0.2259 0.63 94000 0.3110 95.6757 91.4957 95.6738 95.6683 33.56
0.2178 0.63 95000 0.3142 95.6548 91.4577 95.6529 95.6478 33.57
0.2288 0.64 96000 0.3051 95.7232 91.6865 95.7109 95.7141 33.59
0.2051 0.65 97000 0.3073 95.761 91.7916 95.7624 95.7573 33.59
0.2227 0.65 98000 0.3091 95.6803 91.526 95.6722 95.6768 33.56
0.2353 0.66 99000 0.3087 95.662 91.4535 95.6541 95.6524 33.55
0.217 0.67 100000 0.3046 95.6757 91.4957 95.6738 95.6683 33.56
0.1989 0.67 101000 0.3062 95.6757 91.4957 95.6738 95.6683 33.56
0.217 0.68 102000 0.3071 95.6757 91.4957 95.6738 95.6683 33.56
0.22 0.69 103000 0.3048 95.6757 91.4957 95.6738 95.6683 33.56
0.2202 0.69 104000 0.3081 95.6757 91.4957 95.6738 95.6683 33.56
0.2121 0.7 105000 0.3088 95.6265 91.4405 95.6194 95.6178 33.54
0.2137 0.71 106000 0.3096 95.7694 91.6795 95.766 95.7524 33.49
0.2261 0.71 107000 0.3041 95.7209 91.6199 95.7148 95.7064 33.47
0.2105 0.72 108000 0.3042 95.7209 91.6199 95.7148 95.7064 33.47
0.1974 0.73 109000 0.3045 95.6593 91.5597 95.656 95.6542 33.53
0.198 0.73 110000 0.3054 95.7694 91.6795 95.766 95.7524 33.49
0.2217 0.74 111000 0.3049 95.7102 91.6135 95.7119 95.7038 33.55
0.225 0.75 112000 0.3021 95.7102 91.6135 95.7119 95.7038 33.55
0.2222 0.75 113000 0.3045 95.7102 91.6135 95.7119 95.7038 33.55
0.2078 0.76 114000 0.3041 95.7102 91.6135 95.7119 95.7038 33.55
0.2194 0.77 115000 0.3027 95.7102 91.6135 95.7119 95.7038 33.55
0.2155 0.77 116000 0.3037 95.6593 91.5597 95.656 95.6542 33.53
0.2201 0.78 117000 0.3007 95.7102 91.6135 95.7119 95.7038 33.55
0.2061 0.79 118000 0.3017 95.7102 91.6135 95.7119 95.7038 33.55
0.2091 0.79 119000 0.3021 95.7102 91.6135 95.7119 95.7038 33.55
0.1921 0.8 120000 0.3036 95.7102 91.6135 95.7119 95.7038 33.55
0.2013 0.81 121000 0.3033 95.7102 91.6135 95.7119 95.7038 33.55
0.2105 0.81 122000 0.3003 95.6511 91.473 95.6445 95.6453 33.54
0.2044 0.82 123000 0.2997 95.6511 91.473 95.6445 95.6453 33.55
0.2128 0.83 124000 0.2997 95.6511 91.473 95.6445 95.6453 33.55
0.2113 0.83 125000 0.3016 95.6317 91.4265 95.6215 95.6277 33.55
0.2078 0.84 126000 0.3003 95.6317 91.4265 95.6215 95.6277 33.55
0.2117 0.85 127000 0.3016 95.6511 91.473 95.6445 95.6453 33.55
0.2112 0.85 128000 0.3012 95.5576 91.1912 95.5377 95.5468 33.52
0.2035 0.86 129000 0.3018 95.6286 91.3629 95.608 95.6137 33.49
0.2228 0.87 130000 0.2999 95.6511 91.473 95.6445 95.6453 33.55
0.2079 0.87 131000 0.2999 95.6511 91.473 95.6445 95.6453 33.55
0.2145 0.88 132000 0.3004 95.6409 91.4248 95.6233 95.6332 33.53
0.1987 0.89 133000 0.3028 95.6409 91.4248 95.6233 95.6332 33.53
0.2045 0.89 134000 0.3043 95.6848 91.5839 95.679 95.6777 33.54
0.1922 0.9 135000 0.3014 95.6313 91.4632 95.6214 95.6226 33.52
0.1956 0.91 136000 0.3003 95.6313 91.4632 95.6214 95.6226 33.52
0.2132 0.91 137000 0.3001 95.6848 91.5839 95.679 95.6777 33.54
0.1989 0.92 138000 0.2998 95.6409 91.4248 95.6233 95.6332 33.53
0.2179 0.93 139000 0.2997 95.6409 91.4248 95.6233 95.6332 33.53
0.1921 0.93 140000 0.2994 95.6848 91.5839 95.679 95.6777 33.54
0.2031 0.94 141000 0.3003 95.6848 91.5839 95.679 95.6777 33.54
0.1961 0.95 142000 0.3021 95.6848 91.5839 95.679 95.6777 33.54
0.2166 0.95 143000 0.3023 95.6848 91.5839 95.679 95.6777 33.54
0.2105 0.96 144000 0.3021 95.6848 91.5839 95.679 95.6777 33.54
0.2244 0.97 145000 0.3019 95.6848 91.5839 95.679 95.6777 33.54
0.1998 0.97 146000 0.3017 95.6848 91.5839 95.679 95.6777 33.54
0.2001 0.98 147000 0.3016 95.6848 91.5839 95.679 95.6777 33.54
0.2152 0.99 148000 0.3015 95.6848 91.5839 95.679 95.6777 33.54
0.1987 0.99 149000 0.3014 95.6848 91.5839 95.679 95.6777 33.54
0.2068 1.0 150000 0.3014 95.6848 91.5839 95.679 95.6777 33.54

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0
Downloads last month
48
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jbochi/flan-t5-base-spelling

Finetuned
(627)
this model