codet5-small-custom-functions-dataset-python
This model is a fine-tuned version of Salesforce/codet5-small on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2103
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
6.8821 | 0.03 | 1 | 4.9003 |
5.1641 | 0.06 | 2 | 4.1876 |
4.5747 | 0.09 | 3 | 3.5772 |
3.985 | 0.12 | 4 | 3.0527 |
4.0255 | 0.15 | 5 | 2.5962 |
3.1963 | 0.18 | 6 | 2.2589 |
3.01 | 0.21 | 7 | 1.9755 |
2.5837 | 0.24 | 8 | 1.7736 |
2.6645 | 0.27 | 9 | 1.6032 |
1.8825 | 0.3 | 10 | 1.4620 |
2.282 | 0.33 | 11 | 1.3621 |
1.9555 | 0.36 | 12 | 1.2926 |
2.0374 | 0.39 | 13 | 1.2261 |
1.6276 | 0.42 | 14 | 1.1631 |
1.937 | 0.45 | 15 | 1.1053 |
1.4738 | 0.48 | 16 | 1.0512 |
1.5335 | 0.52 | 17 | 1.0016 |
1.5224 | 0.55 | 18 | 0.9554 |
1.5048 | 0.58 | 19 | 0.9175 |
1.3983 | 0.61 | 20 | 0.8806 |
1.2506 | 0.64 | 21 | 0.8495 |
1.186 | 0.67 | 22 | 0.8243 |
1.1824 | 0.7 | 23 | 0.7988 |
1.29 | 0.73 | 24 | 0.7728 |
1.159 | 0.76 | 25 | 0.7468 |
0.9893 | 0.79 | 26 | 0.7193 |
1.2054 | 0.82 | 27 | 0.7013 |
1.0004 | 0.85 | 28 | 0.6850 |
0.7918 | 0.88 | 29 | 0.6704 |
1.0357 | 0.91 | 30 | 0.6570 |
1.0648 | 0.94 | 31 | 0.6452 |
1.0679 | 0.97 | 32 | 0.6336 |
0.9296 | 1.0 | 33 | 0.6227 |
0.8459 | 1.03 | 34 | 0.6123 |
0.8312 | 1.06 | 35 | 0.6000 |
0.9367 | 1.09 | 36 | 0.5844 |
0.8813 | 1.12 | 37 | 0.5724 |
0.9134 | 1.15 | 38 | 0.5608 |
0.6967 | 1.18 | 39 | 0.5509 |
0.8654 | 1.21 | 40 | 0.5416 |
0.784 | 1.24 | 41 | 0.5324 |
0.7623 | 1.27 | 42 | 0.5237 |
0.739 | 1.3 | 43 | 0.5145 |
0.8273 | 1.33 | 44 | 0.5064 |
0.7384 | 1.36 | 45 | 0.4968 |
0.6936 | 1.39 | 46 | 0.4882 |
0.7078 | 1.42 | 47 | 0.4807 |
0.6214 | 1.45 | 48 | 0.4740 |
0.6983 | 1.48 | 49 | 0.4662 |
0.6328 | 1.52 | 50 | 0.4588 |
0.663 | 1.55 | 51 | 0.4533 |
0.6518 | 1.58 | 52 | 0.4476 |
0.5782 | 1.61 | 53 | 0.4343 |
0.6361 | 1.64 | 54 | 0.4296 |
0.5804 | 1.67 | 55 | 0.4249 |
0.6557 | 1.7 | 56 | 0.4210 |
0.6801 | 1.73 | 57 | 0.4173 |
0.6682 | 1.76 | 58 | 0.4132 |
0.6346 | 1.79 | 59 | 0.4090 |
0.6421 | 1.82 | 60 | 0.4028 |
0.6318 | 1.85 | 61 | 0.3969 |
0.6914 | 1.88 | 62 | 0.3942 |
0.5953 | 1.91 | 63 | 0.3920 |
0.7016 | 1.94 | 64 | 0.3894 |
0.5728 | 1.97 | 65 | 0.3839 |
0.5417 | 2.0 | 66 | 0.3738 |
0.5502 | 2.03 | 67 | 0.3705 |
0.5167 | 2.06 | 68 | 0.3668 |
0.6452 | 2.09 | 69 | 0.3629 |
0.4713 | 2.12 | 70 | 0.3583 |
0.5239 | 2.15 | 71 | 0.3553 |
0.6125 | 2.18 | 72 | 0.3527 |
0.4548 | 2.21 | 73 | 0.3414 |
0.5705 | 2.24 | 74 | 0.3389 |
0.4912 | 2.27 | 75 | 0.3374 |
0.4566 | 2.3 | 76 | 0.3316 |
0.5642 | 2.33 | 77 | 0.3288 |
0.4212 | 2.36 | 78 | 0.3260 |
0.3808 | 2.39 | 79 | 0.3236 |
0.4833 | 2.42 | 80 | 0.3214 |
0.4775 | 2.45 | 81 | 0.3193 |
0.5598 | 2.48 | 82 | 0.3175 |
0.5144 | 2.52 | 83 | 0.3162 |
0.4554 | 2.55 | 84 | 0.3152 |
0.4811 | 2.58 | 85 | 0.3141 |
0.4545 | 2.61 | 86 | 0.3130 |
0.438 | 2.64 | 87 | 0.3117 |
0.4071 | 2.67 | 88 | 0.3104 |
0.4635 | 2.7 | 89 | 0.3090 |
0.5118 | 2.73 | 90 | 0.3077 |
0.4043 | 2.76 | 91 | 0.3059 |
0.4675 | 2.79 | 92 | 0.3044 |
0.4551 | 2.82 | 93 | 0.3021 |
0.497 | 2.85 | 94 | 0.2987 |
0.4334 | 2.88 | 95 | 0.2932 |
0.4087 | 2.91 | 96 | 0.2901 |
0.477 | 2.94 | 97 | 0.2888 |
0.4834 | 2.97 | 98 | 0.2871 |
0.4513 | 3.0 | 99 | 0.2856 |
0.4172 | 3.03 | 100 | 0.2845 |
0.3827 | 3.06 | 101 | 0.2837 |
0.3851 | 3.09 | 102 | 0.2830 |
0.3976 | 3.12 | 103 | 0.2823 |
0.4909 | 3.15 | 104 | 0.2833 |
0.5409 | 3.18 | 105 | 0.2830 |
0.4039 | 3.21 | 106 | 0.2808 |
0.4057 | 3.24 | 107 | 0.2789 |
0.4214 | 3.27 | 108 | 0.2779 |
0.4209 | 3.3 | 109 | 0.2768 |
0.5044 | 3.33 | 110 | 0.2759 |
0.3457 | 3.36 | 111 | 0.2750 |
0.394 | 3.39 | 112 | 0.2744 |
0.4008 | 3.42 | 113 | 0.2739 |
0.3837 | 3.45 | 114 | 0.2736 |
0.3843 | 3.48 | 115 | 0.2734 |
0.4458 | 3.52 | 116 | 0.2730 |
0.4417 | 3.55 | 117 | 0.2725 |
0.4274 | 3.58 | 118 | 0.2719 |
0.4129 | 3.61 | 119 | 0.2712 |
0.421 | 3.64 | 120 | 0.2702 |
0.3625 | 3.67 | 121 | 0.2692 |
0.3785 | 3.7 | 122 | 0.2683 |
0.4023 | 3.73 | 123 | 0.2671 |
0.416 | 3.76 | 124 | 0.2663 |
0.3661 | 3.79 | 125 | 0.2654 |
0.373 | 3.82 | 126 | 0.2647 |
0.4045 | 3.85 | 127 | 0.2640 |
0.3955 | 3.88 | 128 | 0.2633 |
0.3796 | 3.91 | 129 | 0.2627 |
0.3682 | 3.94 | 130 | 0.2621 |
0.4195 | 3.97 | 131 | 0.2614 |
0.4135 | 4.0 | 132 | 0.2609 |
0.3244 | 4.03 | 133 | 0.2601 |
0.411 | 4.06 | 134 | 0.2597 |
0.4019 | 4.09 | 135 | 0.2599 |
0.451 | 4.12 | 136 | 0.2592 |
0.3948 | 4.15 | 137 | 0.2584 |
0.3375 | 4.18 | 138 | 0.2577 |
0.3687 | 4.21 | 139 | 0.2567 |
0.3946 | 4.24 | 140 | 0.2557 |
0.4181 | 4.27 | 141 | 0.2547 |
0.2949 | 4.3 | 142 | 0.2540 |
0.3621 | 4.33 | 143 | 0.2530 |
0.4134 | 4.36 | 144 | 0.2523 |
0.3366 | 4.39 | 145 | 0.2516 |
0.3798 | 4.42 | 146 | 0.2510 |
0.3519 | 4.45 | 147 | 0.2505 |
0.2999 | 4.48 | 148 | 0.2501 |
0.4096 | 4.52 | 149 | 0.2495 |
0.4736 | 4.55 | 150 | 0.2485 |
0.3481 | 4.58 | 151 | 0.2481 |
0.3683 | 4.61 | 152 | 0.2479 |
0.325 | 4.64 | 153 | 0.2476 |
0.3746 | 4.67 | 154 | 0.2473 |
0.3394 | 4.7 | 155 | 0.2468 |
0.3653 | 4.73 | 156 | 0.2463 |
0.3222 | 4.76 | 157 | 0.2458 |
0.3496 | 4.79 | 158 | 0.2453 |
0.368 | 4.82 | 159 | 0.2450 |
0.3473 | 4.85 | 160 | 0.2447 |
0.3712 | 4.88 | 161 | 0.2445 |
0.3542 | 4.91 | 162 | 0.2443 |
0.3249 | 4.94 | 163 | 0.2436 |
0.3135 | 4.97 | 164 | 0.2431 |
0.3603 | 5.0 | 165 | 0.2427 |
0.3345 | 5.03 | 166 | 0.2424 |
0.3385 | 5.06 | 167 | 0.2428 |
0.3939 | 5.09 | 168 | 0.2422 |
0.334 | 5.12 | 169 | 0.2414 |
0.3482 | 5.15 | 170 | 0.2401 |
0.3323 | 5.18 | 171 | 0.2396 |
0.3603 | 5.21 | 172 | 0.2391 |
0.354 | 5.24 | 173 | 0.2385 |
0.3241 | 5.27 | 174 | 0.2379 |
0.4134 | 5.3 | 175 | 0.2373 |
0.3726 | 5.33 | 176 | 0.2369 |
0.2997 | 5.36 | 177 | 0.2364 |
0.3317 | 5.39 | 178 | 0.2360 |
0.3692 | 5.42 | 179 | 0.2356 |
0.3411 | 5.45 | 180 | 0.2347 |
0.274 | 5.48 | 181 | 0.2342 |
0.3714 | 5.52 | 182 | 0.2337 |
0.442 | 5.55 | 183 | 0.2332 |
0.3262 | 5.58 | 184 | 0.2327 |
0.2929 | 5.61 | 185 | 0.2323 |
0.3435 | 5.64 | 186 | 0.2315 |
0.3921 | 5.67 | 187 | 0.2311 |
0.3609 | 5.7 | 188 | 0.2306 |
0.3585 | 5.73 | 189 | 0.2302 |
0.3323 | 5.76 | 190 | 0.2298 |
0.3205 | 5.79 | 191 | 0.2295 |
0.3407 | 5.82 | 192 | 0.2293 |
0.3109 | 5.85 | 193 | 0.2290 |
0.3075 | 5.88 | 194 | 0.2287 |
0.3538 | 5.91 | 195 | 0.2285 |
0.2968 | 5.94 | 196 | 0.2283 |
0.34 | 5.97 | 197 | 0.2281 |
0.3608 | 6.0 | 198 | 0.2279 |
0.2768 | 6.03 | 199 | 0.2277 |
0.3783 | 6.06 | 200 | 0.2275 |
0.3024 | 6.09 | 201 | 0.2272 |
0.3221 | 6.12 | 202 | 0.2269 |
0.3432 | 6.15 | 203 | 0.2266 |
0.3497 | 6.18 | 204 | 0.2264 |
0.3174 | 6.21 | 205 | 0.2261 |
0.3034 | 6.24 | 206 | 0.2259 |
0.3035 | 6.27 | 207 | 0.2257 |
0.3185 | 6.3 | 208 | 0.2255 |
0.3851 | 6.33 | 209 | 0.2252 |
0.3612 | 6.36 | 210 | 0.2249 |
0.2838 | 6.39 | 211 | 0.2247 |
0.3452 | 6.42 | 212 | 0.2245 |
0.3358 | 6.45 | 213 | 0.2243 |
0.3181 | 6.48 | 214 | 0.2241 |
0.329 | 6.52 | 215 | 0.2240 |
0.2819 | 6.55 | 216 | 0.2238 |
0.3283 | 6.58 | 217 | 0.2237 |
0.2752 | 6.61 | 218 | 0.2235 |
0.3194 | 6.64 | 219 | 0.2233 |
0.2981 | 6.67 | 220 | 0.2230 |
0.2954 | 6.7 | 221 | 0.2229 |
0.2762 | 6.73 | 222 | 0.2228 |
0.3206 | 6.76 | 223 | 0.2223 |
0.3017 | 6.79 | 224 | 0.2221 |
0.3219 | 6.82 | 225 | 0.2219 |
0.2929 | 6.85 | 226 | 0.2215 |
0.3576 | 6.88 | 227 | 0.2212 |
0.2712 | 6.91 | 228 | 0.2210 |
0.2682 | 6.94 | 229 | 0.2207 |
0.3412 | 6.97 | 230 | 0.2205 |
0.3136 | 7.0 | 231 | 0.2203 |
0.3161 | 7.03 | 232 | 0.2200 |
0.2902 | 7.06 | 233 | 0.2197 |
0.3053 | 7.09 | 234 | 0.2194 |
0.3182 | 7.12 | 235 | 0.2190 |
0.2752 | 7.15 | 236 | 0.2186 |
0.262 | 7.18 | 237 | 0.2182 |
0.2783 | 7.21 | 238 | 0.2178 |
0.2795 | 7.24 | 239 | 0.2174 |
0.2964 | 7.27 | 240 | 0.2171 |
0.2737 | 7.3 | 241 | 0.2167 |
0.3377 | 7.33 | 242 | 0.2164 |
0.2579 | 7.36 | 243 | 0.2161 |
0.3015 | 7.39 | 244 | 0.2158 |
0.2525 | 7.42 | 245 | 0.2156 |
0.3187 | 7.45 | 246 | 0.2154 |
0.2628 | 7.48 | 247 | 0.2152 |
0.3267 | 7.52 | 248 | 0.2151 |
0.2718 | 7.55 | 249 | 0.2149 |
0.3153 | 7.58 | 250 | 0.2148 |
0.3555 | 7.61 | 251 | 0.2146 |
0.2921 | 7.64 | 252 | 0.2145 |
0.3538 | 7.67 | 253 | 0.2143 |
0.3197 | 7.7 | 254 | 0.2143 |
0.3745 | 7.73 | 255 | 0.2141 |
0.2762 | 7.76 | 256 | 0.2140 |
0.3053 | 7.79 | 257 | 0.2139 |
0.3357 | 7.82 | 258 | 0.2137 |
0.3105 | 7.85 | 259 | 0.2136 |
0.3287 | 7.88 | 260 | 0.2134 |
0.3194 | 7.91 | 261 | 0.2133 |
0.3151 | 7.94 | 262 | 0.2131 |
0.2784 | 7.97 | 263 | 0.2130 |
0.2946 | 8.0 | 264 | 0.2128 |
0.2804 | 8.03 | 265 | 0.2127 |
0.2549 | 8.06 | 266 | 0.2126 |
0.3115 | 8.09 | 267 | 0.2125 |
0.3675 | 8.12 | 268 | 0.2123 |
0.2582 | 8.15 | 269 | 0.2122 |
0.2974 | 8.18 | 270 | 0.2121 |
0.2885 | 8.21 | 271 | 0.2120 |
0.2962 | 8.24 | 272 | 0.2120 |
0.3726 | 8.27 | 273 | 0.2119 |
0.2631 | 8.3 | 274 | 0.2119 |
0.3114 | 8.33 | 275 | 0.2120 |
0.3445 | 8.36 | 276 | 0.2120 |
0.2782 | 8.39 | 277 | 0.2121 |
0.3429 | 8.42 | 278 | 0.2121 |
0.2533 | 8.45 | 279 | 0.2121 |
0.2858 | 8.48 | 280 | 0.2121 |
0.2815 | 8.52 | 281 | 0.2122 |
0.3285 | 8.55 | 282 | 0.2123 |
0.3484 | 8.58 | 283 | 0.2124 |
0.2468 | 8.61 | 284 | 0.2124 |
0.2686 | 8.64 | 285 | 0.2124 |
0.2784 | 8.67 | 286 | 0.2124 |
0.2645 | 8.7 | 287 | 0.2123 |
0.2882 | 8.73 | 288 | 0.2122 |
0.293 | 8.76 | 289 | 0.2121 |
0.2691 | 8.79 | 290 | 0.2120 |
0.3051 | 8.82 | 291 | 0.2120 |
0.2897 | 8.85 | 292 | 0.2119 |
0.2625 | 8.88 | 293 | 0.2119 |
0.3175 | 8.91 | 294 | 0.2119 |
0.2702 | 8.94 | 295 | 0.2118 |
0.3006 | 8.97 | 296 | 0.2118 |
0.2438 | 9.0 | 297 | 0.2118 |
0.3455 | 9.03 | 298 | 0.2118 |
0.2754 | 9.06 | 299 | 0.2117 |
0.2761 | 9.09 | 300 | 0.2117 |
0.2699 | 9.12 | 301 | 0.2116 |
0.322 | 9.15 | 302 | 0.2116 |
0.2373 | 9.18 | 303 | 0.2115 |
0.2814 | 9.21 | 304 | 0.2114 |
0.3558 | 9.24 | 305 | 0.2113 |
0.3223 | 9.27 | 306 | 0.2113 |
0.2798 | 9.3 | 307 | 0.2112 |
0.3263 | 9.33 | 308 | 0.2111 |
0.2523 | 9.36 | 309 | 0.2110 |
0.2687 | 9.39 | 310 | 0.2109 |
0.2623 | 9.42 | 311 | 0.2109 |
0.3164 | 9.45 | 312 | 0.2108 |
0.2801 | 9.48 | 313 | 0.2108 |
0.2967 | 9.52 | 314 | 0.2107 |
0.2816 | 9.55 | 315 | 0.2107 |
0.2721 | 9.58 | 316 | 0.2107 |
0.297 | 9.61 | 317 | 0.2106 |
0.2585 | 9.64 | 318 | 0.2106 |
0.2361 | 9.67 | 319 | 0.2106 |
0.2365 | 9.7 | 320 | 0.2105 |
0.3068 | 9.73 | 321 | 0.2105 |
0.2938 | 9.76 | 322 | 0.2105 |
0.3219 | 9.79 | 323 | 0.2104 |
0.2706 | 9.82 | 324 | 0.2104 |
0.2837 | 9.85 | 325 | 0.2104 |
0.3062 | 9.88 | 326 | 0.2103 |
0.3063 | 9.91 | 327 | 0.2103 |
0.3163 | 9.94 | 328 | 0.2103 |
0.2935 | 9.97 | 329 | 0.2103 |
0.2611 | 10.0 | 330 | 0.2103 |
Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 10
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.