Edit model card

working

This model is a fine-tuned version of google-bert/bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0013
  • Accuracy: 0.9997
  • F1: 0.9997
  • Precision: 0.9997
  • Recall: 0.9997

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 10
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.7386 0.05 10 1.4921 0.5744 0.5473 0.6739 0.5638
1.2954 0.11 20 0.9534 0.9091 0.9066 0.9148 0.9082
0.7734 0.16 30 0.4544 0.9570 0.9567 0.9585 0.9578
0.3456 0.21 40 0.1662 0.9907 0.9908 0.9909 0.9907
0.1226 0.26 50 0.0743 0.9914 0.9914 0.9916 0.9914
0.0709 0.32 60 0.0446 0.9917 0.9917 0.9921 0.9914
0.0565 0.37 70 0.0408 0.9897 0.9898 0.9897 0.9900
0.0172 0.42 80 0.0214 0.9967 0.9967 0.9966 0.9968
0.0234 0.48 90 0.0217 0.9954 0.9954 0.9954 0.9954
0.027 0.53 100 0.0158 0.9977 0.9977 0.9976 0.9977
0.0346 0.58 110 0.0119 0.9980 0.9980 0.9980 0.9981
0.0621 0.63 120 0.0093 0.9990 0.9990 0.9990 0.9990
0.0065 0.69 130 0.0156 0.9960 0.9960 0.9960 0.9960
0.006 0.74 140 0.0081 0.9980 0.9980 0.9979 0.9980
0.0045 0.79 150 0.0091 0.9970 0.9970 0.9969 0.9971
0.0108 0.85 160 0.0045 0.9997 0.9997 0.9997 0.9996
0.0128 0.9 170 0.0033 0.9997 0.9997 0.9997 0.9996
0.0166 0.95 180 0.0112 0.9974 0.9974 0.9973 0.9974
0.0031 1.01 190 0.0117 0.9974 0.9974 0.9973 0.9974
0.0028 1.06 200 0.0143 0.9967 0.9967 0.9967 0.9967
0.0031 1.11 210 0.0076 0.9987 0.9987 0.9987 0.9987
0.0126 1.16 220 0.0051 0.9990 0.9990 0.9990 0.9990
0.0194 1.22 230 0.0048 0.9993 0.9993 0.9993 0.9993
0.0021 1.27 240 0.0093 0.9980 0.9980 0.9980 0.9980
0.002 1.32 250 0.0082 0.9983 0.9983 0.9983 0.9983
0.0187 1.38 260 0.0041 0.9993 0.9993 0.9993 0.9993
0.0018 1.43 270 0.0049 0.9990 0.9990 0.9990 0.9990
0.0016 1.48 280 0.0050 0.9987 0.9987 0.9987 0.9986
0.0016 1.53 290 0.0047 0.9993 0.9993 0.9993 0.9993
0.0014 1.59 300 0.0049 0.9993 0.9993 0.9993 0.9993
0.0014 1.64 310 0.0050 0.9990 0.9990 0.9990 0.9990
0.0013 1.69 320 0.0051 0.9990 0.9990 0.9990 0.9990
0.0089 1.75 330 0.0360 0.9911 0.9911 0.9913 0.9911
0.0211 1.8 340 0.0042 0.9987 0.9987 0.9987 0.9987
0.0014 1.85 350 0.0223 0.9957 0.9957 0.9957 0.9957
0.003 1.9 360 0.0027 0.9993 0.9993 0.9993 0.9993
0.0086 1.96 370 0.0026 0.9993 0.9994 0.9994 0.9993
0.0013 2.01 380 0.0023 0.9990 0.9990 0.9990 0.9990
0.0011 2.06 390 0.0037 0.9987 0.9987 0.9987 0.9986
0.0047 2.12 400 0.0066 0.9980 0.9980 0.9980 0.9980
0.0013 2.17 410 0.0037 0.9990 0.9990 0.9990 0.9990
0.001 2.22 420 0.0020 0.9997 0.9997 0.9997 0.9996
0.0009 2.28 430 0.0014 0.9997 0.9997 0.9997 0.9996
0.0009 2.33 440 0.0012 0.9997 0.9997 0.9997 0.9996
0.0008 2.38 450 0.0011 0.9997 0.9997 0.9997 0.9996
0.0008 2.43 460 0.0011 0.9997 0.9997 0.9997 0.9996
0.0009 2.49 470 0.0010 1.0 1.0 1.0 1.0
0.0008 2.54 480 0.0012 0.9997 0.9997 0.9997 0.9997
0.0007 2.59 490 0.0012 0.9997 0.9997 0.9997 0.9997
0.0007 2.65 500 0.0012 0.9997 0.9997 0.9997 0.9997
0.0007 2.7 510 0.0012 0.9997 0.9997 0.9997 0.9997
0.014 2.75 520 0.0019 0.9997 0.9997 0.9997 0.9997
0.0009 2.8 530 0.0036 0.9993 0.9993 0.9994 0.9993
0.0007 2.86 540 0.0036 0.9993 0.9993 0.9994 0.9993
0.0007 2.91 550 0.0032 0.9993 0.9993 0.9994 0.9993
0.0006 2.96 560 0.0028 0.9993 0.9993 0.9994 0.9993
0.0006 3.02 570 0.0026 0.9993 0.9993 0.9994 0.9993
0.0006 3.07 580 0.0024 0.9993 0.9993 0.9994 0.9993
0.0007 3.12 590 0.0023 0.9993 0.9993 0.9994 0.9993
0.0005 3.17 600 0.0021 0.9993 0.9993 0.9994 0.9993
0.0006 3.23 610 0.0021 0.9993 0.9993 0.9994 0.9993
0.0006 3.28 620 0.0018 0.9993 0.9993 0.9994 0.9993
0.0005 3.33 630 0.0017 0.9993 0.9993 0.9994 0.9993
0.0005 3.39 640 0.0016 0.9993 0.9993 0.9994 0.9993
0.0005 3.44 650 0.0014 0.9993 0.9993 0.9994 0.9993
0.0005 3.49 660 0.0013 0.9997 0.9997 0.9997 0.9997
0.0005 3.54 670 0.0012 0.9997 0.9997 0.9997 0.9997
0.0005 3.6 680 0.0011 0.9997 0.9997 0.9997 0.9997
0.0004 3.65 690 0.0011 0.9997 0.9997 0.9997 0.9997
0.0004 3.7 700 0.0011 0.9997 0.9997 0.9997 0.9997
0.0004 3.76 710 0.0010 0.9997 0.9997 0.9997 0.9997
0.0004 3.81 720 0.0010 0.9997 0.9997 0.9997 0.9997
0.0004 3.86 730 0.0010 0.9997 0.9997 0.9997 0.9997
0.0004 3.92 740 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 3.97 750 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.02 760 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.07 770 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.13 780 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.18 790 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.23 800 0.0009 0.9997 0.9997 0.9997 0.9997
0.0004 4.29 810 0.0008 0.9997 0.9997 0.9997 0.9997
0.0004 4.34 820 0.0008 0.9997 0.9997 0.9997 0.9997
0.0004 4.39 830 0.0008 0.9997 0.9997 0.9997 0.9997
0.0004 4.44 840 0.0008 0.9997 0.9997 0.9997 0.9997
0.0004 4.5 850 0.0008 0.9997 0.9997 0.9997 0.9997
0.0003 4.55 860 0.0008 0.9997 0.9997 0.9997 0.9997
0.0004 4.6 870 0.0008 0.9997 0.9997 0.9997 0.9997
0.0003 4.66 880 0.0008 0.9997 0.9997 0.9997 0.9997
0.0003 4.71 890 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 4.76 900 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 4.81 910 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 4.87 920 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 4.92 930 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 4.97 940 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.03 950 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.08 960 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.13 970 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.19 980 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.24 990 0.0007 0.9997 0.9997 0.9997 0.9997
0.0003 5.29 1000 0.0006 0.9997 0.9997 0.9997 0.9997
0.0003 5.34 1010 0.0006 0.9997 0.9997 0.9997 0.9997
0.0003 5.4 1020 0.0006 1.0 1.0 1.0 1.0
0.0003 5.45 1030 0.0006 1.0 1.0 1.0 1.0
0.0003 5.5 1040 0.0006 1.0 1.0 1.0 1.0
0.0003 5.56 1050 0.0006 1.0 1.0 1.0 1.0
0.0003 5.61 1060 0.0006 1.0 1.0 1.0 1.0
0.0003 5.66 1070 0.0006 1.0 1.0 1.0 1.0
0.0003 5.71 1080 0.0006 1.0 1.0 1.0 1.0
0.0003 5.77 1090 0.0006 1.0 1.0 1.0 1.0
0.0003 5.82 1100 0.0006 1.0 1.0 1.0 1.0
0.0003 5.87 1110 0.0006 1.0 1.0 1.0 1.0
0.0003 5.93 1120 0.0006 1.0 1.0 1.0 1.0
0.0003 5.98 1130 0.0006 1.0 1.0 1.0 1.0
0.0003 6.03 1140 0.0006 1.0 1.0 1.0 1.0
0.0003 6.08 1150 0.0006 1.0 1.0 1.0 1.0
0.0003 6.14 1160 0.0006 1.0 1.0 1.0 1.0
0.0003 6.19 1170 0.0006 1.0 1.0 1.0 1.0
0.0003 6.24 1180 0.0006 1.0 1.0 1.0 1.0
0.0003 6.3 1190 0.0006 1.0 1.0 1.0 1.0
0.0002 6.35 1200 0.0006 1.0 1.0 1.0 1.0
0.0003 6.4 1210 0.0006 1.0 1.0 1.0 1.0
0.0002 6.46 1220 0.0006 1.0 1.0 1.0 1.0
0.0002 6.51 1230 0.0006 1.0 1.0 1.0 1.0
0.0002 6.56 1240 0.0006 1.0 1.0 1.0 1.0
0.0002 6.61 1250 0.0006 1.0 1.0 1.0 1.0
0.0002 6.67 1260 0.0005 1.0 1.0 1.0 1.0
0.0002 6.72 1270 0.0005 1.0 1.0 1.0 1.0
0.0002 6.77 1280 0.0005 1.0 1.0 1.0 1.0
0.0002 6.83 1290 0.0005 1.0 1.0 1.0 1.0
0.0002 6.88 1300 0.0005 1.0 1.0 1.0 1.0
0.0002 6.93 1310 0.0005 1.0 1.0 1.0 1.0
0.0002 6.98 1320 0.0005 1.0 1.0 1.0 1.0
0.0002 7.04 1330 0.0005 1.0 1.0 1.0 1.0
0.0002 7.09 1340 0.0005 1.0 1.0 1.0 1.0
0.0002 7.14 1350 0.0005 1.0 1.0 1.0 1.0
0.0002 7.2 1360 0.0005 1.0 1.0 1.0 1.0
0.0002 7.25 1370 0.0005 1.0 1.0 1.0 1.0
0.0002 7.3 1380 0.0005 1.0 1.0 1.0 1.0
0.0002 7.35 1390 0.0005 1.0 1.0 1.0 1.0
0.0002 7.41 1400 0.0005 1.0 1.0 1.0 1.0
0.0002 7.46 1410 0.0005 1.0 1.0 1.0 1.0
0.0002 7.51 1420 0.0005 1.0 1.0 1.0 1.0
0.0002 7.57 1430 0.0005 1.0 1.0 1.0 1.0
0.0002 7.62 1440 0.0005 1.0 1.0 1.0 1.0
0.0002 7.67 1450 0.0005 1.0 1.0 1.0 1.0
0.0002 7.72 1460 0.0005 1.0 1.0 1.0 1.0
0.0002 7.78 1470 0.0005 1.0 1.0 1.0 1.0
0.0002 7.83 1480 0.0005 1.0 1.0 1.0 1.0
0.0002 7.88 1490 0.0005 1.0 1.0 1.0 1.0
0.0002 7.94 1500 0.0005 1.0 1.0 1.0 1.0
0.0002 7.99 1510 0.0005 1.0 1.0 1.0 1.0
0.0002 8.04 1520 0.0005 1.0 1.0 1.0 1.0
0.0002 8.1 1530 0.0005 1.0 1.0 1.0 1.0
0.0002 8.15 1540 0.0005 1.0 1.0 1.0 1.0
0.0002 8.2 1550 0.0005 1.0 1.0 1.0 1.0
0.0002 8.25 1560 0.0005 1.0 1.0 1.0 1.0
0.0002 8.31 1570 0.0005 1.0 1.0 1.0 1.0
0.0002 8.36 1580 0.0005 1.0 1.0 1.0 1.0
0.0002 8.41 1590 0.0005 1.0 1.0 1.0 1.0
0.0002 8.47 1600 0.0005 1.0 1.0 1.0 1.0
0.0002 8.52 1610 0.0005 1.0 1.0 1.0 1.0
0.0002 8.57 1620 0.0005 1.0 1.0 1.0 1.0
0.0002 8.62 1630 0.0005 1.0 1.0 1.0 1.0
0.0002 8.68 1640 0.0005 1.0 1.0 1.0 1.0
0.0002 8.73 1650 0.0005 1.0 1.0 1.0 1.0
0.0002 8.78 1660 0.0005 1.0 1.0 1.0 1.0
0.0002 8.84 1670 0.0005 1.0 1.0 1.0 1.0
0.0002 8.89 1680 0.0005 1.0 1.0 1.0 1.0
0.0002 8.94 1690 0.0005 1.0 1.0 1.0 1.0
0.0002 8.99 1700 0.0005 1.0 1.0 1.0 1.0
0.0002 9.05 1710 0.0005 1.0 1.0 1.0 1.0
0.0002 9.1 1720 0.0005 1.0 1.0 1.0 1.0
0.0002 9.15 1730 0.0005 1.0 1.0 1.0 1.0
0.0002 9.21 1740 0.0005 1.0 1.0 1.0 1.0
0.0002 9.26 1750 0.0005 1.0 1.0 1.0 1.0
0.0002 9.31 1760 0.0005 1.0 1.0 1.0 1.0
0.0002 9.37 1770 0.0005 1.0 1.0 1.0 1.0
0.0002 9.42 1780 0.0005 1.0 1.0 1.0 1.0
0.0002 9.47 1790 0.0005 1.0 1.0 1.0 1.0
0.0002 9.52 1800 0.0005 1.0 1.0 1.0 1.0
0.0002 9.58 1810 0.0005 1.0 1.0 1.0 1.0
0.0002 9.63 1820 0.0005 1.0 1.0 1.0 1.0
0.0002 9.68 1830 0.0005 1.0 1.0 1.0 1.0
0.0002 9.74 1840 0.0005 1.0 1.0 1.0 1.0
0.0002 9.79 1850 0.0005 1.0 1.0 1.0 1.0
0.0002 9.84 1860 0.0005 1.0 1.0 1.0 1.0
0.0002 9.89 1870 0.0005 1.0 1.0 1.0 1.0
0.0002 9.95 1880 0.0005 1.0 1.0 1.0 1.0
0.0002 10.0 1890 0.0005 1.0 1.0 1.0 1.0

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Narkantak/Intent-classification-12k

Finetuned
(2057)
this model