--- library_name: transformers base_model: lokeshk/Face-Recognition-NM tags: - image-classification, faces-recognition - generated_from_trainer metrics: - accuracy model-index: - name: faces_clasification_alcss results: [] --- # faces_clasification_alcss This model is a fine-tuned version of [lokeshk/Face-Recognition-NM](https://huggingface.co/lokeshk/Face-Recognition-NM) on the private_faces_dataset dataset. It achieves the following results on the evaluation set: - Loss: 2.2189 - Accuracy: 0.5481 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 4.6929 | 0.0962 | 10 | 4.5349 | 0.0913 | | 4.4502 | 0.1923 | 20 | 4.3568 | 0.1010 | | 4.3178 | 0.2885 | 30 | 4.2414 | 0.0865 | | 4.2083 | 0.3846 | 40 | 4.1396 | 0.1731 | | 4.1225 | 0.4808 | 50 | 4.0550 | 0.1538 | | 3.908 | 0.5769 | 60 | 3.9737 | 0.1683 | | 3.9502 | 0.6731 | 70 | 3.9053 | 0.1683 | | 3.8725 | 0.7692 | 80 | 3.8432 | 0.1731 | | 3.7944 | 0.8654 | 90 | 3.7840 | 0.2115 | | 3.7648 | 0.9615 | 100 | 3.7644 | 0.1971 | | 3.7211 | 1.0577 | 110 | 3.7172 | 0.1971 | | 3.5759 | 1.1538 | 120 | 3.6622 | 0.25 | | 3.6082 | 1.25 | 130 | 3.6080 | 0.2644 | | 3.4272 | 1.3462 | 140 | 3.5738 | 0.2837 | | 3.3349 | 1.4423 | 150 | 3.5526 | 0.2740 | | 3.189 | 1.5385 | 160 | 3.5162 | 0.2837 | | 3.2959 | 1.6346 | 170 | 3.4639 | 0.2933 | | 3.1273 | 1.7308 | 180 | 3.4308 | 0.3365 | | 3.1621 | 1.8269 | 190 | 3.3436 | 0.3221 | | 3.1053 | 1.9231 | 200 | 3.2973 | 0.3702 | | 3.0641 | 2.0192 | 210 | 3.2637 | 0.3462 | | 2.6593 | 2.1154 | 220 | 3.2149 | 0.3798 | | 2.5709 | 2.2115 | 230 | 3.1658 | 0.3846 | | 2.5412 | 2.3077 | 240 | 3.1359 | 0.3702 | | 2.5775 | 2.4038 | 250 | 3.1104 | 0.3654 | | 2.6372 | 2.5 | 260 | 3.1027 | 0.3798 | | 2.6732 | 2.5962 | 270 | 3.0252 | 0.4231 | | 2.6384 | 2.6923 | 280 | 3.0202 | 0.3942 | | 2.4607 | 2.7885 | 290 | 2.9785 | 0.4135 | | 2.5519 | 2.8846 | 300 | 2.9470 | 0.4327 | | 2.2381 | 2.9808 | 310 | 2.9402 | 0.4231 | | 2.1999 | 3.0769 | 320 | 2.9074 | 0.4519 | | 2.179 | 3.1731 | 330 | 2.8780 | 0.4567 | | 2.1427 | 3.2692 | 340 | 2.8331 | 0.4423 | | 2.1335 | 3.3654 | 350 | 2.8051 | 0.4760 | | 1.7641 | 3.4615 | 360 | 2.7798 | 0.4712 | | 1.9687 | 3.5577 | 370 | 2.7607 | 0.4808 | | 1.8046 | 3.6538 | 380 | 2.7381 | 0.4760 | | 1.944 | 3.75 | 390 | 2.7244 | 0.4856 | | 1.7403 | 3.8462 | 400 | 2.6899 | 0.4567 | | 1.7732 | 3.9423 | 410 | 2.6656 | 0.4808 | | 1.4105 | 4.0385 | 420 | 2.6526 | 0.4760 | | 1.377 | 4.1346 | 430 | 2.6448 | 0.4904 | | 1.5767 | 4.2308 | 440 | 2.5933 | 0.4663 | | 1.3826 | 4.3269 | 450 | 2.5832 | 0.5 | | 1.6504 | 4.4231 | 460 | 2.5573 | 0.5192 | | 1.5579 | 4.5192 | 470 | 2.5666 | 0.5048 | | 1.2466 | 4.6154 | 480 | 2.5197 | 0.5144 | | 1.32 | 4.7115 | 490 | 2.5145 | 0.5240 | | 1.5286 | 4.8077 | 500 | 2.4909 | 0.5192 | | 1.394 | 4.9038 | 510 | 2.4882 | 0.5192 | | 1.3982 | 5.0 | 520 | 2.4616 | 0.5192 | | 1.1167 | 5.0962 | 530 | 2.4569 | 0.5096 | | 1.3562 | 5.1923 | 540 | 2.4559 | 0.5192 | | 1.0018 | 5.2885 | 550 | 2.4438 | 0.5192 | | 1.2367 | 5.3846 | 560 | 2.4204 | 0.5192 | | 1.1748 | 5.4808 | 570 | 2.4112 | 0.5337 | | 1.022 | 5.5769 | 580 | 2.4130 | 0.5385 | | 1.0954 | 5.6731 | 590 | 2.3992 | 0.5192 | | 0.9759 | 5.7692 | 600 | 2.3683 | 0.5240 | | 1.0327 | 5.8654 | 610 | 2.3577 | 0.5144 | | 1.1167 | 5.9615 | 620 | 2.3547 | 0.5144 | | 0.8077 | 6.0577 | 630 | 2.3452 | 0.5385 | | 0.95 | 6.1538 | 640 | 2.3486 | 0.5433 | | 0.7993 | 6.25 | 650 | 2.3428 | 0.5529 | | 0.923 | 6.3462 | 660 | 2.3279 | 0.5337 | | 0.7566 | 6.4423 | 670 | 2.3176 | 0.5385 | | 0.8834 | 6.5385 | 680 | 2.3201 | 0.5288 | | 0.9337 | 6.6346 | 690 | 2.3064 | 0.5529 | | 0.7596 | 6.7308 | 700 | 2.3063 | 0.5288 | | 0.973 | 6.8269 | 710 | 2.2847 | 0.5337 | | 1.0212 | 6.9231 | 720 | 2.3006 | 0.5433 | | 0.8315 | 7.0192 | 730 | 2.2813 | 0.5385 | | 0.814 | 7.1154 | 740 | 2.2751 | 0.5481 | | 0.7658 | 7.2115 | 750 | 2.2754 | 0.5433 | | 0.5956 | 7.3077 | 760 | 2.2781 | 0.5433 | | 0.8864 | 7.4038 | 770 | 2.2602 | 0.5529 | | 0.7181 | 7.5 | 780 | 2.2568 | 0.5721 | | 0.643 | 7.5962 | 790 | 2.2586 | 0.5481 | | 0.7107 | 7.6923 | 800 | 2.2520 | 0.5337 | | 0.6078 | 7.7885 | 810 | 2.2423 | 0.5385 | | 0.873 | 7.8846 | 820 | 2.2478 | 0.5337 | | 0.7018 | 7.9808 | 830 | 2.2357 | 0.5577 | | 0.7177 | 8.0769 | 840 | 2.2340 | 0.5481 | | 0.6543 | 8.1731 | 850 | 2.2337 | 0.5481 | | 0.7563 | 8.2692 | 860 | 2.2259 | 0.5529 | | 0.549 | 8.3654 | 870 | 2.2310 | 0.5529 | | 0.6981 | 8.4615 | 880 | 2.2381 | 0.5529 | | 0.6172 | 8.5577 | 890 | 2.2285 | 0.5481 | | 0.587 | 8.6538 | 900 | 2.2165 | 0.5481 | | 0.613 | 8.75 | 910 | 2.2215 | 0.5481 | | 0.5558 | 8.8462 | 920 | 2.2274 | 0.5529 | | 0.6892 | 8.9423 | 930 | 2.2221 | 0.5433 | | 0.5876 | 9.0385 | 940 | 2.2209 | 0.5481 | | 0.5845 | 9.1346 | 950 | 2.2211 | 0.5481 | | 0.645 | 9.2308 | 960 | 2.2208 | 0.5481 | | 0.5619 | 9.3269 | 970 | 2.2204 | 0.5481 | | 0.5371 | 9.4231 | 980 | 2.2219 | 0.5481 | | 0.5048 | 9.5192 | 990 | 2.2203 | 0.5481 | | 0.6007 | 9.6154 | 1000 | 2.2192 | 0.5481 | | 0.5706 | 9.7115 | 1010 | 2.2187 | 0.5481 | | 0.571 | 9.8077 | 1020 | 2.2189 | 0.5481 | | 0.6692 | 9.9038 | 1030 | 2.2191 | 0.5481 | | 0.5411 | 10.0 | 1040 | 2.2189 | 0.5481 | ### Framework versions - Transformers 4.44.1 - Pytorch 2.4.0+cu118 - Datasets 2.21.0 - Tokenizers 0.19.1