Bert Uncased Model Fine Tuned For Stock Sentiment
- This model is a fine-tuned version of the BERT (Bidirectional Encoder Representations from Transformers) model specifically designed for analyzing stock sentiment. The fine-tuning process involved training the model on tagged comments from the last two pages of the stock form on the Investing platform, focusing on stocks listed in the BIST Index.
Stock List:
- ACSEL, ADEL, ARCLK, ASELS, AZTEK, BIMAS, BFREN, BMSCH,
- CCOLA, CIMSA, CMBTN, CWENE,EKGYO, ENJSA, EREGL, FROTO,
- GOODY, GUBRF, HALKB, HEKTS, ISCTR, KCHOL, KOZAL, KOPOL,
- KRDMD, ONCSM, PETKM, PKART, SAHOL, SASA, SISE, SMRTG,
- THYAO, TMSN, TCELL, TTKOM, TOASO, TTRAK, TUPRS, VESTL, YAPRK, YKSLN
This fine-tuned model aims to provide insights into the sentiment of these stocks based on the given tagged comments and can be used for stock sentiment analysis in financial applications.
Training hyperparameters
Training Hyperparameters: The following hyperparameters were used during training:
- Optimizer: SGD
- Learning Rate: 3e-2
- Number of Training Epochs: 10
- Metric for Best Model: F1 Score
Training Results
Epoch | Training Loss | Validation Loss | Accuracy | Precision | Recall | F1 Score |
---|---|---|---|---|---|---|
1 | 1.057400 | 0.895725 | 0.621538 | 0.618631 | 0.612559 | 0.611949 |
2 | 0.908400 | 0.822652 | 0.632308 | 0.644781 | 0.629953 | 0.622661 |
3 | 0.812100 | 0.788586 | 0.656923 | 0.680735 | 0.659374 | 0.650310 |
4 | 0.747700 | 0.737312 | 0.667692 | 0.670311 | 0.668073 | 0.666547 |
5 | 0.712600 | 0.743018 | 0.692308 | 0.710226 | 0.691384 | 0.686578 |
6 | 0.659200 | 0.771312 | 0.670769 | 0.695524 | 0.669198 | 0.662246 |
7 | 0.608300 | 0.733821 | 0.680000 | 0.677778 | 0.678871 | 0.677992 |
8 | 0.575900 | 0.739905 | 0.701538 | 0.702704 | 0.700902 | 0.698514 |
9 | 0.565200 | 0.754889 | 0.692308 | 0.692446 | 0.693058 | 0.691157 |
10 | 0.541000 |
0.754683 |
0.704615 |
0.705291 |
0.704209 |
0.702093 |
Evaluation Results
Loss | Accuracy | Precision | Recall | F1 Score | Runtime | Samples/s | Steps/s | Epoch |
---|---|---|---|---|---|---|---|---|
0.754683 | 0.704615 | 0.705291 | 0.704209 | 0.702093 | 3.3869 | 191.915 | 24.211 | 10.0 |
Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Tokenizers 0.13.3
- Downloads last month
- 48
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.