StephanAkkerman's picture
Update README.md
9872c2a
|
raw
history blame
2.75 kB
metadata
license: mit
datasets:
  - TimKoornstra/financial-tweets-sentiment
language:
  - en
metrics:
  - accuracy
  - f1
pipeline_tag: text-classification
tags:
  - sentiment
  - finance
  - sentiment-analysis
  - financial-sentiment-analysis
  - twitter
  - tweets
  - stocks
  - stock-market
  - crypto
  - cryptocurrency
base_model: StephanAkkerman/FinTwitBERT

FinTwitBERT-sentiment

FinTwitBERT-sentiment is a finetuned model for classifying the sentiment of financial tweets. It uses FinTwitBERT as a base model, which has been pre-trained on 1 million financial tweets. This approach ensures that the FinTwitBERT-sentiment has seen enough financial tweets, which have an informal nature, compared to other financial texts, such as news headlines. Therefore this model performs great on informal financial texts, seen on social media.

Intended Uses

FinTwitBERT-sentiment is intended for classifying financial tweets or other financial social media texts.

More Information

For a comprehensive overview, including the training setup and analysis of the model, visit the FinTwitBERT GitHub repository.

Usage

Using HuggingFace's transformers library the model and tokenizers can be converted into a pipeline for text classification.

from transformers import BertForSequenceClassification, AutoTokenizer, pipeline

model = BertForSequenceClassification.from_pretrained(
    "StephanAkkerman/FinTwitBERT-sentiment",
    num_labels=3,
    id2label={0: "NEUTRAL", 1: "BULLISH", 2: "BEARISH"},
    label2id={"NEUTRAL": 0, "BULLISH": 1, "BEARISH": 2},
)
model.config.problem_type = "single_label_classification"
tokenizer = AutoTokenizer.from_pretrained(
    "StephanAkkerman/FinTwitBERT-sentiment"
)
model.eval()
pipeline = pipeline(
    "text-classification", model=model, tokenizer=tokenizer
)

# Sentences we want the sentiment for
sentence = ["I love you"]

# Get the predicted sentiment
print(pipeline(sentence))

Training

The model was trained with the following parameters:

Citing & Authors

If you use FinTwitBERT or FinTwitBERT-sentiment in your research, please cite us as follows, noting that both authors contributed equally to this work:

@misc{FinTwitBERT,
  author = {Stephan Akkerman, Tim Koornstra},
  title = {FinTwitBERT: A Specialized Language Model for Financial Tweets},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/TimKoornstra/FinTwitBERT}}
}

License

This project is licensed under the MIT License. See the LICENSE file for details.