Edit model card

CryptoBERT is a pre-trained BERT (Bidirectional Encoder Representations from Transformers) model fine-tuned on a dataset of crypto-related news articles. It is designed to analyze and understand crypto news, providing valuable insights into the rapidly evolving world of cryptocurrencies.

Features

  • Domain-Specific Knowledge: Trained on a diverse dataset of crypto news, CryptoBERT captures domain-specific information, enabling it to understand the unique language and context of the cryptocurrency space.

  • Sentiment Analysis: CryptoBERT is capable of sentiment analysis, helping you gauge the overall sentiment expressed in crypto news articles, whether it's positive, negative, or neutral.

  • Named Entity Recognition (NER): The model excels in identifying key entities such as cryptocurrency names, organizations, and important figures, enhancing its ability to extract relevant information.

  • Fine-tuned for Crypto Jargon: CryptoBERT is fine-tuned to recognize and understand the specialized jargon commonly used in the crypto industry, ensuring accurate interpretation of news articles.

Usage

Downloads last month
29
Safetensors
Model size
17.4M params
Tensor type
F32
·