academia-mar11Top10
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("Thang203/academia-mar11Top10")
topic_model.get_topic_info()
Topic overview
- Number of topics: 10
- Number of training documents: 2353
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | models - language - language models - large - llms | 10 | -1_models_language_language models_large |
0 | language - models - language models - model - large | 678 | 0_language_models_language models_model |
1 | code - software - llms - models - generation | 1117 | 1_code_software_llms_models |
2 | chatgpt - ai - education - students - learning | 168 | 2_chatgpt_ai_education_students |
3 | adversarial - attacks - models - privacy - attack | 127 | 3_adversarial_attacks_models_privacy |
4 | legal - financial - models - patent - language | 73 | 4_legal_financial_models_patent |
5 | learning - reinforcement learning - reinforcement - rl - policy | 55 | 5_learning_reinforcement learning_reinforcement_rl |
6 | training - quantization - memory - transformer - models | 54 | 6_training_quantization_memory_transformer |
7 | materials - molecular - molecule - discovery - models | 52 | 7_materials_molecular_molecule_discovery |
8 | surprisal - reading - models - language - language models | 19 | 8_surprisal_reading_models_language |
Training hyperparameters
- calculate_probabilities: False
- language: english
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: 10
- seed_topic_list: None
- top_n_words: 10
- verbose: True
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.25.2
- HDBSCAN: 0.8.33
- UMAP: 0.5.5
- Pandas: 1.5.3
- Scikit-Learn: 1.2.2
- Sentence-transformers: 2.6.1
- Transformers: 4.38.2
- Numba: 0.58.1
- Plotly: 5.15.0
- Python: 3.10.12
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.