transformers_issues_topics
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("Adamyu/transformers_issues_topics")
topic_model.get_topic_info()
Topic overview
- Number of topics: 30
- Number of training documents: 9000
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | bert - tf - pytorch - tensorflow - t5 | 12 | -1_bert_tf_pytorch_tensorflow |
0 | tokenizer - tokenizers - tokenization - tokenize - token | 2218 | 0_tokenizer_tokenizers_tokenization_tokenize |
1 | cuda - cuda0 - memory - tensorflow - ram | 1617 | 1_cuda_cuda0_memory_tensorflow |
2 | modelcard - modelcards - card - model - cards | 582 | 2_modelcard_modelcards_card_model |
3 | readmemd - readmetxt - readme - file - docstring | 536 | 3_readmemd_readmetxt_readme_file |
4 | gpt2 - gpt2tokenizer - gpt2tokenizerfast - gpt2model - gpt2xl | 490 | 4_gpt2_gpt2tokenizer_gpt2tokenizerfast_gpt2model |
5 | squaddataset - squad - squadpy - dataset - datasets | 430 | 5_squaddataset_squad_squadpy_dataset |
6 | summarization - summaries - sentences - sentencepiece - nlp | 387 | 6_summarization_summaries_sentences_sentencepiece |
7 | s2s - s2t - exampless2s - seq2seq - seq2seqtrainer | 335 | 7_s2s_s2t_exampless2s_seq2seq |
8 | trainertrain - trainer - trainerevaluate - trainers - training | 301 | 8_trainertrain_trainer_trainerevaluate_trainers |
9 | t5 - t5model - t5base - tf - t5large | 266 | 9_t5_t5model_t5base_tf |
10 | ci - testing - tests - speedup - test | 254 | 10_ci_testing_tests_speedup |
11 | importerror - transformerscli - transformers - transformer - transformerxl | 202 | 11_importerror_transformerscli_transformers_transformer |
12 | pipeline - pipelines - ner - nerpipeline - fillmaskpipeline | 168 | 12_pipeline_pipelines_ner_nerpipeline |
13 | glue - gluepy - glueconvertexamplestofeatures - huggingfacetransformers - huggingfacemaster | 164 | 13_glue_gluepy_glueconvertexamplestofeatures_huggingfacetransformers |
14 | questionansweringpipeline - questionanswering - answering - tfalbertforquestionanswering - questionasnwering | 162 | 14_questionansweringpipeline_questionanswering_answering_tfalbertforquestionanswering |
15 | deprecate - deprecation - warnings - deprecated - warning | 135 | 15_deprecate_deprecation_warnings_deprecated |
16 | onnx - onnxonnxruntime - onnxexport - 04onnxexport - 04onnxexportipynb | 132 | 16_onnx_onnxonnxruntime_onnxexport_04onnxexport |
17 | flaxbertmodel - flaxbertformaskedlm - flax - flaxelectraforpretraining - flaxelectraformaskedlm | 96 | 17_flaxbertmodel_flaxbertformaskedlm_flax_flaxelectraforpretraining |
18 | precision - accuracy - tf - tfbertfortokenclassification - dtypefloat32 | 95 | 18_precision_accuracy_tf_tfbertfortokenclassification |
19 | generationbeamsearchpy - generatebeamsearch - beamsearch - nonbeamsearch - beam | 81 | 19_generationbeamsearchpy_generatebeamsearch_beamsearch_nonbeamsearch |
20 | longformer - longformers - longform - longformerlayer - longformermodel | 74 | 20_longformer_longformers_longform_longformerlayer |
21 | cachedir - cache - cachedpath - caching - cached | 65 | 21_cachedir_cache_cachedpath_caching |
22 | wandbproject - wandb - wandbcallback - wandbdisabled - wandbdisabledtrue | 40 | 22_wandbproject_wandb_wandbcallback_wandbdisabled |
23 | layoutlm - layoutlmtokenizer - layout - layoutlmbaseuncased - template | 38 | 23_layoutlm_layoutlmtokenizer_layout_layoutlmbaseuncased |
24 | notebook - notebooks - blenderbot3b - community - blenderbot | 32 | 24_notebook_notebooks_blenderbot3b_community |
25 | electra - electrapretrainedmodel - electraformaskedlm - electraformultiplechoice - electrafortokenclassification | 30 | 25_electra_electrapretrainedmodel_electraformaskedlm_electraformultiplechoice |
26 | colab - cola - fixed - fix - links | 25 | 26_colab_cola_fixed_fix |
27 | pplm - pr - deprecated - variable - attributeerror | 20 | 27_pplm_pr_deprecated_variable |
28 | isort - blackisortflake8 - github - repo - version | 13 | 28_isort_blackisortflake8_github_repo |
Training hyperparameters
- calculate_probabilities: False
- language: english
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: 30
- seed_topic_list: None
- top_n_words: 10
- verbose: True
Framework versions
- Numpy: 1.23.5
- HDBSCAN: 0.8.33
- UMAP: 0.5.5
- Pandas: 1.5.3
- Scikit-Learn: 1.2.2
- Sentence-transformers: 2.2.2
- Transformers: 4.35.2
- Numba: 0.58.1
- Plotly: 5.15.0
- Python: 3.10.12
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.