MARTINI_enrich_BERTopic_OdeToPower2
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("AIDA-UPM/MARTINI_enrich_BERTopic_OdeToPower2")
topic_model.get_topic_info()
Topic overview
- Number of topics: 5
- Number of training documents: 492
Click here for an overview of all topics.
| Topic ID | Topic Keywords | Topic Frequency | Label |
|---|---|---|---|
| -1 | telegram - freemasonry - anyone - prigozhin - hitler | 24 | -1_telegram_freemasonry_anyone_prigozhin |
| 0 | jews - freud - netanyahu - ukrainians - gaza | 208 | 0_jews_freud_netanyahu_ukrainians |
| 1 | whites - fuckers - racism - goddamn - soul | 108 | 1_whites_fuckers_racism_goddamn |
| 2 | men - libido - whore - smokers - daoist | 101 | 2_men_libido_whore_smokers |
| 3 | meme - normies - banned - subscribers - videos | 51 | 3_meme_normies_banned_subscribers |
Training hyperparameters
- calculate_probabilities: True
- language: None
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: None
- seed_topic_list: None
- top_n_words: 10
- verbose: False
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.26.4
- HDBSCAN: 0.8.40
- UMAP: 0.5.7
- Pandas: 2.2.3
- Scikit-Learn: 1.5.2
- Sentence-transformers: 3.3.1
- Transformers: 4.46.3
- Numba: 0.60.0
- Plotly: 5.24.1
- Python: 3.10.12
- Downloads last month
- -