Dataset Viewer
The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   ConfigNamesError
Exception:    ReadTimeout
Message:      (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: d659a225-d033-41be-b52a-6c4ae6d45800)')
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
                  config_names = get_dataset_config_names(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 165, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1663, in dataset_module_factory
                  raise e1 from None
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1620, in dataset_module_factory
                  return HubDatasetModuleFactoryWithoutScript(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1017, in get_module
                  patterns = get_data_patterns(base_path, download_config=self.download_config)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 474, in get_data_patterns
                  return _get_data_files_patterns(resolver)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 263, in _get_data_files_patterns
                  data_files = pattern_resolver(pattern)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 361, in resolve_pattern
                  for filepath, info in fs.glob(pattern, detail=True, **glob_kwargs).items()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 520, in glob
                  path = self.resolve_path(path, revision=kwargs.get("revision")).unresolve()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 198, in resolve_path
                  repo_and_revision_exist, err = self._repo_and_revision_exist(repo_type, repo_id, revision)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 125, in _repo_and_revision_exist
                  self._api.repo_info(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
                  return fn(*args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2704, in repo_info
                  return method(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
                  return fn(*args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2561, in dataset_info
                  r = get_session().get(path, headers=headers, timeout=timeout, params=params)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/sessions.py", line 602, in get
                  return self.request("GET", url, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/sessions.py", line 589, in request
                  resp = self.send(prep, **send_kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/sessions.py", line 703, in send
                  r = adapter.send(request, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_http.py", line 93, in send
                  return super().send(request, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/adapters.py", line 635, in send
                  raise ReadTimeout(e, request=request)
              requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: d659a225-d033-41be-b52a-6c4ae6d45800)')

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Bittensor Subnet 13 X (Twitter) Dataset

Data-universe: The finest collection of social media data the web has to offer
Data-universe: The finest collection of social media data the web has to offer

Dataset Summary

This dataset is part of the Bittensor Subnet 13 decentralized network, containing preprocessed data from X (formerly Twitter). The data is continuously updated by network miners, providing a real-time stream of tweets for various analytical and machine learning tasks. For more information about the dataset, please visit the official repository.

Supported Tasks

The versatility of this dataset allows researchers and data scientists to explore various aspects of social media dynamics and develop innovative applications. Users are encouraged to leverage this data creatively for their specific research or business needs. For example:

  • Sentiment Analysis
  • Trend Detection
  • Content Analysis
  • User Behavior Modeling

Languages

Primary language: Datasets are mostly English, but can be multilingual due to decentralized ways of creation.

Dataset Structure

Data Instances

Each instance represents a single tweet with the following fields:

Data Fields

  • text (string): The main content of the tweet.
  • label (string): Sentiment or topic category of the tweet.
  • tweet_hashtags (list): A list of hashtags used in the tweet. May be empty if no hashtags are present.
  • datetime (string): The date when the tweet was posted.
  • username_encoded (string): An encoded version of the username to maintain user privacy.
  • url_encoded (string): An encoded version of any URLs included in the tweet. May be empty if no URLs are present.

Data Splits

This dataset is continuously updated and does not have fixed splits. Users should create their own splits based on their requirements and the data's timestamp.

Dataset Creation

Source Data

Data is collected from public tweets on X (Twitter), adhering to the platform's terms of service and API usage guidelines.

Personal and Sensitive Information

All usernames and URLs are encoded to protect user privacy. The dataset does not intentionally include personal or sensitive information.

Considerations for Using the Data

Social Impact and Biases

Users should be aware of potential biases inherent in X (Twitter) data, including demographic and content biases. This dataset reflects the content and opinions expressed on X and should not be considered a representative sample of the general population.

Limitations

  • Data quality may vary due to the decentralized nature of collection and preprocessing.
  • The dataset may contain noise, spam, or irrelevant content typical of social media platforms.
  • Temporal biases may exist due to real-time collection methods.
  • The dataset is limited to public tweets and does not include private accounts or direct messages.
  • Not all tweets contain hashtags or URLs.

Additional Information

Licensing Information

The dataset is released under the MIT license. The use of this dataset is also subject to X Terms of Use.

Citation Information

If you use this dataset in your research, please cite it as follows:

@misc{StormKing992025datauniversex_dataset_8191,
        title={The Data Universe Datasets: The finest collection of social media data the web has to offer},
        author={StormKing99},
        year={2025},
        url={https://huggingface.co/datasets/StormKing99/x_dataset_8191},
        }

Contributions

To report issues or contribute to the dataset, please contact the miner or use the Bittensor Subnet 13 governance mechanisms.

Dataset Statistics

[This section is automatically updated]

  • Total Instances: 150425608
  • Date Range: 2025-01-21T00:00:00Z to 2025-02-07T00:00:00Z
  • Last Updated: 2025-02-12T18:47:42Z

Data Distribution

  • Tweets with hashtags: 42.10%
  • Tweets without hashtags: 57.90%

Top 10 Hashtags

For full statistics, please refer to the stats.json file in the repository.

Rank Topic Total Count Percentage
1 NULL 85002227 57.31%
2 #riyadh 1046251 0.71%
3 #zelena 785878 0.53%
4 #tiktok 615020 0.41%
5 #bbb25 382300 0.26%
6 #ad 358410 0.24%
7 #jhope_at_galadespiècesjaunes 234370 0.16%
8 #bbmzansi 194620 0.13%
9 #pr 189419 0.13%
10 #trump 182679 0.12%

Update History

Date New Instances Total Instances
2025-01-26T04:23:37Z 2098210 2098210
2025-01-26T04:24:18Z 2162522 4260732
2025-01-29T17:24:35Z 30495898 34756630
2025-02-02T05:41:30Z 28962209 63718839
2025-02-05T17:59:56Z 29099416 92818255
2025-02-09T06:21:50Z 29023092 121841347
2025-02-12T18:47:42Z 28584261 150425608
Downloads last month
2,850