The dataset viewer is not available for this dataset.
Error code: RetryableConfigNamesError Exception: HfHubHTTPError Message: 500 Server Error: Internal Server Error for url: https://huggingface.co/api/datasets/bniladridas/nvidia-gpu-dataset/tree/89fc0ac1583f079431d2480fb4f21d533e8d9152/nvidia-gpu-dataset?recursive=True&expand=False (Request ID: Root=1-67cd7629-657867fd28d592a2574e0bb2;b9714cb4-6901-4cc9-9f03-fc2be22b773f) Internal Error - We're working hard to fix this as soon as possible! Traceback: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response config_names = get_dataset_config_names( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/inspect.py", line 164, in get_dataset_config_names dataset_module = dataset_module_factory( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1731, in dataset_module_factory raise e1 from None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1688, in dataset_module_factory return HubDatasetModuleFactoryWithoutScript( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/load.py", line 1066, in get_module patterns = get_data_patterns(base_path, download_config=self.download_config) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 501, in get_data_patterns return _get_data_files_patterns(resolver) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 295, in _get_data_files_patterns data_files = pattern_resolver(pattern) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/data_files.py", line 388, in resolve_pattern for filepath, info in fs.glob(pattern, detail=True, **glob_kwargs).items() File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 521, in glob return super().glob(path, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 604, in glob allpaths = self.find(root, maxdepth=depth, withdirs=True, detail=True, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 563, in find out = self._ls_tree(path, recursive=True, refresh=refresh, revision=resolved_path.revision, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 446, in _ls_tree self._ls_tree( File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 463, in _ls_tree for path_info in tree: File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 3028, in list_repo_tree for path_info in paginate(path=tree_url, headers=headers, params={"recursive": recursive, "expand": expand}): File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_pagination.py", line 37, in paginate hf_raise_for_status(r) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_http.py", line 477, in hf_raise_for_status raise _format(HfHubHTTPError, str(e), response) from e huggingface_hub.errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://huggingface.co/api/datasets/bniladridas/nvidia-gpu-dataset/tree/89fc0ac1583f079431d2480fb4f21d533e8d9152/nvidia-gpu-dataset?recursive=True&expand=False (Request ID: Root=1-67cd7629-657867fd28d592a2574e0bb2;b9714cb4-6901-4cc9-9f03-fc2be22b773f) Internal Error - We're working hard to fix this as soon as possible!
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
🚫 Hands Off! This dataset’s locked down—no downloading or messing with it unless you’re cleared. Regular Hugging Face users, this ain’t for you.
NVIDIA GPU Scraper
This project crunches NVIDIA GPU data in a slick Docker setup. It scrapes the web with webscraped.py
, then whips up a tidy report with summary.py
. Think CSV gold like nvidia_gpu_summary_report.csv
.
What You Need
- Docker on your machine.
- Basic command-line chops.
Quick Start
Project Layout
nvidia_project/
├── Dockerfile
├── requirements.txt
├── webscraped.py
├── summary.py
└── README.md
1. Build It
Fire up the Docker image—all dependencies baked in, no virtualenv nonsense:
docker build -t nvidia_project .
2. Run It
Spin the container and let the scripts rip:
docker run --rm -it nvidia_project
--rm
: Cleans up after itself.-it
: Keeps you in the loop with a terminal.
3. Grab the Goods
Snag the output CSV from the container:
docker cp $(docker ps -l -q):/app/nvidia_gpu_summary_report.csv /your/local/path
Example:
docker cp $(docker ps -l -q):/app/nvidia_gpu_summary_report.csv /Users/niladridas/Desktop/nvidia_doc
Pushing to Hugging Face
Want it on Hugging Face? Here’s the drill:
Get the Tools:
pip install datasets
Prep the Data: Make sure it’s in JSON or CSV shape.
Upload It:
huggingface-cli dataset create --dataset_name bniladridas/nvidia-gpu-dataset --path /path/to/your/data
(Swap
bniladridas/nvidia-gpu-dataset
for your own dataset name if needed.)Spice It Up: Add a dataset card with the juicy details.
Check It: Hit Hugging Face to confirm it’s live and legit.
More deets? Peek at the Hugging Face docs.
Debugging
Stuff breaking? Dive in:
Peek Inside:
docker run -it nvidia_project /bin/sh
Scope out
/app
with:ls -l /app
Read the Tea Leaves:
docker logs $(docker ps -l -q)
Pro Tips
- Docker’s Your Friend: No need to fuss with
source .venv/bin/activate
—it’s all contained. - Keep it lean—let the container handle the heavy lifting.
- Double-check your
Dockerfile
copieswebscraped.py
andsummary.py
to/app
and sets the entrypoint right. - Tweak that
docker cp
path to wherever you want the CSV to land.
- Downloads last month
- 20