id
int64
959M
2.55B
title
stringlengths
3
133
body
stringlengths
1
65.5k
βŒ€
description
stringlengths
5
65.6k
state
stringclasses
2 values
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
βŒ€
user
stringclasses
174 values
1,981,190,891
Add missing runners to job runner factory
null
Add missing runners to job runner factory:
closed
2023-11-07T12:01:16Z
2023-11-07T12:10:33Z
2023-11-07T12:10:22Z
lhoestq
1,981,048,062
create /healthcheck and /metrics endpoints in workers
fixes #2067 - [x] Publish /metrics and /healthcheck endpoints in workers - [x] Check worker's health with the new endpoint - [x] Fetch worker's Prometheus metrics - [x] Instrument parts of the worker's code
create /healthcheck and /metrics endpoints in workers: fixes #2067 - [x] Publish /metrics and /healthcheck endpoints in workers - [x] Check worker's health with the new endpoint - [x] Fetch worker's Prometheus metrics - [x] Instrument parts of the worker's code
closed
2023-11-07T10:37:40Z
2023-11-08T18:22:55Z
2023-11-08T18:22:54Z
severo
1,980,331,649
try to remove assets and cached-assets
Second try to delete all dependencies to assets and cached assets. In https://github.com/huggingface/datasets-server/pull/2040, reverse-proxy failed (Maybe because of a wrong nginx configuration file, I didn't find any log error).
try to remove assets and cached-assets: Second try to delete all dependencies to assets and cached assets. In https://github.com/huggingface/datasets-server/pull/2040, reverse-proxy failed (Maybe because of a wrong nginx configuration file, I didn't find any log error).
closed
2023-11-07T00:44:07Z
2023-11-07T12:14:49Z
2023-11-07T11:29:26Z
AndreaFrancis
1,979,752,070
Instrument the workers to profile jobs duration
The workers do not send metrics to Prometheus, which makes it hard to understand where the time is spent.
Instrument the workers to profile jobs duration: The workers do not send metrics to Prometheus, which makes it hard to understand where the time is spent.
closed
2023-11-06T18:04:49Z
2023-11-08T18:22:55Z
2023-11-08T18:22:55Z
severo
1,979,667,830
Revert delete assets
null
Revert delete assets:
closed
2023-11-06T17:25:08Z
2023-11-06T17:26:51Z
2023-11-06T17:26:49Z
AndreaFrancis
1,979,660,702
Revert "remove assets and cached assets (#2040)"
This reverts commit 7d1213c05598e2cb4aabbb829b3df7e4a9cbca6f.
Revert "remove assets and cached assets (#2040)": This reverts commit 7d1213c05598e2cb4aabbb829b3df7e4a9cbca6f.
closed
2023-11-06T17:20:48Z
2023-11-06T17:23:03Z
2023-11-06T17:22:59Z
AndreaFrancis
1,979,610,600
UnexpectedApiError on viewer
https://huggingface.co/datasets/RepoFusion/Stack-Repo/viewer/bm25_contexts > Error code: UnexpectedApiError
UnexpectedApiError on viewer: https://huggingface.co/datasets/RepoFusion/Stack-Repo/viewer/bm25_contexts > Error code: UnexpectedApiError
closed
2023-11-06T16:55:07Z
2023-11-06T17:54:04Z
2023-11-06T17:02:28Z
severo
1,979,357,282
fix reverse proxy
After deploying https://github.com/huggingface/datasets-server/pull/2040 in staging, reverse-proxy component failed (I could not see the specific error but suspect it is because of an alias to a no longer existent location /storage).
fix reverse proxy: After deploying https://github.com/huggingface/datasets-server/pull/2040 in staging, reverse-proxy component failed (I could not see the specific error but suspect it is because of an alias to a no longer existent location /storage).
closed
2023-11-06T14:52:19Z
2023-11-06T16:59:24Z
2023-11-06T16:59:23Z
AndreaFrancis
1,979,344,129
Fix cache directory for partial datasets for `split-descriptive-statistics` runner
fixes https://github.com/huggingface/datasets-server/issues/1831 fix glob pattern: preserve 'partial' in a split dir name
Fix cache directory for partial datasets for `split-descriptive-statistics` runner: fixes https://github.com/huggingface/datasets-server/issues/1831 fix glob pattern: preserve 'partial' in a split dir name
closed
2023-11-06T14:46:20Z
2023-11-06T17:21:41Z
2023-11-06T17:21:40Z
polinaeterna
1,978,834,505
Support "maxDays: -1" to disable deleting old cache entries
See https://github.com/huggingface/datasets-server/pull/2060#pullrequestreview-1714775842 for reference.
Support "maxDays: -1" to disable deleting old cache entries: See https://github.com/huggingface/datasets-server/pull/2060#pullrequestreview-1714775842 for reference.
closed
2023-11-06T10:47:56Z
2024-02-06T15:55:02Z
2024-02-06T15:55:02Z
severo
1,978,804,114
invalidate the datasets cache after 3 years instead of 3 months
Quick fix to avoid deleting old cache entries (in particular, to be sure to keep the old dataset cache entries for datasets with a dataset script, now that we don't support scripts anymore)
invalidate the datasets cache after 3 years instead of 3 months: Quick fix to avoid deleting old cache entries (in particular, to be sure to keep the old dataset cache entries for datasets with a dataset script, now that we don't support scripts anymore)
closed
2023-11-06T10:31:48Z
2023-11-06T10:48:05Z
2023-11-06T10:46:32Z
severo
1,978,729,245
UnexpectedError on /rows for splits with images
https://huggingface.co/datasets/diegomiranda/teste-new-repo-1/discussions/1 > The dataset viewer is not working. > > Error details: > > ``` > Error code: UnexpectedError > ``` > > Hi, > > I'm encountering an issue with loading images in the dataset. I'm working with five datasets: three of them include both images and CSV files, while the remaining two contain only CSV files. When I test the datasets with only CSV files, everything works fine. However, I'm facing challenges when working with the datasets that include images. I would greatly appreciate any assistance or guidance on resolving this matter.
UnexpectedError on /rows for splits with images: https://huggingface.co/datasets/diegomiranda/teste-new-repo-1/discussions/1 > The dataset viewer is not working. > > Error details: > > ``` > Error code: UnexpectedError > ``` > > Hi, > > I'm encountering an issue with loading images in the dataset. I'm working with five datasets: three of them include both images and CSV files, while the remaining two contain only CSV files. When I test the datasets with only CSV files, everything works fine. However, I'm facing challenges when working with the datasets that include images. I would greatly appreciate any assistance or guidance on resolving this matter.
closed
2023-11-06T09:51:54Z
2024-07-30T16:08:11Z
2024-07-30T16:08:11Z
severo
1,978,612,223
Stop worker loop executor on SIGTERM
Fix #1999.
Stop worker loop executor on SIGTERM: Fix #1999.
closed
2023-11-06T08:53:42Z
2023-11-07T10:29:23Z
2023-11-07T10:29:23Z
albertvillanova
1,976,978,765
The "Recreate dataset" admin action first fails, then works
First click on the button: <img width="1526" alt="Capture d’écran 2023-11-03 aΜ€ 22 55 31" src="https://github.com/huggingface/datasets-server/assets/1676121/ebbcb349-ca39-4a6e-9aee-f244ef0ebbf6"> Second click (a few seconds later): <img width="1526" alt="Capture d’écran 2023-11-03 aΜ€ 22 55 38" src="https://github.com/huggingface/datasets-server/assets/1676121/96ecc3d3-e5a8-4d95-a20f-2c239589ed7d">
The "Recreate dataset" admin action first fails, then works: First click on the button: <img width="1526" alt="Capture d’écran 2023-11-03 aΜ€ 22 55 31" src="https://github.com/huggingface/datasets-server/assets/1676121/ebbcb349-ca39-4a6e-9aee-f244ef0ebbf6"> Second click (a few seconds later): <img width="1526" alt="Capture d’écran 2023-11-03 aΜ€ 22 55 38" src="https://github.com/huggingface/datasets-server/assets/1676121/96ecc3d3-e5a8-4d95-a20f-2c239589ed7d">
closed
2023-11-03T21:57:55Z
2023-11-07T20:22:43Z
2023-11-07T19:09:50Z
severo
1,976,962,566
feat: 🎸 upgrade nginx image to 1.25.3
fixes #1961. RELEASE NOTES are here: https://nginx.org/en/CHANGES. The change we're interested in is: > *) Change: improved detection of misbehaving clients when using HTTP/2.
feat: 🎸 upgrade nginx image to 1.25.3: fixes #1961. RELEASE NOTES are here: https://nginx.org/en/CHANGES. The change we're interested in is: > *) Change: improved detection of misbehaving clients when using HTTP/2.
closed
2023-11-03T21:37:53Z
2023-11-04T09:30:39Z
2023-11-04T09:30:38Z
severo
1,976,953,097
fix: πŸ› hide the error to avoid disclosing a token
see internal discussion in Slack: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1692803953736769
fix: πŸ› hide the error to avoid disclosing a token: see internal discussion in Slack: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1692803953736769
closed
2023-11-03T21:27:33Z
2023-11-06T17:17:57Z
2023-11-06T17:17:56Z
severo
1,976,941,021
Use ruff in vscode
Can you test if it works for you? (edit: I'm merging, but still: please give me feedback if it breaks something) For me, after some struggle, it works well. A nice improvement is that it will sort the imports. https://github.com/huggingface/datasets-server/assets/1676121/ba698454-b6df-42d6-b402-676b6d030e8c
Use ruff in vscode: Can you test if it works for you? (edit: I'm merging, but still: please give me feedback if it breaks something) For me, after some struggle, it works well. A nice improvement is that it will sort the imports. https://github.com/huggingface/datasets-server/assets/1676121/ba698454-b6df-42d6-b402-676b6d030e8c
closed
2023-11-03T21:15:07Z
2023-11-03T21:17:01Z
2023-11-03T21:16:31Z
severo
1,976,560,588
Delete duckdb files when a split is deleted
See https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k/tree/refs%2Fconvert%2Fparquet/default/train The `train` split existed, but has been deleted. Now, the datasets has `train_sft` and `train_gen`. But the duckdb index file still exists for `train` in `refs/convert/parquet`. Note that the parquet files have been deleted.
Delete duckdb files when a split is deleted: See https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k/tree/refs%2Fconvert%2Fparquet/default/train The `train` split existed, but has been deleted. Now, the datasets has `train_sft` and `train_gen`. But the duckdb index file still exists for `train` in `refs/convert/parquet`. Note that the parquet files have been deleted.
closed
2023-11-03T16:33:43Z
2024-01-19T14:40:11Z
2024-01-19T14:12:47Z
severo
1,976,529,844
Support JWT on cookies
private conversion: https://huggingface.slack.com/archives/D030YA5BW91/p1696507761676679 When the users goes to https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-textbooks-are-all-you-need, moonlanding can put a cookie on `datasets-server.huggingface.co` with name `hf_jwt_[sha256(/datasets/emrgnt-cmplxty/sciphi-textbooks-are-all-you-need)]` and with the JWT as the value This cookie would be read on datasets server when accessing a gated dataset Doing so would simplify a lot the code on the Hub (moonlanding) by removing the need to refresh the JWT (remove an endpoint), and avoid the logic in the frontend code that refreshes the JWT. It would be a security improvement too, because the Hub's frontend code (javascript) would no more have access to the JWT (the browser directly adds the cookie to the HTTP request)
Support JWT on cookies: private conversion: https://huggingface.slack.com/archives/D030YA5BW91/p1696507761676679 When the users goes to https://huggingface.co/datasets/emrgnt-cmplxty/sciphi-textbooks-are-all-you-need, moonlanding can put a cookie on `datasets-server.huggingface.co` with name `hf_jwt_[sha256(/datasets/emrgnt-cmplxty/sciphi-textbooks-are-all-you-need)]` and with the JWT as the value This cookie would be read on datasets server when accessing a gated dataset Doing so would simplify a lot the code on the Hub (moonlanding) by removing the need to refresh the JWT (remove an endpoint), and avoid the logic in the frontend code that refreshes the JWT. It would be a security improvement too, because the Hub's frontend code (javascript) would no more have access to the JWT (the browser directly adds the cookie to the HTTP request)
closed
2023-11-03T16:17:20Z
2024-03-13T09:49:15Z
2024-03-13T09:49:14Z
severo
1,976,239,331
Update index.mdx
Add a sentence to highlight conversion to Parquet
Update index.mdx: Add a sentence to highlight conversion to Parquet
closed
2023-11-03T13:44:58Z
2023-11-03T13:51:41Z
2023-11-03T13:51:10Z
severo
1,976,215,817
Should we support video datasets?
Like https://huggingface.co/datasets/commaai/commavq There was a previous intent in datasets: https://github.com/huggingface/datasets/pull/5339
Should we support video datasets?: Like https://huggingface.co/datasets/commaai/commavq There was a previous intent in datasets: https://github.com/huggingface/datasets/pull/5339
closed
2023-11-03T13:33:00Z
2023-12-11T15:04:08Z
2023-12-11T15:04:08Z
severo
1,975,982,656
Retry jobs that finish with `ClientConnection` error?
Maybe here: https://github.com/huggingface/datasets-server/blob/f311a9212aaa91dd0373e5c2d4f5da9b6bdabcb5/chart/env/prod.yaml#L209 Internal conversation on Slack: https://huggingface.slack.com/archives/C0311GZ7R6K/p1698224875005729 Anyway: I'm wondering if we can have the error now that the dataset scripts are disabled by default.
Retry jobs that finish with `ClientConnection` error?: Maybe here: https://github.com/huggingface/datasets-server/blob/f311a9212aaa91dd0373e5c2d4f5da9b6bdabcb5/chart/env/prod.yaml#L209 Internal conversation on Slack: https://huggingface.slack.com/archives/C0311GZ7R6K/p1698224875005729 Anyway: I'm wondering if we can have the error now that the dataset scripts are disabled by default.
closed
2023-11-03T11:28:19Z
2024-02-06T17:29:45Z
2024-02-06T17:29:45Z
severo
1,975,936,480
Improve selection of the next job to priorize datasets with few configs/splits
Internal Slack conversation: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698323643019099?thread_ts=1698323528.797149&cid=C04L6P8KNQ5 > `opus_euconst` is squatting the pods, and the small datasets are waiting for too long before being processed. ![image](https://github.com/huggingface/datasets-server/assets/1676121/976df781-fda8-4072-9133-9a2cdd273eb6) > maybe switch it to "low" prior ? > It could be a good solution for these big datasets, indeed! If number of configs or splits is over some threshold -> change the priority to low. > I like it, because it would not change our bottleneck query, which is to find the next job to process.
Improve selection of the next job to priorize datasets with few configs/splits: Internal Slack conversation: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698323643019099?thread_ts=1698323528.797149&cid=C04L6P8KNQ5 > `opus_euconst` is squatting the pods, and the small datasets are waiting for too long before being processed. ![image](https://github.com/huggingface/datasets-server/assets/1676121/976df781-fda8-4072-9133-9a2cdd273eb6) > maybe switch it to "low" prior ? > It could be a good solution for these big datasets, indeed! If number of configs or splits is over some threshold -> change the priority to low. > I like it, because it would not change our bottleneck query, which is to find the next job to process.
closed
2023-11-03T10:59:19Z
2024-02-06T17:04:07Z
2024-02-06T17:04:06Z
severo
1,975,908,066
descriptive-statistics job is sometimes stuck
Conversation on Slack (internal) https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698325340414399?thread_ts=1698323528.797149&cid=C04L6P8KNQ5 > Note that stats jobs get stuck very often for some reason, I think Polina was investigating > e.g. this job has been stuck for 30min at the duckdb data ingestion step (only 22MB of data) ``` INFO: 2023-10-26 12:44:15,764 - root - [split-descriptive-statistics] compute JobManager(job_id=653a5e8e2a3bd4b73b210849 dataset=onuralp/open-otter job_info={'job_id': '653a5e8e2a3bd4b73b210849', 'type': 'split-descriptive-statistics', 'params': {'dataset': 'onuralp/open-otter', 'revision': '17db84f97d83d3d782869fa3767b812ba4f7e407', 'config': 'default', 'split': 'train'}, 'priority': <Priority.NORMAL: 'normal'>, 'difficulty': 70} INFO: 2023-10-26 12:44:15,771 - root - Compute descriptive statistics for dataset='onuralp/open-otter', config='default', split='train' INFO: 2023-10-26 12:44:15,774 - root - Downloading remote parquet files to a local directory /storage/stats-cache/89781749453102-split-descriptive-statistics-onuralp-open-otter-776f2760. Downloading 0000.parquet: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 22.3M/22.3M [00:01<00:00, 17.0MB/s] INFO: 2023-10-26 12:47:00,397 - root - Original number of threads=2 INFO: 2023-10-26 12:47:00,398 - root - Current number of threads=8 INFO: 2023-10-26 12:47:00,398 - root - Original max_memory='105.7GB' INFO: 2023-10-26 12:47:00,399 - root - Current max_memory='28.0GB' INFO: 2023-10-26 12:47:00,399 - root - Loading data into in-memory table. INFO: 2023-10-26 12:47:00,399 - root - CREATE OR REPLACE TABLE data AS SELECT "input","output","instruction","data_source" FROM read_parquet('/storage/stats-cache/89781749453102-split-descriptive-statistics-onuralp-open-otter-776f2760/default/train/*.parquet'); ``` > (running this duckdb command in a shell is instantaneous !!) (edited) > (and the subsequent duckdb-index job took a few seconds)
descriptive-statistics job is sometimes stuck: Conversation on Slack (internal) https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698325340414399?thread_ts=1698323528.797149&cid=C04L6P8KNQ5 > Note that stats jobs get stuck very often for some reason, I think Polina was investigating > e.g. this job has been stuck for 30min at the duckdb data ingestion step (only 22MB of data) ``` INFO: 2023-10-26 12:44:15,764 - root - [split-descriptive-statistics] compute JobManager(job_id=653a5e8e2a3bd4b73b210849 dataset=onuralp/open-otter job_info={'job_id': '653a5e8e2a3bd4b73b210849', 'type': 'split-descriptive-statistics', 'params': {'dataset': 'onuralp/open-otter', 'revision': '17db84f97d83d3d782869fa3767b812ba4f7e407', 'config': 'default', 'split': 'train'}, 'priority': <Priority.NORMAL: 'normal'>, 'difficulty': 70} INFO: 2023-10-26 12:44:15,771 - root - Compute descriptive statistics for dataset='onuralp/open-otter', config='default', split='train' INFO: 2023-10-26 12:44:15,774 - root - Downloading remote parquet files to a local directory /storage/stats-cache/89781749453102-split-descriptive-statistics-onuralp-open-otter-776f2760. Downloading 0000.parquet: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 22.3M/22.3M [00:01<00:00, 17.0MB/s] INFO: 2023-10-26 12:47:00,397 - root - Original number of threads=2 INFO: 2023-10-26 12:47:00,398 - root - Current number of threads=8 INFO: 2023-10-26 12:47:00,398 - root - Original max_memory='105.7GB' INFO: 2023-10-26 12:47:00,399 - root - Current max_memory='28.0GB' INFO: 2023-10-26 12:47:00,399 - root - Loading data into in-memory table. INFO: 2023-10-26 12:47:00,399 - root - CREATE OR REPLACE TABLE data AS SELECT "input","output","instruction","data_source" FROM read_parquet('/storage/stats-cache/89781749453102-split-descriptive-statistics-onuralp-open-otter-776f2760/default/train/*.parquet'); ``` > (running this duckdb command in a shell is instantaneous !!) (edited) > (and the subsequent duckdb-index job took a few seconds)
closed
2023-11-03T10:44:55Z
2024-07-30T17:07:22Z
2024-07-30T17:07:22Z
severo
1,975,787,190
Fix Hub datasets after updating datasets to 2.14.6
After deploying PR: - #2007 We need to fix/refresh Hub datasets: - [x] Raise `DefunctDatasetError` from datasets with taken down data - [x] Refresh viewer of datasets with `metadata.csv` - [x] 27 datasets, e.g. https://huggingface.co/datasets/lukarape/public_small_papers/discussions/1 - [x] Refresh viewer of datasets with 'data' word twice - [x] 270 datasets, e.g. https://huggingface.co/datasets/piuba-bigdata/articles_and_comments/discussions/1
Fix Hub datasets after updating datasets to 2.14.6: After deploying PR: - #2007 We need to fix/refresh Hub datasets: - [x] Raise `DefunctDatasetError` from datasets with taken down data - [x] Refresh viewer of datasets with `metadata.csv` - [x] 27 datasets, e.g. https://huggingface.co/datasets/lukarape/public_small_papers/discussions/1 - [x] Refresh viewer of datasets with 'data' word twice - [x] 270 datasets, e.g. https://huggingface.co/datasets/piuba-bigdata/articles_and_comments/discussions/1
closed
2023-11-03T09:36:10Z
2024-02-06T18:27:29Z
2024-02-06T18:27:29Z
albertvillanova
1,974,386,677
Audio asset with pydub conversion fails to write on S3
See https://huggingface.co/datasets/zeio/auto-pale/discussions/1 (ogg audio files) ``` File \"/src/libs/libcommon/src/libcommon/viewer_utils/asset.py\", line 111, in create_audio_file segment.export(f, format=suffix[1:]) File \"/src/services/worker/.venv/lib/python3.9/site-packages/pydub/audio_segment.py\", line 868, in export out_f.seek(0) File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1737, in seek aise OSError(ESPIPE, \"Seek only available in read mode\") OSError: [Errno 29] Seek only available in read mode ``` we might have to write to a BytesIO buffer before writing the audio actual audio asset file, since fsspec doesn't allow seek for wb files somehow.
Audio asset with pydub conversion fails to write on S3: See https://huggingface.co/datasets/zeio/auto-pale/discussions/1 (ogg audio files) ``` File \"/src/libs/libcommon/src/libcommon/viewer_utils/asset.py\", line 111, in create_audio_file segment.export(f, format=suffix[1:]) File \"/src/services/worker/.venv/lib/python3.9/site-packages/pydub/audio_segment.py\", line 868, in export out_f.seek(0) File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1737, in seek aise OSError(ESPIPE, \"Seek only available in read mode\") OSError: [Errno 29] Seek only available in read mode ``` we might have to write to a BytesIO buffer before writing the audio actual audio asset file, since fsspec doesn't allow seek for wb files somehow.
closed
2023-11-02T14:30:10Z
2023-11-16T12:45:39Z
2023-11-16T12:45:39Z
lhoestq
1,974,288,216
Support ruff in vscode
See https://github.com/huggingface/datasets-server/pull/2031#issuecomment-1790455688 and https://github.com/huggingface/datasets-server/pull/2031#issuecomment-1790718911
Support ruff in vscode: See https://github.com/huggingface/datasets-server/pull/2031#issuecomment-1790455688 and https://github.com/huggingface/datasets-server/pull/2031#issuecomment-1790718911
closed
2023-11-02T13:41:26Z
2023-11-10T15:07:41Z
2023-11-10T15:07:40Z
severo
1,973,807,306
Explain single quote string SQL syntax in filter docs
Explain single quote string SQL syntax in filter docs. Close #2038.
Explain single quote string SQL syntax in filter docs: Explain single quote string SQL syntax in filter docs. Close #2038.
closed
2023-11-02T09:07:45Z
2023-11-02T13:53:02Z
2023-11-02T13:52:33Z
albertvillanova
1,973,757,952
Use hf-doc-builder version from GitHub to build docs locally
This PR aligns the `hf-doc-builder` version used to build docs locally with the one used by our CI: now both use the version in the main branch in the GitHub repository: https://github.com/huggingface/doc-builder Note that before this PR, local docs building: - used an old version of `hf-doc-builder` (0.3.0) released on Apr 29, 2022: https://pypi.org/project/hf-doc-builder/0.3.0/ - that version had a bug (it didn't install required package `black`), that we were avoiding by requiring the "quality" extra Now: - the above mentioned bug has been fixed in main branch; see: - https://github.com/huggingface/doc-builder/pull/434 - we no longer need to install the "quality" extra, so no unnecessary packages are installed: `flake8`, `isort`, `mccabe`, `pycodestyle`, `pyflakes`,...
Use hf-doc-builder version from GitHub to build docs locally: This PR aligns the `hf-doc-builder` version used to build docs locally with the one used by our CI: now both use the version in the main branch in the GitHub repository: https://github.com/huggingface/doc-builder Note that before this PR, local docs building: - used an old version of `hf-doc-builder` (0.3.0) released on Apr 29, 2022: https://pypi.org/project/hf-doc-builder/0.3.0/ - that version had a bug (it didn't install required package `black`), that we were avoiding by requiring the "quality" extra Now: - the above mentioned bug has been fixed in main branch; see: - https://github.com/huggingface/doc-builder/pull/434 - we no longer need to install the "quality" extra, so no unnecessary packages are installed: `flake8`, `isort`, `mccabe`, `pycodestyle`, `pyflakes`,...
closed
2023-11-02T08:35:46Z
2023-11-02T12:50:09Z
2023-11-02T12:49:36Z
albertvillanova
1,970,839,681
Add information about filter feature to the Hub's docs on the Dataset Viewer
Maybe I didn't find it but I think it's missing, at least [here](https://huggingface.co/docs/hub/datasets-viewer)
Add information about filter feature to the Hub's docs on the Dataset Viewer: Maybe I didn't find it but I think it's missing, at least [here](https://huggingface.co/docs/hub/datasets-viewer)
closed
2023-10-31T16:24:18Z
2024-02-06T15:45:25Z
2024-02-06T15:45:25Z
polinaeterna
1,970,662,057
remove assets and cached assets
Since /storage/assets and /storage/cached-assets are no more needed for staging and prod environments, I am removing all chart/k8s related artifacts: - persistance claim - containers - volume mounts - pvc
remove assets and cached assets: Since /storage/assets and /storage/cached-assets are no more needed for staging and prod environments, I am removing all chart/k8s related artifacts: - persistance claim - containers - volume mounts - pvc
closed
2023-10-31T14:58:06Z
2023-11-06T14:13:24Z
2023-11-06T14:13:23Z
AndreaFrancis
1,970,262,386
Revert "Revert assets fsspec (#2036)"
This reverts commit 5716638a683d83652da672f0f59bf68935d0a93e.
Revert "Revert assets fsspec (#2036)": This reverts commit 5716638a683d83652da672f0f59bf68935d0a93e.
closed
2023-10-31T11:36:27Z
2023-10-31T12:13:58Z
2023-10-31T12:13:57Z
AndreaFrancis
1,969,315,934
How to pass single quote in /filter endpoint "where" parameter?
See `https://huggingface.co/datasets/albertvillanova/lm_en_dummy2/viewer/default/train?f[meta][value]='{'file': 'file_4.txt'}'` From `https://datasets-server.huggingface.co/filter?dataset=albertvillanova/lm_en_dummy2&config=default&split=train&where=meta='{'file': 'file_4.txt'}'`, we get: ``` {"error":"Parameter 'where' is invalid"} ``` We want to search he value `{'file': 'file_4.txt'}` in the column `meta`
How to pass single quote in /filter endpoint "where" parameter?: See `https://huggingface.co/datasets/albertvillanova/lm_en_dummy2/viewer/default/train?f[meta][value]='{'file': 'file_4.txt'}'` From `https://datasets-server.huggingface.co/filter?dataset=albertvillanova/lm_en_dummy2&config=default&split=train&where=meta='{'file': 'file_4.txt'}'`, we get: ``` {"error":"Parameter 'where' is invalid"} ``` We want to search he value `{'file': 'file_4.txt'}` in the column `meta`
closed
2023-10-30T22:21:24Z
2023-11-02T17:22:54Z
2023-11-02T13:52:34Z
severo
1,969,160,211
feat: 🎸 allow dataset script on togethercomputer/RedPajama-Data
null
feat: 🎸 allow dataset script on togethercomputer/RedPajama-Data:
closed
2023-10-30T20:29:46Z
2023-10-30T20:36:23Z
2023-10-30T20:36:22Z
severo
1,969,087,295
Revert assets fsspec
Once deployed in staging env, it looks like URL path resolve is not working for assets and cached-assets, will revert related PRs. For example image https://datasets-server.us.dev.moon.huggingface.tech/assets/asoria/bluey/--/2479b12a83d3ac64cecf3ef[…]3b2fbf88fb1/--/default/train/0/image/image.png](https://datasets-server.us.dev.moon.huggingface.tech/assets/asoria/bluey/--/2479b12a83d3ac64cecf3ef1c33cc3b2fbf88fb1/--/default/train/0/image/image.png cant be found.
Revert assets fsspec: Once deployed in staging env, it looks like URL path resolve is not working for assets and cached-assets, will revert related PRs. For example image https://datasets-server.us.dev.moon.huggingface.tech/assets/asoria/bluey/--/2479b12a83d3ac64cecf3ef[…]3b2fbf88fb1/--/default/train/0/image/image.png](https://datasets-server.us.dev.moon.huggingface.tech/assets/asoria/bluey/--/2479b12a83d3ac64cecf3ef1c33cc3b2fbf88fb1/--/default/train/0/image/image.png cant be found.
closed
2023-10-30T19:40:34Z
2023-10-30T20:09:42Z
2023-10-30T20:09:41Z
AndreaFrancis
1,969,025,760
Update pip to 23.3.1 to fix vulnerability
Update `pip` to 23.3.1 to fix vulnerability. See: https://github.com/huggingface/datasets-server/actions/runs/6697364484/job/18197081786?pr=2031 ``` Found 1 known vulnerability in 1 package Name Version ID Fix Versions ---- ------- ------------------- ------------ pip 23.1.2 GHSA-mq26-g339-26xf 23.3 ```
Update pip to 23.3.1 to fix vulnerability: Update `pip` to 23.3.1 to fix vulnerability. See: https://github.com/huggingface/datasets-server/actions/runs/6697364484/job/18197081786?pr=2031 ``` Found 1 known vulnerability in 1 package Name Version ID Fix Versions ---- ------- ------------------- ------------ pip 23.1.2 GHSA-mq26-g339-26xf 23.3 ```
closed
2023-10-30T18:59:39Z
2023-10-30T19:45:04Z
2023-10-30T19:45:03Z
albertvillanova
1,968,937,161
Install `ps` and `htop` in workers to monitor processes inside pods
also added them to other services in case we ever need them
Install `ps` and `htop` in workers to monitor processes inside pods: also added them to other services in case we ever need them
closed
2023-10-30T18:01:15Z
2023-10-31T15:33:13Z
2023-10-31T15:33:12Z
polinaeterna
1,968,658,253
Add S3 configs to admin
null
Add S3 configs to admin:
closed
2023-10-30T15:33:27Z
2023-10-30T15:34:20Z
2023-10-30T15:34:18Z
AndreaFrancis
1,968,607,482
Fix assets volumes
null
Fix assets volumes:
closed
2023-10-30T15:10:35Z
2023-10-30T15:14:39Z
2023-10-30T15:14:38Z
AndreaFrancis
1,968,553,996
Use ruff for CI quality
Use `ruff` for CI quality: 10-100x faster than existing linters (like Flake8) and formatters (like Black). Note this is most useful when developing: `make style` now is much faster and additionally runs the linter (before `flake8` was only called by `make quality`).
Use ruff for CI quality: Use `ruff` for CI quality: 10-100x faster than existing linters (like Flake8) and formatters (like Black). Note this is most useful when developing: `make style` now is much faster and additionally runs the linter (before `flake8` was only called by `make quality`).
closed
2023-10-30T14:47:14Z
2023-11-02T14:01:49Z
2023-11-02T14:01:48Z
albertvillanova
1,967,941,143
Support async file I/O to get duckdb index file
Support asynchronous file I/O to get duckdb index file by replacing `pathlib.Path` with `anyio.Path`.
Support async file I/O to get duckdb index file: Support asynchronous file I/O to get duckdb index file by replacing `pathlib.Path` with `anyio.Path`.
closed
2023-10-30T09:50:48Z
2023-10-30T12:34:09Z
2023-10-30T12:34:08Z
albertvillanova
1,967,882,566
Update pandas dependency to 2.0.3
Update `pandas` dependency to latest patch release 2.0.3 in 2.0 series: https://github.com/pandas-dev/pandas/releases/tag/v2.0.3 > This is a patch release in the 2.0.x series and includes some regression and bug fixes. We recommend that all users upgrade to this version. Note that most components had `pandas` 2.0.2. Also note that we have respected the constrain `<2.1`, until we update `duckdb` as well. More details in: - #1914
Update pandas dependency to 2.0.3: Update `pandas` dependency to latest patch release 2.0.3 in 2.0 series: https://github.com/pandas-dev/pandas/releases/tag/v2.0.3 > This is a patch release in the 2.0.x series and includes some regression and bug fixes. We recommend that all users upgrade to this version. Note that most components had `pandas` 2.0.2. Also note that we have respected the constrain `<2.1`, until we update `duckdb` as well. More details in: - #1914
closed
2023-10-30T09:19:08Z
2023-10-30T10:02:19Z
2023-10-30T10:02:18Z
albertvillanova
1,966,126,224
adding tests for recreate dataset
Adding test for recreate dataset in admin service. Also found a bug for cancel_jobs, it was returning always 0 after the update statement.
adding tests for recreate dataset: Adding test for recreate dataset in admin service. Also found a bug for cancel_jobs, it was returning always 0 after the update statement.
closed
2023-10-27T20:04:15Z
2023-10-30T11:30:21Z
2023-10-30T11:30:20Z
AndreaFrancis
1,965,849,698
fix duckdb circular import
When deploying to staging (and also locally), the following error appeared for /search service: ![Screenshot from 2023-10-27 12-34-59](https://github.com/huggingface/datasets-server/assets/5564745/78784b8f-94f8-41fa-8a7e-6a473bb35f2c)
fix duckdb circular import: When deploying to staging (and also locally), the following error appeared for /search service: ![Screenshot from 2023-10-27 12-34-59](https://github.com/huggingface/datasets-server/assets/5564745/78784b8f-94f8-41fa-8a7e-6a473bb35f2c)
closed
2023-10-27T16:35:20Z
2023-10-27T17:45:23Z
2023-10-27T16:59:55Z
AndreaFrancis
1,965,756,311
Make create_response async and run _transform_row map async in separate thread
Make `create_response` (and inner functions) asynchronous and run `_transform_row` map asynchronously in separate thread.
Make create_response async and run _transform_row map async in separate thread: Make `create_response` (and inner functions) asynchronous and run `_transform_row` map asynchronously in separate thread.
closed
2023-10-27T15:28:21Z
2023-10-31T16:03:25Z
2023-10-31T16:03:23Z
albertvillanova
1,965,475,341
2023 reduce number comments
fixes #2023 note: `__PARQUET_CONVERTER_USER__` will be shown as `parquet-converter` in prod, without the bold formatting (due to the `__` characters). <img width="597" alt="Capture d’écran 2023-10-27 aΜ€ 14 53 26" src="https://github.com/huggingface/datasets-server/assets/1676121/df164999-709c-4afe-b825-c547ce7d4ca6"> --- <img width="767" alt="Capture d’écran 2023-10-27 aΜ€ 15 01 00" src="https://github.com/huggingface/datasets-server/assets/1676121/08c3e0fb-a8b1-4950-ae88-b21bf0797d60">
2023 reduce number comments: fixes #2023 note: `__PARQUET_CONVERTER_USER__` will be shown as `parquet-converter` in prod, without the bold formatting (due to the `__` characters). <img width="597" alt="Capture d’écran 2023-10-27 aΜ€ 14 53 26" src="https://github.com/huggingface/datasets-server/assets/1676121/df164999-709c-4afe-b825-c547ce7d4ca6"> --- <img width="767" alt="Capture d’écran 2023-10-27 aΜ€ 15 01 00" src="https://github.com/huggingface/datasets-server/assets/1676121/08c3e0fb-a8b1-4950-ae88-b21bf0797d60">
closed
2023-10-27T12:56:59Z
2023-10-30T20:57:12Z
2023-10-30T20:57:11Z
severo
1,965,382,173
Reduce the amount of comments after parquet conversion
See https://huggingface.co/datasets/open-source-metrics/datasets-dependents/discussions/1#6537d488b33b913282d076de > Would it make sense to just update the original message to add the different revisions that have been converted? > I agree that editing original might be better. Translating this same situation to GitHub, I would not be happy with a bot that pings for every single commit. For example, the doc-builder bot does not write a comment for every commit in a PR, it just does a single comment and link will work. I don't think most users will want to get a new notification for every single commit. We can also not silence notifications at a repo level, so as a user i would be inclined to silence the whole org. Let's just create a discussion that points to the refs/convert/parquet page, and ignore if a discussion already exists.
Reduce the amount of comments after parquet conversion: See https://huggingface.co/datasets/open-source-metrics/datasets-dependents/discussions/1#6537d488b33b913282d076de > Would it make sense to just update the original message to add the different revisions that have been converted? > I agree that editing original might be better. Translating this same situation to GitHub, I would not be happy with a bot that pings for every single commit. For example, the doc-builder bot does not write a comment for every commit in a PR, it just does a single comment and link will work. I don't think most users will want to get a new notification for every single commit. We can also not silence notifications at a repo level, so as a user i would be inclined to silence the whole org. Let's just create a discussion that points to the refs/convert/parquet page, and ignore if a discussion already exists.
closed
2023-10-27T12:03:42Z
2023-10-30T20:57:12Z
2023-10-30T20:57:12Z
severo
1,965,321,728
fix: πŸ› do not set the JWT public key URL by default
it's only useful for the Hugging Face production datasets-server, since any other instance would not receive requests from the Hugging Face Hub. Fixes #2019
fix: πŸ› do not set the JWT public key URL by default: it's only useful for the Hugging Face production datasets-server, since any other instance would not receive requests from the Hugging Face Hub. Fixes #2019
closed
2023-10-27T11:27:22Z
2023-10-27T11:28:06Z
2023-10-27T11:28:05Z
severo
1,965,181,589
Remove duckdb as libapi dependency
Remove `duckdb` from libapi by moving the function `duckdb_connect` to the "search" service. In the future, if this function is required by other components, I would suggest to create a new duckdb-specific lib, so that we decouple API and DuckDB functionalities. Additionally update all components that depend on `libapi` and remove their dependency on `duckdb` if they do not need it. These are: - admin - api - rows - sse-api
Remove duckdb as libapi dependency: Remove `duckdb` from libapi by moving the function `duckdb_connect` to the "search" service. In the future, if this function is required by other components, I would suggest to create a new duckdb-specific lib, so that we decouple API and DuckDB functionalities. Additionally update all components that depend on `libapi` and remove their dependency on `duckdb` if they do not need it. These are: - admin - api - rows - sse-api
closed
2023-10-27T10:01:00Z
2023-10-27T11:19:16Z
2023-10-27T11:19:15Z
albertvillanova
1,965,046,181
Update duckdb minor version to 0.8.1 in admin_ui
Update `duckdb` dependency minor version to 0.8.1 in admin_ui, which is a patch version that includes only bug fixes: https://github.com/duckdb/duckdb/releases/tag/v0.8.1
Update duckdb minor version to 0.8.1 in admin_ui: Update `duckdb` dependency minor version to 0.8.1 in admin_ui, which is a patch version that includes only bug fixes: https://github.com/duckdb/duckdb/releases/tag/v0.8.1
closed
2023-10-27T08:39:57Z
2023-10-27T09:20:52Z
2023-10-27T09:20:51Z
albertvillanova
1,964,976,711
Some jwt error after `make start` the newest dataset-server
My system is Windows10 and use the newest version of Docker Desktop, which the wsl engine was go on to execute docker together with WSL2. My WSL2 system is Ubuntu-22.04, and I cloned the newest version of datasets-server in it. Then I ran `make start` to build the server and successfully execute and can be shown in Docker Desktop dashboard. When i saw the log about them, I found the worker docker: api, rows and search docker occred the same error as follows, which make the proxy docker receive 502 when i requests them: ``` bash 2023-10-27 15:11:50 INFO: Stopping parent process [1] 2023-10-27 15:11:52 INFO: Uvicorn running on http://0.0.0.0:8180 (Press CTRL+C to quit) 2023-10-27 15:11:52 INFO: Started parent process [1] 2023-10-27 15:11:53 Process SpawnProcess-2: 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 169, in start_tls 2023-10-27 15:11:53 sock = ssl_context.wrap_socket( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 501, in wrap_socket 2023-10-27 15:11:53 return self.sslsocket_class._create( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1041, in _create 2023-10-27 15:11:53 self.do_handshake() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1310, in do_handshake 2023-10-27 15:11:53 self._sslobj.do_handshake() 2023-10-27 15:11:53 socket.timeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 262, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 245, in handle_request 2023-10-27 15:11:53 response = connection.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request 2023-10-27 15:11:53 stream = self._connect(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 156, in _connect 2023-10-27 15:11:53 stream = stream.start_tls(**kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2023-10-27 15:11:53 raise to_exc(exc) from exc 2023-10-27 15:11:53 httpcore.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 177, in fetch_jwt_public_key_json 2023-10-27 15:11:53 response = httpx.get(url, timeout=hf_timeout_seconds) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 189, in get 2023-10-27 15:11:53 return request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 100, in request 2023-10-27 15:11:53 return client.request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 814, in request 2023-10-27 15:11:53 return self.send(request, auth=auth, follow_redirects=follow_redirects) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 901, in send 2023-10-27 15:11:53 response = self._send_handling_auth( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 929, in _send_handling_auth 2023-10-27 15:11:53 response = self._send_handling_redirects( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 966, in _send_handling_redirects 2023-10-27 15:11:53 response = self._send_single_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 1002, in _send_single_request 2023-10-27 15:11:53 response = transport.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions 2023-10-27 15:11:53 raise mapped_exc(message) from exc 2023-10-27 15:11:53 httpx.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 220, in get_jwt_public_keys 2023-10-27 15:11:53 payload = fetch_jwt_public_key_json( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 181, in fetch_jwt_public_key_json 2023-10-27 15:11:53 raise RuntimeError(f"Failed to fetch the JWT public key from {url}. ") from err 2023-10-27 15:11:53 RuntimeError: Failed to fetch the JWT public key from https://huggingface.co/api/keys/jwt. 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap 2023-10-27 15:11:53 self.run() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 108, in run 2023-10-27 15:11:53 self._target(*self._args, **self._kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started 2023-10-27 15:11:53 target(sockets=sockets) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 60, in run 2023-10-27 15:11:53 return asyncio.run(self.serve(sockets=sockets)) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run 2023-10-27 15:11:53 return loop.run_until_complete(main) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete 2023-10-27 15:11:53 return future.result() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 67, in serve 2023-10-27 15:11:53 config.load() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/config.py", line 483, in load 2023-10-27 15:11:53 self.loaded_app = self.loaded_app() 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 28, in create_app 2023-10-27 15:11:53 return create_app_with_config(app_config=app_config, endpoint_config=endpoint_config) 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 37, in create_app_with_config 2023-10-27 15:11:53 hf_jwt_public_keys = get_jwt_public_keys( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 232, in get_jwt_public_keys 2023-10-27 15:11:53 raise JWTKeysError("Failed to create the JWT public keys.") from err 2023-10-27 15:11:53 libapi.exceptions.JWTKeysError: Failed to create the JWT public keys. 2023-10-27 15:11:53 Process SpawnProcess-1: 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 169, in start_tls 2023-10-27 15:11:53 sock = ssl_context.wrap_socket( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 501, in wrap_socket 2023-10-27 15:11:53 return self.sslsocket_class._create( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1041, in _create 2023-10-27 15:11:53 self.do_handshake() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1310, in do_handshake 2023-10-27 15:11:53 self._sslobj.do_handshake() 2023-10-27 15:11:53 socket.timeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 262, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 245, in handle_request 2023-10-27 15:11:53 response = connection.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request 2023-10-27 15:11:53 stream = self._connect(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 156, in _connect 2023-10-27 15:11:53 stream = stream.start_tls(**kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2023-10-27 15:11:53 raise to_exc(exc) from exc 2023-10-27 15:11:53 httpcore.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 177, in fetch_jwt_public_key_json 2023-10-27 15:11:53 response = httpx.get(url, timeout=hf_timeout_seconds) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 189, in get 2023-10-27 15:11:53 return request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 100, in request 2023-10-27 15:11:53 return client.request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 814, in request 2023-10-27 15:11:53 return self.send(request, auth=auth, follow_redirects=follow_redirects) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 901, in send 2023-10-27 15:11:53 response = self._send_handling_auth( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 929, in _send_handling_auth 2023-10-27 15:11:53 response = self._send_handling_redirects( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 966, in _send_handling_redirects 2023-10-27 15:11:53 response = self._send_single_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 1002, in _send_single_request 2023-10-27 15:11:53 response = transport.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions 2023-10-27 15:11:53 raise mapped_exc(message) from exc 2023-10-27 15:11:53 httpx.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 220, in get_jwt_public_keys 2023-10-27 15:11:53 payload = fetch_jwt_public_key_json( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 181, in fetch_jwt_public_key_json 2023-10-27 15:11:53 raise RuntimeError(f"Failed to fetch the JWT public key from {url}. ") from err 2023-10-27 15:11:53 RuntimeError: Failed to fetch the JWT public key from https://huggingface.co/api/keys/jwt. 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap 2023-10-27 15:11:53 self.run() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 108, in run 2023-10-27 15:11:53 self._target(*self._args, **self._kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started 2023-10-27 15:11:53 target(sockets=sockets) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 60, in run 2023-10-27 15:11:53 return asyncio.run(self.serve(sockets=sockets)) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run 2023-10-27 15:11:53 return loop.run_until_complete(main) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete 2023-10-27 15:11:53 return future.result() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 67, in serve 2023-10-27 15:11:53 config.load() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/config.py", line 483, in load 2023-10-27 15:11:53 self.loaded_app = self.loaded_app() 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 28, in create_app 2023-10-27 15:11:53 return create_app_with_config(app_config=app_config, endpoint_config=endpoint_config) 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 37, in create_app_with_config 2023-10-27 15:11:53 hf_jwt_public_keys = get_jwt_public_keys( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 232, in get_jwt_public_keys 2023-10-27 15:11:53 raise JWTKeysError("Failed to create the JWT public keys.") from err 2023-10-27 15:11:53 libapi.exceptions.JWTKeysError: Failed to create the JWT public keys. ``` At first I thought the main error is network and i use `ping huggingface.co` and `curl https://huggingface.co/api/keys/jwt` in these dockers, but I had 200 request later, which means the network in dockers are not the main reason cause the prolems. Can anyone give a solutions or mothod to solve this bug, thanks !
Some jwt error after `make start` the newest dataset-server: My system is Windows10 and use the newest version of Docker Desktop, which the wsl engine was go on to execute docker together with WSL2. My WSL2 system is Ubuntu-22.04, and I cloned the newest version of datasets-server in it. Then I ran `make start` to build the server and successfully execute and can be shown in Docker Desktop dashboard. When i saw the log about them, I found the worker docker: api, rows and search docker occred the same error as follows, which make the proxy docker receive 502 when i requests them: ``` bash 2023-10-27 15:11:50 INFO: Stopping parent process [1] 2023-10-27 15:11:52 INFO: Uvicorn running on http://0.0.0.0:8180 (Press CTRL+C to quit) 2023-10-27 15:11:52 INFO: Started parent process [1] 2023-10-27 15:11:53 Process SpawnProcess-2: 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 169, in start_tls 2023-10-27 15:11:53 sock = ssl_context.wrap_socket( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 501, in wrap_socket 2023-10-27 15:11:53 return self.sslsocket_class._create( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1041, in _create 2023-10-27 15:11:53 self.do_handshake() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1310, in do_handshake 2023-10-27 15:11:53 self._sslobj.do_handshake() 2023-10-27 15:11:53 socket.timeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 262, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 245, in handle_request 2023-10-27 15:11:53 response = connection.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request 2023-10-27 15:11:53 stream = self._connect(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 156, in _connect 2023-10-27 15:11:53 stream = stream.start_tls(**kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2023-10-27 15:11:53 raise to_exc(exc) from exc 2023-10-27 15:11:53 httpcore.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 177, in fetch_jwt_public_key_json 2023-10-27 15:11:53 response = httpx.get(url, timeout=hf_timeout_seconds) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 189, in get 2023-10-27 15:11:53 return request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 100, in request 2023-10-27 15:11:53 return client.request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 814, in request 2023-10-27 15:11:53 return self.send(request, auth=auth, follow_redirects=follow_redirects) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 901, in send 2023-10-27 15:11:53 response = self._send_handling_auth( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 929, in _send_handling_auth 2023-10-27 15:11:53 response = self._send_handling_redirects( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 966, in _send_handling_redirects 2023-10-27 15:11:53 response = self._send_single_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 1002, in _send_single_request 2023-10-27 15:11:53 response = transport.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions 2023-10-27 15:11:53 raise mapped_exc(message) from exc 2023-10-27 15:11:53 httpx.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 220, in get_jwt_public_keys 2023-10-27 15:11:53 payload = fetch_jwt_public_key_json( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 181, in fetch_jwt_public_key_json 2023-10-27 15:11:53 raise RuntimeError(f"Failed to fetch the JWT public key from {url}. ") from err 2023-10-27 15:11:53 RuntimeError: Failed to fetch the JWT public key from https://huggingface.co/api/keys/jwt. 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap 2023-10-27 15:11:53 self.run() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 108, in run 2023-10-27 15:11:53 self._target(*self._args, **self._kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started 2023-10-27 15:11:53 target(sockets=sockets) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 60, in run 2023-10-27 15:11:53 return asyncio.run(self.serve(sockets=sockets)) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run 2023-10-27 15:11:53 return loop.run_until_complete(main) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete 2023-10-27 15:11:53 return future.result() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 67, in serve 2023-10-27 15:11:53 config.load() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/config.py", line 483, in load 2023-10-27 15:11:53 self.loaded_app = self.loaded_app() 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 28, in create_app 2023-10-27 15:11:53 return create_app_with_config(app_config=app_config, endpoint_config=endpoint_config) 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 37, in create_app_with_config 2023-10-27 15:11:53 hf_jwt_public_keys = get_jwt_public_keys( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 232, in get_jwt_public_keys 2023-10-27 15:11:53 raise JWTKeysError("Failed to create the JWT public keys.") from err 2023-10-27 15:11:53 libapi.exceptions.JWTKeysError: Failed to create the JWT public keys. 2023-10-27 15:11:53 Process SpawnProcess-1: 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 169, in start_tls 2023-10-27 15:11:53 sock = ssl_context.wrap_socket( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 501, in wrap_socket 2023-10-27 15:11:53 return self.sslsocket_class._create( 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1041, in _create 2023-10-27 15:11:53 self.do_handshake() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/ssl.py", line 1310, in do_handshake 2023-10-27 15:11:53 self._sslobj.do_handshake() 2023-10-27 15:11:53 socket.timeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 66, in map_httpcore_exceptions 2023-10-27 15:11:53 yield 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 262, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection_pool.py", line 245, in handle_request 2023-10-27 15:11:53 response = connection.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 99, in handle_request 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 76, in handle_request 2023-10-27 15:11:53 stream = self._connect(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 156, in _connect 2023-10-27 15:11:53 stream = stream.start_tls(**kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_backends/sync.py", line 174, in start_tls 2023-10-27 15:11:53 raise exc 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions 2023-10-27 15:11:53 raise to_exc(exc) from exc 2023-10-27 15:11:53 httpcore.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 177, in fetch_jwt_public_key_json 2023-10-27 15:11:53 response = httpx.get(url, timeout=hf_timeout_seconds) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 189, in get 2023-10-27 15:11:53 return request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_api.py", line 100, in request 2023-10-27 15:11:53 return client.request( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 814, in request 2023-10-27 15:11:53 return self.send(request, auth=auth, follow_redirects=follow_redirects) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 901, in send 2023-10-27 15:11:53 response = self._send_handling_auth( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 929, in _send_handling_auth 2023-10-27 15:11:53 response = self._send_handling_redirects( 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 966, in _send_handling_redirects 2023-10-27 15:11:53 response = self._send_single_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_client.py", line 1002, in _send_single_request 2023-10-27 15:11:53 response = transport.handle_request(request) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 228, in handle_request 2023-10-27 15:11:53 resp = self._pool.handle_request(req) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/contextlib.py", line 137, in __exit__ 2023-10-27 15:11:53 self.gen.throw(typ, value, traceback) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/httpx/_transports/default.py", line 83, in map_httpcore_exceptions 2023-10-27 15:11:53 raise mapped_exc(message) from exc 2023-10-27 15:11:53 httpx.ConnectTimeout: _ssl.c:1112: The handshake operation timed out 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 220, in get_jwt_public_keys 2023-10-27 15:11:53 payload = fetch_jwt_public_key_json( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 181, in fetch_jwt_public_key_json 2023-10-27 15:11:53 raise RuntimeError(f"Failed to fetch the JWT public key from {url}. ") from err 2023-10-27 15:11:53 RuntimeError: Failed to fetch the JWT public key from https://huggingface.co/api/keys/jwt. 2023-10-27 15:11:53 2023-10-27 15:11:53 The above exception was the direct cause of the following exception: 2023-10-27 15:11:53 2023-10-27 15:11:53 Traceback (most recent call last): 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap 2023-10-27 15:11:53 self.run() 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/multiprocessing/process.py", line 108, in run 2023-10-27 15:11:53 self._target(*self._args, **self._kwargs) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started 2023-10-27 15:11:53 target(sockets=sockets) 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 60, in run 2023-10-27 15:11:53 return asyncio.run(self.serve(sockets=sockets)) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/runners.py", line 44, in run 2023-10-27 15:11:53 return loop.run_until_complete(main) 2023-10-27 15:11:53 File "/usr/local/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete 2023-10-27 15:11:53 return future.result() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/server.py", line 67, in serve 2023-10-27 15:11:53 config.load() 2023-10-27 15:11:53 File "/src/services/api/.venv/lib/python3.9/site-packages/uvicorn/config.py", line 483, in load 2023-10-27 15:11:53 self.loaded_app = self.loaded_app() 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 28, in create_app 2023-10-27 15:11:53 return create_app_with_config(app_config=app_config, endpoint_config=endpoint_config) 2023-10-27 15:11:53 File "/src/services/api/src/api/app.py", line 37, in create_app_with_config 2023-10-27 15:11:53 hf_jwt_public_keys = get_jwt_public_keys( 2023-10-27 15:11:53 File "/src/libs/libapi/src/libapi/jwt_token.py", line 232, in get_jwt_public_keys 2023-10-27 15:11:53 raise JWTKeysError("Failed to create the JWT public keys.") from err 2023-10-27 15:11:53 libapi.exceptions.JWTKeysError: Failed to create the JWT public keys. ``` At first I thought the main error is network and i use `ping huggingface.co` and `curl https://huggingface.co/api/keys/jwt` in these dockers, but I had 200 request later, which means the network in dockers are not the main reason cause the prolems. Can anyone give a solutions or mothod to solve this bug, thanks !
closed
2023-10-27T07:57:38Z
2023-10-27T11:28:07Z
2023-10-27T11:28:06Z
vagitablebirdcode
1,964,946,941
Download duckdb index async in separate thread
Download `duckdb` index asynchronously in separate thread, and make `get_index_file_location_and_download_if_missing` asynchronous.
Download duckdb index async in separate thread: Download `duckdb` index asynchronously in separate thread, and make `get_index_file_location_and_download_if_missing` asynchronous.
closed
2023-10-27T07:38:18Z
2023-10-27T09:10:43Z
2023-10-27T09:10:42Z
albertvillanova
1,964,224,729
Use get_request_parameter in endpoint
Refactor the "endpoint" route in the "api" service and use libapi `get_request_parameter`.
Use get_request_parameter in endpoint: Refactor the "endpoint" route in the "api" service and use libapi `get_request_parameter`.
closed
2023-10-26T19:19:29Z
2023-10-31T08:06:14Z
2023-10-31T08:05:43Z
albertvillanova
1,963,763,881
add cron job to clean duckdb index cache
We have a cache folder in `/storage/duckdb-index`, this folder is used for cached downloads in `/search` and `/filter` while calling ` hf_hub_download` https://github.com/huggingface/datasets-server/blob/ef8a1cb34ba661b859962ac8d6c138374a0b0190/libs/libapi/src/libapi/duckdb.py#L78 (which needs a cache directory). I noticed that we have old files in `/storage/duckdb-index/cache` which are using about 185G: ![image](https://github.com/huggingface/datasets-server/assets/5564745/3af4c5b6-eb5e-4bbb-9df3-5672dbc84173) We can apply the same cleaning as we do for other folders so I added a new cron job. I think another approach to avoid using `hf_hub_download` cache is to use `hf_transfer` directly, but I will investigate more about the advantages, meanwhile, the cron job will avoid increasing the storage usage.
add cron job to clean duckdb index cache: We have a cache folder in `/storage/duckdb-index`, this folder is used for cached downloads in `/search` and `/filter` while calling ` hf_hub_download` https://github.com/huggingface/datasets-server/blob/ef8a1cb34ba661b859962ac8d6c138374a0b0190/libs/libapi/src/libapi/duckdb.py#L78 (which needs a cache directory). I noticed that we have old files in `/storage/duckdb-index/cache` which are using about 185G: ![image](https://github.com/huggingface/datasets-server/assets/5564745/3af4c5b6-eb5e-4bbb-9df3-5672dbc84173) We can apply the same cleaning as we do for other folders so I added a new cron job. I think another approach to avoid using `hf_hub_download` cache is to use `hf_transfer` directly, but I will investigate more about the advantages, meanwhile, the cron job will avoid increasing the storage usage.
closed
2023-10-26T14:55:37Z
2023-10-26T18:47:21Z
2023-10-26T18:47:20Z
AndreaFrancis
1,963,481,907
increase number of replicas
Internal references: API: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698224417526029 workers: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698323528797149
increase number of replicas: Internal references: API: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698224417526029 workers: https://huggingface.slack.com/archives/C04L6P8KNQ5/p1698323528797149
closed
2023-10-26T12:42:46Z
2023-10-26T16:37:50Z
2023-10-26T16:37:48Z
severo
1,961,824,392
Update werkzeug
fix ``` Name Version ID Fix Versions -------- ------- ------------------- ------------ werkzeug 3.0.0 GHSA-hrfv-mqp8-q5rw 3.0.1 ```
Update werkzeug: fix ``` Name Version ID Fix Versions -------- ------- ------------------- ------------ werkzeug 3.0.0 GHSA-hrfv-mqp8-q5rw 3.0.1 ```
closed
2023-10-25T16:40:25Z
2023-10-25T17:18:23Z
2023-10-25T17:18:22Z
lhoestq
1,961,471,039
Factorize getting request parameters
Factorize getting request parameters for both required and non-required ones, by using the function `get_request_parameter` and merging `get_required_request_parameter` into it, with the addition of the parameters "required" and "default". Use `get_request_parameter` to get the "cursor", "priority" and "all" parameters.
Factorize getting request parameters: Factorize getting request parameters for both required and non-required ones, by using the function `get_request_parameter` and merging `get_required_request_parameter` into it, with the addition of the parameters "required" and "default". Use `get_request_parameter` to get the "cursor", "priority" and "all" parameters.
closed
2023-10-25T13:45:54Z
2023-10-26T17:35:22Z
2023-10-26T17:35:21Z
albertvillanova
1,961,360,943
Update doc-build.yml - add missing token
Add missing token to doc build workflow
Update doc-build.yml - add missing token: Add missing token to doc build workflow
closed
2023-10-25T12:55:28Z
2023-10-25T12:57:43Z
2023-10-25T12:57:42Z
mishig25
1,961,249,824
Remove no_max_size_limit_datasets (no more full parquet conversion for Open-Orca/OpenOrca to fix the viewer)
The full Open-Orca/OpenOrca is too big to be processed correctly by most services, so let it work like all the other datasets (use the first 5GB)
Remove no_max_size_limit_datasets (no more full parquet conversion for Open-Orca/OpenOrca to fix the viewer): The full Open-Orca/OpenOrca is too big to be processed correctly by most services, so let it work like all the other datasets (use the first 5GB)
closed
2023-10-25T12:05:53Z
2023-10-25T12:28:49Z
2023-10-25T12:28:48Z
lhoestq
1,961,082,276
Parquet stream-conversion fails to embed images/audio files from gated repos
e.g. for https://huggingface.co/datasets/SinKove/synthetic_mammography_csaw it seems to be an issue with `datasets` not passing the token to `embed_table_storage` when generating a dataset ```json { "error": "An error occurred while generating the dataset", "cause_exception": "DatasetGenerationError", "cause_message": "An error occurred while generating the dataset", "cause_traceback": [ "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py\", line 261, in hf_raise_for_status\n response.raise_for_status()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/requests/models.py\", line 1021, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\n", "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/datasets/SinKove/synthetic_mammography_csaw/revision/490f9d72f59e2599b5d1eca5973b37a913270445\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 100, in _repo_and_revision_exist\n self._api.repo_info(repo_id, revision=revision, repo_type=repo_type)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py\", line 118, in _inner_fn\n return fn(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py\", line 1868, in repo_info\n return method(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py\", line 118, in _inner_fn\n return fn(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py\", line 1741, in dataset_info\n hf_raise_for_status(r)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py\", line 277, in hf_raise_for_status\n raise GatedRepoError(message, response) from e\n", "huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6538ec77-55cac38800ae3763321e921e;1f6f0b56-19bb-4ea0-bfd6-0a1081822277)\n\nCannot access gated repo for url https://huggingface.co/api/datasets/SinKove/synthetic_mammography_csaw/revision/490f9d72f59e2599b5d1eca5973b37a913270445.\nRepo dataset SinKove/synthetic_mammography_csaw is gated. You must be authenticated to access it.\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1703, in _prepare_split_single\n num_examples, num_bytes = writer.finalize()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 586, in finalize\n self.write_examples_on_file()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 448, in write_examples_on_file\n self.write_batch(batch_examples=batch_examples)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 559, in write_batch\n self.write_table(pa_table, writer_batch_size)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 574, in write_table\n pa_table = embed_table_storage(pa_table)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2306, in embed_table_storage\n arrays = [\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2307, in <listcomp>\n embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name]\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1831, in wrapper\n return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1831, in <listcomp>\n return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2176, in embed_array_storage\n return feature.embed_storage(array)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 266, in embed_storage\n [\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 267, in <listcomp>\n (path_to_bytes(x[\"path\"]) if x[\"bytes\"] is None else x[\"bytes\"]) if x is not None else None\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py\", line 306, in wrapper\n return func(value) if value is not None else None\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 261, in path_to_bytes\n with xopen(path, \"rb\") as f:\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py\", line 496, in xopen\n file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 439, in open\n return open_files(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 282, in open_files\n fs, fs_token, paths = get_fs_token_paths(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 606, in get_fs_token_paths\n fs = filesystem(protocol, **inkwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/registry.py\", line 261, in filesystem\n return cls(**storage_options)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 76, in __call__\n obj = super().__call__(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py\", line 58, in __init__\n self.fo = fo.__enter__() # the whole instance is a context\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 102, in __enter__\n f = self.fs.open(self.path, mode=mode)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1199, in open\n f = self._open(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 200, in _open\n return HfFileSystemFile(self, path, mode=mode, revision=revision, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 392, in __init__\n super().__init__(fs, path, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1555, in __init__\n self.size = self.details[\"size\"]\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1568, in details\n self._details = self.fs.info(self.path)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 351, in info\n resolved_path = self.resolve_path(path)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 149, in resolve_path\n raise FileNotFoundError(path) from err\n", "FileNotFoundError: SinKove/synthetic_mammography_csaw@490f9d72f59e2599b5d1eca5973b37a913270445/data/train.zip\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/src/worker/job_manager.py\", line 168, in process\n job_result = self.job_runner.compute()\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 1348, in compute\n compute_config_parquet_and_info_response(\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 1255, in compute_config_parquet_and_info_response\n parquet_operations, partial = stream_convert_to_parquet(\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 852, in stream_convert_to_parquet\n builder._prepare_split(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1555, in _prepare_split\n for job_id, done, content in self._prepare_split_single(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1712, in _prepare_split_single\n raise DatasetGenerationError(\"An error occurred while generating the dataset\") from e\n", "datasets.builder.DatasetGenerationError: An error occurred while generating the dataset\n" ] } ```
Parquet stream-conversion fails to embed images/audio files from gated repos: e.g. for https://huggingface.co/datasets/SinKove/synthetic_mammography_csaw it seems to be an issue with `datasets` not passing the token to `embed_table_storage` when generating a dataset ```json { "error": "An error occurred while generating the dataset", "cause_exception": "DatasetGenerationError", "cause_message": "An error occurred while generating the dataset", "cause_traceback": [ "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py\", line 261, in hf_raise_for_status\n response.raise_for_status()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/requests/models.py\", line 1021, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\n", "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/datasets/SinKove/synthetic_mammography_csaw/revision/490f9d72f59e2599b5d1eca5973b37a913270445\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 100, in _repo_and_revision_exist\n self._api.repo_info(repo_id, revision=revision, repo_type=repo_type)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py\", line 118, in _inner_fn\n return fn(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py\", line 1868, in repo_info\n return method(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py\", line 118, in _inner_fn\n return fn(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py\", line 1741, in dataset_info\n hf_raise_for_status(r)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py\", line 277, in hf_raise_for_status\n raise GatedRepoError(message, response) from e\n", "huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6538ec77-55cac38800ae3763321e921e;1f6f0b56-19bb-4ea0-bfd6-0a1081822277)\n\nCannot access gated repo for url https://huggingface.co/api/datasets/SinKove/synthetic_mammography_csaw/revision/490f9d72f59e2599b5d1eca5973b37a913270445.\nRepo dataset SinKove/synthetic_mammography_csaw is gated. You must be authenticated to access it.\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1703, in _prepare_split_single\n num_examples, num_bytes = writer.finalize()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 586, in finalize\n self.write_examples_on_file()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 448, in write_examples_on_file\n self.write_batch(batch_examples=batch_examples)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 559, in write_batch\n self.write_table(pa_table, writer_batch_size)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py\", line 574, in write_table\n pa_table = embed_table_storage(pa_table)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2306, in embed_table_storage\n arrays = [\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2307, in <listcomp>\n embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name]\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1831, in wrapper\n return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 1831, in <listcomp>\n return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py\", line 2176, in embed_array_storage\n return feature.embed_storage(array)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 266, in embed_storage\n [\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 267, in <listcomp>\n (path_to_bytes(x[\"path\"]) if x[\"bytes\"] is None else x[\"bytes\"]) if x is not None else None\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py\", line 306, in wrapper\n return func(value) if value is not None else None\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py\", line 261, in path_to_bytes\n with xopen(path, \"rb\") as f:\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py\", line 496, in xopen\n file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open()\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 439, in open\n return open_files(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 282, in open_files\n fs, fs_token, paths = get_fs_token_paths(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 606, in get_fs_token_paths\n fs = filesystem(protocol, **inkwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/registry.py\", line 261, in filesystem\n return cls(**storage_options)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 76, in __call__\n obj = super().__call__(*args, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py\", line 58, in __init__\n self.fo = fo.__enter__() # the whole instance is a context\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py\", line 102, in __enter__\n f = self.fs.open(self.path, mode=mode)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1199, in open\n f = self._open(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 200, in _open\n return HfFileSystemFile(self, path, mode=mode, revision=revision, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 392, in __init__\n super().__init__(fs, path, **kwargs)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1555, in __init__\n self.size = self.details[\"size\"]\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py\", line 1568, in details\n self._details = self.fs.info(self.path)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 351, in info\n resolved_path = self.resolve_path(path)\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py\", line 149, in resolve_path\n raise FileNotFoundError(path) from err\n", "FileNotFoundError: SinKove/synthetic_mammography_csaw@490f9d72f59e2599b5d1eca5973b37a913270445/data/train.zip\n", "\nThe above exception was the direct cause of the following exception:\n\n", "Traceback (most recent call last):\n", " File \"/src/services/worker/src/worker/job_manager.py\", line 168, in process\n job_result = self.job_runner.compute()\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 1348, in compute\n compute_config_parquet_and_info_response(\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 1255, in compute_config_parquet_and_info_response\n parquet_operations, partial = stream_convert_to_parquet(\n", " File \"/src/services/worker/src/worker/job_runners/config/parquet_and_info.py\", line 852, in stream_convert_to_parquet\n builder._prepare_split(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1555, in _prepare_split\n for job_id, done, content in self._prepare_split_single(\n", " File \"/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py\", line 1712, in _prepare_split_single\n raise DatasetGenerationError(\"An error occurred while generating the dataset\") from e\n", "datasets.builder.DatasetGenerationError: An error occurred while generating the dataset\n" ] } ```
open
2023-10-25T10:38:47Z
2023-11-03T11:23:49Z
null
lhoestq
1,959,490,015
Are URLs in rows response sanitized?
see https://github.com/huggingface/moon-landing/pull/7798#discussion_r1369813236 (internal) > Is "src" validated / sanitized? > if not there is a potential XSS exploit here (you can inject javascript code in an image src) > Are S3 object names sanitized? If no, it should be the case in dataset-server side
Are URLs in rows response sanitized?: see https://github.com/huggingface/moon-landing/pull/7798#discussion_r1369813236 (internal) > Is "src" validated / sanitized? > if not there is a potential XSS exploit here (you can inject javascript code in an image src) > Are S3 object names sanitized? If no, it should be the case in dataset-server side
closed
2023-10-24T15:10:29Z
2023-11-21T15:39:13Z
2023-11-21T15:39:12Z
severo
1,959,370,917
Fix trending datasets dashboard
<img width="794" alt="image" src="https://github.com/huggingface/datasets-server/assets/42851186/67a3310d-54ba-42d7-8f33-9e47d10179ab"> (already deployed on spaces)
Fix trending datasets dashboard: <img width="794" alt="image" src="https://github.com/huggingface/datasets-server/assets/42851186/67a3310d-54ba-42d7-8f33-9e47d10179ab"> (already deployed on spaces)
closed
2023-10-24T14:19:01Z
2023-10-26T10:21:53Z
2023-10-24T16:09:09Z
lhoestq
1,959,150,821
Update datasets to 2.14.6
Update `datasets` to 2.14.6. Fix #2006.
Update datasets to 2.14.6: Update `datasets` to 2.14.6. Fix #2006.
closed
2023-10-24T12:30:03Z
2023-10-25T06:02:01Z
2023-10-25T06:02:00Z
albertvillanova
1,959,132,059
Update datasets to 2.14.6
Update datasets to 2.14.6: https://github.com/huggingface/datasets/releases/tag/2.14.6 > Create DefunctDatasetError by @albertvillanova in huggingface/datasets#6286 - We will be able to raise the `DefunctDatasetError` from defunct datasets
Update datasets to 2.14.6: Update datasets to 2.14.6: https://github.com/huggingface/datasets/releases/tag/2.14.6 > Create DefunctDatasetError by @albertvillanova in huggingface/datasets#6286 - We will be able to raise the `DefunctDatasetError` from defunct datasets
closed
2023-10-24T12:17:43Z
2023-10-25T06:02:01Z
2023-10-25T06:02:01Z
albertvillanova
1,952,766,158
Fix another test
from https://github.com/huggingface/datasets-server/pull/2003
Fix another test: from https://github.com/huggingface/datasets-server/pull/2003
closed
2023-10-19T17:40:52Z
2023-10-19T17:53:51Z
2023-10-19T17:53:50Z
lhoestq
1,952,749,167
More disable_dataset_scripts_support
Satasets scripts were still loaded in some cases. following https://github.com/huggingface/datasets-server/pull/2001
More disable_dataset_scripts_support: Satasets scripts were still loaded in some cases. following https://github.com/huggingface/datasets-server/pull/2001
closed
2023-10-19T17:28:31Z
2023-10-19T17:32:23Z
2023-10-19T17:32:22Z
lhoestq
1,952,675,563
Fix test
null
Fix test:
closed
2023-10-19T16:42:48Z
2023-10-19T16:44:35Z
2023-10-19T16:44:34Z
lhoestq
1,952,481,405
Disable dataset scripts
The `config-parquet-and-info` step will now raise a `DatasetWithScriptNotSupportedError` for datasets with a script, except those in the allow list. This will prevent users from running arbitrary code using dataset scripts. The error message is shown to the user on the website and it says ```python raise DatasetWithScriptNotSupportedError( "The dataset viewer doesn't support this dataset because it runs " "arbitrary python code. Please open a discussion in the discussion tab " "if you think this is an error and tag @lhoestq and @severo." ) ``` The allow list is hardcoded for now: `DATASET_SCRIPTS_ALLOW_LIST = ["canonical"]` The keyword "canonical" means all the datasets without namespaces. We can add other datasets to the allow list, and it supports `fnmatch`, for example to support all the datasets from `huggingface` we can add `huggingface/*` to the allow list. cc @severo @XciD
Disable dataset scripts: The `config-parquet-and-info` step will now raise a `DatasetWithScriptNotSupportedError` for datasets with a script, except those in the allow list. This will prevent users from running arbitrary code using dataset scripts. The error message is shown to the user on the website and it says ```python raise DatasetWithScriptNotSupportedError( "The dataset viewer doesn't support this dataset because it runs " "arbitrary python code. Please open a discussion in the discussion tab " "if you think this is an error and tag @lhoestq and @severo." ) ``` The allow list is hardcoded for now: `DATASET_SCRIPTS_ALLOW_LIST = ["canonical"]` The keyword "canonical" means all the datasets without namespaces. We can add other datasets to the allow list, and it supports `fnmatch`, for example to support all the datasets from `huggingface` we can add `huggingface/*` to the allow list. cc @severo @XciD
closed
2023-10-19T15:02:13Z
2023-10-19T16:33:47Z
2023-10-19T16:33:46Z
lhoestq
1,952,166,518
fix: remove useless token
This token is not used by your action. Secret is removed from the repository.
fix: remove useless token: This token is not used by your action. Secret is removed from the repository.
closed
2023-10-19T12:42:15Z
2023-10-19T12:42:37Z
2023-10-19T12:42:36Z
rtrompier
1,950,231,325
Better handling of k8s SIGTERM when stopping workers
We've observed recently that worker pods can take more than 1h to get killed by kubernetes (see internal [slack thread](https://huggingface.slack.com/archives/C04L6P8KNQ5/p1697641144036899) for example). According to the k8s docs, at one point it sends a TERM and then after a grace period a SIGKILL on all the running processes. If this keeps happening we need to improve the way we handle those to kill the pods faster. One idea is to catch SIGTERM in the worker executor (main python process of workers) that would stop the worker loop (its subprocess where the actual code runs).
Better handling of k8s SIGTERM when stopping workers: We've observed recently that worker pods can take more than 1h to get killed by kubernetes (see internal [slack thread](https://huggingface.slack.com/archives/C04L6P8KNQ5/p1697641144036899) for example). According to the k8s docs, at one point it sends a TERM and then after a grace period a SIGKILL on all the running processes. If this keeps happening we need to improve the way we handle those to kill the pods faster. One idea is to catch SIGTERM in the worker executor (main python process of workers) that would stop the worker loop (its subprocess where the actual code runs).
closed
2023-10-18T16:57:09Z
2023-11-07T10:29:24Z
2023-11-07T10:29:24Z
lhoestq
1,949,834,163
Store data in a local database file for stats instead of an in-memory connection
This is my almost last idea on how make things not freezing on big datasets. It's the only difference with `duckdb-index` worker which apparently works fine for 5Gb datasets: it uses local db file, stats worker didn't use it.
Store data in a local database file for stats instead of an in-memory connection: This is my almost last idea on how make things not freezing on big datasets. It's the only difference with `duckdb-index` worker which apparently works fine for 5Gb datasets: it uses local db file, stats worker didn't use it.
closed
2023-10-18T14:00:23Z
2023-10-18T14:11:33Z
2023-10-18T14:11:32Z
polinaeterna
1,948,988,533
Fix quality in search by removing unused import
Fix quality issue in search by removing unused import. This quality issue was introduced by: - #1983
Fix quality in search by removing unused import: Fix quality issue in search by removing unused import. This quality issue was introduced by: - #1983
closed
2023-10-18T06:52:47Z
2023-10-18T07:07:44Z
2023-10-18T07:07:42Z
albertvillanova
1,948,970,916
Update urllib3 from 1.26.17 to fix vulnerability
Update `urllib3` from 1.26.17 to fix vulnerability: - to 2.0.7 in e2e - to 1.26.18 in the rest of the components This should fix 12 dependabot alerts. Supersede and close #1994. Supersede and close #1995.
Update urllib3 from 1.26.17 to fix vulnerability: Update `urllib3` from 1.26.17 to fix vulnerability: - to 2.0.7 in e2e - to 1.26.18 in the rest of the components This should fix 12 dependabot alerts. Supersede and close #1994. Supersede and close #1995.
closed
2023-10-18T06:40:30Z
2023-10-18T15:39:11Z
2023-10-18T15:39:10Z
albertvillanova
1,948,568,266
Bump urllib3 from 1.26.17 to 1.26.18 in /libs/libcommon
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.17 to 1.26.18. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/releases">urllib3's releases</a>.</em></p> <blockquote> <h2>1.26.18</h2> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses. (GHSA-g4mx-q9vg-27p4)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's changelog</a>.</em></p> <blockquote> <h1>1.26.18 (2023-10-17)</h1> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/urllib3/urllib3/commit/9c2c2307dd1d6af504e09aac0326d86ee3597a0b"><code>9c2c230</code></a> Release 1.26.18 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3159">#3159</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/b594c5ceaca38e1ac215f916538fb128e3526a36"><code>b594c5c</code></a> Merge pull request from GHSA-g4mx-q9vg-27p4</li> <li><a href="https://github.com/urllib3/urllib3/commit/944f0eb134485f41bc531be52de12ba5a37bca73"><code>944f0eb</code></a> [1.26] Use vendored six in urllib3.contrib.securetransport</li> <li>See full diff in <a href="https://github.com/urllib3/urllib3/compare/1.26.17...1.26.18">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=1.26.17&new-version=1.26.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/huggingface/datasets-server/network/alerts). </details>
Bump urllib3 from 1.26.17 to 1.26.18 in /libs/libcommon: Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.17 to 1.26.18. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/releases">urllib3's releases</a>.</em></p> <blockquote> <h2>1.26.18</h2> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses. (GHSA-g4mx-q9vg-27p4)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's changelog</a>.</em></p> <blockquote> <h1>1.26.18 (2023-10-17)</h1> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/urllib3/urllib3/commit/9c2c2307dd1d6af504e09aac0326d86ee3597a0b"><code>9c2c230</code></a> Release 1.26.18 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3159">#3159</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/b594c5ceaca38e1ac215f916538fb128e3526a36"><code>b594c5c</code></a> Merge pull request from GHSA-g4mx-q9vg-27p4</li> <li><a href="https://github.com/urllib3/urllib3/commit/944f0eb134485f41bc531be52de12ba5a37bca73"><code>944f0eb</code></a> [1.26] Use vendored six in urllib3.contrib.securetransport</li> <li>See full diff in <a href="https://github.com/urllib3/urllib3/compare/1.26.17...1.26.18">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=1.26.17&new-version=1.26.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/huggingface/datasets-server/network/alerts). </details>
closed
2023-10-18T01:20:51Z
2023-10-18T15:39:22Z
2023-10-18T15:39:11Z
dependabot[bot]
1,948,565,114
Bump urllib3 from 1.26.17 to 1.26.18 in /e2e
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.17 to 1.26.18. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/releases">urllib3's releases</a>.</em></p> <blockquote> <h2>1.26.18</h2> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses. (GHSA-g4mx-q9vg-27p4)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's changelog</a>.</em></p> <blockquote> <h1>1.26.18 (2023-10-17)</h1> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/urllib3/urllib3/commit/9c2c2307dd1d6af504e09aac0326d86ee3597a0b"><code>9c2c230</code></a> Release 1.26.18 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3159">#3159</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/b594c5ceaca38e1ac215f916538fb128e3526a36"><code>b594c5c</code></a> Merge pull request from GHSA-g4mx-q9vg-27p4</li> <li><a href="https://github.com/urllib3/urllib3/commit/944f0eb134485f41bc531be52de12ba5a37bca73"><code>944f0eb</code></a> [1.26] Use vendored six in urllib3.contrib.securetransport</li> <li>See full diff in <a href="https://github.com/urllib3/urllib3/compare/1.26.17...1.26.18">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=1.26.17&new-version=1.26.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/huggingface/datasets-server/network/alerts). </details>
Bump urllib3 from 1.26.17 to 1.26.18 in /e2e: Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.17 to 1.26.18. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/releases">urllib3's releases</a>.</em></p> <blockquote> <h2>1.26.18</h2> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses. (GHSA-g4mx-q9vg-27p4)</li> </ul> </blockquote> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's changelog</a>.</em></p> <blockquote> <h1>1.26.18 (2023-10-17)</h1> <ul> <li>Made body stripped from HTTP requests changing the request method to GET after HTTP 303 &quot;See Other&quot; redirect responses.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/urllib3/urllib3/commit/9c2c2307dd1d6af504e09aac0326d86ee3597a0b"><code>9c2c230</code></a> Release 1.26.18 (<a href="https://redirect.github.com/urllib3/urllib3/issues/3159">#3159</a>)</li> <li><a href="https://github.com/urllib3/urllib3/commit/b594c5ceaca38e1ac215f916538fb128e3526a36"><code>b594c5c</code></a> Merge pull request from GHSA-g4mx-q9vg-27p4</li> <li><a href="https://github.com/urllib3/urllib3/commit/944f0eb134485f41bc531be52de12ba5a37bca73"><code>944f0eb</code></a> [1.26] Use vendored six in urllib3.contrib.securetransport</li> <li>See full diff in <a href="https://github.com/urllib3/urllib3/compare/1.26.17...1.26.18">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=1.26.17&new-version=1.26.18)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/huggingface/datasets-server/network/alerts). </details>
closed
2023-10-18T01:18:28Z
2023-10-18T15:39:21Z
2023-10-18T15:39:11Z
dependabot[bot]
1,948,016,264
making dataset_git_revision field mandatory
See https://github.com/huggingface/datasets-server/pull/1988#issuecomment-1766330413 for context Depends on https://github.com/huggingface/datasets-server/pull/1996
making dataset_git_revision field mandatory: See https://github.com/huggingface/datasets-server/pull/1988#issuecomment-1766330413 for context Depends on https://github.com/huggingface/datasets-server/pull/1996
closed
2023-10-17T18:31:45Z
2023-10-18T16:47:22Z
2023-10-18T16:47:21Z
AndreaFrancis
1,947,840,586
Partial duckdb index
For big datasets, instead of using all the parquet files, use only the first ones until the limit of 5GB is reached. It's used for both `search` and `filter`. ## Indexing I added three fields in the duckdb-index job: - partial (if it used a partial parquet export or if it didn't index the full parquet export) - num_rows - num_bytes ## New jobs I also added two jobs and their responses can be used in the front-end instead of the `size` endpoint: - config-duckdb-index-size - dataset-duckdb-index-size Both jobs return the same response format as the config-size and the dataset-size jobs, but with information about the size of the indexed data instead of the size of the parquet-exported data. ## Update of search and filter I also added the `partial` field in the responses of `/search` and `/filter`, in case they use a partial index. ## Index file The file name is `index.duckdb` if it indexes all the rows from the parquet export, otherwise it's named `partial-duckdb.index`. It is placed in the same folder as the parquet export in `refs/convert/parquet`. Fix https://github.com/huggingface/datasets-server/issues/1742 TODO - [x] tests - [x] docs - [x] openapi - [x] update /search and /filter for partial indexes
Partial duckdb index: For big datasets, instead of using all the parquet files, use only the first ones until the limit of 5GB is reached. It's used for both `search` and `filter`. ## Indexing I added three fields in the duckdb-index job: - partial (if it used a partial parquet export or if it didn't index the full parquet export) - num_rows - num_bytes ## New jobs I also added two jobs and their responses can be used in the front-end instead of the `size` endpoint: - config-duckdb-index-size - dataset-duckdb-index-size Both jobs return the same response format as the config-size and the dataset-size jobs, but with information about the size of the indexed data instead of the size of the parquet-exported data. ## Update of search and filter I also added the `partial` field in the responses of `/search` and `/filter`, in case they use a partial index. ## Index file The file name is `index.duckdb` if it indexes all the rows from the parquet export, otherwise it's named `partial-duckdb.index`. It is placed in the same folder as the parquet export in `refs/convert/parquet`. Fix https://github.com/huggingface/datasets-server/issues/1742 TODO - [x] tests - [x] docs - [x] openapi - [x] update /search and /filter for partial indexes
closed
2023-10-17T16:43:29Z
2023-11-07T09:53:03Z
2023-11-07T09:52:27Z
lhoestq
1,947,739,879
Limit max RAM that duckdb can use
Current value of `max_memory` in prod is 105.7Gb (despite the pod has only 34Gb but it's set automatically by duckdb as 80% of ram and ram of the node in total is 128Gb) <img width="382" alt="image" src="https://github.com/huggingface/datasets-server/assets/16348744/b29d7829-43e3-431c-a453-546f80e722a5"> I want to try to lower it manually since we assume there might be some memory issues with duckdb causing the `split-descriptive-statistics` process to stuck at the step of loading data into in-memory db. (also want to try to load data to local db file, this is the only difference with `duckdb-index` step)
Limit max RAM that duckdb can use: Current value of `max_memory` in prod is 105.7Gb (despite the pod has only 34Gb but it's set automatically by duckdb as 80% of ram and ram of the node in total is 128Gb) <img width="382" alt="image" src="https://github.com/huggingface/datasets-server/assets/16348744/b29d7829-43e3-431c-a453-546f80e722a5"> I want to try to lower it manually since we assume there might be some memory issues with duckdb causing the `split-descriptive-statistics` process to stuck at the step of loading data into in-memory db. (also want to try to load data to local db file, this is the only difference with `duckdb-index` step)
closed
2023-10-17T15:47:35Z
2023-10-18T13:50:53Z
2023-10-17T16:22:15Z
polinaeterna
1,947,439,157
/filter returns Unexpected error for string columns
For example: https://datasets-server.huggingface.co/filter?dataset=tapaco&config=af&split=train&where=language=%27af%27 ``` {"error":"Unexpected error."} ```
/filter returns Unexpected error for string columns: For example: https://datasets-server.huggingface.co/filter?dataset=tapaco&config=af&split=train&where=language=%27af%27 ``` {"error":"Unexpected error."} ```
closed
2023-10-17T13:28:57Z
2023-10-19T14:45:10Z
2023-10-19T14:45:10Z
severo
1,947,101,711
more search and filter pods
null
more search and filter pods:
closed
2023-10-17T10:35:27Z
2023-10-17T10:36:37Z
2023-10-17T10:36:36Z
lhoestq
1,945,813,806
adding revision for assets creation path
Fix for https://github.com/huggingface/datasets-server/issues/1981 As suggested by @lhoestq in https://github.com/huggingface/datasets-server/issues/1981#issuecomment-1764559346 , adding dataset version (revision) as part of assets path: - Removing old code to periodically clean cached-assets storage - Add revision as part of the asset path {dataset}-{revision}-{config}-{split} if no revision, default is "main" same as in https://github.com/huggingface/datasets-server/blob/main/jobs/mongodb_migration/src/mongodb_migration/migrations/_20230516101500_queue_job_add_revision.py#L22 - For cached-assets, we dont need to delete previous version since they will be automatically removed by the bucket TTL policy (After 1 day of inactivity) - We can keep overwrite=False in /search, /fiter and /rows since a new version will be created in another folder - For previous assets I think there is no problem with backwards compatibility since we stored the file URL in the db Open questions for upcoming PRs: - Since all our cache records in [cachedResponsesBlue](https://github.com/huggingface/datasets-server/blob/main/libs/libcommon/src/libcommon/simple_cache.py#L129) collection have a revision value, should we change it to a required and non Optional same as in [Jobs](https://github.com/huggingface/datasets-server/blob/main/libs/libcommon/src/libcommon/queue.py#L178)? -> Done in https://github.com/huggingface/datasets-server/pull/1993 - For assets, if a new version is processed, we will be storing the previous files in S3, we need to find a way to clean them or, maybe in https://github.com/huggingface/datasets-server/issues/1823 it will make more sense since a "versioned" approach was proposed.
adding revision for assets creation path: Fix for https://github.com/huggingface/datasets-server/issues/1981 As suggested by @lhoestq in https://github.com/huggingface/datasets-server/issues/1981#issuecomment-1764559346 , adding dataset version (revision) as part of assets path: - Removing old code to periodically clean cached-assets storage - Add revision as part of the asset path {dataset}-{revision}-{config}-{split} if no revision, default is "main" same as in https://github.com/huggingface/datasets-server/blob/main/jobs/mongodb_migration/src/mongodb_migration/migrations/_20230516101500_queue_job_add_revision.py#L22 - For cached-assets, we dont need to delete previous version since they will be automatically removed by the bucket TTL policy (After 1 day of inactivity) - We can keep overwrite=False in /search, /fiter and /rows since a new version will be created in another folder - For previous assets I think there is no problem with backwards compatibility since we stored the file URL in the db Open questions for upcoming PRs: - Since all our cache records in [cachedResponsesBlue](https://github.com/huggingface/datasets-server/blob/main/libs/libcommon/src/libcommon/simple_cache.py#L129) collection have a revision value, should we change it to a required and non Optional same as in [Jobs](https://github.com/huggingface/datasets-server/blob/main/libs/libcommon/src/libcommon/queue.py#L178)? -> Done in https://github.com/huggingface/datasets-server/pull/1993 - For assets, if a new version is processed, we will be storing the previous files in S3, we need to find a way to clean them or, maybe in https://github.com/huggingface/datasets-server/issues/1823 it will make more sense since a "versioned" approach was proposed.
closed
2023-10-16T18:11:08Z
2023-10-19T15:23:40Z
2023-10-19T15:23:04Z
AndreaFrancis
1,945,237,423
Add missing filter errors to openapi.json
Add missing filter errors to `openapi.json`.
Add missing filter errors to openapi.json: Add missing filter errors to `openapi.json`.
closed
2023-10-16T13:35:03Z
2023-10-20T07:29:18Z
2023-10-20T07:28:43Z
albertvillanova
1,945,181,398
adding clean stats cache cron job
Same as we do now for `duckdb-index` and `hf-datasets-cache` (cleaning periodically obsolete/old files), we can sometimes keep files in stats because of job runner crash or zombie killer. Adding a cron job to delete files/folders in /storage/stats-cache that are older than 3 hours ago.
adding clean stats cache cron job: Same as we do now for `duckdb-index` and `hf-datasets-cache` (cleaning periodically obsolete/old files), we can sometimes keep files in stats because of job runner crash or zombie killer. Adding a cron job to delete files/folders in /storage/stats-cache that are older than 3 hours ago.
closed
2023-10-16T13:06:51Z
2023-10-16T16:51:26Z
2023-10-16T16:51:25Z
AndreaFrancis
1,945,108,580
DuckDB parsing errors if a column name has quotes
Not super important but affects the `split-descriptive-statistics` and the `split-duckdb-index ` jobs e.g. this dataset has a pretty long column name that has quotes and it raises this error ``` β”‚ INFO: 2023-10-16 12:27:43,231 - root - Compute descriptive statistics for dataset='lunaluan/chatbox3_history', config='default', split='train' β”‚ β”‚ INFO: 2023-10-16 12:27:43,233 - root - Downloading remote parquet files to a local directory /storage/stats-cache/81004481536938-split-descriptive-statistics-lunaluan-chatbox3_hi-0d92723b. β”‚ β”‚ Downloading 0000.parquet: 0%| | 0.00/6.75k [00:00<?, ?B/s]Downloading 0000.parquet: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 6.75k/6.75k [00:00<00:00, 5.67MB/s] β”‚ β”‚ INFO: 2023-10-16 12:27:43,912 - root - Loading data into in-memory table. β”‚ β”‚ ERROR: 2023-10-16 12:27:44,068 - root - Parser Error: syntax error at or near "detail" β”‚ β”‚ LINE 2: ... over his Professor's face. Mention "in detail description" how the professor ... β”‚ β”‚ ^ β”‚ β”‚ Traceback (most recent call last): β”‚ β”‚ File "/src/services/worker/src/worker/job_manager.py", line 168, in process β”‚ β”‚ job_result = self.job_runner.compute() β”‚ β”‚ File "/src/services/worker/src/worker/job_runners/split/descriptive_statistics.py", line 591, in compute β”‚ β”‚ compute_descriptive_statistics_response( β”‚ β”‚ File "/src/services/worker/src/worker/job_runners/split/descriptive_statistics.py", line 485, in compute_descriptive_statistics_response β”‚ β”‚ con.sql( β”‚ β”‚ duckdb.ParserException: Parser Error: syntax error at or near "detail" β”‚ β”‚ LINE 2: ... over his Professor's face. Mention "in detail description" how the professor ... β”‚ β”‚ ^ β”‚ ```
DuckDB parsing errors if a column name has quotes: Not super important but affects the `split-descriptive-statistics` and the `split-duckdb-index ` jobs e.g. this dataset has a pretty long column name that has quotes and it raises this error ``` β”‚ INFO: 2023-10-16 12:27:43,231 - root - Compute descriptive statistics for dataset='lunaluan/chatbox3_history', config='default', split='train' β”‚ β”‚ INFO: 2023-10-16 12:27:43,233 - root - Downloading remote parquet files to a local directory /storage/stats-cache/81004481536938-split-descriptive-statistics-lunaluan-chatbox3_hi-0d92723b. β”‚ β”‚ Downloading 0000.parquet: 0%| | 0.00/6.75k [00:00<?, ?B/s]Downloading 0000.parquet: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 6.75k/6.75k [00:00<00:00, 5.67MB/s] β”‚ β”‚ INFO: 2023-10-16 12:27:43,912 - root - Loading data into in-memory table. β”‚ β”‚ ERROR: 2023-10-16 12:27:44,068 - root - Parser Error: syntax error at or near "detail" β”‚ β”‚ LINE 2: ... over his Professor's face. Mention "in detail description" how the professor ... β”‚ β”‚ ^ β”‚ β”‚ Traceback (most recent call last): β”‚ β”‚ File "/src/services/worker/src/worker/job_manager.py", line 168, in process β”‚ β”‚ job_result = self.job_runner.compute() β”‚ β”‚ File "/src/services/worker/src/worker/job_runners/split/descriptive_statistics.py", line 591, in compute β”‚ β”‚ compute_descriptive_statistics_response( β”‚ β”‚ File "/src/services/worker/src/worker/job_runners/split/descriptive_statistics.py", line 485, in compute_descriptive_statistics_response β”‚ β”‚ con.sql( β”‚ β”‚ duckdb.ParserException: Parser Error: syntax error at or near "detail" β”‚ β”‚ LINE 2: ... over his Professor's face. Mention "in detail description" how the professor ... β”‚ β”‚ ^ β”‚ ```
open
2023-10-16T12:32:27Z
2024-06-19T16:10:52Z
null
lhoestq
1,944,692,347
Run duckdb queries async in separate thread
Run `duckdb` queries asynchronously in a separate thread.
Run duckdb queries async in separate thread: Run `duckdb` queries asynchronously in a separate thread.
closed
2023-10-16T08:44:19Z
2023-10-20T08:00:00Z
2023-10-20T07:59:59Z
albertvillanova
1,944,606,683
Move modification time update of duckdb index file to libapi
EDIT: After the explanation by @AndreaFrancis, this PR moves the updating of the modification time of the duckdb index file from the /search endpoint to the `libapi` function `get_index_file_location_and_download_if_missing`. This way, the modification time is also updated by the /filter endpoint. ~~Remove unnecessary `Path.touch` in /search.~~ ~~I think this is not needed.~~ CC: @AndreaFrancis
Move modification time update of duckdb index file to libapi: EDIT: After the explanation by @AndreaFrancis, this PR moves the updating of the modification time of the duckdb index file from the /search endpoint to the `libapi` function `get_index_file_location_and_download_if_missing`. This way, the modification time is also updated by the /filter endpoint. ~~Remove unnecessary `Path.touch` in /search.~~ ~~I think this is not needed.~~ CC: @AndreaFrancis
closed
2023-10-16T07:56:24Z
2023-10-16T15:48:45Z
2023-10-16T15:48:44Z
albertvillanova
1,942,211,081
Remove unused httpfs duckdb extension
it's not used for files download anymore, we use `huggingface_hub` directly **+ I added some logging to check what's going on with the stats computing in this PR to make it fast, to be removed!**
Remove unused httpfs duckdb extension: it's not used for files download anymore, we use `huggingface_hub` directly **+ I added some logging to check what's going on with the stats computing in this PR to make it fast, to be removed!**
closed
2023-10-13T15:57:09Z
2023-10-17T12:13:33Z
2023-10-17T12:13:32Z
polinaeterna
1,942,060,762
Overwrite image and audio cached assets when necessary
Currently we never overwrite image and audio cached assets, which can lead to outdated data in the viewer like here: https://huggingface.co/datasets/ccmusic-database/instrument_timbre_eval/discussions/1 In particular we have `overwrite=False` here: https://github.com/huggingface/datasets-server/blob/b84c8210ec023664c4780ab348b0232468afe116/libs/libapi/src/libapi/response.py#L36-L42 cc @AndreaFrancis @severo
Overwrite image and audio cached assets when necessary: Currently we never overwrite image and audio cached assets, which can lead to outdated data in the viewer like here: https://huggingface.co/datasets/ccmusic-database/instrument_timbre_eval/discussions/1 In particular we have `overwrite=False` here: https://github.com/huggingface/datasets-server/blob/b84c8210ec023664c4780ab348b0232468afe116/libs/libapi/src/libapi/response.py#L36-L42 cc @AndreaFrancis @severo
closed
2023-10-13T14:21:51Z
2023-10-19T17:49:42Z
2023-10-19T17:49:42Z
lhoestq
1,942,002,712
increase ttl time for clean cron jobs
null
increase ttl time for clean cron jobs:
closed
2023-10-13T13:47:54Z
2023-10-13T13:50:33Z
2023-10-13T13:50:32Z
AndreaFrancis
1,941,961,874
Remove requests as dependency of libapi
Remove `requests` as `libapi` dependency by using `httpx` instead.
Remove requests as dependency of libapi: Remove `requests` as `libapi` dependency by using `httpx` instead.
closed
2023-10-13T13:24:09Z
2023-10-19T06:35:21Z
2023-10-19T06:35:19Z
albertvillanova
1,941,617,329
Make admin authentication asynchronous
Make admin service authentication asynchronous by using `httpx` instead of `requests`. Related to: - #1975
Make admin authentication asynchronous: Make admin service authentication asynchronous by using `httpx` instead of `requests`. Related to: - #1975
closed
2023-10-13T09:45:14Z
2023-10-13T12:37:24Z
2023-10-13T12:37:23Z
albertvillanova
1,941,379,218
Remove requests as dependency of libcommon
Remove `requests` as dependency of `libcommon` because it is not used.
Remove requests as dependency of libcommon: Remove `requests` as dependency of `libcommon` because it is not used.
closed
2023-10-13T07:13:43Z
2023-10-13T08:09:56Z
2023-10-13T08:09:55Z
albertvillanova
1,940,736,477
fsspec storage instead of boto3
Related to comment https://github.com/huggingface/datasets-server/pull/1882#issuecomment-1740688443 - Use a storage agnostic tool (fsspec) to store images/audios for assets and cached assets. - Adding an abstraction layer for "file" and "s3" storage. - For docker containers, we will use "file" storage but for staging/prod we will use "s3". - I removed all the dependencies of boto3 and moto (and added s3fs) - I removed all the dependencies of `/storage/assets` and `/storage/cached-assets` (We don't need it anymore since images and audios are being written directly to the file system) - I removed the disk metrics (They need to be added again but getting the values from S3, maybe can be handled in another PR) - Will close https://github.com/huggingface/datasets-server/issues/1887 and https://github.com/huggingface/datasets-server/issues/1885 Pending issue: - For reverse-proxy, I am not able to get the files stored in /assets and /cached-assets volume (My generated URL is http://localhost:8100/assets/asoria/image/--/666e73a/--/default/train/99/image/image.jpg but it only works when I get http://localhost:8100/assets/assets/asoria/image/--/666e73a/--/default/train/99/image/image.jpg) maybe it is a configuration with the nginx redirect configuration).
fsspec storage instead of boto3: Related to comment https://github.com/huggingface/datasets-server/pull/1882#issuecomment-1740688443 - Use a storage agnostic tool (fsspec) to store images/audios for assets and cached assets. - Adding an abstraction layer for "file" and "s3" storage. - For docker containers, we will use "file" storage but for staging/prod we will use "s3". - I removed all the dependencies of boto3 and moto (and added s3fs) - I removed all the dependencies of `/storage/assets` and `/storage/cached-assets` (We don't need it anymore since images and audios are being written directly to the file system) - I removed the disk metrics (They need to be added again but getting the values from S3, maybe can be handled in another PR) - Will close https://github.com/huggingface/datasets-server/issues/1887 and https://github.com/huggingface/datasets-server/issues/1885 Pending issue: - For reverse-proxy, I am not able to get the files stored in /assets and /cached-assets volume (My generated URL is http://localhost:8100/assets/asoria/image/--/666e73a/--/default/train/99/image/image.jpg but it only works when I get http://localhost:8100/assets/assets/asoria/image/--/666e73a/--/default/train/99/image/image.jpg) maybe it is a configuration with the nginx redirect configuration).
closed
2023-10-12T20:21:53Z
2023-10-30T14:34:45Z
2023-10-30T14:34:43Z
AndreaFrancis
1,940,390,374
Make authentication asynchronous
Make authentication asynchronous by using `httpx` instead of `requests`.
Make authentication asynchronous: Make authentication asynchronous by using `httpx` instead of `requests`.
closed
2023-10-12T16:46:02Z
2023-10-13T08:04:22Z
2023-10-13T08:04:20Z
albertvillanova
1,940,369,820
Increase the number of CPUs from 2 to 8 for heavy workers
Currently loading data into table takes too much time but should be decreased proportionally to the number of cpus. For example, on my local machine using 2 cpus loads 4Gb parquet files of [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) for 20 sec and with 8 cpus for 5 sec. Can we try this?
Increase the number of CPUs from 2 to 8 for heavy workers: Currently loading data into table takes too much time but should be decreased proportionally to the number of cpus. For example, on my local machine using 2 cpus loads 4Gb parquet files of [Open-Orca/OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) for 20 sec and with 8 cpus for 5 sec. Can we try this?
closed
2023-10-12T16:36:05Z
2023-10-12T21:55:36Z
2023-10-12T21:55:35Z
polinaeterna
1,940,292,868
fix cron job metadata name lenght
When deploying to staging, this error appeared: `one or more objects failed to apply, reason: CronJob.batch "staging-datasets-server-job-clean-duckdb-index-downloads" is invalid: metadata.name: Invalid value: "staging-datasets-server-job-clean-duckdb-index-downloads": must be no more than 52 characters,CronJob.batch "staging-datasets-server-job-clean-duckdb-index-job-runner" is invalid: metadata.name: Invalid value: "staging-datasets-server-job-clean-duckdb-index-job-runner": must be no more than 52 characters` Reducing the job name to see if it fix the error
fix cron job metadata name lenght: When deploying to staging, this error appeared: `one or more objects failed to apply, reason: CronJob.batch "staging-datasets-server-job-clean-duckdb-index-downloads" is invalid: metadata.name: Invalid value: "staging-datasets-server-job-clean-duckdb-index-downloads": must be no more than 52 characters,CronJob.batch "staging-datasets-server-job-clean-duckdb-index-job-runner" is invalid: metadata.name: Invalid value: "staging-datasets-server-job-clean-duckdb-index-job-runner": must be no more than 52 characters` Reducing the job name to see if it fix the error
closed
2023-10-12T15:55:38Z
2023-10-12T16:16:45Z
2023-10-12T16:16:45Z
AndreaFrancis
1,940,267,701
Fix admin service in dev docker compose
It was missing libapi, a better timeout. On my laptop the import of sounfile without libsndfile from datasets is causing an error so I also added it to the dev dockerfile.
Fix admin service in dev docker compose: It was missing libapi, a better timeout. On my laptop the import of sounfile without libsndfile from datasets is causing an error so I also added it to the dev dockerfile.
closed
2023-10-12T15:41:21Z
2023-10-19T09:23:09Z
2023-10-19T09:23:08Z
lhoestq
1,940,255,563
Add date type support in cache responses
Mongo only supports dates with time (not date alone) because of bson. So the `date` python type from `date32` pyarrow type couldn't be saved in mongo. I fixed it by saving the `date` the same way as `string` objects. Fix https://github.com/huggingface/datasets-server/issues/86 e.g. in first-rows for [aborruso](https://huggingface.co/aborruso)[/pnrr](https://huggingface.co/datasets/aborruso/pnrr) ```json { "dataset": "__DUMMY_DATASETS_SERVER_USER__/pnrr", "config": "default", "split": "train", "features": [ { "feature_idx": 0, "name": "Programma", "type": { "dtype": "string", "_type": "Value" } }, ... { "feature_idx": 50, "name": "Data di Estrazione", "type": { "dtype": "date32", "_type": "Value" } } ], "rows": [ { "row_idx": 0, "row": { "Programma": "PNRR", ... "Data di Estrazione": "2023-06-13" }, "truncated_cells": [] }, ... ] } ``` EDIT: also added support for time, timedelta and decimal I also checked binary but it is already supported, though its format is not ideal (we get a list of integers from mongo)
Add date type support in cache responses: Mongo only supports dates with time (not date alone) because of bson. So the `date` python type from `date32` pyarrow type couldn't be saved in mongo. I fixed it by saving the `date` the same way as `string` objects. Fix https://github.com/huggingface/datasets-server/issues/86 e.g. in first-rows for [aborruso](https://huggingface.co/aborruso)[/pnrr](https://huggingface.co/datasets/aborruso/pnrr) ```json { "dataset": "__DUMMY_DATASETS_SERVER_USER__/pnrr", "config": "default", "split": "train", "features": [ { "feature_idx": 0, "name": "Programma", "type": { "dtype": "string", "_type": "Value" } }, ... { "feature_idx": 50, "name": "Data di Estrazione", "type": { "dtype": "date32", "_type": "Value" } } ], "rows": [ { "row_idx": 0, "row": { "Programma": "PNRR", ... "Data di Estrazione": "2023-06-13" }, "truncated_cells": [] }, ... ] } ``` EDIT: also added support for time, timedelta and decimal I also checked binary but it is already supported, though its format is not ideal (we get a list of integers from mongo)
closed
2023-10-12T15:34:20Z
2023-10-17T12:56:02Z
2023-10-16T15:40:43Z
lhoestq
1,939,480,752
Remove duplicate admin utils
Remove duplicate `admin.utils` and use `libapi.utils` instead.
Remove duplicate admin utils: Remove duplicate `admin.utils` and use `libapi.utils` instead.
closed
2023-10-12T08:27:28Z
2023-10-12T09:27:44Z
2023-10-12T09:27:43Z
albertvillanova
1,938,302,717
minor logging improvement in clean hf dset cache
I noticed that the errors counter was incorrect (was counting subdirectories)
minor logging improvement in clean hf dset cache: I noticed that the errors counter was incorrect (was counting subdirectories)
closed
2023-10-11T17:10:38Z
2023-10-12T12:53:57Z
2023-10-12T12:53:56Z
lhoestq