repo_name
stringlengths 9
75
| topic
stringclasses 30
values | issue_number
int64 1
203k
| title
stringlengths 1
976
| body
stringlengths 0
254k
| state
stringclasses 2
values | created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| url
stringlengths 38
105
| labels
sequencelengths 0
9
| user_login
stringlengths 1
39
| comments_count
int64 0
452
|
---|---|---|---|---|---|---|---|---|---|---|---|
plotly/dash-core-components | dash | 953 | Release v1.16.0 of DashCoreComponents for Julia | This issue will trigger v1.16.0 of `DashCoreComponents` for Dash.jl. | closed | 2021-04-13T19:09:48Z | 2021-04-14T01:13:56Z | https://github.com/plotly/dash-core-components/issues/953 | [] | rpkyle | 3 |
betodealmeida/shillelagh | sqlalchemy | 47 | Make schema work | Right now if we query `main.table` the query fails. | closed | 2021-07-01T14:30:02Z | 2021-07-01T16:55:02Z | https://github.com/betodealmeida/shillelagh/issues/47 | [
"bug"
] | betodealmeida | 0 |
feder-cr/Jobs_Applier_AI_Agent_AIHawk | automation | 296 | The program shows applying to jobs completed but no jobs are actually applied to | ## Description
I set up the program and ran it according to the instructions, it was successful in logging in -> navigating to the linkedIn jobs page -> scrolling through the page ; but I don't see any ongoing action after the message `Starting the application process for this page...` and after a couple of minutes it says `Applying to jobs on this page has been completed!` , but none of the jobs are actually applied to ; I am currently running it in the default mode `python main.py` where it should generate the resumes for me.
## Config used
```
remote: true
experienceLevel:
internship: false
entry: true
associate: true
mid-senior level: true
director: false
executive: false
jobTypes:
full-time: true
contract: true
part-time: false
temporary: false
internship: false
other: false
volunteer: false
date:
all time: false
month: false
week: false
24 hours: true
positions:
- Product Manager
locations:
- United States
distance: 100
companyBlacklist:
- Falkonry
- Peloton
- Helm.ai
titleBlacklist:
- Sales
```
## Screenshot

How do I fix this/proceed? Any debugging notes shall be helpful too.
| closed | 2024-09-05T23:33:29Z | 2024-09-25T14:09:44Z | https://github.com/feder-cr/Jobs_Applier_AI_Agent_AIHawk/issues/296 | [] | kevalhb | 8 |
miguelgrinberg/flasky | flask | 32 | Chapter 10 - There is an error | closed | 2015-01-20T15:23:47Z | 2015-01-20T15:40:20Z | https://github.com/miguelgrinberg/flasky/issues/32 | [] | ghost | 0 |
|
man-group/arctic | pandas | 884 | Panel not available in pandas 1.2 | Starting from pandas 1.2 Panel is not available anymore, so this causes an error:
arctic/store/_pandas_ndarray_store.py", line 6, in <module>
from pandas import DataFrame, Series, Panel
ImportError: cannot import name 'Panel' from 'pandas' | closed | 2021-01-21T18:04:35Z | 2021-03-12T11:51:04Z | https://github.com/man-group/arctic/issues/884 | [] | grinisrit | 4 |
netbox-community/netbox | django | 18,129 | Create custom views to extend export templates feature | ### NetBox version
v4.1.7
### Feature type
New functionality
### Triage priority
N/A
### Proposed functionality
This feature would allow to create a custom view that can cross-display columns from different tables based on a reconciliation key. For example, a custom view that displays a device display name, primary ip, serial and site tenant (or any other field from the Site view).
### Use case
This could be used to avoid creating export templates to cross the data for users, or export both tables and cross it themselves, especially when the goal is only to view the data and not work with it. Additionally, it could expand export templates to use those views as a base with filtering logic if needed.
### Database changes
_No response_
### External dependencies
_No response_ | closed | 2024-12-02T10:44:58Z | 2025-03-06T03:09:07Z | https://github.com/netbox-community/netbox/issues/18129 | [
"type: feature",
"plugin candidate"
] | YoucefYousfi | 1 |
automl/auto-sklearn | scikit-learn | 1,670 | [Question] Trouble to install `auto-sklearn`? | How to troubleshoot installation `auto-sklearn`?
I tried to do `pipenv install auto-sklearn` on my Linux environment, but I got this (in the end):
```sh
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for scikit-learn
ERROR: Could not build wheels for scikit-learn, which is required to install pyproject.toml-based projects
[0m
⠇ Installing auto-sklearn...✘ Installation Failed
```
I installed `swig` through `sudo apt -y install swig` and it returned me no error.
In order to address the error, I tried a manual installation of scikit-learn, following these steps, [link1](https://stackoverflow.com/questions/70303758/error-could-not-build-wheels-for-scikit-learn-which-is-required-to-install-pyp), [link2](https://stackoverflow.com/questions/38866758/filename-whl-is-not-a-supported-wheel-on-this-platform), but I always end up on missing wheels.
Any help?/Guidance?
*I use WSL to run Ubuntu 22.04.2 LTS on my windows machine.
I also have C++ compiler. | open | 2023-06-15T12:45:34Z | 2023-08-25T07:48:07Z | https://github.com/automl/auto-sklearn/issues/1670 | [] | guilhermeparreira | 6 |
PaddlePaddle/models | nlp | 4,742 | PaddleNLP的lac分词baseline模型保存为serving的预测模型出错 | 保存模型:
```
# baseline model
export PYTHONIOENCODING=UTF-8 # 模型输出为Unicode编码,Python2若无此设置容易报错
python3.7 inference_model.py \
--init_checkpoint ./model_baseline \
--inference_save_dir ./inference_model
```
保存为serving预测模型:
```
import paddle_serving_client.io as serving_io
serving_io.inference_model_to_serving('./inference_model', serving_server="serving_server", serving_client="serving_client", model_filename='model.pdmodel', params_filename='params.pdparams')
```
最后生成的serving_server_conf.prototxt有错误,导致请求时返回`{"result":"Request Value Error"}`
```
feed_var {
name: "words"
alias_name: "words"
is_lod_tensor: true
feed_type: 0
shape: -1
}
fetch_var {
name: "crf_decoding_0.tmp_0"
alias_name: "crf_decoding_0.tmp_0"
is_lod_tensor: true
fetch_type: 1 <--- 这里应该是0。0代表int,1代表float
shape: -1
}
```
| closed | 2020-07-06T08:14:18Z | 2020-07-13T10:52:18Z | https://github.com/PaddlePaddle/models/issues/4742 | [] | levinxo | 1 |
benbusby/whoogle-search | flask | 390 | [QUESTION] Disable autocompletion | As the title suggests. i dont know if this is already in there and I skipped past it because im dumb and/or blind but if not im sure theres a way to disable that, right? | closed | 2021-08-20T08:34:12Z | 2021-09-07T16:37:04Z | https://github.com/benbusby/whoogle-search/issues/390 | [
"question"
] | ClaraCrazy | 4 |
huggingface/datasets | pandas | 6,989 | cache in nfs error | ### Describe the bug
- When reading dataset, a cache will be generated to the ~/. cache/huggingface/datasets directory
- When using .map and .filter operations, runtime cache will be generated to the /tmp/hf_datasets-* directory
- The default is to use the path of tempfile.tempdir
- If I modify this path to the NFS disk, an error will be reported, but the program will continue to run
- https://github.com/huggingface/datasets/blob/main/src/datasets/config.py#L257
```
Traceback (most recent call last):
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/process.py", line 315, in _bootstrap
self.run()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/managers.py", line 616, in _run_server
server.serve_forever()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/managers.py", line 182, in serve_forever
sys.exit(0)
SystemExit: 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 300, in _run_finalizers
finalizer()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 224, in __call__
res = self._callback(*self._args, **self._kwargs)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 133, in _remove_temp_dir
rmtree(tempdir)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 718, in rmtree
_rmtree_safe_fd(fd, path, onerror)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 675, in _rmtree_safe_fd
onerror(os.unlink, fullname, sys.exc_info())
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 673, in _rmtree_safe_fd
os.unlink(entry.name, dir_fd=topfd)
OSError: [Errno 16] Device or resource busy: '.nfs000000038330a012000030b4'
Traceback (most recent call last):
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/process.py", line 315, in _bootstrap
self.run()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/managers.py", line 616, in _run_server
server.serve_forever()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/managers.py", line 182, in serve_forever
sys.exit(0)
SystemExit: 0
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 300, in _run_finalizers
finalizer()
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 224, in __call__
res = self._callback(*self._args, **self._kwargs)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/site-packages/multiprocess/util.py", line 133, in _remove_temp_dir
rmtree(tempdir)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 718, in rmtree
_rmtree_safe_fd(fd, path, onerror)
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 675, in _rmtree_safe_fd
onerror(os.unlink, fullname, sys.exc_info())
File "/home/wzp/miniconda3/envs/dask/lib/python3.8/shutil.py", line 673, in _rmtree_safe_fd
os.unlink(entry.name, dir_fd=topfd)
OSError: [Errno 16] Device or resource busy: '.nfs0000000400064d4a000030e5'
```
### Steps to reproduce the bug
```
import os
import time
import tempfile
from datasets import load_dataset
def add_column(sample):
# print(type(sample))
# time.sleep(0.1)
sample['__ds__stats__'] = {'data': 123}
return sample
def filt_column(sample):
# print(type(sample))
if len(sample['content']) > 10:
return True
else:
return False
if __name__ == '__main__':
input_dir = '/mnt/temp/CN/small' # some json dataset
dataset = load_dataset('json', data_dir=input_dir)
temp_dir = '/media/release/release/temp/temp' # a nfs folder
os.makedirs(temp_dir, exist_ok=True)
# change huggingface-datasets runtime cache in nfs(default in /tmp)
tempfile.tempdir = temp_dir
aa = dataset.map(add_column, num_proc=64)
aa = aa.filter(filt_column, num_proc=64)
print(aa)
```
### Expected behavior
no error occur
### Environment info
datasets==2.18.0
ubuntu 20.04 | open | 2024-06-21T02:09:22Z | 2025-01-29T11:44:04Z | https://github.com/huggingface/datasets/issues/6989 | [] | simplew2011 | 1 |
streamlit/streamlit | machine-learning | 10,181 | `st.data_editor` should remain at the edited cell (x,y) after the (`on_change`) refresh. | ### Checklist
- [X] I have searched the [existing issues](https://github.com/streamlit/streamlit/issues) for similar feature requests.
- [X] I added a descriptive title and summary to this issue.
### Summary
**Description**
When a user scrolls down in the `st.data_editor` table and inputs data into a cell, the on_change event is triggered. After this, the table re-renders and resets the scroll position to the top-left corner (0,0) instead of maintaining the user’s previous scroll position.
**Steps to Reproduce**
Scroll down in the st.data_editor table.
Input data into a cell to trigger the on_change event.
Observe that the table re-renders and the scroll position resets to the top-left corner (0,0).
**Expected Behavior**
The scroll position should remain where the user had scrolled before inputting data, even after the table re-renders.
Position can be (x,y) of last edited cell.
**Versions**
Python: 3.11.10
Streamlit: 1.41.1
### Why?
_No response_
### How?
_No response_
### Additional Context
_No response_ | open | 2025-01-14T08:49:33Z | 2025-03-17T18:25:32Z | https://github.com/streamlit/streamlit/issues/10181 | [
"type:enhancement",
"feature:st.data_editor"
] | mariod13 | 3 |
ray-project/ray | pytorch | 50,718 | [core] Fix mock dependency | ### What happened + What you expected to happen
Ray core's mock object dependency is a mess, one concrete example:
https://github.com/ray-project/ray/blob/master/src/mock/ray/raylet/agent_manager.h
It should have the dependency of `AgentManager` and `DefaultAgentManagerServiceHandler`, but somehow it doesn't...
As a result, we have to include `class_a.h` before `mock_class_a.h` and avoid linter reorder via `clang-format off`, otherwise build will break.
A concrete example:
https://github.com/ray-project/ray/blob/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/ray/gcs/gcs_server/test/gcs_worker_manager_test.cc#L17-L26
This issue could be split into multiple sub-issues:
- [x] (small) https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/common/ray_syncer
- [ ] https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/core_worker
- [ ] (small) https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/gcs/gcs_client
- [ ] https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/gcs/gcs_server
- [ ] (small) (https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/gcs/pubsub)
- [ ] (small) (https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/gcs/store_client)
- [ ] (small) (https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/pubsub)
- [ ] https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/raylet
- [ ] (small) https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/raylet_client
- [ ] (small) https://github.com/ray-project/ray/tree/051798b384e1e1d0ae9aa3d17d8a8fcf8d69fc74/src/mock/ray/rpc/worker
**Note:
For large folders, you don't need to do everything in one PR.**
Steps to take:
1. Create a subissue, which links back to this main issue
2. Pick one items you're interested in, properly add dependency (header file and bazel build file)
3. Update use cases in unit tests and remove `clang-format off` mark
### Versions / Dependencies
N/A
### Reproduction script
N/A
### Issue Severity
None | open | 2025-02-19T05:25:20Z | 2025-03-12T01:52:56Z | https://github.com/ray-project/ray/issues/50718 | [
"bug",
"good-first-issue",
"enhancement",
"core",
"help-wanted"
] | dentiny | 6 |
modin-project/modin | data-science | 7,123 | Preserve shape_hint for dropna | This is to avoid index materialization in Series.columnarize.
| closed | 2024-03-26T09:34:08Z | 2024-03-26T12:14:25Z | https://github.com/modin-project/modin/issues/7123 | [
"Performance 🚀"
] | YarShev | 0 |
saleor/saleor | graphql | 17,377 | Bug: Variable "$password" is never used in operation "SetPassword". | ### What are you trying to achieve?
Hello, when trying to create a mutation to reset a password, I encountered the impossibility of doing this.
In the documentation https://docs.saleor.io/developer/users it is written that I first need to send a password reset mutation RequestPasswordReset (and I succeed, the letter comes to the mail), after which I need to send a SetPassword mutation and at this stage I get an error
'{
"errors": [
{
"message": "Variable \"$password\" is never used in operation \"SetPassword\".",
"locations": [
{
"line": 2,
"column": 60
}
],
"extensions": {
"exception": {
"code": "GraphQLError",
"stacktrace": [
"graphql.error.base.GraphQLError: Variable \"$password\" is never used in operation \"SetPassword\"."
]
}
}
}
],
"extensions": {
"cost": {
"requestedQueryCost": 0,
"maximumAvailable": 50000
}
}
}'
This is also confirmed by graphql-playground
### Steps to reproduce the problem
1. Send a SetPassword request as per the documentation
' mutation SetPassword($email: String!, $token: String!, $password: String!) {
setPassword(email: $email, token: $token, password: $token) {
errors {
field
message
}
}
}'
2. get an error ' Variable "$password" is never used in operation "SetPassword"."'
### What did you expect to happen?
Resetting the password
### Logs
_No response_
### Environment
Saleor version: 3.20
OS and version: in docker ghcr.io/saleor/saleor:3.20
| closed | 2025-02-17T19:10:51Z | 2025-02-17T19:22:23Z | https://github.com/saleor/saleor/issues/17377 | [
"bug",
"triage"
] | ilyLovesCode | 0 |
netbox-community/netbox | django | 18,856 | Related Objects not showing for a user that has limited permissions | ### Deployment Type
NetBox Cloud
### NetBox Version
v4.2.5
### Python Version
3.11
### Steps to Reproduce
Create a user with limited permissions to certain objects (I.E. we have production users who only can view prefixes, VRFs, and IP addresses with a production tag on them. They have read-only access to all sites).
Using this user, navigate to the sites view for a particular site.
Related objects will display none instead of the prefixes they have access to.
Navigating to the IPAM prefix view will show all of the prefixes they have access to.
### Expected Behavior
The user should be able to see the related objects they have permissions to view
### Observed Behavior
The user cannot see related objects | open | 2025-03-10T16:51:45Z | 2025-03-17T14:50:06Z | https://github.com/netbox-community/netbox/issues/18856 | [
"type: bug",
"status: revisions needed"
] | ZachHoiberg | 5 |
facebookresearch/fairseq | pytorch | 4,687 | The normalization settings of input audio | ## ❓ Questions and Help
### Before asking:
1. search the issues.
2. search the docs.
<!-- If you still can't find what you need: -->
#### What is your question?
In wav2vec2.0 and hubert, the config `task.normalize` is set to `False` (which means not to normalize the input audio), but data2vec is set to `True`, and the original paper also mentioned it. Will it have a big effect on experiment result?
#### Code
<!-- Please paste a code snippet if your question requires it! -->
#### What have you tried?
#### What's your environment?
- fairseq Version (e.g., 1.0 or main):
- PyTorch Version (e.g., 1.0)
- OS (e.g., Linux):
- How you installed fairseq (`pip`, source):
- Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
| closed | 2022-09-01T13:50:32Z | 2022-09-18T06:06:09Z | https://github.com/facebookresearch/fairseq/issues/4687 | [
"question",
"needs triage"
] | Ther-nullptr | 2 |
AUTOMATIC1111/stable-diffusion-webui | deep-learning | 16,016 | [Feature Request]: Since the SD3 effect is so poor, can we support the use of HunyuanDIT? | ### Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
### What would your feature do ?
Support the inference use of the HunyuanDIT model.
### Proposed workflow
Able to perform t2i
### Additional information
Since the SD3 effect is so poor, can we support the use of HunyuanDIT?
https://www.reddit.com/r/StableDiffusion/comments/1dehbpo/sd3_vs_hunyuandit/#lightbox
https://huggingface.co/Tencent-Hunyuan/HunyuanDiT
https://github.com/Tencent/HunyuanDiT
https://dit.hunyuan.tencent.com/


| open | 2024-06-14T01:54:26Z | 2024-06-18T04:22:00Z | https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/16016 | [
"enhancement"
] | yuno779 | 8 |
gradio-app/gradio | machine-learning | 10,755 | Update User interface for all users without restarting the server | Hello
I am reposting #10699 with more info as there was obviously a misunderstanding in my previous my request:
I know how to share data between user sessions. What I need is to be able to use this shared data to adapt the gradio user interface without having to restart the gradio server.
So here is a more concrete example: In one my video generator application I have embeddded a config menu (also made with gradio). If one user enables / disables some features I want other users to be able to see these new features without restarting the server. Right now only the user who changed the config can see the changes while all the other users are stuck with the user interface that was frozen when starting the app. This is also an issue for the user who made the config changes if he opens a new tab.
This frozen default user interface behaviour prevents even designing simple features like a news feed.
Maybe there is already a workaround and I would be grateful if you could share it with me..
Otherwise one solution could be to implement a gr.restart() that would produce an event where one could provide a new 'demo' object (or just call again the existing function that produced the demo object so that it can take into account new data). | closed | 2025-03-07T14:55:51Z | 2025-03-10T18:14:10Z | https://github.com/gradio-app/gradio/issues/10755 | [] | deepbeepmeep | 3 |
robotframework/robotframework | automation | 4,688 | Do not exclude files during parsing if using `--suite` option | Currently when the `--suite` option is used, files not matching the specified suite aren't parsed at all. This is a useful performance optimization, but it doesn't work well with the new `Name` setting (#4583) that allows configuring the suite name in the parsed file itself. In addition to that, suites not being parsed and not thus not being available for pre-run modifiers can cause surprises. To avoid all these issues, it is better to not use `--suite` for limiting what files are parsed at all.
This change isn't functionally backwards incompatible, but it obviously affects those who `--suite` to make parsing faster. A precondition to such a change is having an explicit way to limit what files are parsed (#4687). | closed | 2023-03-14T13:42:47Z | 2023-06-09T22:54:34Z | https://github.com/robotframework/robotframework/issues/4688 | [
"enhancement",
"priority: medium",
"backwards incompatible",
"rc 1",
"effort: small"
] | pekkaklarck | 1 |
jina-ai/serve | deep-learning | 5,539 | chore: draft release note v3.13.1 | # Release Note (3.13.1)
This release contains 3 bug fixes and 1 documentation improvement.
## 🐞 Bug Fixes
### Support Gateway with multiple protocols for Kubernetes export ([#5532](https://github.com/jina-ai/jina/pull/5532))
You can now export Flows with multiple protocols to Kubernetes. Previously this would cause an error.
```python
flow = Flow().config_gateway(protocol=['http', 'grpc'])
flow.to_kubernetes_yaml('k8s_flow_folder')
```
### Fix Python 3.11 support ([#5529](https://github.com/jina-ai/jina/pull/5529))
It was previously impossible to install Jina with Python 3.11 due to a `grpcio` dependency problem. `grpcio` added support for Python 3.11 only with version 1.49.0, [causing potential problems when used by Jina and other projects](https://github.com/grpc/grpc/issues/30303).
In this release `grpcio>=1.49.0` is installed alongside Jina when using Python 3.11. However, be aware of potential problems related to [grpc hanging](https://github.com/grpc/grpc/issues/30843).
### Unary RPC from Client respects `results_in_order` ([#5513](https://github.com/jina-ai/jina/pull/5513))
In prior releases, calling the `post` method of a client with `grpc` and using `stream=False` did not respect the `results_in_order` parameter and results were always returned in order:
```python
# this wrongly returns results in order
c = Client(protocol='grpc')
c.post(on='/', inputs=DocumentArray.empty(10000), stream=False, results_in_order=False)
```
Also this implied that using the Client with `asyncio=True` and `stream=False` in the post call would return results in the order that they were returned by the Flow, rather than respecting the input order:
```python
# this wrongly returns results in order
c = Client(protocol='grpc', asyncio=True)
async for resp in c.post(on='/', inputs=DocumentArray.empty(10000), stream=False, results_in_order=False)
print(resp)
```
This release fixes the ordering bug.
## 📗 Documentation Improvements
- Document inheritance of arguments from Flow API to Executors and Gateway ([#5535](https://github.com/jina-ai/jina/pull/5535))
## 🤘 Contributors
We would like to thank all contributors to this release:
- AlaeddineAbdessalem ([@alaeddine-13](https://github.com/alaeddine-13))
- Joan Fontanals ([@JoanFM](https://github.com/JoanFM))
- Jackmin801 ([@Jackmin801](https://github.com/Jackmin801))
- Anne Yang ([@AnneYang720](https://github.com/AnneYang720))
| closed | 2022-12-20T10:47:35Z | 2022-12-22T20:07:31Z | https://github.com/jina-ai/serve/issues/5539 | [] | alexcg1 | 0 |
pydata/xarray | pandas | 9,376 | xarray.open_datatree is taking too long to open datatree in a s3 bucket | ### What is your issue?
Hi all,
I was trying to open a datatree stored in a s3 bucket but it is taking too long.
```python
from xarray.backends.api import open_datatree
URL = 'https://js2.jetstream-cloud.org:8001/'
path = f'pythia/radar/erad2024'
fs = s3fs.S3FileSystem(anon=True, client_kwargs=dict(endpoint_url=URL))
file = s3fs.S3Map(f"{path}/zarr_radar/erad_2024.zarr", s3=fs)
dt = open_datatree(file, engine='zarr', consolidated=True)
```
When digging around, I discovered some parameters/arguments such as `mode`, `consolidated`, ..., were not being passed to `ZarrStore.open_store` function here.
https://github.com/pydata/xarray/blob/da9e7ec131379dd45f8f2b03d8488eda203c2bcb/xarray/backends/zarr.py#L1236
| closed | 2024-08-17T22:10:58Z | 2024-08-20T16:01:56Z | https://github.com/pydata/xarray/issues/9376 | [
"topic-performance",
"topic-zarr",
"topic-DataTree"
] | aladinor | 1 |
flasgger/flasgger | rest-api | 196 | use swaggerUiPrefix in template_file | I want to use template_file argument with a unique YAML file.
But i'm behind a reverse proxy and i need to use the "swaggerUiPrefix" option in the template.
`swaggerUiPrefix=LazyString(lambda : request.environ.get('HTTP_X_SCRIPT_NAME', ''))`
But when i use a YAML file, this option doesn't work. flasgger see this option like a string
How can i use the 'swaggerUiPrefix' in YAML file ? | open | 2018-05-01T18:59:04Z | 2018-10-01T17:31:19Z | https://github.com/flasgger/flasgger/issues/196 | [
"enhancement",
"hacktoberfest"
] | phil2fer | 2 |
gradio-app/gradio | data-visualization | 10,267 | gradio 5.0 unable to load javascript file | ### Describe the bug
if I provide JavaScript code in a variable, it is executed perfectly well but when I put the same code in a file "app.js" and then pass the file path in `js` parameter in `Blocks`, it doesn't work. I have added the code in reproduction below. if the same code is put in a file, the block will be unable to execute that.
It was working fine in version 4. Now I am upgrading to 5.0.
### Have you searched existing issues? 🔎
- [X] I have searched and found no existing issues
### Reproduction
```python
import gradio as gr
login_page_js = """
() => {
//handle launch
let reload = false;
let gradioURL = new URL(window.location.href);
if(
!gradioURL.searchParams.has('__theme') ||
(gradioURL.searchParams.has('__theme') && gradioURL.searchParams.get('__theme') !== 'dark')
) {
gradioURL.searchParams.delete('__theme');
gradioURL.searchParams.set('__theme', 'dark');
reload = true;
}
if(reload) {
window.location.replace(gradioURL.href);
}
}
"""
with gr.Blocks(
js = login_page_js
) as login_page:
gr.Button("Sign in with Microsoft", elem_classes="icon-button" ,link="/login")
if __name__ == "__main__":
login_page.launch()
```
### Screenshot
_No response_
### Logs
_No response_
### System Info
```shell
linux 2204
```
### Severity
I can work around it | open | 2024-12-30T15:09:28Z | 2024-12-30T16:19:48Z | https://github.com/gradio-app/gradio/issues/10267 | [
"bug"
] | git-hamza | 2 |
holoviz/panel | jupyter | 7,754 | TextAreaInput - implementing/dropping `cols` param? | #### ALL software version info
Panel 1.6.2a1
#### Description of expected behavior and the observed behavior
The [Panel docs](https://panel.holoviz.org/reference/widgets/TextAreaInput.html) says `cols` is the number of columns in the text input field, which is quite confusing. The [Bokeh docs](https://docs.bokeh.org/en/3.6.0/docs/reference/models/widgets/inputs.html#bokeh.models.TextAreaInput.cols) makes it clearer, `cols` specifies the width of the text area (in average character width). However, instantiating different TextAreaInput widgets with different `cols` doesn't seem to make any differences. Looking at the code, not sure if `cols` got implemented actually. As we allow setting the width of widget with `width` param. How about dropping the `cols` entirely?
#### Complete, minimal, self-contained example code that reproduces the issue
```python
import panel as pn
# looks like we also allow negative/0 cols
pn.widgets.TextAreaInput(cols=-1).servable()
pn.widgets.TextAreaInput(cols=0).servable()
pn.widgets.TextAreaInput(cols=1).servable()
pn.widgets.TextAreaInput(cols=100).servable()
pn.widgets.TextAreaInput(cols=200).servable()
```
| open | 2025-03-03T09:58:21Z | 2025-03-11T14:23:00Z | https://github.com/holoviz/panel/issues/7754 | [] | thuydotm | 0 |
openapi-generators/openapi-python-client | fastapi | 1,051 | Missing lazy import in post body to_multipart function | **Describe the bug**
It is possible to generate a model with a missing import leading to invalid python code.
The below `to_multipart` function is generated in the body for the create_upload_file_uploadfile__post model and references `UploadConfig` however it is not in scope.
```python
def to_multipart(self) -> Dict[str, Any]:
file = self.file.to_tuple()
config: Union[Tuple[None, bytes, str], Unset]
if isinstance(self.config, Unset):
config = UNSET
elif isinstance(self.config, UploadConfig):
config = (None, json.dumps(self.config.to_dict()).encode(), "application/json")
else:
config = (None, str(self.config).encode(), "text/plain")
field_dict: Dict[str, Any] = {}
for prop_name, prop in self.additional_properties.items():
field_dict[prop_name] = (None, str(prop).encode(), "text/plain")
field_dict.update(
{
"file": file,
}
)
if config is not UNSET:
field_dict["config"] = config
return field_dict
```
The function `to_dict` right above it in the file lazily imports `UploadConfig`
```python
def to_dict(self) -> Dict[str, Any]:
from ..models.upload_config import UploadConfig
file = self.file.to_tuple()
```
**OpenAPI Spec File**
```
{
"openapi": "3.1.0",
"info": {
"title": "FastAPI",
"version": "0.1.0"
},
"paths": {
"/uploadfile/": {
"post": {
"summary": "Create Upload File",
"operationId": "create_upload_file_uploadfile__post",
"requestBody": {
"content": {
"multipart/form-data": {
"schema": {
"$ref": "#/components/schemas/Body_create_upload_file_uploadfile__post"
}
}
},
"required": true
},
"responses": {
"200": {
"description": "Successful Response",
"content": {
"application/json": {
"schema": {}
}
}
},
"422": {
"description": "Validation Error",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/HTTPValidationError"
}
}
}
}
}
}
}
},
"components": {
"schemas": {
"Body_create_upload_file_uploadfile__post": {
"properties": {
"file": {
"type": "string",
"format": "binary",
"title": "File"
},
"config": {
"anyOf": [
{
"$ref": "#/components/schemas/UploadConfig"
},
{
"type": "null"
}
]
}
},
"type": "object",
"required": [
"file"
],
"title": "Body_create_upload_file_uploadfile__post"
},
"HTTPValidationError": {
"properties": {
"detail": {
"items": {
"$ref": "#/components/schemas/ValidationError"
},
"type": "array",
"title": "Detail"
}
},
"type": "object",
"title": "HTTPValidationError"
},
"UploadConfig": {
"properties": {
"test": {
"type": "string",
"title": "Test",
"default": "test"
},
"config": {
"type": "object",
"title": "Config",
"default": {}
}
},
"type": "object",
"title": "UploadConfig"
},
"ValidationError": {
"properties": {
"loc": {
"items": {
"anyOf": [
{
"type": "string"
},
{
"type": "integer"
}
]
},
"type": "array",
"title": "Location"
},
"msg": {
"type": "string",
"title": "Message"
},
"type": {
"type": "string",
"title": "Error Type"
}
},
"type": "object",
"required": [
"loc",
"msg",
"type"
],
"title": "ValidationError"
}
}
}
}
```
**Desktop (please complete the following information):**
- OS: All
- Python Version: All
- openapi-python-client version 0.20.0
**Additional context**
I have been able to fix this locally by modifying the following
https://github.com/openapi-generators/openapi-python-client/blob/73f92ea7da8daa758bb237daa7cc26030cd32225/openapi_python_client/templates/model.py.jinja#L131-L134
Solution is to add the lazy import template that is found in `to_dict`
https://github.com/openapi-generators/openapi-python-client/blob/73f92ea7da8daa758bb237daa7cc26030cd32225/openapi_python_client/templates/model.py.jinja#L126-L128
| open | 2024-06-07T10:07:22Z | 2024-06-07T11:07:21Z | https://github.com/openapi-generators/openapi-python-client/issues/1051 | [] | mthanded | 0 |
ultralytics/ultralytics | machine-learning | 18,973 | Tensorrt cannot speed up inference time well | ### Search before asking
- [x] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/orgs/ultralytics/discussions) and found no similar questions.
### Question
I have tried to use tensorrt to speed up the inference speed, but it has no impact. Is it normal?
The way I export tensorrt:
```
from ultralytics import YOLO
model = YOLO("yolo11m.pt")
model.export(format="engine")
```
The way I test speed
```
from ultralytics import YOLO
from loguru import logger
import datetime
import cv2
model = YOLO("yolo11m.pt")
start_time = datetime.datetime.today()
img = cv2.imread('bus.jpg')
for _ in range(1000):
model.predict(img)
running_time1 = (datetime.datetime.today() - start_time).seconds
model = YOLO("yolo11m.engine", task='detect')
start_time = datetime.datetime.today()
for _ in range(1000):
model.predict(img)
running_time2 = (datetime.datetime.today() - start_time).seconds
logger.info(f"running time: {running_time1}")
logger.info(f"running time: {running_time2}")
```
The result are both 7 second.
Environment:
Ultralytics 8.3.18 🚀 Python-3.10.12 torch-2.4.1+cu124 CUDA:0 (NVIDIA RTX 4000 Ada Generation, 20040MiB)
Setup complete ✅ (24 CPUs, 62.4 GB RAM, 143.4/913.8 GB disk)
OS Linux-6.8.0-47-generic-x86_64-with-glibc2.35
Environment Linux
Python 3.10.12
Install git
RAM 62.44 GB
Disk 143.4/913.8 GB
CPU Intel Xeon w5-3425
CPU count 24
GPU NVIDIA RTX 4000 Ada Generation, 20040MiB
GPU count 2
CUDA 12.4
numpy ✅ 1.26.3>=1.23.0
matplotlib ✅ 3.9.2>=3.3.0
opencv-python ✅ 4.10.0.84>=4.6.0
pillow ✅ 10.2.0>=7.1.2
pyyaml ✅ 6.0.2>=5.3.1
requests ✅ 2.32.3>=2.23.0
scipy ✅ 1.14.1>=1.4.1
torch ✅ 2.4.1+cu124>=1.8.0
torchvision ✅ 0.19.1+cu124>=0.9.0
tqdm ✅ 4.66.5>=4.64.0
psutil ✅ 6.0.0
py-cpuinfo ✅ 9.0.0
pandas ✅ 2.2.3>=1.1.4
seaborn ✅ 0.13.2>=0.11.0
ultralytics-thop ✅ 2.0.9>=2.0.0
torch ✅ 2.4.1+cu124!=2.4.0,>=1.8.0; sys_platform == "win32"
### Additional
_No response_ | open | 2025-02-03T03:20:32Z | 2025-02-10T09:03:23Z | https://github.com/ultralytics/ultralytics/issues/18973 | [
"question",
"detect",
"exports"
] | darouwan | 9 |
ultralytics/ultralytics | python | 19,294 | Cannot access segment model in mobile hub | Hi
When I try to use my segment model I get the message that currently only detection models are supported.
Ok, but how does this fit with the remark
> @AstroCIEL Segment models also automatically are Detect models, they output both bounding boxes and segment masks.
_Originally posted by @glenn-jocher in [#14648](https://github.com/ultralytics/ultralytics/issues/14648#issuecomment-2247479874)_
Thanks for any clarification | open | 2025-02-18T09:51:38Z | 2025-02-20T23:51:27Z | https://github.com/ultralytics/ultralytics/issues/19294 | [
"question",
"HUB",
"segment"
] | metagic | 4 |
Sanster/IOPaint | pytorch | 498 | Cant find an option to use mask or alpha channel in inpaint | **Model**
Which model are you using?
Fooocus inpaint + canny controlnet
**Describe the bug**
A clear and concise description of what the bug is.
In previous versions ( lama-cleaner) it was possible to upload a mask together with the image, I can't find this option again or use the alpha channel in the image as a mask.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**System Info**
Software version used
- lama-cleaner: iopaint
- pytorch:
- CUDA:
| closed | 2024-03-28T20:25:58Z | 2025-01-12T02:06:13Z | https://github.com/Sanster/IOPaint/issues/498 | [
"stale"
] | ShmuelRonen | 4 |
aleju/imgaug | deep-learning | 118 | Probablity value to use in OneOf | Hi @aleju ,
I am using the OneOf option to do Fliplr, Flipud and rotate.
My question is in this case since i want one of the augmentations, is it better to put a probability of 1
for Fliplr and Flipud ? so my code looks like this:
augmentation = imgaug.augmenters.OneOf([
imgaug.augmenters.Fliplr(1.0),
imgaug.augmenters.Affine(rotate=(-10,10)),
imgaug.augmenters.Flipud(1.0)
])
Or is this a bad idea and we should use Fliplr(0.5) with the OneOf option.
Thanks
Mehul
| open | 2018-04-05T15:52:40Z | 2022-06-09T11:27:07Z | https://github.com/aleju/imgaug/issues/118 | [] | mpsampat | 4 |
Kanaries/pygwalker | plotly | 277 | Allow single spec file same for multiple cell (different Dataframes) visualization | **Issue:**
Current functionality does not allow saving different visualizations(different Jupyter cells and different data frames) to a single config file.
If you plan to share your repo with someone for collaboration, this clutters the filespace.
| open | 2023-10-21T02:58:33Z | 2023-11-06T01:53:31Z | https://github.com/Kanaries/pygwalker/issues/277 | [
"enhancement"
] | rishabh-dream11 | 3 |
biolab/orange3 | numpy | 6,720 | Orange and PyMC? |
**What's your use case?**
Bayesian Modelling e.g. using [PyMC](https://www.pymc.io/welcome.html) has already a visual representation built into it's concept. E.g. converting the model to `pm.model_to_graphviz(model)`
Adapting the Orange framework to Bayesian or probabalist modelling in general, might lower the bounds for students to learn these concepts.
**What's your proposed solution?**
I haven't looked into the details but there is a functionality within PyMC to convert a model to networkX:
https://www.pymc.io/projects/docs/en/stable/_modules/pymc/model_graph.html
Would it be possible to do the inverse step? There is already an extension for NetworkX implemented in Orange.
That might be a starting point to generate PyMC models in Orange graphically.
I am not a professional software developer neither an expert in Bayesian modelling but I can code and would be willing to invest some time developing a prototype if someone else is interested.
**Are there any alternative solutions?**
I haven't found any by googling.
| closed | 2024-01-28T08:02:06Z | 2024-02-16T09:42:32Z | https://github.com/biolab/orange3/issues/6720 | [] | MoritzImendoerffer | 1 |
babysor/MockingBird | pytorch | 270 | 合成时人声电音爆裂问题 | 效果不是很好,有时人声还会突然中断变成电子音

| closed | 2021-12-14T10:37:51Z | 2021-12-16T01:58:28Z | https://github.com/babysor/MockingBird/issues/270 | [] | a1140314577 | 2 |
PokeAPI/pokeapi | graphql | 587 | GraphQL implementation | I am willing to try out implementing GraphQL with graphene-django. Does it seem like a good/useful idea?
| closed | 2021-03-08T23:37:04Z | 2021-06-10T09:20:40Z | https://github.com/PokeAPI/pokeapi/issues/587 | [] | AgustinRamiroDiaz | 2 |
flairNLP/flair | nlp | 3,415 | [Question]: Combining two different Models? | ### Question
Hello,
I am a complete freshman in the whole NLP/NER topic and unfortunately don't have much knowledge of python. I am experimenting with NER with Flair and was happy to get the NER with Flair up and running (with good results using the NER-German-large model). Now there are a few entities like dates or even languages in the english model or the 18-Entities Ontonotes model I want to extract as well. I was wondering if it is possible to combine the german model and for example the english one and extract certain entities of both models?
Due to my lacking knowledge of Python I am hesitant to training a new model and would rather rely on already existing models.
How could a code could look like? I would highly apreciate if someone could help and enlighten me!
Thanks in advance.
Here is my code so far:
from flair.nn import Classifier
from flair.data import Sentence
tagger = Classifier.load("flair/ner-german-large")
txt_file_path = …
with open(txt_file_path, 'r', encoding='utf-8') as file:
text_content = file.read()
sentence = Sentence(text_content)
tagger.predict(sentence)
for label in sentence.get_labels('ner'):
print(label)
| closed | 2024-03-01T11:39:44Z | 2025-03-12T08:02:46Z | https://github.com/flairNLP/flair/issues/3415 | [
"question"
] | ghost | 1 |
koxudaxi/datamodel-code-generator | fastapi | 1,848 | TypeAlias does not work for python versions <3.10 | **Describe the bug**
When `from typing import TypeAlias` is used in the output models, it does not take into consideration that support for this was introduced in python 3.10. Any versions below 3.10 will generate models that will not work.
The potential solution would be to use `from typing_extensions import TypeAlias
**To Reproduce**
Example schema:
```graphql
type DummyType {
id: ID!
label: String!
value: Int!
}
```
Used commandline:
```
$ datamodel-codegen --input ./example.graphql --input-file-type=graphql --output-model-type pydantic_v2.BaseModel --target-python-version 3.8 --output example_models.py
```
**Expected behavior**
The outputted example_models.py should be code that can be run using python3.8.
**Version:**
- OS: [e.g. iOS] Mac OS 14, aws lambda runtime
- Python version: 3.8
- datamodel-code-generator version: [e.g. 22] 0.25.3
**Additional context**
Add any other context about the problem here.
| open | 2024-02-09T09:43:13Z | 2025-02-07T18:17:18Z | https://github.com/koxudaxi/datamodel-code-generator/issues/1848 | [
"bug",
"help wanted"
] | paultop6 | 5 |
HumanSignal/labelImg | deep-learning | 395 | Orientation issue exif related | Photos taken by iPhones in Portrait mode show up as landscape. This could be an Exif related issue. Photos taken in landscape mode seem fine. I will provide more details as I work with this tool more to characterize what's really wrong. | closed | 2018-11-18T04:01:09Z | 2023-02-03T18:26:46Z | https://github.com/HumanSignal/labelImg/issues/395 | [] | kechan | 1 |
AUTOMATIC1111/stable-diffusion-webui | pytorch | 15,519 | [Bug]: Worse results with latest version | ### Checklist
- [X] The issue exists after disabling all extensions
- [ ] The issue exists on a clean installation of webui
- [ ] The issue is caused by an extension, but I believe it is caused by a bug in the webui
- [x] The issue exists in the current version of the webui
- [X] The issue has not been reported before recently
- [ ] The issue has been reported before but has not been fixed yet
### What happened?
I use DPM++ 3M SDE Karras, after this new update, where sampler and schedule type are now separated, my results are much worse, they look like what my generations from a year ago looked like. It seems like it's kinda ignoring my negative prompts now. I get watermarks in all pictures, hands are horrible now. THIS didn't happen yesterday before I upgraded to this new version
### Steps to reproduce the problem
DPM++ 3M SDE Karras
30 Steps
CFG 5
Same Model
Same Resolution
### What should have happened?
Something related to this scheduler type. Negative prompt is weird now too
### What browsers do you use to access the UI ?
Google Chrome, Other
### Sysinfo
[sysinfo-2024-04-14-22-16.json](https://github.com/AUTOMATIC1111/stable-diffusion-webui/files/14972331/sysinfo-2024-04-14-22-16.json)
### Console logs
```Shell
Already up to date.
venv "F:\DeepFaceLab\stable-diffusion-webui-SDXL\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug 1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: v1.9.0
Commit hash: adadb4e3c7382bf3e4f7519126cd6c70f4f8557b
CUDA 11.8
Launching Web UI with arguments: --opt-split-attention --xformers --no-half-vae --medvram --disable-safe-unpickle
==============================================================================
You are running torch 2.0.1+cu118.
The program is tested to work with torch 2.1.2.
To reinstall the desired version, run with commandline flag --reinstall-torch.
Beware that this will cause a lot of large files to be downloaded, as well as
there are reports of issues with training tab on the latest version.
Use --skip-version-check commandline argument to disable this check.
==============================================================================
=================================================================================
You are running xformers 0.0.20.
The program is tested to work with xformers 0.0.23.post1.
To reinstall the desired version, run with commandline flag --reinstall-xformers.
Use --skip-version-check commandline argument to disable this check.
=================================================================================
ControlNet preprocessor location: F:\DeepFaceLab\stable-diffusion-webui-SDXL\extensions\__sd-webui-controlnet\annotator\downloads
2024-04-14 23:24:54,441 - ControlNet - INFO - ControlNet v1.1.443
2024-04-14 23:24:54,694 - ControlNet - INFO - ControlNet v1.1.443
[-] ADetailer initialized. version: 24.4.1, num models: 12
23:24:56 - ReActor - STATUS - Running v0.7.0-b7 on Device: CUDA
23:24:56 - ReActor - STATUS - Running v0.7.0-b7 on Device: CUDA
[-] ADetailer initialized. version: 24.4.1, num models: 12
[AddNet] Updating model hashes...
0it [00:00, ?it/s]
[AddNet] Updating model hashes...
0it [00:00, ?it/s]
[Vec. CC] Style Sheet Loaded...
Loading weights [a6698c0f30] from F:\DeepFaceLab\stable-diffusion-webui-SDXL\models\Stable-diffusion\STOCK-NSFW.safetensors
Creating model from config: F:\DeepFaceLab\stable-diffusion-webui-SDXL\repositories\generative-models\configs\inference\sd_xl_base.yaml
Applying attention optimization: xformers... done.
Model loaded in 7.3s (load weights from disk: 0.6s, create model: 0.5s, apply weights to model: 5.5s, calculate empty prompt: 0.6s).
2024-04-14 23:25:06,073 - ControlNet - INFO - ControlNet UI callback registered.
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Startup time: 42.3s (prepare environment: 18.1s, import torch: 5.4s, import gradio: 1.6s, setup paths: 1.2s, initialize shared: 0.2s, other imports: 1.3s, list SD models: 0.1s, load scripts: 5.9s, create ui: 8.2s, gradio launch: 0.3s).
INFO:sd_dynamic_prompts.dynamic_prompting:Prompt matrix will create 5 images in a total of 5 batches.
```
### Additional information
I have updated WebUI version. Not other changes were made. | open | 2024-04-14T22:19:29Z | 2024-09-14T10:09:35Z | https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/15519 | [
"bug-report"
] | Pisha99 | 26 |
amdegroot/ssd.pytorch | computer-vision | 513 | train.py:KeyError: 'van' | This's my bug:
-----------------
iter 0 || Loss: 20.2327 || timer: 0.1706 sec.
iter 10 || Loss: 12.8549 || Traceback (most recent call last):
File "train.py", line 259, in <module>
train()
File "train.py", line 165, in train
images, targets = next(batch_iterator)
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 801, in __next__
return self._process_data(data)
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 846, in _process_data
data.reraise()
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/_utils.py", line 385, in reraise
raise self.exc_type(msg)
KeyError: Caught KeyError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
data = fetcher.fetch(index)
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/its/anaconda3/envs/torch13/lib/python3.5/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/its/ssd.pytorch/data/voc0712.py", line 118, in __getitem__
im, gt, h, w = self.pull_item(index)
File "/home/its/ssd.pytorch/data/voc0712.py", line 133, in pull_item
target = self.target_transform(target, width, height)
File "/home/its/ssd.pytorch/data/voc0712.py", line 74, in __call__
label_idx = self.class_to_ind[name]
KeyError: 'van'
------------------------
I think the question was caused by the capital letter.
my VOC_CLASSES is 'Car', 'Bus','Truck','Van','Minibus' , and I have changed<name = obj.find('name').text.lower().strip()> into <name = obj.find('name').text.strip()> (voc0712.py)
where should I change? | closed | 2020-08-27T06:13:31Z | 2022-03-20T12:16:48Z | https://github.com/amdegroot/ssd.pytorch/issues/513 | [] | zhao-97 | 3 |
SciTools/cartopy | matplotlib | 1,869 | Cartopy crashes on Google Colab notebook: URLError: <urlopen error [Errno -3] Temporary failure in name resolution> | ### Description
<!-- Please provide a general introduction to the issue/proposal. -->
Trying to run Cartopy on this Colab notebook: https://colab.research.google.com/drive/12AbSjq5gJd5mhcZDplH-WFl9vtUFc-wi#scrollTo=gsPHPY_xxmeL
However, any attempt to download coastlines or features leads to a URLError. Is this simply because the website, https://naciscdn.org/, is down? Or perhaps there's some discrepancy between Python versions? I'm not sure what the cause is here, but is there any effort to fix this issue?
<!--
If you are reporting a bug, attach the *entire* traceback from Python.
If you are proposing an enhancement/new feature, provide links to related articles, reference examples, etc.
If you are asking a question, please ask on StackOverflow and use the cartopy tag. All cartopy
questions on StackOverflow can be found at https://stackoverflow.com/questions/tagged/cartopy
-->
#### Code to reproduce
```
!apt-get install -qq libgdal-dev libproj-dev
#!pip install --no-binary shapely shapely
!pip install --no-binary shapely shapely --force
!pip install cartopy
import cartopy.crs as ccrs
import cartopy.feature as cfeature
import matplotlib.pyplot as plt
%matplotlib inline
ax = plt.axes(projection=ccrs.PlateCarree())
ax.coastlines()
plt.show()
```
#### Traceback
```
/usr/local/lib/python3.7/dist-packages/cartopy/io/__init__.py:241: DownloadWarning: Downloading: https://naciscdn.org/naturalearth/110m/physical/ne_110m_coastline.zip
warnings.warn('Downloading: {}'.format(url), DownloadWarning)
---------------------------------------------------------------------------
gaierror Traceback (most recent call last)
/usr/lib/python3.7/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
1349 h.request(req.get_method(), req.selector, req.data, headers,
-> 1350 encode_chunked=req.has_header('Transfer-encoding'))
1351 except OSError as err: # timeout error
35 frames
/usr/lib/python3.7/http/client.py in request(self, method, url, body, headers, encode_chunked)
1280 """Send a complete request to the server."""
-> 1281 self._send_request(method, url, body, headers, encode_chunked)
1282
/usr/lib/python3.7/http/client.py in _send_request(self, method, url, body, headers, encode_chunked)
1326 body = _encode(body, 'body')
-> 1327 self.endheaders(body, encode_chunked=encode_chunked)
1328
/usr/lib/python3.7/http/client.py in endheaders(self, message_body, encode_chunked)
1275 raise CannotSendHeader()
-> 1276 self._send_output(message_body, encode_chunked=encode_chunked)
1277
/usr/lib/python3.7/http/client.py in _send_output(self, message_body, encode_chunked)
1035 del self._buffer[:]
-> 1036 self.send(msg)
1037
/usr/lib/python3.7/http/client.py in send(self, data)
975 if self.auto_open:
--> 976 self.connect()
977 else:
/usr/lib/python3.7/http/client.py in connect(self)
1442
-> 1443 super().connect()
1444
/usr/lib/python3.7/http/client.py in connect(self)
947 self.sock = self._create_connection(
--> 948 (self.host,self.port), self.timeout, self.source_address)
949 self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
/usr/lib/python3.7/socket.py in create_connection(address, timeout, source_address)
706 err = None
--> 707 for res in getaddrinfo(host, port, 0, SOCK_STREAM):
708 af, socktype, proto, canonname, sa = res
/usr/lib/python3.7/socket.py in getaddrinfo(host, port, family, type, proto, flags)
751 addrlist = []
--> 752 for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
753 af, socktype, proto, canonname, sa = res
gaierror: [Errno -3] Temporary failure in name resolution
During handling of the above exception, another exception occurred:
URLError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/IPython/core/formatters.py in __call__(self, obj)
332 pass
333 else:
--> 334 return printer(obj)
335 # Finally look for special method names
336 method = get_real_method(obj, self.print_method)
/usr/local/lib/python3.7/dist-packages/IPython/core/pylabtools.py in <lambda>(fig)
239
240 if 'png' in formats:
--> 241 png_formatter.for_type(Figure, lambda fig: print_figure(fig, 'png', **kwargs))
242 if 'retina' in formats or 'png2x' in formats:
243 png_formatter.for_type(Figure, lambda fig: retina_figure(fig, **kwargs))
/usr/local/lib/python3.7/dist-packages/IPython/core/pylabtools.py in print_figure(fig, fmt, bbox_inches, **kwargs)
123
124 bytes_io = BytesIO()
--> 125 fig.canvas.print_figure(bytes_io, **kw)
126 data = bytes_io.getvalue()
127 if fmt == 'svg':
/usr/local/lib/python3.7/dist-packages/matplotlib/backend_bases.py in print_figure(self, filename, dpi, facecolor, edgecolor, orientation, format, bbox_inches, **kwargs)
2098 else suppress())
2099 with ctx:
-> 2100 self.figure.draw(renderer)
2101 bbox_artists = kwargs.pop("bbox_extra_artists", None)
2102 bbox_inches = self.figure.get_tightbbox(renderer,
/usr/local/lib/python3.7/dist-packages/matplotlib/artist.py in draw_wrapper(artist, renderer, *args, **kwargs)
36 renderer.start_filter()
37
---> 38 return draw(artist, renderer, *args, **kwargs)
39 finally:
40 if artist.get_agg_filter() is not None:
/usr/local/lib/python3.7/dist-packages/matplotlib/figure.py in draw(self, renderer)
1734 self.patch.draw(renderer)
1735 mimage._draw_list_compositing_images(
-> 1736 renderer, self, artists, self.suppressComposite)
1737
1738 renderer.close_group('figure')
/usr/local/lib/python3.7/dist-packages/matplotlib/image.py in _draw_list_compositing_images(renderer, parent, artists, suppress_composite)
135 if not_composite or not has_images:
136 for a in artists:
--> 137 a.draw(renderer)
138 else:
139 # Composite any adjacent images together
/usr/local/lib/python3.7/dist-packages/matplotlib/artist.py in draw_wrapper(artist, renderer, *args, **kwargs)
36 renderer.start_filter()
37
---> 38 return draw(artist, renderer, *args, **kwargs)
39 finally:
40 if artist.get_agg_filter() is not None:
/usr/local/lib/python3.7/dist-packages/cartopy/mpl/geoaxes.py in draw(self, renderer, **kwargs)
515 self._done_img_factory = True
516
--> 517 return matplotlib.axes.Axes.draw(self, renderer=renderer, **kwargs)
518
519 def _update_title_position(self, renderer):
/usr/local/lib/python3.7/dist-packages/matplotlib/artist.py in draw_wrapper(artist, renderer, *args, **kwargs)
36 renderer.start_filter()
37
---> 38 return draw(artist, renderer, *args, **kwargs)
39 finally:
40 if artist.get_agg_filter() is not None:
/usr/local/lib/python3.7/dist-packages/matplotlib/axes/_base.py in draw(self, renderer, inframe)
2628 renderer.stop_rasterizing()
2629
-> 2630 mimage._draw_list_compositing_images(renderer, self, artists)
2631
2632 renderer.close_group('axes')
/usr/local/lib/python3.7/dist-packages/matplotlib/image.py in _draw_list_compositing_images(renderer, parent, artists, suppress_composite)
135 if not_composite or not has_images:
136 for a in artists:
--> 137 a.draw(renderer)
138 else:
139 # Composite any adjacent images together
/usr/local/lib/python3.7/dist-packages/matplotlib/artist.py in draw_wrapper(artist, renderer, *args, **kwargs)
36 renderer.start_filter()
37
---> 38 return draw(artist, renderer, *args, **kwargs)
39 finally:
40 if artist.get_agg_filter() is not None:
/usr/local/lib/python3.7/dist-packages/cartopy/mpl/feature_artist.py in draw(self, renderer, *args, **kwargs)
151 except ValueError:
152 warnings.warn('Unable to determine extent. Defaulting to global.')
--> 153 geoms = self._feature.intersecting_geometries(extent)
154
155 # Combine all the keyword args in priority order.
/usr/local/lib/python3.7/dist-packages/cartopy/feature/__init__.py in intersecting_geometries(self, extent)
295 """
296 self.scaler.scale_from_extent(extent)
--> 297 return super().intersecting_geometries(extent)
298
299 def with_scale(self, new_scale):
/usr/local/lib/python3.7/dist-packages/cartopy/feature/__init__.py in intersecting_geometries(self, extent)
104 extent_geom = sgeom.box(extent[0], extent[2],
105 extent[1], extent[3])
--> 106 return (geom for geom in self.geometries() if
107 geom is not None and extent_geom.intersects(geom))
108 else:
/usr/local/lib/python3.7/dist-packages/cartopy/feature/__init__.py in geometries(self)
279 path = shapereader.natural_earth(resolution=self.scale,
280 category=self.category,
--> 281 name=self.name)
282 geometries = tuple(shapereader.Reader(path).geometries())
283 _NATURAL_EARTH_GEOM_CACHE[key] = geometries
/usr/local/lib/python3.7/dist-packages/cartopy/io/shapereader.py in natural_earth(resolution, category, name)
280 format_dict = {'config': config, 'category': category,
281 'name': name, 'resolution': resolution}
--> 282 return ne_downloader.path(format_dict)
283
284
/usr/local/lib/python3.7/dist-packages/cartopy/io/__init__.py in path(self, format_dict)
201 else:
202 # we need to download the file
--> 203 result_path = self.acquire_resource(target_path, format_dict)
204
205 return result_path
/usr/local/lib/python3.7/dist-packages/cartopy/io/shapereader.py in acquire_resource(self, target_path, format_dict)
335 url = self.url(format_dict)
336
--> 337 shapefile_online = self._urlopen(url)
338
339 zfh = ZipFile(io.BytesIO(shapefile_online.read()), 'r')
/usr/local/lib/python3.7/dist-packages/cartopy/io/__init__.py in _urlopen(self, url)
240 """
241 warnings.warn('Downloading: {}'.format(url), DownloadWarning)
--> 242 return urlopen(url)
243
244 @staticmethod
/usr/lib/python3.7/urllib/request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context)
220 else:
221 opener = _opener
--> 222 return opener.open(url, data, timeout)
223
224 def install_opener(opener):
/usr/lib/python3.7/urllib/request.py in open(self, fullurl, data, timeout)
523 req = meth(req)
524
--> 525 response = self._open(req, data)
526
527 # post-process response
/usr/lib/python3.7/urllib/request.py in _open(self, req, data)
541 protocol = req.type
542 result = self._call_chain(self.handle_open, protocol, protocol +
--> 543 '_open', req)
544 if result:
545 return result
/usr/lib/python3.7/urllib/request.py in _call_chain(self, chain, kind, meth_name, *args)
501 for handler in handlers:
502 func = getattr(handler, meth_name)
--> 503 result = func(*args)
504 if result is not None:
505 return result
/usr/lib/python3.7/urllib/request.py in https_open(self, req)
1391 def https_open(self, req):
1392 return self.do_open(http.client.HTTPSConnection, req,
-> 1393 context=self._context, check_hostname=self._check_hostname)
1394
1395 https_request = AbstractHTTPHandler.do_request_
/usr/lib/python3.7/urllib/request.py in do_open(self, http_class, req, **http_conn_args)
1350 encode_chunked=req.has_header('Transfer-encoding'))
1351 except OSError as err: # timeout error
-> 1352 raise URLError(err)
1353 r = h.getresponse()
1354 except:
URLError: <urlopen error [Errno -3] Temporary failure in name resolution>
```
<details>
<summary>Full environment definition</summary>
<!-- fill in the following information as appropriate -->
### Operating system
Google Colab on Google Chrome Browser
### Cartopy version
Not sure. 0.19.0?
### conda list
```
```
### pip list
```
```
</details>
| closed | 2021-09-15T21:46:52Z | 2021-09-17T09:26:15Z | https://github.com/SciTools/cartopy/issues/1869 | [] | HaynesStephens | 13 |
taverntesting/tavern | pytest | 596 | Help needed: content-type; request body elements order, printing the response | I am trying out Tavern with RestfullBooker API. One of the test cases is for [creating a booking](https://restful-booker.herokuapp.com/apidoc/index.html#api-Booking-CreateBooking). For some reason, I keep getting content-type text/html in response instead of application/json, although that is the content-type I am sending through the request headers. Can anybody spot the issue here?
It also seems that the request body key-value pairs are not being sent in the order I wrote them - can I somehow assure the order? I am also unable to print the actual response and see what I actually get. **EDIT**: I actually did manage to print the response, it is "<Response [200]>", but there is nothing else in the body. I expect to get a booking id and a booking object in a json format.
```
---
test_name: Create a booking
includes:
- !include ..\utils.yaml
stages:
- name: Post a new booking
request:
url: "{environment.url}/booking"
headers:
content-type: application/json
accept: application/json
method: POST
json:
firstname: "{data.user_name}"
lastname: "{data.user_surname}"
totalprice: "{data.booking_price}"
depositpaid: "{data.dep_paid}"
bookingdates:
checkin: "{data.date_checkin}"
checkout: "{data.date_checkout}"
additionalneeds: "{data.add_needs}"
response:
status_code: 200
headers:
content-type: application/json
save:
json:
booking_id: bookingid`
```
This is the error summary I get when running this test with tox (tox -e tavern):
`Errors:
E tavern.util.exceptions.TestFailError: Test 'Post a new booking' failed:
- Key mismatch: (expected["content-type"] = 'application/json' (type = <class 'tavern.util.dict_util._FormattedString'>), actual["content-type"] = 'text/html; charset=utf-8' (type = <class 'str'>))
- No json in response (wanted to save {'booking_id': 'bookingid'})
` | closed | 2020-09-07T18:26:37Z | 2021-01-30T16:38:51Z | https://github.com/taverntesting/tavern/issues/596 | [] | api-girl | 4 |
ydataai/ydata-profiling | jupyter | 800 | Ambiguous config usage documentation | Documentation about [how to alter the configurations](URL) directly starts with ``` profile = df.profile_report(title="Pandas Profiling Report", pool_size=1)``` and it is absolutely not clear what the ```df``` is. If it is a pandas DataFrame, it does not have profile_report as a defined method. | open | 2021-06-19T13:03:36Z | 2021-10-14T09:45:46Z | https://github.com/ydataai/ydata-profiling/issues/800 | [
"documentation 📖",
"Hacktoberfest :fireworks:"
] | nitinmnsn | 3 |
jupyterhub/jupyterhub-deploy-docker | jupyter | 107 | build failed "No module named 'conda'" | ### description
build failed
### context:
- no custom environnement variables
- Docker version 20.10.5, build 55c4c88
- docker-compose version 1.29.2, build 5becea4c
### how to reproduce
1. git clone https://github.com/jupyterhub/jupyterhub-deploy-docker
1. mkdir secrets
1. echo -e admin admin > secrets/userlist
1. generate self signed key/cert:
- openssl genrsa 2048 > server.key
- chmod 400 server.key
- openssl req -new -x509 -nodes -sha256 -days 365 -key server.key -out server.crt
1. make build
### logs
```
> make build
Generating postgres password in secrets/postgres.env
fa0f2834b7b97136c8539fc21aaf25fdc9a07b2c14fb6a2a257278654c4b1a6e
jupyterhub-data
jupyterhub-db-data
docker-compose build
hub-db uses an image, skipping
Building hub
Sending build context to Docker daemon 295.4kB
Step 1/9 : ARG JUPYTERHUB_VERSION
Step 2/9 : FROM jupyterhub/jupyterhub-onbuild:$JUPYTERHUB_VERSION
0.9.2: Pulling from jupyterhub/jupyterhub-onbuild
c64513b74145: Pull complete
01b8b12bad90: Pull complete
c5d85cf7a05f: Pull complete
b6b268720157: Pull complete
e12192999ff1: Pull complete
256ccce151e7: Pull complete
f0d04906c48f: Pull complete
3f8d9ed60d29: Pull complete
9a4e3ab6e62d: Pull complete
39d4f72bd19a: Pull complete
Digest: sha256:c44611b52a3a64494ba51cf4f2abb2cbc17ceba325e5fe172be80c6e6c3a5be8
Status: Downloaded newer image for jupyterhub/jupyterhub-onbuild:0.9.2
# Executing 1 build trigger
---> 119c8c9149d7
Step 3/9 : RUN /opt/conda/bin/conda install -yq psycopg2=2.7 && /opt/conda/bin/conda clean -tipsy && /opt/conda/bin/pip install --no-cache-dir jupyterhub-dummyauthenticator dockerspawner==12.0.*
---> Running in 65a57f248a0d
Solving environment: ...working... done
## Package Plan ##
environment location: /opt/conda
added / updated specs:
- psycopg2=2.7
The following packages will be downloaded:
package | build
---------------------------|-----------------
libstdcxx-ng-9.3.0 | hd4cf53a_17 4.0 MB
ca-certificates-2021.5.25 | h06a4308_1 118 KB
libffi-3.2.1 | hf484d3e_1007 52 KB
openssl-1.1.1k | h27cfd23_0 3.8 MB
ncurses-6.2 | he6710b0_1 1.1 MB
libgomp-9.3.0 | h5101ec6_17 378 KB
psycopg2-2.7.6.1 | py37h1ba5d50_0 308 KB
readline-7.0 | h7b6447c_5 392 KB
wheel-0.36.2 | pyhd3eb1b0_0 31 KB
zlib-1.2.11 | h7b6447c_3 120 KB
libpq-11.2 | h20c2e04_0 2.7 MB
pip-21.1.2 | py37h06a4308_0 2.0 MB
_libgcc_mutex-0.1 | main 3 KB
setuptools-52.0.0 | py37h06a4308_0 921 KB
ld_impl_linux-64-2.35.1 | h7274673_9 637 KB
krb5-1.16.4 | h173b8e3_0 1.4 MB
sqlite-3.33.0 | h62c20be_0 2.0 MB
libedit-3.1.20210216 | h27cfd23_1 190 KB
certifi-2021.5.30 | py37h06a4308_0 141 KB
xz-5.2.5 | h7b6447c_0 438 KB
libgcc-ng-9.3.0 | h5101ec6_17 7.8 MB
python-3.7.7 |h191fe78_0_cpython 52.6 MB
_openmp_mutex-4.5 | 1_gnu 22 KB
------------------------------------------------------------
Total: 81.0 MB
The following NEW packages will be INSTALLED:
_libgcc_mutex: 0.1-main
_openmp_mutex: 4.5-1_gnu
krb5: 1.16.4-h173b8e3_0
ld_impl_linux-64: 2.35.1-h7274673_9
libgomp: 9.3.0-h5101ec6_17
libpq: 11.2-h20c2e04_0
psycopg2: 2.7.6.1-py37h1ba5d50_0
The following packages will be UPDATED:
ca-certificates: 2018.4.16-0 conda-forge --> 2021.5.25-h06a4308_1
certifi: 2018.8.13-py36_0 conda-forge --> 2021.5.30-py37h06a4308_0
decorator: 4.3.0-py_0 conda-forge --> 4.3.0-py_0 conda-forge
ipython_genutils: 0.2.0-py_1 conda-forge --> 0.2.0-py_1 conda-forge
jinja2: 2.10-py_1 conda-forge --> 2.10-py_1 conda-forge
libedit: 3.1.20170329-haf1bffa_1 conda-forge --> 3.1.20210216-h27cfd23_1
libffi: 3.2.1-hd88cf55_4 --> 3.2.1-hf484d3e_1007
libgcc-ng: 7.2.0-hdf63c60_3 --> 9.3.0-h5101ec6_17
libstdcxx-ng: 7.2.0-hdf63c60_3 --> 9.3.0-hd4cf53a_17
ncurses: 6.1-hfc679d8_1 conda-forge --> 6.2-he6710b0_1
openssl: 1.0.2o-h470a237_1 conda-forge --> 1.1.1k-h27cfd23_0
pip: 18.0-py36_1 conda-forge --> 21.1.2-py37h06a4308_0
python: 3.6.6-h5001a0f_0 conda-forge --> 3.7.7-h191fe78_0_cpython
readline: 7.0-ha6073c6_4 --> 7.0-h7b6447c_5
setuptools: 39.0.1-py36_0 --> 52.0.0-py37h06a4308_0
sqlite: 3.24.0-h2f33b56_0 conda-forge --> 3.33.0-h62c20be_0
wheel: 0.31.0-py36_0 --> 0.36.2-pyhd3eb1b0_0
xz: 5.2.3-h55aa19d_2 --> 5.2.5-h7b6447c_0
zlib: 1.2.11-ha838bed_2 --> 1.2.11-h7b6447c_3
Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working...
done
Traceback (most recent call last):
File "/opt/conda/bin/conda", line 7, in <module>
from conda.cli import main
ModuleNotFoundError: No module named 'conda'
The command '/bin/sh -c /opt/conda/bin/conda install -yq psycopg2=2.7 && /opt/conda/bin/conda clean -tipsy && /opt/conda/bin/pip install --no-cache-dir jupyterhub-dummyauthenticator dockerspawner==12.0.*' returned a non-zero code: 1
ERROR: Service 'hub' failed to build : Build failed
make: *** [build] Error 1
``` | closed | 2021-06-30T15:33:30Z | 2022-12-05T00:47:26Z | https://github.com/jupyterhub/jupyterhub-deploy-docker/issues/107 | [] | juleslagarde | 7 |
jonaswinkler/paperless-ng | django | 816 | [BUG] Cannot change base href |
**Describe the bug**
I'm using paperless with using caddy reverse proxy I cannot get it working correctly because when I open:
https://example.com/paperlessng/dashboard im being redirected by paperless-ng server to:
https://example.com/accounts/login/?next=/paperlessng/dashboard
If I run it locally (192.168.X.X.) I can log in using 192.168.0.2/accounts/login/ and after that I'm being redirected to 192.168.0.2/paperlessng/accounts/login/.
What could be important is that '/accounts/login/' page doesn't have <base href="/paperlessng/"> node but the 'paperlessng/dashboard' has it.
**To Reproduce**
Create paperless docker container using PAPERLESS_FORCE_SCRIPT_NAME option.
**Expected behavior**
During redirection base dir (/paperlessng) should not be stripped and I should be able to open https://example.com/paperlessng/accounts/login/?next=/paperlessng/dashboard
**Relevant information**
Caddy2 config
```
redir /paperlessng /paperlessng/
handle /paperlessng/* {
reverse_proxy paperlessng:8000 {
header_up X-Real-IP {remote_host}
}
}
```
Docker compose
```
paperless:
image: jonaswinkler/paperless-ng:latest
container_name: paperlessng
hostname: paperlessng
restart: unless-stopped
depends_on:
- paperlessng_db
- broker
ports:
- 8000:8000
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000"]
interval: 30s
timeout: 10s
retries: 5
volumes:
- ./paperlessng/paperless/data:/usr/src/paperless/data
- ./paperlessng/paperless/media:/usr/src/paperless/media
- ./paperlessng/paperless/export:/usr/src/paperless/export
- ./paperlessng/paperless/consume:/usr/src/paperless/consume
environment:
PAPERLESS_REDIS: redis://broker:6379
PAPERLESS_DBHOST: paperlessng_db
COMPOSE_PROJECT_NAME: paperless
PAPERLESS_ALLOWED_HOSTS: "example.com,localhost,192.168.0.2"
PAPERLESS_CORS_ALLOWED_HOSTS: "https://example.com,http://localhost,http://192.168.0.2"
PAPERLESS_SECRET_KEY: "secret"
PAPERLESS_FORCE_SCRIPT_NAME: "/paperlessng/"
```
| closed | 2021-03-24T08:45:57Z | 2021-03-24T11:35:40Z | https://github.com/jonaswinkler/paperless-ng/issues/816 | [] | ahaw | 1 |
AirtestProject/Airtest | automation | 1,035 | 报告样式中,poco录制的元素超出屏幕 | 报告样式中,poco录制的元素超出屏幕,样式问题,如下图所示
运行环境:
mac
python 3.9
airtest==1.2.4
<img width="1395" alt="image" src="https://user-images.githubusercontent.com/29191106/161370843-0cf8ea9b-cec8-44cd-b69a-03a1118b0444.png">
| open | 2022-04-02T06:49:39Z | 2022-04-02T06:49:39Z | https://github.com/AirtestProject/Airtest/issues/1035 | [] | Pactortester | 0 |
ashnkumar/sketch-code | tensorflow | 21 | How to use multi .h5 files generated by training my own data? | When I trained my own data, it generated multi .h5 file, how to use them? | closed | 2019-05-27T05:26:49Z | 2019-07-17T13:45:55Z | https://github.com/ashnkumar/sketch-code/issues/21 | [] | lewis617 | 1 |
thp/urlwatch | automation | 247 | Allow --edit even when the urls.yaml file is currently invalid | Once there is a configuration error the `--edit`, `--edit-config` and `--edit-hooks` commands no longer work.
For example if the `urls.yaml` looks like this:
name: test
ur: abc
There will be an error when invoking the `--edit` command:
EDITOR=vim urlwatch --edit
Traceback (most recent call last):
File "/usr/bin/urlwatch", line 107, in <module>
urlwatch = Urlwatch(command_config, config_storage, cache_storage, urls_storage)
File "/usr/lib/python3.6/site-packages/urlwatch/main.py", line 63, in __init__
self.load_jobs()
File "/usr/lib/python3.6/site-packages/urlwatch/main.py", line 84, in load_jobs
jobs = self.urls_storage.load_secure()
File "/usr/lib/python3.6/site-packages/urlwatch/storage.py", line 233, in load_secure
jobs = self.load()
File "/usr/lib/python3.6/site-packages/urlwatch/storage.py", line 322, in load
return [JobBase.unserialize(job) for job in yaml.load_all(fp) if job is not None]
File "/usr/lib/python3.6/site-packages/urlwatch/storage.py", line 322, in <listcomp>
return [JobBase.unserialize(job) for job in yaml.load_all(fp) if job is not None]
File "/usr/lib/python3.6/site-packages/urlwatch/jobs.py", line 118, in unserialize
raise ValueError('Kind is not specified, and no job matches: %r' % (data,))
ValueError: Kind is not specified, and no job matches: {'name': 'test', 'ur': 'abc'}
urlwatch should not parse the config when trying to edit it. Once misconfigured, it cannot be fixed without editing the files directly. There should also be a basic syntax check when saving the file through the `--edit` command. The same is true for any other configuration file (hooks.py, urlwatch.yaml). | closed | 2018-06-06T18:22:10Z | 2018-11-11T20:48:37Z | https://github.com/thp/urlwatch/issues/247 | [] | kbabioch | 4 |
ydataai/ydata-profiling | data-science | 837 | importing ProfileReport seems to interfere with pandas plotting | **Describe the bug**
Running
```
from pandas_profiling import ProfileReport
import pandas as pd
```
seems to interfere with pandas plotting
**To Reproduce**
Run this as a cell in jupyter lab (may be same as jupyter notebook)
```
import pandas as pd
from pandas_profiling import ProfileReport
df = pd.DataFrame(data={"col1": [1, 2]})
df.plot()
```
<img width="507" alt="Screen Shot 2021-09-28 at 5 02 21 PM" src="https://user-images.githubusercontent.com/17162724/135165192-00755b6e-b275-436e-92f1-0958059b1903.png">
Now the same but comment out pandas_profiling
<img width="503" alt="Screen Shot 2021-09-28 at 5 03 39 PM" src="https://user-images.githubusercontent.com/17162724/135165389-0489d2a0-d6bf-418f-b0a5-f784fef759e2.png">
**Version information:**
pandas-profiling 3.1.0 | open | 2021-09-28T21:04:02Z | 2021-10-11T01:21:43Z | https://github.com/ydataai/ydata-profiling/issues/837 | [
"bug 🐛",
"Hacktoberfest :fireworks:"
] | raybellwaves | 2 |
dask/dask | pandas | 11,044 | FileExpired exception when reading parquet from a Minio bucket using Dask | <!-- Please include a self-contained copy-pastable example that generates the issue if possible.
Please be concise with code posted. See guidelines below on how to provide a good bug report:
- Craft Minimal Bug Reports http://matthewrocklin.com/blog/work/2018/02/28/minimal-bug-reports
- Minimal Complete Verifiable Examples https://stackoverflow.com/help/mcve
Bug reports that follow these guidelines are easier to diagnose, and so are often handled much more quickly.
-->
**Describe the issue**:
I have a list of dataframes in a Minio bucket that are updated every 15 minutes. My script runs inside a Docker container in a loop and every 15 minutes a list of futures is created to read and preprocess every dataframe in the bucket. When computing the result it happens sometimes that the following exception is triggered:
```
s3fs.utils.FileExpired: [Errno 16] The remote file corresponding to filename filename.part.0.parquet and Etag "76b9a0ed3044b29e8a326d6f4ade2036" no longer exists
```
and triggers `TypeError: __init__() missing 1 required positional argument: 'e_tag'`. Even catching the exception does not solve the problem since during the next iteration it is triggered again. I checked on Minio and the Etag effectively does not correspond to that in the exception but I do not know how to solve this problem. The code to read data is this
```
df = dd.read_parquet('s3://{}/{}/{}'.format(bucket, path, filename),
storage_options={
"key": key,
"secret": secret,
"client_kwargs": {'endpoint_url': 'http://{}'.format(minio_endpoint)}
})
```
**Minimal Complete Verifiable Example**:
Providing a verifiable example is difficult since it runs on Docker and it is the result of various interacting scripts. I tried to replicate it running a simple script outside of docker but the problem does not appear. This is the script I used that is similar to what the original script does.
```
def load_user(bucket, path, file, minio_endpoint, key, secret):
print(f'Reading {file}...')
df = dd.read_parquet('s3://{}/{}/{}'.format(bucket, path, hash_filename(file)),
storage_options={
"key": key,
"secret": secret,
"client_kwargs": {'endpoint_url': 'http://{}'.format(minio_endpoint)}
},
ignore_metadata_file=True)
return df
if __name__ == "__main__":
bucket = bucket name
path = path
filenames = list of filenames
cluster = LocalCluster(silence_logs=logging.CRITICAL)
dask_client = Client(cluster)
minio_client = Minio(endpoint, key, secret, secure=False)
minio_endpoint = minio_client._base_url.host
key = minio_client._provider._credentials.access_key
secret = minio_client._provider._credentials.secret_key
while True:
futures = []
for file in filenames:
future = dask_client.submit(load_user,
bucket,
path,
file,
minio_endpoint,
key,
secret)
futures.append(future)
for user, future in zip(filenames, futures):
df = future.result()
```
**Anything else we need to know?**:
I have tried to use the function `invalidate_cache()` of s3fs and using `ignore_metadata_file=True` when reading data but it didn't worked. Catching the exception works but the problem is not solved during the following iteration.
Here is the complete traceback if you find it useful
```
Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 113, in _error_wrapper return await func(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/aiobotocore/client.py", line 408, in _make_api_call raise error_class(parsed_response, operation_name)botocore.exceptions.ClientError: An error occurred (PreconditionFailed) when calling the GetObject operation: At least one of the pre-conditions you specified did not holdThe above exception was the direct cause of the following exception:Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 2300, in _fetch_range return _fetch_range( File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 2462, in _fetch_range return sync(fs.loop, _inner_fetch, fs, bucket, key, version_id, start, end, req_kw) File "/usr/local/lib/python3.9/site-packages/fsspec/asyn.py", line 103, in sync raise return_result File "/usr/local/lib/python3.9/site-packages/fsspec/asyn.py", line 56, in _runner result[0] = await coro File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 2480, in _inner_fetch return await _error_wrapper(_call_and_read, retries=fs.retries) File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 142, in _error_wrapper raise err File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 113, in _error_wrapper return await func(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 2467, in _call_and_read resp = await fs._call_s3( File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 362, in _call_s3 return await _error_wrapper( File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 142, in _error_wrapper raise errOSError: [Errno 22] At least one of the pre-conditions you specified did not holdThe above exception was the direct cause of the following exception:Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/dask/backends.py", line 141, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/dask/dataframe/io/parquet/core.py", line 529, in read_parquet read_metadata_result = engine.read_metadata( File "/usr/local/lib/python3.9/site-packages/dask/dataframe/io/parquet/arrow.py", line 546, in read_metadata dataset_info = cls._collect_dataset_info( File "/usr/local/lib/python3.9/site-packages/dask/dataframe/io/parquet/arrow.py", line 1061, in _collect_dataset_info ds = pa_ds.dataset( File "/usr/local/lib/python3.9/site-packages/pyarrow/dataset.py", line 785, in dataset return _filesystem_dataset(source, **kwargs) File "/usr/local/lib/python3.9/site-packages/pyarrow/dataset.py", line 475, in _filesystem_dataset return factory.finish(schema) File "pyarrow/_dataset.pyx", line 3025, in pyarrow._dataset.DatasetFactory.finish File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status File "pyarrow/error.pxi", line 88, in pyarrow.lib.check_status File "/usr/local/lib/python3.9/site-packages/fsspec/spec.py", line 1846, in read out = self.cache._fetch(self.loc, self.loc + length) File "/usr/local/lib/python3.9/site-packages/fsspec/caching.py", line 189, in _fetch self.cache = self.fetcher(start, end) # new block replaces old File "/usr/local/lib/python3.9/site-packages/s3fs/core.py", line 2312, in _fetch_range raise FileExpired(s3fs.utils.FileExpired: [Errno 16] The remote file corresponding to filename 16 and Etag The remote file corresponding to filename filename.part.0.parquet and Etag "76b9a0ed3044b29e8a326d6f4ade2036" no longer exists. no longer exists.During handling of the above exception, another exception occurred:Traceback (most recent call last): File "/python-docker/run.py", line 19, in service.start() File "/python-docker/chimera_ml_app/services.py", line 130, in start self.channel.start_consuming() File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 1883, in start_consuming self._process_data_events(time_limit=None) File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 2044, in _process_data_events self.connection.process_data_events(time_limit=time_limit) File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 851, in process_data_events self._dispatch_channel_events() File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 567, in _dispatch_channel_events impl_channel._get_cookie()._dispatch_events() File "/usr/local/lib/python3.9/site-packages/pika/adapters/blocking_connection.py", line 1510, in _dispatch_events consumer_info.on_message_callback(self, evt.method, File "/python-docker/chimera_ml_app/services.py", line 157, in on_request response = chimera.ml.predict(body=request) File "src/dependency_injector/_cwiring.pyx", line 28, in dependency_injector._cwiring._get_sync_patched._patched File "/python-docker/chimera_ml_app/ml.py", line 20, in predict predictions = predictor.predict() File "/usr/local/lib/python3.9/site-packages/hijacking_detection/pipelines/PipelinePredictorKATANA.py", line 125, in predict preprocessing.preprocess(path, save_path) File "/usr/local/lib/python3.9/site-packages/hijacking_detection/preprocessing/PreprocessingTestResample.py", line 55, in preprocess dataframe = future.result() File "/usr/local/lib/python3.9/site-packages/distributed/client.py", line 323, in result return self.client.sync(self._result, callback_timeout=timeout) File "/usr/local/lib/python3.9/site-packages/hijacking_detection/preprocessing/PreprocessingTestResample.py", line 91, in load_user df = dd.read_parquet('s3://{}/{}'.format(bucket, path), File "/usr/local/lib/python3.9/site-packages/dask/backends.py", line 143, in wrapper raise type(e)(TypeError: __init__() missing 1 required positional argument: 'e_tag'
```
**Environment**:
- Dask version: 2024.2.1
- Python version: 3.9.13
- Operating System: Docker running [python:3.9.17-slim-buster
](https://hub.docker.com/layers/library/python/3.9.17-slim-buster/images/sha256-d5cca64dca485c37ccf06721e36a93bf4331b0404bfd3ef2c7873867965359b7?context=explore)
- Install method (conda, pip, source): pip
| open | 2024-04-11T07:24:36Z | 2024-04-18T07:53:51Z | https://github.com/dask/dask/issues/11044 | [
"dataframe",
"parquet"
] | PietroSpalluto | 6 |
ageitgey/face_recognition | machine-learning | 1,280 | Failing to Detect faces from IPhone images | * face_recognition version: 1.2.3
* Python version: 3.9.1
* Operating System: Mac
### Description
I am working on a website to do emotion detection on faces from images. The first step is to detect faces in an image.
This mostly works fine, but whenever I upload images straight from my iphone to the website, no faces are detected.
The images are properly converted to numpy arrays, but no faces are recognized.
### What I Did
`image = face_recognition.load_image_file(image_fp)`
`face_locations = face_recognition.face_locations(image)`
`print(image.shape)` returns the expected shape so `face_recognition.load_image_file` is working. What could be going wrong?
| open | 2021-02-07T18:13:27Z | 2021-11-29T19:53:55Z | https://github.com/ageitgey/face_recognition/issues/1280 | [] | cullena20 | 3 |
OpenInterpreter/open-interpreter | python | 1,297 | Should not crash from running out of money | ### Describe the bug
Crashes when OpenAI account runs out of money
### Reproduce
Use Open Interpreter when there is little money in account until it runs out.
### Expected behavior
Should print the error message without crashing or dropping the conversation, so you can replenish account and continue.
### Screenshots
```
raise RateLimitError(
litellm.exceptions.RateLimitError: RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
```
### Open Interpreter version
0.2.5
### Python version
Python 3.12.3
### Operating System name and version
Windows 10
### Additional context
_No response_ | open | 2024-06-08T15:18:19Z | 2024-06-08T15:18:52Z | https://github.com/OpenInterpreter/open-interpreter/issues/1297 | [] | endolith | 0 |
waditu/tushare | pandas | 1,376 | pro.hk_hold - 返回数据和接口参数建议 | ### 1 建议增加同时获取 SH SZ数据
问题描述:exchange 参数不能同时设置为 ['SH', 'SZ'] 或 'SH, SZ' 的形式,以同时获取两个市场的数据
level: minor
###2 建议剔除周六数据
问题描述:pro.hk_hold回返回周六的数据,和周五相同
level: minor
```
In [211]: df = pro.hk_hold(trade_date='20200516', exchange='SZ')
In [212]: df
Out[212]:
code trade_date ts_code name vol ratio exchange
0 70001 20200516 000001.SZ 平安银行 1659800356 8.55 SZ
1 70002 20200516 000002.SZ 万 科A 478036252 4.91 SZ
2 70005 20200516 000005.SZ 世纪星源 2010 0.00 SZ
3 70006 20200516 000006.SZ 深振业A 19060581 1.41 SZ
4 70008 20200516 000008.SZ 神州高铁 17847662 0.64 SZ
... ... ... ... ... ... ... ...
1154 77765 20200516 300765.SZ 新诺威 405425 0.09 SZ
1155 77766 20200516 300766.SZ 每日互动 637412 0.15 SZ
1156 77768 20200516 300768.SZ 迪普科技 718660 0.17 SZ
1157 77770 20200516 300770.SZ 新媒股份 964811 0.75 SZ
1158 77773 20200516 300773.SZ 拉卡拉 2311627 0.57 SZ
[1159 rows x 7 columns]
In [213]: df = pro.hk_hold(trade_date='20200515', exchange='SZ')
In [214]: df
Out[214]:
code trade_date ts_code name vol ratio exchange
0 70001 20200515 000001.SZ 平安银行 1659800356 8.55 SZ
1 70002 20200515 000002.SZ 万 科A 478036252 4.91 SZ
2 70005 20200515 000005.SZ 世纪星源 2010 0.00 SZ
3 70006 20200515 000006.SZ 深振业A 19060581 1.41 SZ
4 70008 20200515 000008.SZ 神州高铁 17847662 0.64 SZ
... ... ... ... ... ... ... ...
1154 77765 20200515 300765.SZ 新诺威 405425 0.09 SZ
1155 77766 20200515 300766.SZ 每日互动 637412 0.15 SZ
1156 77768 20200515 300768.SZ 迪普科技 718660 0.17 SZ
1157 77770 20200515 300770.SZ 新媒股份 964811 0.75 SZ
1158 77773 20200515 300773.SZ 拉卡拉 2311627 0.57 SZ
[1159 rows x 7 columns]
In [215]: df = pro.hk_hold(trade_date='20200517', exchange='SZ')
In [216]: df
Out[216]:
Empty DataFrame
Columns: [code, trade_date, ts_code, name, vol, ratio, exchange]
Index: []
``` | closed | 2020-06-14T03:32:27Z | 2020-06-15T10:48:46Z | https://github.com/waditu/tushare/issues/1376 | [] | chrjxj | 1 |
gyli/PyWaffle | data-visualization | 19 | Font awesome license | I notice that files from the Font Awesome project are being used. The project's license [requires attribution](https://fontawesome.com/license/free). IANAL, and I am not affiliated with that project, but I think there probably needs to be a license file for these files, and this needs to be in the MANIFEST.in file (see #18) so it is included in sdists. | closed | 2019-11-19T16:37:48Z | 2020-01-27T01:43:39Z | https://github.com/gyli/PyWaffle/issues/19 | [] | toddrme2178 | 2 |
ultralytics/ultralytics | pytorch | 19,283 | Loss of frames in inference on Yolo11 and greyed out output video | ### Search before asking
- [x] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and found no similar bug report.
### Ultralytics YOLO Component
Predict
### Bug
I'm having the same problem, I'm processing RTSP and generating videos based on the inferences made, but the video runs for a few minutes normally, then it starts to turn gray and I lose a lot of frames, leaving 10 minutes of processing, I get an output video of around 6 to 7 minutes. I've already tried all the suggestions above, but nothing worked, I'm using Yolo11.
And during inference I also have the following error: [hevc @ 0x7f323821d480] Could not find ref with POC 39
I saw in some other posts, people changing the video compressor, testing XVID, MP4V, MJPG And still, I'm having both problems, does anyone have any suggestions?
**Example image from video**

### Environment
[hevc @ 0x7f323821d480] Could not find ref with POC 39
### Minimal Reproducible Example
import cv2
import time
import threading
from ultralytics import YOLO
def process_stream(rtsp_url, output_name, model_path, id):
capture = cv2.VideoCapture(rtsp_url, cv2.CAP_FFMPEG)
if not capture.isOpened():
print(f"Erro ao abrir stream: {rtsp_url}, tentando novamente em 5 segundos...")
return
print(f"Iniciando processamento: {id} com modelo {model_path}")
model = YOLO(model_path)
capture.set(cv2.CAP_PROP_BUFFERSIZE, 2)
capture.set(cv2.CAP_PROP_FRAME_WIDTH, 1920)
capture.set(cv2.CAP_PROP_FRAME_HEIGHT, 1080)
fps = capture.get(cv2.CAP_PROP_FPS)
width, height = 1920, 1080
fourcc = cv2.VideoWriter_fourcc(*"MJPG")
output = cv2.VideoWriter(output_name, fourcc, fps_target, (width, height))
frame_count = 0
last_valid_frame = None
start_total_time = time.time()
last_inference = time.time()
while capture.isOpened():
if time.time() - start_total_time >= 600:
break
ret, frame = capture.read()
if not ret:
print(f"Erro ao capturar frame de {rtsp_url}, tentando reconectar...")
capture.release()
time.sleep(2)
break
if frame_count % max(1, int(fps // fps_target)) == 0:
results = model.predict(frame, device=1, imgsz=1920, stream_buffer=True)
if results and len(results) > 0:
frame = results[0].plot()
last_valid_frame = frame
else:
frame = last_valid_frame if last_valid_frame is not None else frame
output.write(frame)
print(f"Inferencia demorou: {(time.time() - last_inference):.3f}s id: {id} usando {model_path}")
last_inference = time.time()
frame_count += 1
capture.release()
output.release()
print(f"Stream salvo como: {output_name}")
with open("list.streams", "r") as file:
streams = [line.strip() for line in file.readlines() if line.strip()]
with open("models.txt", "r") as fileModels:
model_paths = [line.strip() for line in fileModels.readlines() if line.strip()]
fps_target = 5.0
if len(model_paths) < len(streams):
print("Erro: número de modelos insuficiente para o número de streams.")
exit(1)
threads = []
for id, (rtsp, model_path) in enumerate(zip(streams, model_paths)):
output_file = f"output_stream_{id+1}.avi"
t = threading.Thread(target=process_stream, args=(rtsp, output_file, model_path, id))
t.start()
threads.append(t)
for t in threads:
t.join()
### Additional
_No response_
### Are you willing to submit a PR?
- [x] Yes I'd like to help by submitting a PR! | open | 2025-02-17T13:51:09Z | 2025-02-19T10:08:40Z | https://github.com/ultralytics/ultralytics/issues/19283 | [
"question",
"detect"
] | kaueciglioni | 7 |
qubvel-org/segmentation_models.pytorch | computer-vision | 591 | wish more implements~ | closed | 2022-04-20T08:51:30Z | 2022-06-27T02:20:18Z | https://github.com/qubvel-org/segmentation_models.pytorch/issues/591 | [
"Stale"
] | yonghongwu | 2 |
|
mage-ai/mage-ai | data-science | 5,402 | [BUG] Kubernetes API Failures During Job Creation | ### Mage version
0.9.72
### Describe the bug
Currently, when using Mage to run jobs in a Kubernetes environment, there is an issue where failures caused by the Kubernetes API being unavailable or unreachable are not handled properly. These failures can happen due to reasons like temporary unavailability of Kubernetes nodes, control plane issues, or network-related problems.
When the Kubernetes API fails to respond during the job creation process, the job fails to start, and no retries are attempted. As a result, the job is never executed, and no logs are generated, leading to silent failures that are difficult to detect and debug.
While logs and error handling work correctly when there are coding issues during block execution, no mechanism currently exists to handle failures due to API unavailability. This makes it difficult to identify and resolve problems when the Kubernetes API is temporarily down, as these failures are not being captured or retried, leaving critical jobs unexecuted without any notification or trace of the issue.
This behavior negatively impacts the reliability of job execution in Mage, especially in environments where the Kubernetes control plane can experience intermittent downtime.
### Additional Concern
Also, I have seen that you can pass values for the job that is created in the metadata.yaml:
```
k8s_executor_config:
job:
active_deadline_seconds: 1400
backoff_limit: 6
ttl_seconds_after_finished: 4000
```
These values are used when creating the job:
```
spec = client.V1JobSpec(
template=template,
active_deadline_seconds=k8s_config.job_config.active_deadline_seconds,
backoff_limit=k8s_config.job_config.backoff_limit,
ttl_seconds_after_finished=k8s_config.job_config.ttl_seconds_after_finished)
```
I’m experiencing an issue where the Kubernetes Job does not retry the number of times specified in the backoffLimit. Instead, the job gets deleted after the first failure or when it is successful. Occasionally, the job runs twice before being deleted, even though the backoffLimit is set to a higher number (e.g., 4). This suggests that the Job is being deleted prematurely, preventing Kubernetes from managing the retries as intended.
I think that this is the relevant code:
```
while not job_completed:
api_response = self.batch_api_client.read_namespaced_job(
name=self.job_name,
namespace=self.namespace
)
if api_response.status.succeeded is not None or \
api_response.status.failed is not None:
job_completed = True
time.sleep(5)
# self._print(f'Job {self.job_name} status={api_response.status}')
self.delete_job()
self._print(f'Job {self.job_name} status={api_response.status}')
if api_response.status.succeeded is None:
raise Exception(f'Failed to execute k8s job {self.job_name}')
```
I think that in the code above, the status of the executed job is checked, if the status is complete or failure. But I think if one of the runs fails, status failure is incremented, it is no longer None, but that doesn't mean that the job is finished. That's why I think, sometimes it is executed twice.
Because the job fails, as it is waiting 5 sec. to check again if it is completed or not, the retry of the job launches again the pod, but as the `api_response.status.failed` is not None then it exits the for loop and it is marked as failed.
Is this expected? I don't quite understand why it is allowed to add these Job settings, but then they are not being used. Maybe it is my lack of understanding in the repository and they are being used elsewhere in the code.
### To reproduce
1. Set up a Kubernetes cluster and ensure that Mage is configured to execute jobs within the cluster.
2. Introduce an artificial failure or unavailability of the Kubernetes API, for example, by:
- Stopping the Kubernetes API temporarily (kubectl stop or equivalent on your platform).
- Simulating network issues between Mage and the Kubernetes control plane (e.g., by using a firewall rule or network policy to block traffic).
3. Attempt to run a job in Mage during the Kubernetes API unavailability.
- You can use any standard Mage job that interacts with the Kubernetes API for this test.
4. Observe the behavior:
- The job creation will fail silently because the Kubernetes API is unreachable.
- No retries will be attempted, and no logs or errors will be generated in Mage.
- The job will not be executed, and there is no feedback indicating that the failure occurred.
5. Once the Kubernetes API is available again, verify that the job did not get created or executed and that no error or retry mechanism was triggered.
### Expected behavior
1. Job Creation Attempt: When Mage tries to create a Kubernetes Job, if the Kubernetes API is unavailable or unreachable (due to network issues, control plane downtime, or other temporary failures), the job creation should not fail silently.
2. Retry on API Failures:
- If the Kubernetes API returns specific errors such as 502 (Bad Gateway), 503 (Service Unavailable), or 504 (Gateway Timeout), the system should attempt to retry the job creation rather than failing immediately.
- The retry attempts should occur automatically based on the configuration and should ensure the system tries to reach the Kubernetes API before failing.
3.Configurable Retry Mechanism:
- Developers should be able to configure the number of retries through a variable (e.g., in the k8s_executor_config), which allows flexibility based on system needs.
- The retry mechanism should handle specific error codes (e.g., 502, 503, 504), while ensuring that non-retryable errors (like invalid job configurations or authentication issues) fail immediately.
4. Logging and Feedback:
- If the Kubernetes API is temporarily unavailable, logs should capture these API failures and retries.
- Clear error messages should be generated if all retries fail, ensuring that the issue is visible and not a silent failure.
# Additional Concern:
While configuring the backoffLimit and ttl_seconds_after_finished is already possible, it seems that they are not functioning as expected. Even when these values are configured, the job is deleted immediately after it finishes or fails.
Is this behavior expected, or is there an issue with how these configurations are being handled in Mage? Specifically, the ttl_seconds_after_finished seems to be ignored, as the job is removed instantly upon completion.
### Screenshots
_No response_
### Operating system
- Google Kubernetes Engine.
- Official Docker image
### Additional context
_No response_ | open | 2024-09-09T14:56:40Z | 2025-02-06T23:05:16Z | https://github.com/mage-ai/mage-ai/issues/5402 | [
"bug"
] | jordi-ydteam | 5 |
lanpa/tensorboardX | numpy | 152 | Implement support of Matplotlib figures | Hello @lanpa,
First of all, thanks a lot for investing the time and developing this project. I find it super useful whenever I try (and usually fail ;) hacking _PyTorch_.
Frequently, when I work there's a need to create _Matplotlib_ plot and add it to _TensorBoard_. There are known and well documented ways to do it by converting the figure to image. One of your [examples](https://github.com/lanpa/tensorboard-pytorch/blob/master/examples/matplotlib_demo.py) in this repo actually demonstrates it and there's a [few other ways](https://github.com/wookayin/tensorflow-plot) to achieve the same result.
I was wondering if it would make sense to implement `SummaryWriter.add_figure(...)` method that would accept `matplotlib.Figure`, convert it into a RGB image and then add it to _TensorBoard_. I think the implementation could look roughly like this:
```python
import matplotlib.pyplot as plt
import matplotlib.backends.backend_agg as plt_backend_agg
class SummaryWriter:
def add_figure(self, tag, figure, global_step=None, close_figure=True):
canvas = plt_backend_agg.FigureCanvasAgg(figure)
canvas.draw()
data = np.frombuffer(canvas.buffer_rgba(), dtype=np.uint8)
w, h = figure.canvas.get_width_height()
image = data.reshape([h, w, 4])[:, :, 0:3]
if close_figure:
plt.close(figure)
self.add_image(tag, image, global_step)
```
I'll submit a PR if you think adding the method makes sense and it fits well into the API.
Have a good one! | closed | 2018-05-31T17:18:56Z | 2018-06-02T17:07:37Z | https://github.com/lanpa/tensorboardX/issues/152 | [] | vojtamolda | 5 |
python-gino/gino | asyncio | 30 | Integrate with Sanic | Perhaps at least offer a request-scoped lazy-borrowed connection. | closed | 2017-08-07T06:25:26Z | 2017-08-14T04:43:53Z | https://github.com/python-gino/gino/issues/30 | [
"help wanted",
"task"
] | fantix | 0 |
opengeos/leafmap | plotly | 663 | Support add_stac_layer for leafmap.deckgl backend | <!-- Please search existing issues to avoid creating duplicates. -->
### Description
lonboard support was added for the deck.gl backend. However, I'm having difficulty integrating this functionality with STAC item visualizations that are supported in the ipyleaflet backend.
Below is a reproducible example
Acquiring a gdf of solar farm predictions from [SATLAS](https://github.com/allenai/satlas)
```python
import requests
import geopandas as gpd
import json
from pathlib import Path
solar_farm_detections_url = 'https://pub-956f3eb0f5974f37b9228e0a62f449bf.r2.dev/outputs/renewable/latest.geojson'
local_filename = 'renewable.geojson'
file_path = Path(local_filename)
if file_path.exists():
with open(file_path, 'r') as file:
geojson_data = json.load(file)
else:
response = requests.get(solar_farm_detections_url)
if response.status_code == 200:
geojson_data = response.json()
with open(file_path, 'w') as file:
json.dump(geojson_data, file)
else:
raise Exception(f"Failed to download the file: Status code {response.status_code}")
gdf = gpd.GeoDataFrame.from_features(geojson_data['features'])
solar_farms = gdf[gdf['category']=="solar_farm"]
```
Trying to visualize STAC Items with the predictions
```python
import pystac_client
import planetary_computer
from shapely.geometry import Point
import leafmap.deckgl as leafmap
area_of_interest = Point((-120.963122, 37.025915)) # wright solar farm lon lat
catalog = pystac_client.Client.open(
"https://planetarycomputer.microsoft.com/api/stac/v1",
modifier=planetary_computer.sign_inplace,
)
range_old = "2010-01-01/2013-01-01"
range_new = "2020-01-01/2021-01-01"
search_old = catalog.search(
collections=["naip"], intersects=area_of_interest, datetime=range_old
)
search_new = catalog.search(
collections=["naip"], intersects=area_of_interest, datetime=range_new
)
items_old = search_old.item_collection()
items_new = search_new.item_collection()
print(f"{len(items_old)} Items found in the 'old' range")
print(f"{len(items_new)} Items found in the 'new' range")
map = leafmap.Map(height=600, show_tooltip=True)
map.center = (area_of_interest.y, area_of_interest.x)
map.zoom = 2
# leafmap.stac_assets(collection="naip", item=items_old[0].id, titiler_endpoint="pc")
map.add_stac_layer(
collection="naip",
item=items_old[0].id,
assets=["image"],
name="Old naip image 2012 before solar development",
)
map.add_stac_layer(
collection="naip",
item=items_new[0].id,
assets=["image"],
name="New naip image 2016 before solar development",
)
range_old = "2015-01-01/2016-01-01"
range_new = "2020-01-01/2021-01-01"
max_cloud_cover = 15
search_old = catalog.search(
collections=["sentinel-2-l2a"], intersects=area_of_interest, datetime=range_old, query={"eo:cloud_cover": {"lt": max_cloud_cover}}
)
search_new = catalog.search(
collections=["sentinel-2-l2a"], intersects=area_of_interest, datetime=range_new, query={"eo:cloud_cover": {"lt": max_cloud_cover}}
)
items_old = search_old.item_collection()
items_new = search_new.item_collection()
print(f"{len(items_old)} Items found in the 'old' range")
print(f"{len(items_new)} Items found in the 'new' range")
# map = leafmap.Map()
# map.center = (area_of_interest.y, area_of_interest.x)
# map.zoom = 1
leafmap.stac_assets(collection="sentinel-2-l2a", item=items_old[0].id, titiler_endpoint="pc")
map.add_stac_layer(
collection="sentinel-2-l2a",
item=items_old[0].id,
assets=["B04", "B03", "B02"],
name="Old s2 image 2012 before solar development",
fit_bounds=False
)
map.add_stac_layer(
collection="sentinel-2-l2a",
item=items_new[0].id,
assets=["B04", "B03", "B02"],
name="New s2 image 2016 before solar development",
fit_bounds=False
)
map.add_vector(solar_farms)
map
```
### the error
```python
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[12], line 36
31 map.zoom = 2
33 # leafmap.stac_assets(collection="naip", item=items_old[0].id, titiler_endpoint="pc")
---> 36 map.add_stac_layer(
37 collection="naip",
38 item=items_old[0].id,
39 assets=["image"],
40 name="Old naip image 2012 before solar development",
41 )
42 map.add_stac_layer(
43 collection="naip",
44 item=items_new[0].id,
45 assets=["image"],
46 name="New naip image 2016 before solar development",
47 )
50 range_old = "2015-01-01/2016-01-01"
AttributeError: 'Map' object has no attribute 'add_stac_layer'
```
## Feature Request
Ideally it would be possible to visualize imagery with vectors plotted with lonboard.
| closed | 2024-01-18T20:47:32Z | 2024-01-20T04:40:36Z | https://github.com/opengeos/leafmap/issues/663 | [
"Feature Request"
] | rbavery | 2 |
lensacom/sparkit-learn | scikit-learn | 20 | Implement ArrayRDD.std() method | see http://docs.scipy.org/doc/numpy/reference/generated/numpy.ndarray.std.html#numpy.ndarray.std
see ArrayRDD.mean()
| open | 2015-06-11T12:09:49Z | 2015-06-11T15:30:49Z | https://github.com/lensacom/sparkit-learn/issues/20 | [
"enhancement"
] | kszucs | 0 |
jumpserver/jumpserver | django | 14,572 | [Question] 使用 vscode 连接资产服务器失败 | ### 产品版本
v4.4.1
### 版本类型
- [X] 社区版
- [ ] 企业版
- [ ] 企业试用版
### 安装方式
- [ ] 在线安装 (一键命令安装)
- [X] 离线包安装
- [ ] All-in-One
- [ ] 1Panel
- [ ] Kubernetes
- [ ] 源码安装
### 环境信息
ubuntu-22.04.5 server
有互联网
### 🤔 问题描述
config.txt中添加了配置,但使用vscode 连接资产服务器失败。
资产服务器账号密码托管在平台上,JumpServer登录日志中显示JumpServer用户登录成功。但无法与资产服务器建立连接,并报错
[17:17:34.170] > Welcome to JumpServer SSH Server
[17:17:34.187] > admin@root@10.168.1.5@202.115.11.78's password:
[17:17:34.187] Showing password prompt
[17:17:39.735] Got password response
[17:17:39.735] "install" wrote data to terminal: "*********"
[17:17:39.751] >
[17:17:40.240] > match asset failed: No found asset
[17:17:41.507] "install" terminal command done
[17:17:41.507] Install terminal quit with output: match asset failed: No found asset
[17:17:41.507] Received install output: match asset failed: No found asset
[17:17:41.507] Failed to parse remote port from server output
[17:17:41.509] Resolver error: Error:
at v.Create (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:493431)
at t.handleInstallOutput (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:490753)
at t.tryInstall (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:608797)
at async c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:568008
at async t.withShowDetailsEvent (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:571256)
at async P (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:564794)
at async t.resolve (c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:568667)
at async c:\Users\btcz7\.vscode\extensions\ms-vscode-remote.remote-ssh-0.115.1\out\extension.js:2:839059
[17:17:41.513] ------
### 期望结果
问题解决,可连接
### 补充信息
_No response_ | closed | 2024-12-03T09:55:09Z | 2024-12-05T01:57:08Z | https://github.com/jumpserver/jumpserver/issues/14572 | [
"🤔 Question"
] | steven22tom | 2 |
exaloop/codon | numpy | 528 | Why did some AST copy in some places and then assign values to themselves? | like:
```C++
f.ast = std::static_pointer_cast<FunctionStmt>(clone(f.ast));
```
What is the reason for the cost of this copy?
| closed | 2024-01-15T08:03:22Z | 2024-01-29T19:27:49Z | https://github.com/exaloop/codon/issues/528 | [] | WutingjiaX | 1 |
pydata/xarray | pandas | 9,890 | ASV Benchmark warning and timeouts | ### What is your issue?
https://github.com/pydata/xarray/pull/9889#issuecomment-2543054655
Quite a few timeouts and warnings nowadays with the benchmarks, but it runs now at least.
Failing after 115m
<details>
```
2024-12-14T09:52:03.3320213Z [50.00%] · For xarray commit 36e46460 (round 2/2):
2024-12-14T09:52:03.3321514Z [50.00%] ·· Benchmarking mamba-py3.11-bottleneck-cftime-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-setuptools_scm-sparse
2024-12-14T09:52:03.3323019Z [50.12%] ··· accessors.DateTimeAccessor.time_dayofyear ok
2024-12-14T09:52:03.3323734Z [50.12%] ··· ========== =============
2024-12-14T09:52:03.3324121Z calendar
2024-12-14T09:52:03.3324570Z ---------- -------------
2024-12-14T09:52:03.3325130Z standard 428±3μs
2024-12-14T09:52:03.3325696Z noleap 1.01±0.01ms
2024-12-14T09:52:03.3326136Z ========== =============
2024-12-14T09:52:03.3326413Z
2024-12-14T09:52:03.3326884Z [50.25%] ··· accessors.DateTimeAccessor.time_floor ok
2024-12-14T09:52:03.3327467Z [50.25%] ··· ========== ==========
2024-12-14T09:52:03.3327733Z calendar
2024-12-14T09:52:03.3328247Z ---------- ----------
2024-12-14T09:52:03.3328567Z standard 406±4μs
2024-12-14T09:52:03.3328926Z noleap 81.8±1ms
2024-12-14T09:52:03.3329177Z ========== ==========
2024-12-14T09:52:03.3329314Z
2024-12-14T09:52:03.3329569Z [50.37%] ··· accessors.DateTimeAccessor.time_year ok
2024-12-14T09:52:03.3329968Z [50.37%] ··· ========== =========
2024-12-14T09:52:03.3330190Z calendar
2024-12-14T09:52:03.3330410Z ---------- ---------
2024-12-14T09:52:03.3330659Z standard 401±1μs
2024-12-14T09:52:03.3330913Z noleap 911±2μs
2024-12-14T09:52:03.3331115Z ========== =========
2024-12-14T09:52:03.3331256Z
2024-12-14T09:52:03.3331495Z [50.50%] ··· alignment.Align.time_already_aligned ok
2024-12-14T09:52:03.3331869Z [50.50%] ··· ========== ============
2024-12-14T09:52:03.3332097Z join
2024-12-14T09:52:03.3332327Z ---------- ------------
2024-12-14T09:52:03.3332588Z outer 28.3±0.8ms
2024-12-14T09:52:03.3332845Z inner 28.3±0.8ms
2024-12-14T09:52:03.3333096Z left 28.8±0.7ms
2024-12-14T09:52:03.3333352Z right 28.2±0.7ms
2024-12-14T09:52:03.3333615Z exact 28.5±0.8ms
2024-12-14T09:52:03.3333876Z override 176±1μs
2024-12-14T09:52:03.3334091Z ========== ============
2024-12-14T09:52:03.3334233Z
2024-12-14T09:52:03.3334468Z [50.62%] ··· alignment.Align.time_not_aligned ok
2024-12-14T09:52:03.3334825Z [50.62%] ··· ======= =============
2024-12-14T09:52:03.3335048Z join
2024-12-14T09:52:03.3335261Z ------- -------------
2024-12-14T09:52:03.3335521Z outer 29.7±0.9ms
2024-12-14T09:52:03.3335779Z inner 1.05±0.02ms
2024-12-14T09:52:03.3336028Z left 29.0±0.6ms
2024-12-14T09:52:03.3336371Z right 885±10μs
2024-12-14T09:52:03.3336613Z ======= =============
2024-12-14T09:52:03.3336747Z
2024-12-14T09:52:03.3336973Z [50.74%] ··· ...nment.Align.time_not_aligned_random_integers ok
2024-12-14T09:52:03.3337330Z [50.74%] ··· ======= ============
2024-12-14T09:52:03.3337759Z join
2024-12-14T09:52:03.3337967Z ------- ------------
2024-12-14T09:52:03.3338227Z outer 30.3±0.4ms
2024-12-14T09:52:03.3338482Z inner 14.0±0.3ms
2024-12-14T09:52:03.3338746Z left 29.2±0.6ms
2024-12-14T09:52:03.3339007Z right 13.3±0.2ms
2024-12-14T09:52:03.3339216Z ======= ============
2024-12-14T09:52:03.3339351Z
2024-12-14T09:52:03.3339595Z [50.87%] ··· alignment.AlignCFTime.time_already_aligned ok
2024-12-14T09:52:03.3339979Z [50.87%] ··· ========== ============
2024-12-14T09:52:03.3340213Z join
2024-12-14T09:52:03.3340431Z ---------- ------------
2024-12-14T09:52:03.3340692Z outer 28.0±1ms
2024-12-14T09:52:03.3340953Z inner 28.0±0.6ms
2024-12-14T09:52:03.3341209Z left 28.3±0.6ms
2024-12-14T09:52:03.3341465Z right 28.3±0.7ms
2024-12-14T09:52:03.3341732Z exact 28.4±0.6ms
2024-12-14T09:52:03.3341990Z override 228±3μs
2024-12-14T09:52:03.3342266Z ========== ============
2024-12-14T09:52:03.3342464Z
2024-12-14T09:52:03.3343064Z [50.99%] ··· alignment.AlignCFTime.time_not_aligned ok
2024-12-14T09:52:03.3343475Z [50.99%] ··· ======= =============
2024-12-14T09:52:03.3343703Z join
2024-12-14T09:52:03.3343925Z ------- -------------
2024-12-14T09:52:03.3344182Z outer 42.4±0.9ms
2024-12-14T09:52:03.3344439Z inner 10.3±0.06ms
2024-12-14T09:52:03.3344884Z left 40.4±0.5ms
2024-12-14T09:52:03.3345144Z right 1.37±0.02ms
2024-12-14T09:52:03.3345355Z ======= =============
2024-12-14T09:52:03.3345493Z
2024-12-14T09:52:03.3345732Z [51.11%] ··· ...AlignCFTime.time_not_aligned_random_integers ok
2024-12-14T09:52:03.3346102Z [51.11%] ··· ======= ============
2024-12-14T09:52:03.3346321Z join
2024-12-14T09:52:03.3346536Z ------- ------------
2024-12-14T09:52:03.3346785Z outer 47.1±0.6ms
2024-12-14T09:52:03.3347039Z inner 33.0±0.2ms
2024-12-14T09:52:03.3347293Z left 43.3±0.7ms
2024-12-14T09:52:03.3347537Z right 23.4±0.3ms
2024-12-14T09:52:03.3347751Z ======= ============
2024-12-14T09:52:03.3347884Z
2024-12-14T09:52:03.3348126Z [51.24%] ··· alignment.AlignDask.time_already_aligned ok
2024-12-14T09:52:03.3348513Z [51.24%] ··· ========== ==========
2024-12-14T09:52:03.3348733Z join
2024-12-14T09:52:03.3348954Z ---------- ----------
2024-12-14T09:52:03.3349207Z outer 606±7μs
2024-12-14T09:52:03.3349464Z inner 611±10μs
2024-12-14T09:52:03.3349719Z left 604±5μs
2024-12-14T09:52:03.3349975Z right 601±4μs
2024-12-14T09:52:03.3350227Z exact 613±2μs
2024-12-14T09:52:03.3350525Z override 180±3μs
2024-12-14T09:52:03.3350792Z ========== ==========
2024-12-14T09:52:03.3350951Z
2024-12-14T09:52:47.4447019Z [51.36%] ··· alignment.AlignDask.time_not_aligned ok
2024-12-14T09:52:47.4447861Z [51.36%] ··· ======= =============
2024-12-14T09:52:47.4448359Z join
2024-12-14T09:52:47.4448881Z ------- -------------
2024-12-14T09:52:47.4449506Z outer 1.29±0.02ms
2024-12-14T09:52:47.4450099Z inner 1.60±0.02ms
2024-12-14T09:52:47.4450886Z left 1.13±0.01ms
2024-12-14T09:52:47.4451640Z right 1.44±0.02ms
2024-12-14T09:52:47.4452269Z ======= =============
2024-12-14T09:52:47.4452685Z
2024-12-14T09:52:47.4453367Z [51.49%] ··· ...t.AlignDask.time_not_aligned_random_integers ok
2024-12-14T09:52:47.4454699Z [51.49%] ··· ======= =============
2024-12-14T09:52:47.4455231Z join
2024-12-14T09:52:47.4455715Z ------- -------------
2024-12-14T09:52:47.4456306Z outer 1.88±0.01ms
2024-12-14T09:52:47.4456909Z inner 12.4±0.1ms
2024-12-14T09:52:47.4457485Z left 1.17±0.02ms
2024-12-14T09:52:47.4458097Z right 11.7±0.2ms
2024-12-14T09:52:47.4458636Z ======= =============
2024-12-14T09:52:47.4458973Z
2024-12-14T09:52:47.4459581Z [51.61%] ··· coding.EncodeCFDatetime.time_encode_cf_datetime ok
2024-12-14T09:52:47.4460567Z [51.61%] ··· ========== =========
2024-12-14T09:52:47.4461095Z calendar
2024-12-14T09:52:47.4461598Z ---------- ---------
2024-12-14T09:52:47.4462228Z standard 801±6μs
2024-12-14T09:52:47.4462992Z noleap 128±3ms
2024-12-14T09:52:47.4463476Z ========== =========
2024-12-14T09:52:47.4463842Z
2024-12-14T09:52:47.4464388Z [51.73%] ··· combine.Combine1d.time_combine_by_coords 1.02±0.02ms
2024-12-14T09:52:47.4465386Z [51.86%] ··· combine.Combine1dDask.time_combine_by_coords 176±3ms
2024-12-14T09:52:47.4466274Z [51.98%] ··· combine.Combine3d.time_combine_by_coords 65.6±0.9ms
2024-12-14T09:52:47.4467129Z [52.10%] ··· combine.Combine3d.time_combine_nested 63.5±1ms
2024-12-14T09:52:47.4468039Z [52.23%] ··· ...issing.DataArrayMissingBottleneck.time_bfill ok
2024-12-14T09:52:47.4468799Z [52.23%] ··· =============== ==================== ======= ============
2024-12-14T09:52:47.4469538Z shape chunks limit
2024-12-14T09:52:47.4470076Z --------------- -------------------- ------- ------------
2024-12-14T09:52:47.4470711Z (365, 75, 75) None None 5.05±0.1ms
2024-12-14T09:52:47.4471318Z (365, 75, 75) None 3 5.04±0.1ms
2024-12-14T09:52:47.4471945Z (365, 75, 75) {'x': 25, 'y': 25} None 15.0±0.4ms
2024-12-14T09:52:47.4472561Z (365, 75, 75) {'x': 25, 'y': 25} 3 15.5±0.5ms
2024-12-14T09:52:47.4473031Z =============== ==================== ======= ============
2024-12-14T09:52:47.4473357Z
2024-12-14T09:52:47.4473908Z [52.35%] ··· ...issing.DataArrayMissingBottleneck.time_ffill ok
2024-12-14T09:52:47.4474755Z [52.35%] ··· =============== ==================== ======= ============
2024-12-14T09:52:47.4475252Z shape chunks limit
2024-12-14T09:52:47.4475772Z --------------- -------------------- ------- ------------
2024-12-14T09:52:47.4476405Z (365, 75, 75) None None 6.21±0.3ms
2024-12-14T09:52:47.4477036Z (365, 75, 75) None 3 6.17±0.3ms
2024-12-14T09:52:47.4477688Z (365, 75, 75) {'x': 25, 'y': 25} None 14.1±0.7ms
2024-12-14T09:52:47.4478229Z (365, 75, 75) {'x': 25, 'y': 25} 3 14.2±0.5ms
2024-12-14T09:52:47.4478583Z =============== ==================== ======= ============
2024-12-14T09:52:47.4478821Z
2024-12-14T09:52:47.4479091Z [52.48%] ··· ...rrayMissingInterpolateNA.time_interpolate_na ok
2024-12-14T09:52:47.4479644Z [52.48%] ··· =============== ==================== ======= ===========
2024-12-14T09:52:47.4480045Z shape chunks limit
2024-12-14T09:52:47.4480370Z --------------- -------------------- ------- -----------
2024-12-14T09:52:47.4480831Z (365, 75, 75) None None 102±0.7ms
2024-12-14T09:52:47.4481290Z (365, 75, 75) None 3 121±0.9ms
2024-12-14T09:52:47.4481660Z (365, 75, 75) {'x': 25, 'y': 25} None 165±5ms
2024-12-14T09:52:47.4482326Z (365, 75, 75) {'x': 25, 'y': 25} 3 317±7ms
2024-12-14T09:52:47.4482673Z =============== ==================== ======= ===========
2024-12-14T09:52:47.4482844Z
2024-12-14T09:52:47.4483216Z [52.60%] ··· dataset.DatasetBinaryOp.time_normalize 499±8μs
2024-12-14T09:52:47.4483841Z [52.72%] ··· dataset.DatasetChunk.time_chunk 160±3ms
2024-12-14T09:52:47.4484439Z [52.85%] ··· dataset_io.IOReadCustomEngine.time_open_dataset ok
2024-12-14T09:52:47.4484829Z [52.85%] ··· ============== ==========
2024-12-14T09:52:47.4485058Z chunks
2024-12-14T09:52:47.4485301Z -------------- ----------
2024-12-14T09:52:47.4485571Z None 62.6±1ms
2024-12-14T09:52:47.4485824Z {} 447±9ms
2024-12-14T09:52:47.4486200Z {'time': 10} 499±2ms
2024-12-14T09:52:47.4486417Z ============== ==========
2024-12-14T09:52:47.4486571Z
2024-12-14T09:52:47.4486810Z [52.97%] ··· ...adDataTreeNetCDF4.time_load_datatree_netcdf4 n/a
2024-12-14T09:52:47.4487281Z [52.97%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6728058Z [53.09%] ··· ...adDataTreeNetCDF4.time_open_datatree_netcdf4 n/a
2024-12-14T09:52:47.6728954Z [53.09%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6729997Z [53.22%] ··· ...eadMultipleNetCDF3.time_load_dataset_netcdf4 n/a
2024-12-14T09:52:47.6731032Z [53.22%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6732665Z [53.34%] ··· ...OReadMultipleNetCDF3.time_load_dataset_scipy n/a
2024-12-14T09:52:47.6733949Z [53.34%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6735169Z [53.47%] ··· ...eadMultipleNetCDF3.time_open_dataset_netcdf4 n/a
2024-12-14T09:52:47.6736332Z [53.47%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6737465Z [53.59%] ··· ...OReadMultipleNetCDF3.time_open_dataset_scipy n/a
2024-12-14T09:52:47.6738611Z [53.59%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6739523Z [53.71%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:47.6740459Z [53.71%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6741357Z [53.84%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:47.6742267Z [53.84%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6743480Z [53.96%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:52:47.6744313Z [53.96%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6745176Z [54.08%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:52:47.6746060Z [54.08%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6746890Z [54.21%] ··· ...sk.time_load_dataset_scipy_with_block_chunks n/a
2024-12-14T09:52:47.6747727Z [54.21%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6748568Z [54.33%] ··· ...ask.time_load_dataset_scipy_with_time_chunks n/a
2024-12-14T09:52:47.6749361Z [54.33%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6750145Z [54.46%] ··· ....time_open_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:47.6750938Z [54.46%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6751744Z [54.58%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:47.6752623Z [54.58%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6753454Z [54.70%] ··· ...k.time_open_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:52:47.6754493Z [54.70%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6755284Z [54.83%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:52:47.6756079Z [54.83%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6757002Z [54.95%] ··· ...sk.time_open_dataset_scipy_with_block_chunks n/a
2024-12-14T09:52:47.6757813Z [54.95%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6758464Z [55.07%] ··· ...ask.time_open_dataset_scipy_with_time_chunks n/a
2024-12-14T09:52:47.6759049Z [55.07%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6759645Z [55.20%] ··· ...eadMultipleNetCDF4.time_load_dataset_netcdf4 n/a
2024-12-14T09:52:47.6760220Z [55.20%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6760786Z [55.32%] ··· ...eadMultipleNetCDF4.time_open_dataset_netcdf4 n/a
2024-12-14T09:52:47.6761371Z [55.32%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6761932Z [55.45%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:47.6762496Z [55.45%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6763090Z [55.57%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:47.6763662Z [55.57%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6764221Z [55.69%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:52:47.6764916Z [55.69%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6765382Z [55.82%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:52:47.6765960Z [55.82%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6766415Z [55.94%] ··· ....time_open_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:47.6766859Z [55.94%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6767301Z [56.06%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:47.6767739Z [56.06%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:47.6768174Z [56.19%] ··· ...k.time_open_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:52:47.6768612Z [56.19%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7572980Z [56.31%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:52:55.7574160Z [56.31%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7575041Z [56.44%] ··· dataset_io.IOReadSingleFile.time_read_dataset ok
2024-12-14T09:52:55.7575760Z [56.44%] ··· ========= ============= =============
2024-12-14T09:52:55.7576238Z -- chunks
2024-12-14T09:52:55.7576686Z --------- ---------------------------
2024-12-14T09:52:55.7577118Z engine None {}
2024-12-14T09:52:55.7577535Z ========= ============= =============
2024-12-14T09:52:55.7578060Z scipy 2.18±0.03ms 3.06±0ms
2024-12-14T09:52:55.7578583Z netcdf4 5.08±0.01ms 6.12±0.02ms
2024-12-14T09:52:55.7579014Z ========= ============= =============
2024-12-14T09:52:55.7579295Z
2024-12-14T09:52:55.7579740Z [56.56%] ··· ...OReadSingleNetCDF3.time_load_dataset_netcdf4 n/a
2024-12-14T09:52:55.7580590Z [56.56%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7581423Z [56.68%] ··· ....IOReadSingleNetCDF3.time_load_dataset_scipy n/a
2024-12-14T09:52:55.7582245Z [56.68%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7583249Z [56.81%] ··· ...IOReadSingleNetCDF3.time_orthogonal_indexing n/a
2024-12-14T09:52:55.7584389Z [56.81%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7585224Z [56.93%] ··· ...IOReadSingleNetCDF3.time_vectorized_indexing n/a
2024-12-14T09:52:55.7586051Z [56.93%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7586832Z [57.05%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:55.7587611Z [57.05%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7588422Z [57.18%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:55.7589217Z [57.18%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7590012Z [57.30%] ··· ..._dataset_netcdf4_with_block_chunks_oindexing n/a
2024-12-14T09:52:55.7590799Z [57.30%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7591597Z [57.43%] ··· ..._dataset_netcdf4_with_block_chunks_vindexing n/a
2024-12-14T09:52:55.7592386Z [57.43%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7593275Z [57.55%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:52:55.7594047Z [57.55%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7594845Z [57.67%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:52:55.7595630Z [57.67%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7596636Z [57.80%] ··· ...sk.time_load_dataset_scipy_with_block_chunks n/a
2024-12-14T09:52:55.7597539Z [57.80%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7598329Z [57.92%] ··· ...ad_dataset_scipy_with_block_chunks_oindexing n/a
2024-12-14T09:52:55.7599123Z [57.92%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7599975Z [58.04%] ··· ...ad_dataset_scipy_with_block_chunks_vindexing n/a
2024-12-14T09:52:55.7600832Z [58.04%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7601617Z [58.17%] ··· ...ask.time_load_dataset_scipy_with_time_chunks n/a
2024-12-14T09:52:55.7602391Z [58.17%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7603212Z [58.29%] ··· ...OReadSingleNetCDF4.time_load_dataset_netcdf4 n/a
2024-12-14T09:52:55.7604027Z [58.29%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7604856Z [58.42%] ··· ...IOReadSingleNetCDF4.time_orthogonal_indexing n/a
2024-12-14T09:52:55.7605680Z [58.42%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7606501Z [58.54%] ··· ...IOReadSingleNetCDF4.time_vectorized_indexing n/a
2024-12-14T09:52:55.7607312Z [58.54%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7608107Z [58.66%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T09:52:55.7608881Z [58.66%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7609669Z [58.79%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T09:52:55.7610459Z [58.79%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7611251Z [58.91%] ··· ..._dataset_netcdf4_with_block_chunks_oindexing n/a
2024-12-14T09:52:55.7612036Z [58.91%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:52:55.7612842Z [59.03%] ··· ..._dataset_netcdf4_with_block_chunks_vindexing n/a
2024-12-14T09:52:55.7613629Z [59.03%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4824833Z [59.16%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T09:53:00.4825918Z [59.16%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4826512Z [59.28%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T09:53:00.4827346Z [59.28%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4828217Z [59.41%] ··· ...teMultipleNetCDF3.time_write_dataset_netcdf4 n/a
2024-12-14T09:53:00.4829111Z [59.41%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4829944Z [59.53%] ··· ...riteMultipleNetCDF3.time_write_dataset_scipy n/a
2024-12-14T09:53:00.4830850Z [59.53%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4831725Z [59.65%] ··· dataset_io.IOWriteNetCDFDask.time_write n/a
2024-12-14T09:53:00.4832611Z [59.65%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4833523Z [59.78%] ··· ...t_io.IOWriteNetCDFDaskDistributed.time_write n/a
2024-12-14T09:53:00.4834508Z [59.78%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4835384Z [59.90%] ··· ...riteSingleNetCDF3.time_write_dataset_netcdf4 n/a
2024-12-14T09:53:00.4836248Z [59.90%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4837136Z [60.02%] ··· ...OWriteSingleNetCDF3.time_write_dataset_scipy n/a
2024-12-14T09:53:00.4838010Z [60.02%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:00.4838789Z [60.15%] ··· datatree.Datatree.time_from_dict_few 891±20μs
2024-12-14T09:53:00.4839869Z [60.27%] ··· datatree.Datatree.time_from_dict_many 20.0±0.2ms
2024-12-14T09:53:00.4840675Z [60.40%] ··· groupby.GroupBy.peakmem_binary_op_1d 172M
2024-12-14T09:53:00.4841465Z [60.52%] ··· groupby.GroupBy.peakmem_binary_op_2d 172M
2024-12-14T09:53:00.4842257Z [60.64%] ··· groupby.GroupBy.time_agg_large_num_groups ok
2024-12-14T09:53:00.4842932Z [60.64%] ··· ======== ====== ============= ===========
2024-12-14T09:53:00.4843364Z -- use_flox
2024-12-14T09:53:00.4843774Z --------------- -------------------------
2024-12-14T09:53:00.4844209Z method ndim True False
2024-12-14T09:53:00.4844616Z ======== ====== ============= ===========
2024-12-14T09:53:00.4845104Z sum 1 2.95±0.06ms 101±2ms
2024-12-14T09:53:00.4845611Z sum 2 4.04±0.01ms 133±0.6ms
2024-12-14T09:53:00.4846124Z mean 1 3.11±0.01ms 107±1ms
2024-12-14T09:53:00.4846646Z mean 2 4.17±0.01ms 144±1ms
2024-12-14T09:53:00.4847042Z ======== ====== ============= ===========
2024-12-14T09:53:00.4847323Z
2024-12-14T09:53:00.4847589Z [60.64%] ···· For parameters: 'sum', 1, False
2024-12-14T09:53:00.4849478Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4851527Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4853566Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4855550Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4857579Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4859877Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4861827Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4863466Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4864878Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4866274Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4867669Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4869110Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4869496Z
2024-12-14T09:53:00.4869762Z For parameters: 'sum', 2, False
2024-12-14T09:53:00.4870845Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4872007Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4873169Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4874341Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4875514Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4876708Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4877882Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4879043Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4880347Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4881669Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4882829Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4883986Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4884315Z
2024-12-14T09:53:00.4884513Z For parameters: 'mean', 1, False
2024-12-14T09:53:00.4885585Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4886752Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:00.4887923Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:00.4889079Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3486036Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3488295Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3490379Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3492435Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3494511Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3496576Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3498648Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3500689Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3501257Z
2024-12-14T09:53:02.3501574Z For parameters: 'mean', 2, False
2024-12-14T09:53:02.3503720Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3505799Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3508292Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3510458Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3512641Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3514806Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3516988Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3519181Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3521550Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3523716Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3525945Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T09:53:02.3528110Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:02.3528658Z
2024-12-14T09:53:02.3529502Z [60.77%] ··· groupby.GroupBy.time_agg_small_num_groups ok
2024-12-14T09:53:02.3530247Z [60.77%] ··· ======== ====== ============= =============
2024-12-14T09:53:02.3530738Z -- use_flox
2024-12-14T09:53:02.3531248Z --------------- ---------------------------
2024-12-14T09:53:02.3531731Z method ndim True False
2024-12-14T09:53:02.3532212Z ======== ====== ============= =============
2024-12-14T09:53:02.3532794Z sum 1 2.90±0.06ms 2.42±0.02ms
2024-12-14T09:53:02.3533363Z sum 2 3.97±0.02ms 4.72±0.03ms
2024-12-14T09:53:02.3533950Z mean 1 2.96±0.01ms 2.49±0.02ms
2024-12-14T09:53:02.3534556Z mean 2 4.09±0.01ms 4.77±0.05ms
2024-12-14T09:53:02.3534986Z ======== ====== ============= =============
2024-12-14T09:53:02.3535274Z
ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T09:53:16.8805088Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T09:53:16.8805549Z
2024-12-14T09:53:16.8806203Z [60.89%] ··· groupby.GroupBy.time_binary_op_1d 1.39±0ms
2024-12-14T09:53:16.8807075Z [61.01%] ··· groupby.GroupBy.time_binary_op_2d 2.49±0.01ms
2024-12-14T09:53:16.8807913Z [61.14%] ··· groupby.GroupBy.time_init ok
2024-12-14T09:53:16.8808541Z [61.14%] ··· ====== =============
2024-12-14T09:53:16.8808925Z ndim
2024-12-14T09:53:16.8809290Z ------ -------------
2024-12-14T09:53:16.8809729Z 1 349±6μs
2024-12-14T09:53:16.8810152Z 2 1.28±0.01ms
2024-12-14T09:53:16.8810500Z ====== =============
2024-12-14T09:53:16.8810748Z
2024-12-14T09:53:16.8811181Z [61.26%] ··· groupby.GroupByDask.peakmem_binary_op_1d 195M
2024-12-14T09:53:16.8812076Z [61.39%] ··· groupby.GroupByDask.peakmem_binary_op_2d 236M
2024-12-14T09:53:16.8812907Z [61.51%] ··· groupby.GroupByDask.time_agg_large_num_groups ok
2024-12-14T09:53:16.8813595Z [61.51%] ··· ======== ====== ============= =========
2024-12-14T09:53:16.8814041Z -- use_flox
2024-12-14T09:53:16.8814489Z --------------- -----------------------
2024-12-14T09:53:16.8814923Z method ndim True False
2024-12-14T09:53:16.8815551Z ======== ====== ============= =========
2024-12-14T09:53:16.8816119Z sum 1 5.68±0.2ms 289±4ms
2024-12-14T09:53:16.8816623Z sum 2 12.3±0.1ms 397±4ms
2024-12-14T09:53:16.8817140Z mean 1 5.96±0.08ms 287±2ms
2024-12-14T09:53:16.8817682Z mean 2 13.0±0.05ms 390±4ms
2024-12-14T09:53:16.8818100Z ======== ====== ============= =========
2024-12-14T09:53:16.8818380Z
2024-12-14T09:53:16.8818654Z [61.51%] ···· For parameters: 'sum', 1, False
2024-12-14T09:53:31.4432467Z
2024-12-14T09:53:31.4433269Z [61.76%] ··· groupby.GroupByDask.time_binary_op_1d 102±2ms
2024-12-14T09:53:31.4434375Z [61.88%] ··· groupby.GroupByDask.time_binary_op_2d 1.28±0.01s
2024-12-14T09:53:31.4435423Z [62.00%] ··· groupby.GroupByDask.time_init ok
2024-12-14T09:53:31.4436200Z [62.00%] ··· ====== =============
2024-12-14T09:53:31.4436946Z ndim
2024-12-14T09:53:31.4437360Z ------ -------------
2024-12-14T09:53:31.4437812Z 1 321±1μs
2024-12-14T09:53:31.4438240Z 2 1.79±0.02ms
2024-12-14T09:53:31.4438597Z ====== =============
2024-12-14T09:53:31.4438831Z
2024-12-14T09:53:31.4439312Z [62.13%] ··· ...by.GroupByDaskDataFrame.peakmem_binary_op_1d n/a
2024-12-14T09:53:31.4440166Z [62.13%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4441026Z [62.25%] ··· ...by.GroupByDaskDataFrame.peakmem_binary_op_2d n/a
2024-12-14T09:53:31.4441873Z [62.25%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4442738Z [62.38%] ··· ...oupByDaskDataFrame.time_agg_large_num_groups ok
2024-12-14T09:53:31.4443486Z [62.38%] ··· ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4443988Z -- ndim / use_flox
2024-12-14T09:53:31.4444478Z -------- ---------------------------------------------
2024-12-14T09:53:31.4444988Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:53:31.4445462Z ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4445930Z sum n/a n/a n/a n/a
2024-12-14T09:53:31.4446410Z mean n/a n/a n/a n/a
2024-12-14T09:53:31.4446876Z ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4447167Z
2024-12-14T09:53:31.4447430Z [62.38%] ···· For parameters: 'sum', 1, True
2024-12-14T09:53:31.4447991Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4448516Z
2024-12-14T09:53:31.4448837Z For parameters: 'sum', 1, False
2024-12-14T09:53:31.4449382Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4449939Z
2024-12-14T09:53:31.4450261Z For parameters: 'sum', 2, True
2024-12-14T09:53:31.4450800Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4451320Z
2024-12-14T09:53:31.4451634Z For parameters: 'sum', 2, False
2024-12-14T09:53:31.4452438Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4452949Z
2024-12-14T09:53:31.4453265Z For parameters: 'mean', 1, True
2024-12-14T09:53:31.4453805Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4454318Z
2024-12-14T09:53:31.4454631Z For parameters: 'mean', 1, False
2024-12-14T09:53:31.4455175Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4455541Z
2024-12-14T09:53:31.4455806Z For parameters: 'mean', 2, True
2024-12-14T09:53:31.4456193Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4456607Z
2024-12-14T09:53:31.4456799Z For parameters: 'mean', 2, False
2024-12-14T09:53:31.4457221Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4457468Z
2024-12-14T09:53:31.4457876Z [62.50%] ··· ...oupByDaskDataFrame.time_agg_small_num_groups ok
2024-12-14T09:53:31.4458391Z [62.50%] ··· ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4458689Z -- ndim / use_flox
2024-12-14T09:53:31.4459071Z -------- ---------------------------------------------
2024-12-14T09:53:31.4459368Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:53:31.4459735Z ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4460002Z sum n/a n/a n/a n/a
2024-12-14T09:53:31.4460543Z mean n/a n/a n/a n/a
2024-12-14T09:53:31.4460931Z ======== ========== =========== ========== ===========
2024-12-14T09:53:31.4461109Z
2024-12-14T09:53:31.4461277Z [62.50%] ···· For parameters: 'sum', 1, True
2024-12-14T09:53:31.4461706Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4462012Z
2024-12-14T09:53:31.4462207Z For parameters: 'sum', 1, False
2024-12-14T09:53:31.4462516Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4463037Z
2024-12-14T09:53:31.4463227Z For parameters: 'sum', 2, True
2024-12-14T09:53:31.4463532Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4463816Z
2024-12-14T09:53:31.4463998Z For parameters: 'sum', 2, False
2024-12-14T09:53:31.4464310Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:53:31.4464593Z
2024-12-14T09:55:29.2189949Z For parameters: 'mean', 1, True
2024-12-14T09:55:29.2190392Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2190940Z
2024-12-14T09:55:29.2191304Z For parameters: 'mean', 1, False
2024-12-14T09:55:29.2191857Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2192390Z
2024-12-14T09:55:29.2192720Z For parameters: 'mean', 2, True
2024-12-14T09:55:29.2193258Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2193769Z
2024-12-14T09:55:29.2194087Z For parameters: 'mean', 2, False
2024-12-14T09:55:29.2194618Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2195023Z
2024-12-14T09:55:29.2195758Z [62.62%] ··· groupby.GroupByDaskDataFrame.time_binary_op_1d n/a
2024-12-14T09:55:29.2196639Z [62.62%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2197503Z [62.75%] ··· groupby.GroupByDaskDataFrame.time_binary_op_2d n/a
2024-12-14T09:55:29.2198354Z [62.75%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2199540Z [62.87%] ··· groupby.GroupByDaskDataFrame.time_init ok
2024-12-14T09:55:29.2200206Z [62.87%] ··· ====== =====
2024-12-14T09:55:29.2200535Z ndim
2024-12-14T09:55:29.2200875Z ------ -----
2024-12-14T09:55:29.2201206Z 1 n/a
2024-12-14T09:55:29.2201519Z 2 n/a
2024-12-14T09:55:29.2201836Z ====== =====
2024-12-14T09:55:29.2202039Z
2024-12-14T09:55:29.2202260Z [62.87%] ···· For parameters: 1
2024-12-14T09:55:29.2202754Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2203267Z
2024-12-14T09:55:29.2203613Z For parameters: 2
2024-12-14T09:55:29.2204124Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2204520Z
2024-12-14T09:55:29.2204962Z [63.00%] ··· groupby.GroupByLongTime.time_mean ok
2024-12-14T09:55:29.2205673Z [63.00%] ··· ============ ============ ============
2024-12-14T09:55:29.2206115Z -- use_flox
2024-12-14T09:55:29.2206550Z ------------ -------------------------
2024-12-14T09:55:29.2206978Z use_cftime True False
2024-12-14T09:55:29.2207397Z ============ ============ ============
2024-12-14T09:55:29.2207915Z True 8.63±0.2ms 13.2±0.2ms
2024-12-14T09:55:29.2208431Z False 7.51±0.2ms 12.5±0.1ms
2024-12-14T09:55:29.2208842Z ============ ============ ============
2024-12-14T09:55:29.2209118Z
2024-12-14T09:55:29.2209784Z [63.12%] ··· groupby.GroupByLongTime.time_setup ok
2024-12-14T09:55:29.2210500Z [63.12%] ··· ============ ============= =============
2024-12-14T09:55:29.2210947Z -- use_flox
2024-12-14T09:55:29.2211388Z ------------ ---------------------------
2024-12-14T09:55:29.2211827Z use_cftime True False
2024-12-14T09:55:29.2212244Z ============ ============= =============
2024-12-14T09:55:29.2212775Z True 3.58±0.08ms 3.60±0.06ms
2024-12-14T09:55:29.2213155Z False 2.60±0.08ms 2.50±0.02ms
2024-12-14T09:55:29.2213470Z ============ ============= =============
2024-12-14T09:55:29.2213683Z
2024-12-14T09:55:29.2213941Z [63.24%] ··· ....GroupByPandasDataFrame.peakmem_binary_op_1d n/a
2024-12-14T09:55:29.2214539Z [63.24%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2215142Z [63.37%] ··· ....GroupByPandasDataFrame.peakmem_binary_op_2d n/a
2024-12-14T09:55:29.2215727Z [63.37%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2216318Z [63.49%] ··· ...pByPandasDataFrame.time_agg_large_num_groups ok
2024-12-14T09:55:29.2216842Z [63.49%] ··· ======== ========== =========== ========== ===========
2024-12-14T09:55:29.2217184Z -- ndim / use_flox
2024-12-14T09:55:29.2217530Z -------- ---------------------------------------------
2024-12-14T09:55:29.2217829Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:55:29.2218203Z ======== ========== =========== ========== ===========
2024-12-14T09:55:29.2218464Z sum n/a n/a n/a n/a
2024-12-14T09:55:29.2218732Z mean n/a n/a n/a n/a
2024-12-14T09:55:29.2218996Z ======== ========== =========== ========== ===========
2024-12-14T09:55:29.2219159Z
2024-12-14T09:55:29.2219330Z [63.49%] ···· For parameters: 'sum', 1, True
2024-12-14T09:55:29.2219753Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2220058Z
2024-12-14T09:55:29.2220251Z For parameters: 'sum', 1, False
2024-12-14T09:55:29.2220726Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2221016Z
2024-12-14T09:55:29.2221203Z For parameters: 'sum', 2, True
2024-12-14T09:55:29.2221495Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2221779Z
2024-12-14T09:55:29.2221961Z For parameters: 'sum', 2, False
2024-12-14T09:55:29.2222253Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2222536Z
2024-12-14T09:55:29.2223039Z For parameters: 'mean', 1, True
2024-12-14T09:55:29.2223572Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2223864Z
2024-12-14T09:55:29.2224049Z For parameters: 'mean', 1, False
2024-12-14T09:55:29.2224344Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:29.2224632Z
2024-12-14T09:55:34.3592945Z For parameters: 'mean', 2, True
2024-12-14T09:55:34.3593636Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3594184Z
2024-12-14T09:55:34.3594539Z For parameters: 'mean', 2, False
2024-12-14T09:55:34.3595081Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3595480Z
2024-12-14T09:55:34.3596198Z [63.61%] ··· ...pByPandasDataFrame.time_agg_small_num_groups ok
2024-12-14T09:55:34.3596964Z [63.61%] ··· ======== ========== =========== ========== ===========
2024-12-14T09:55:34.3597752Z -- ndim / use_flox
2024-12-14T09:55:34.3598240Z -------- ---------------------------------------------
2024-12-14T09:55:34.3598727Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:55:34.3599180Z ======== ========== =========== ========== ===========
2024-12-14T09:55:34.3599638Z sum n/a n/a n/a n/a
2024-12-14T09:55:34.3600122Z mean n/a n/a n/a n/a
2024-12-14T09:55:34.3600565Z ======== ========== =========== ========== ===========
2024-12-14T09:55:34.3600868Z
2024-12-14T09:55:34.3601146Z [63.61%] ···· For parameters: 'sum', 1, True
2024-12-14T09:55:34.3601674Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3602181Z
2024-12-14T09:55:34.3602493Z For parameters: 'sum', 1, False
2024-12-14T09:55:34.3603029Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3603526Z
2024-12-14T09:55:34.3603829Z For parameters: 'sum', 2, True
2024-12-14T09:55:34.3604342Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3604838Z
2024-12-14T09:55:34.3605152Z For parameters: 'sum', 2, False
2024-12-14T09:55:34.3605667Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3606130Z
2024-12-14T09:55:34.3606414Z For parameters: 'mean', 1, True
2024-12-14T09:55:34.3606924Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3607437Z
2024-12-14T09:55:34.3607732Z For parameters: 'mean', 1, False
2024-12-14T09:55:34.3608246Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3608716Z
2024-12-14T09:55:34.3609032Z For parameters: 'mean', 2, True
2024-12-14T09:55:34.3609545Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3610028Z
2024-12-14T09:55:34.3610333Z For parameters: 'mean', 2, False
2024-12-14T09:55:34.3610849Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3611481Z
2024-12-14T09:55:34.3611940Z [63.74%] ··· ...pby.GroupByPandasDataFrame.time_binary_op_1d n/a
2024-12-14T09:55:34.3612774Z [63.74%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3613597Z [63.86%] ··· ...pby.GroupByPandasDataFrame.time_binary_op_2d n/a
2024-12-14T09:55:34.3614430Z [63.86%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3615275Z [63.99%] ··· groupby.GroupByPandasDataFrame.time_init ok
2024-12-14T09:55:34.3615934Z [63.99%] ··· ====== =====
2024-12-14T09:55:34.3616268Z ndim
2024-12-14T09:55:34.3616602Z ------ -----
2024-12-14T09:55:34.3616922Z 1 n/a
2024-12-14T09:55:34.3617235Z 2 n/a
2024-12-14T09:55:34.3617539Z ====== =====
2024-12-14T09:55:34.3617741Z
2024-12-14T09:55:34.3617963Z [63.99%] ···· For parameters: 1
2024-12-14T09:55:34.3618464Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3618958Z
2024-12-14T09:55:34.3619252Z For parameters: 2
2024-12-14T09:55:34.3619714Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T09:55:34.3620105Z
2024-12-14T09:55:34.3620523Z [64.11%] ··· groupby.Resample.time_agg_large_num_groups ok
2024-12-14T09:55:34.3621208Z [64.11%] ··· ======== ====== ============= ============
2024-12-14T09:55:34.3621640Z -- use_flox
2024-12-14T09:55:34.3622076Z --------------- --------------------------
2024-12-14T09:55:34.3622903Z method ndim True False
2024-12-14T09:55:34.3623364Z ======== ====== ============= ============
2024-12-14T09:55:34.3623919Z sum 1 3.04±0.02ms 55.2±0.2ms
2024-12-14T09:55:34.3624459Z sum 2 3.46±0.02ms 55.3±0.2ms
2024-12-14T09:55:34.3625023Z mean 1 3.11±0.01ms 48.2±0.1ms
2024-12-14T09:55:34.3625564Z mean 2 3.56±0.01ms 47.6±0.2ms
2024-12-14T09:55:34.3625989Z ======== ====== ============= ============
2024-12-14T09:55:34.3626272Z
2024-12-14T09:55:34.3626744Z [64.23%] ··· groupby.Resample.time_agg_small_num_groups ok
2024-12-14T09:55:34.3627473Z [64.23%] ··· ======== ====== ============= =============
2024-12-14T09:55:34.3627937Z -- use_flox
2024-12-14T09:55:34.3628394Z --------------- ---------------------------
2024-12-14T09:55:34.3628868Z method ndim True False
2024-12-14T09:55:34.3629317Z ======== ====== ============= =============
2024-12-14T09:55:34.3629873Z sum 1 3.16±0.02ms 3.77±0.04ms
2024-12-14T09:55:34.3630417Z sum 2 3.57±0.03ms 3.98±0.02ms
2024-12-14T09:55:34.3630988Z mean 1 3.23±0.01ms 3.51±0.03ms
2024-12-14T09:55:34.3631549Z mean 2 3.63±0.04ms 3.64±0.06ms
2024-12-14T09:55:34.3631982Z ======== ====== ============= =============
2024-12-14T09:55:34.3632267Z
2024-12-14T09:56:46.9330765Z [64.36%] ··· groupby.Resample.time_init ok
2024-12-14T09:56:46.9331409Z [64.36%] ··· ====== =============
2024-12-14T09:56:46.9331805Z ndim
2024-12-14T09:56:46.9332223Z ------ -------------
2024-12-14T09:56:46.9332704Z 1 1.23±0.01ms
2024-12-14T09:56:46.9333192Z 2 1.24±0.01ms
2024-12-14T09:56:46.9333647Z ====== =============
2024-12-14T09:56:46.9333907Z
2024-12-14T09:56:46.9334378Z [64.48%] ··· ...pby.ResampleCFTime.time_agg_large_num_groups ok
2024-12-14T09:56:46.9335168Z [64.48%] ··· ======== ====== ============= ============
2024-12-14T09:56:46.9335703Z -- use_flox
2024-12-14T09:56:46.9336529Z --------------- --------------------------
2024-12-14T09:56:46.9337044Z method ndim True False
2024-12-14T09:56:46.9337533Z ======== ====== ============= ============
2024-12-14T09:56:46.9338129Z sum 1 3.68±0.07ms 51.7±0.3ms
2024-12-14T09:56:46.9338715Z sum 2 4.05±0.04ms 53.0±0.4ms
2024-12-14T09:56:46.9339313Z mean 1 3.70±0.02ms 45.0±0.6ms
2024-12-14T09:56:46.9339898Z mean 2 4.12±0.01ms 44.0±0.3ms
2024-12-14T09:56:46.9340651Z ======== ====== ============= ============
2024-12-14T09:56:46.9340966Z
2024-12-14T09:56:46.9341439Z [64.60%] ··· ...pby.ResampleCFTime.time_agg_small_num_groups ok
2024-12-14T09:56:46.9342209Z [64.60%] ··· ======== ====== ============= =============
2024-12-14T09:56:46.9342898Z -- use_flox
2024-12-14T09:56:46.9343346Z --------------- ---------------------------
2024-12-14T09:56:46.9343783Z method ndim True False
2024-12-14T09:56:46.9344179Z ======== ====== ============= =============
2024-12-14T09:56:46.9344703Z sum 1 2.82±0.03ms 3.23±0.06ms
2024-12-14T09:56:46.9345203Z sum 2 3.23±0.02ms 3.48±0.03ms
2024-12-14T09:56:46.9345697Z mean 1 2.81±0.03ms 3.03±0.09ms
2024-12-14T09:56:46.9346189Z mean 2 3.26±0.04ms 3.16±0.09ms
2024-12-14T09:56:46.9346592Z ======== ====== ============= =============
2024-12-14T09:56:46.9346867Z
2024-12-14T09:56:46.9347281Z [64.73%] ··· groupby.ResampleCFTime.time_init ok
2024-12-14T09:56:46.9347873Z [64.73%] ··· ====== ============
2024-12-14T09:56:46.9348198Z ndim
2024-12-14T09:56:46.9348525Z ------ ------------
2024-12-14T09:56:46.9348944Z 1 2.78±0.1ms
2024-12-14T09:56:46.9349326Z 2 2.85±0.2ms
2024-12-14T09:56:46.9349639Z ====== ============
2024-12-14T09:56:46.9349863Z
2024-12-14T09:56:46.9350284Z [64.85%] ··· groupby.ResampleDask.time_agg_large_num_groups ok
2024-12-14T09:56:46.9350967Z [64.85%] ··· ======== ========== =========== ========== ============
2024-12-14T09:56:46.9351440Z -- ndim / use_flox
2024-12-14T09:56:46.9351914Z -------- ----------------------------------------------
2024-12-14T09:56:46.9352413Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:56:46.9352871Z ======== ========== =========== ========== ============
2024-12-14T09:56:46.9355269Z sum 312±3ms 883±10ms 828±7ms 1.26±0.01s
2024-12-14T09:56:46.9355970Z mean 370±5ms 502±6ms 994±10ms 845±5ms
2024-12-14T09:56:46.9356701Z ======== ========== =========== ========== ============
2024-12-14T09:56:46.9357003Z
2024-12-14T09:56:46.9357444Z [64.98%] ··· groupby.ResampleDask.time_agg_small_num_groups ok
2024-12-14T09:56:46.9358158Z [64.98%] ··· ======== =========== =========== ========== ===========
2024-12-14T09:56:46.9358620Z -- ndim / use_flox
2024-12-14T09:56:46.9359089Z -------- ----------------------------------------------
2024-12-14T09:56:46.9359589Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T09:56:46.9360048Z ======== =========== =========== ========== ===========
2024-12-14T09:56:46.9360624Z sum 128±0.8ms 105±1ms 359±3ms 271±3ms
2024-12-14T09:56:46.9361222Z mean 145±2ms 66.9±1ms 410±5ms 179±4ms
2024-12-14T09:56:46.9361679Z ======== =========== =========== ========== ===========
2024-12-14T09:56:46.9361973Z
2024-12-14T09:56:46.9362372Z [65.10%] ··· groupby.ResampleDask.time_init ok
2024-12-14T09:56:46.9362984Z [65.10%] ··· ====== =============
2024-12-14T09:56:46.9363343Z ndim
2024-12-14T09:56:46.9363696Z ------ -------------
2024-12-14T09:56:46.9364118Z 1 1.23±0.01ms
2024-12-14T09:56:46.9364537Z 2 1.25±0.01ms
2024-12-14T09:56:46.9364916Z ====== =============
2024-12-14T09:56:46.9365151Z
2024-12-14T09:56:46.9365559Z [65.22%] ··· import.Import.timeraw_import_xarray 336±9ms
2024-12-14T09:56:46.9366596Z [65.35%] ··· import.Import.timeraw_import_xarray_backends 332±6ms
2024-12-14T09:56:46.9367476Z [65.47%] ··· import.Import.timeraw_import_xarray_only 54.0±0.8ms
2024-12-14T09:56:46.9368342Z [65.59%] ··· import.Import.timeraw_import_xarray_plot 332±5ms
2024-12-14T09:56:46.9369113Z [65.72%] ··· indexing.Assignment.time_assignment_basic ok
2024-12-14T09:56:46.9369681Z [65.72%] ··· ================== ===========
2024-12-14T09:56:46.9369933Z key
2024-12-14T09:56:46.9370293Z ------------------ -----------
2024-12-14T09:56:46.9370585Z 1scalar 137±0.9μs
2024-12-14T09:56:46.9370987Z 1slice 126±1μs
2024-12-14T09:57:38.2688992Z 1slice-1scalar 132±2μs
2024-12-14T09:57:38.2689761Z 2slicess-1scalar 134±1μs
2024-12-14T09:57:38.2690322Z ================== ===========
2024-12-14T09:57:38.2690678Z
2024-12-14T09:57:38.2691485Z [65.84%] ··· indexing.Assignment.time_assignment_outer ok
2024-12-14T09:57:38.2692544Z [65.84%] ··· ============ =============
2024-12-14T09:57:38.2693184Z key
2024-12-14T09:57:38.2693776Z ------------ -------------
2024-12-14T09:57:38.2694441Z 1d 477±8μs
2024-12-14T09:57:38.2695116Z 2d 2.50±0.01ms
2024-12-14T09:57:38.2695774Z 2d-1scalar 209±2μs
2024-12-14T09:57:38.2696319Z ============ =============
2024-12-14T09:57:38.2696692Z
2024-12-14T09:57:38.2697301Z [65.97%] ··· indexing.Assignment.time_assignment_vectorized ok
2024-12-14T09:57:38.2698242Z [65.97%] ··· ====== ==========
2024-12-14T09:57:38.2698737Z key
2024-12-14T09:57:38.2699281Z ------ ----------
2024-12-14T09:57:38.2699867Z 1-1d 584±20μs
2024-12-14T09:57:38.2700446Z 2-1d 380±6μs
2024-12-14T09:57:38.2701033Z 3-2d 498±2μs
2024-12-14T09:57:38.2701528Z ====== ==========
2024-12-14T09:57:38.2701848Z
2024-12-14T09:57:38.2702292Z [66.09%] ··· ...nmentOptimized.time_assign_identical_indexes 286±1μs
2024-12-14T09:57:38.2703348Z [66.21%] ··· ...g.AssignmentOptimized.time_assign_no_reindex 247±2μs
2024-12-14T09:57:38.2704340Z [66.34%] ··· indexing.BooleanIndexing.time_indexing 110±0.7ms
2024-12-14T09:57:38.2704866Z [66.46%] ··· ...ing.HugeAxisSmallSliceIndexing.time_indexing 58.9±0.5μs
2024-12-14T09:57:38.2705361Z [66.58%] ··· indexing.Indexing.time_indexing_basic ok
2024-12-14T09:57:38.2705748Z [66.58%] ··· ================== ===========
2024-12-14T09:57:38.2705998Z key
2024-12-14T09:57:38.2706233Z ------------------ -----------
2024-12-14T09:57:38.2706514Z 1scalar 119±0.3μs
2024-12-14T09:57:38.2706786Z 1slice 114±2μs
2024-12-14T09:57:38.2707072Z 1slice-1scalar 146±1μs
2024-12-14T09:57:38.2707355Z 2slicess-1scalar 198±4μs
2024-12-14T09:57:38.2707584Z ================== ===========
2024-12-14T09:57:38.2707730Z
2024-12-14T09:57:38.2707966Z [66.71%] ··· indexing.Indexing.time_indexing_basic_ds_large ok
2024-12-14T09:57:38.2708359Z [66.71%] ··· ================== =============
2024-12-14T09:57:38.2708598Z key
2024-12-14T09:57:38.2708827Z ------------------ -------------
2024-12-14T09:57:38.2709108Z 1scalar 2.04±0.03ms
2024-12-14T09:57:38.2709388Z 1slice 2.01±0.02ms
2024-12-14T09:57:38.2709660Z 1slice-1scalar 2.08±0.01ms
2024-12-14T09:57:38.2709943Z 2slicess-1scalar 2.17±0.01ms
2024-12-14T09:57:38.2710173Z ================== =============
2024-12-14T09:57:38.2710325Z
2024-12-14T09:57:38.2710737Z [66.83%] ··· indexing.Indexing.time_indexing_outer ok
2024-12-14T09:57:38.2711124Z [66.83%] ··· ============ =============
2024-12-14T09:57:38.2711390Z key
2024-12-14T09:57:38.2711631Z ------------ -------------
2024-12-14T09:57:38.2711897Z 1d 420±20μs
2024-12-14T09:57:38.2712168Z 2d 1.12±0.08ms
2024-12-14T09:57:38.2712433Z 2d-1scalar 434±6μs
2024-12-14T09:57:38.2712645Z ============ =============
2024-12-14T09:57:38.2712793Z
2024-12-14T09:57:38.2713033Z [66.96%] ··· indexing.Indexing.time_indexing_vectorized ok
2024-12-14T09:57:38.2713414Z [66.96%] ··· ====== ==========
2024-12-14T09:57:38.2713619Z key
2024-12-14T09:57:38.2713822Z ------ ----------
2024-12-14T09:57:38.2714062Z 1-1d 542±20μs
2024-12-14T09:57:38.2714292Z 2-1d 506±10μs
2024-12-14T09:57:38.2714535Z 3-2d 741±4μs
2024-12-14T09:57:38.2714731Z ====== ==========
2024-12-14T09:57:38.2714856Z
2024-12-14T09:57:38.2715096Z [67.08%] ··· indexing.IndexingDask.time_indexing_basic ok
2024-12-14T09:57:38.2715485Z [67.08%] ··· ================== ============
2024-12-14T09:57:38.2715732Z key
2024-12-14T09:57:38.2715963Z ------------------ ------------
2024-12-14T09:57:38.2716241Z 1scalar 7.42±0.1ms
2024-12-14T09:57:38.2716517Z 1slice 7.58±0.1ms
2024-12-14T09:57:38.2716790Z 1slice-1scalar 7.52±0.1ms
2024-12-14T09:57:38.2717076Z 2slicess-1scalar 50.0±0.9ms
2024-12-14T09:57:38.2717305Z ================== ============
2024-12-14T09:57:38.2717454Z
2024-12-14T09:57:38.2717678Z [67.20%] ··· ...ng.IndexingDask.time_indexing_basic_ds_large ok
2024-12-14T09:57:38.2718062Z [67.20%] ··· ================== =============
2024-12-14T09:57:38.2718299Z key
2024-12-14T09:57:38.2718528Z ------------------ -------------
2024-12-14T09:57:38.2718808Z 1scalar 2.05±0.02ms
2024-12-14T09:57:38.2719085Z 1slice 2.04±0.02ms
2024-12-14T09:57:38.2719503Z 1slice-1scalar 2.13±0.01ms
2024-12-14T09:57:38.2719790Z 2slicess-1scalar 2.19±0.02ms
2024-12-14T09:57:38.2720020Z ================== =============
2024-12-14T09:57:38.2720168Z
2024-12-14T09:57:38.2720408Z [67.33%] ··· indexing.IndexingDask.time_indexing_outer ok
2024-12-14T09:57:38.2720773Z [67.33%] ··· ============ ==========
2024-12-14T09:57:38.2720989Z key
2024-12-14T09:57:38.2721208Z ------------ ----------
2024-12-14T09:57:38.2721458Z 1d 223±5ms
2024-12-14T09:57:38.2721710Z 2d 460±10ms
2024-12-14T09:57:38.2721967Z 2d-1scalar 211±4ms
2024-12-14T09:57:38.2722181Z ============ ==========
2024-12-14T09:57:38.2722321Z
2024-12-14T09:57:38.2722557Z [67.45%] ··· indexing.IndexingDask.time_indexing_vectorized ok
2024-12-14T09:57:38.2722927Z [67.45%] ··· ====== ============
2024-12-14T09:57:48.8440310Z key
2024-12-14T09:57:48.8440821Z ------ ------------
2024-12-14T09:57:48.8441543Z 1-1d 220±6ms
2024-12-14T09:57:48.8442053Z 2-1d 90.9±2ms
2024-12-14T09:57:48.8442497Z 3-2d 37.8±0.6ms
2024-12-14T09:57:48.8442869Z ====== ============
2024-12-14T09:57:48.8443130Z
2024-12-14T09:57:48.8443639Z [67.57%] ··· interp.Interpolation.time_interpolation ok
2024-12-14T09:57:48.8444369Z [67.57%] ··· ======== ============ ============
2024-12-14T09:57:48.8444841Z -- is_short
2024-12-14T09:57:48.8445649Z -------- -------------------------
2024-12-14T09:57:48.8446160Z method True False
2024-12-14T09:57:48.8446645Z ======== ============ ============
2024-12-14T09:57:48.8447224Z linear 11.1±0.5ms 13.8±0.5ms
2024-12-14T09:57:48.8447897Z cubic 67.0±2ms 67.9±0.9ms
2024-12-14T09:57:48.8448405Z ======== ============ ============
2024-12-14T09:57:48.8448740Z
2024-12-14T09:57:48.8449280Z [67.70%] ··· interp.Interpolation.time_interpolation_2d ok
2024-12-14T09:57:48.8450020Z [67.70%] ··· ========= ============
2024-12-14T09:57:48.8450386Z method
2024-12-14T09:57:48.8450756Z --------- ------------
2024-12-14T09:57:48.8451188Z linear 18.7±0.2ms
2024-12-14T09:57:48.8451632Z nearest 15.2±0.5ms
2024-12-14T09:57:48.8451992Z ========= ============
2024-12-14T09:57:48.8452223Z
2024-12-14T09:57:48.8452669Z [67.82%] ··· interp.InterpolationDask.time_interpolation ok
2024-12-14T09:57:48.8453368Z [67.82%] ··· ======== ============ ==========
2024-12-14T09:57:48.8453776Z -- is_short
2024-12-14T09:57:48.8454146Z -------- -----------------------
2024-12-14T09:57:48.8454528Z method True False
2024-12-14T09:57:48.8454927Z ======== ============ ==========
2024-12-14T09:57:48.8455389Z linear 28.7±0.5ms 36.4±1ms
2024-12-14T09:57:48.8455866Z cubic 74.7±0.5ms 86.0±1ms
2024-12-14T09:57:48.8456254Z ======== ============ ==========
2024-12-14T09:57:48.8456502Z
2024-12-14T09:57:48.8456936Z [67.95%] ··· interp.InterpolationDask.time_interpolation_2d ok
2024-12-14T09:57:48.8457586Z [67.95%] ··· ========= ==========
2024-12-14T09:57:48.8457940Z method
2024-12-14T09:57:48.8458288Z --------- ----------
2024-12-14T09:57:48.8458726Z linear 38.5±1ms
2024-12-14T09:57:48.8459157Z nearest 33.3±1ms
2024-12-14T09:57:48.8459498Z ========= ==========
2024-12-14T09:57:48.8459733Z
2024-12-14T09:57:48.8460146Z [68.07%] ··· ...e.DatasetAddVariable.time_merge_two_datasets ok
2024-12-14T09:57:48.8460810Z [68.07%] ··· =================== ============
2024-12-14T09:57:48.8461474Z existing_elements
2024-12-14T09:57:48.8461877Z ------------------- ------------
2024-12-14T09:57:48.8462359Z 0 77.7±0.2μs
2024-12-14T09:57:48.8463043Z 10 183±2μs
2024-12-14T09:57:48.8463529Z 100 980±10μs
2024-12-14T09:57:48.8463998Z 1000 9.06±0.3ms
2024-12-14T09:57:48.8464381Z =================== ============
2024-12-14T09:57:48.8464639Z
2024-12-14T09:57:48.8465072Z [68.19%] ··· ...e.DatasetAddVariable.time_variable_insertion ok
2024-12-14T09:57:48.8465745Z [68.19%] ··· =================== =============
2024-12-14T09:57:48.8466165Z existing_elements
2024-12-14T09:57:48.8466572Z ------------------- -------------
2024-12-14T09:57:48.8467052Z 0 84.9±0.6μs
2024-12-14T09:57:48.8467537Z 10 137±1μs
2024-12-14T09:57:48.8467995Z 100 535±7μs
2024-12-14T09:57:48.8468463Z 1000 4.58±0.02ms
2024-12-14T09:57:48.8468845Z =================== =============
2024-12-14T09:57:48.8469099Z
2024-12-14T09:57:48.8469520Z [68.32%] ··· merge.DatasetCreation.time_dataset_creation ok
2024-12-14T09:57:48.8470213Z [68.32%] ··· ==================== ======= =============
2024-12-14T09:57:48.8470650Z strategy count
2024-12-14T09:57:48.8471075Z -------------------- ------- -------------
2024-12-14T09:57:48.8471809Z dict_of_DataArrays 0 145±1μs
2024-12-14T09:57:48.8472382Z dict_of_DataArrays 1 241±6μs
2024-12-14T09:57:48.8472930Z dict_of_DataArrays 10 736±4μs
2024-12-14T09:57:48.8473484Z dict_of_DataArrays 100 5.23±0.04ms
2024-12-14T09:57:48.8474013Z dict_of_DataArrays 1000 49.8±1ms
2024-12-14T09:57:48.8474389Z dict_of_Variables 0 146±2μs
2024-12-14T09:57:48.8474771Z dict_of_Variables 1 168±3μs
2024-12-14T09:57:48.8475196Z dict_of_Variables 10 260±4μs
2024-12-14T09:57:48.8475507Z dict_of_Variables 100 1.10±0.02ms
2024-12-14T09:57:48.8475954Z dict_of_Variables 1000 9.35±0.2ms
2024-12-14T09:57:48.8476358Z dict_of_Tuples 0 148±2μs
2024-12-14T09:57:48.8476688Z dict_of_Tuples 1 160±2μs
2024-12-14T09:57:48.8477115Z dict_of_Tuples 10 232±4μs
2024-12-14T09:57:48.8477423Z dict_of_Tuples 100 865±10μs
2024-12-14T09:57:48.8477836Z dict_of_Tuples 1000 7.34±0.1ms
2024-12-14T09:57:48.8478102Z ==================== ======= =============
2024-12-14T09:57:48.8478361Z
2024-12-14T09:57:48.8478637Z [68.44%] ··· pandas.MultiIndexSeries.time_from_series ok
2024-12-14T09:57:48.8479163Z [68.44%] ··· ======= ============= ============
2024-12-14T09:57:48.8479499Z -- subset
2024-12-14T09:57:48.8479764Z ------- --------------------------
2024-12-14T09:58:15.7738413Z dtype True False
2024-12-14T09:58:15.7739039Z ======= ============= ============
2024-12-14T09:58:15.7739829Z int 1.99±0.07ms 3.87±0.1ms
2024-12-14T09:58:15.7740159Z float 1.99±0.09ms 3.88±0.2ms
2024-12-14T09:58:15.7740441Z ======= ============= ============
2024-12-14T09:58:15.7740614Z
2024-12-14T09:58:15.7740893Z [68.56%] ··· pandas.ToDataFrame.peakmem_to_dataframe 2.97G
2024-12-14T09:58:15.7741549Z [68.69%] ··· pandas.ToDataFrame.time_to_dataframe 722±4ms
2024-12-14T09:58:15.7742360Z [68.81%] ··· pandas.ToDataFrameDask.peakmem_to_dataframe 230M
2024-12-14T09:58:15.7743087Z [68.94%] ··· pandas.ToDataFrameDask.time_to_dataframe 449±2ms
2024-12-14T09:58:15.7743589Z [69.06%] ··· polyfit.Polyval.peakmem_polyval ok
2024-12-14T09:58:15.7743972Z [69.06%] ··· ========= ====== ====== ======
2024-12-14T09:58:15.7744246Z -- ndeg
2024-12-14T09:58:15.7744485Z --------- --------------------
2024-12-14T09:58:15.7744716Z nx 2 5 20
2024-12-14T09:58:15.7744941Z ========= ====== ====== ======
2024-12-14T09:58:15.7745174Z 100 170M 170M 170M
2024-12-14T09:58:15.7745400Z 1000000 185M 185M 185M
2024-12-14T09:58:15.7745622Z ========= ====== ====== ======
2024-12-14T09:58:15.7745769Z
2024-12-14T09:58:15.7745999Z [69.18%] ··· polyfit.Polyval.time_polyval ok
2024-12-14T09:58:15.7746407Z [69.18%] ··· ========= ============= ============= ============
2024-12-14T09:58:15.7746682Z -- ndeg
2024-12-14T09:58:15.7746958Z --------- ----------------------------------------
2024-12-14T09:58:15.7747223Z nx 2 5 20
2024-12-14T09:58:15.7747472Z ========= ============= ============= ============
2024-12-14T09:58:15.7747786Z 100 813±4μs 1.29±0.01ms 3.62±0.2ms
2024-12-14T09:58:15.7748127Z 1000000 2.20±0.03ms 4.25±0.2ms 13.7±0.5ms
2024-12-14T09:58:15.7748584Z ========= ============= ============= ============
2024-12-14T09:58:15.7748758Z
2024-12-14T09:58:15.7749017Z [69.31%] ··· polyfit.PolyvalDask.peakmem_polyval ok
2024-12-14T09:58:15.7749407Z [69.31%] ··· ========= ====== ====== ======
2024-12-14T09:58:15.7749650Z -- ndeg
2024-12-14T09:58:15.7749897Z --------- --------------------
2024-12-14T09:58:15.7750122Z nx 2 5 20
2024-12-14T09:58:15.7750344Z ========= ====== ====== ======
2024-12-14T09:58:15.7750567Z 100 194M 195M 195M
2024-12-14T09:58:15.7750785Z 1000000 214M 214M 220M
2024-12-14T09:58:15.7751006Z ========= ====== ====== ======
2024-12-14T09:58:15.7751153Z
2024-12-14T09:58:15.7751391Z [69.43%] ··· polyfit.PolyvalDask.time_polyval ok
2024-12-14T09:58:15.7751791Z [69.43%] ··· ========= ============= ============ ============
2024-12-14T09:58:15.7752069Z -- ndeg
2024-12-14T09:58:15.7752340Z --------- ---------------------------------------
2024-12-14T09:58:15.7752596Z nx 2 5 20
2024-12-14T09:58:15.7752848Z ========= ============= ============ ============
2024-12-14T09:58:15.7753196Z 100 5.30±0.08ms 10.0±0.4ms 33.7±0.2ms
2024-12-14T09:58:15.7753514Z 1000000 51.5±1ms 88.7±0.7ms 342±5ms
2024-12-14T09:58:15.7753776Z ========= ============= ============ ============
2024-12-14T09:58:15.7753939Z
2024-12-14T09:58:15.7754179Z [69.55%] ··· reindexing.Reindex.time_1d_coarse 506±8μs
2024-12-14T09:58:15.7754662Z [69.68%] ··· reindexing.Reindex.time_1d_fine_all_found 1.74±0.05ms
2024-12-14T09:58:15.7755144Z [69.80%] ··· reindexing.Reindex.time_1d_fine_some_missing 9.33±0.4ms
2024-12-14T09:58:15.7755628Z [69.93%] ··· reindexing.Reindex.time_2d_coarse 1.94±0.02ms
2024-12-14T09:58:15.7756097Z [70.05%] ··· reindexing.Reindex.time_2d_fine_all_found 24.0±0.07ms
2024-12-14T09:58:15.7756575Z [70.17%] ··· reindexing.Reindex.time_2d_fine_some_missing 36.6±0.2ms
2024-12-14T09:58:15.7757056Z [70.30%] ··· reindexing.ReindexDask.time_1d_coarse 3.43±0.7ms
2024-12-14T09:58:15.7757709Z [70.42%] ··· reindexing.ReindexDask.time_1d_fine_all_found 9.62±0.9ms
2024-12-14T09:58:15.7758186Z [70.54%] ··· ...dexing.ReindexDask.time_1d_fine_some_missing 19.6±0.9ms
2024-12-14T09:58:15.7758665Z [70.67%] ··· reindexing.ReindexDask.time_2d_coarse 5.50±0.2ms
2024-12-14T09:58:15.7759153Z [70.79%] ··· reindexing.ReindexDask.time_2d_fine_all_found 23.7±0.5ms
2024-12-14T09:58:15.7759630Z [70.92%] ··· ...dexing.ReindexDask.time_2d_fine_some_missing 40.8±0.5ms
2024-12-14T09:58:15.7760112Z [71.04%] ··· renaming.SwapDims.time_swap_dims ok
2024-12-14T09:58:15.7760490Z [71.04%] ··· ========== ============
2024-12-14T09:58:15.7760709Z size
2024-12-14T09:58:15.7760931Z ---------- ------------
2024-12-14T09:58:15.7761191Z 1000 35.1±0.4μs
2024-12-14T09:58:15.7761435Z 100000 35.0±0.5μs
2024-12-14T09:58:15.7761695Z 10000000 35.7±0.4μs
2024-12-14T09:58:15.7761907Z ========== ============
2024-12-14T09:58:15.7762044Z
2024-12-14T09:58:15.7762275Z [71.16%] ··· renaming.SwapDims.time_swap_dims_newindex ok
2024-12-14T09:58:40.2459837Z [71.16%] ··· ========== =========
2024-12-14T09:58:40.2461155Z size
2024-12-14T09:58:40.2461883Z ---------- ---------
2024-12-14T09:58:40.2462496Z 1000 193±2μs
2024-12-14T09:58:40.2463162Z 100000 190±4μs
2024-12-14T09:58:40.2463626Z 10000000 194±1μs
2024-12-14T09:58:40.2464345Z ========== =========
2024-12-14T09:58:40.2464623Z
2024-12-14T09:58:40.2465062Z [71.29%] ··· repr.Repr.time_repr 7.66±0.06ms
2024-12-14T09:58:40.2465939Z [71.41%] ··· repr.Repr.time_repr_html 101±1ms
2024-12-14T09:58:40.2467071Z [71.53%] ··· repr.ReprMultiIndex.time_repr 703±1μs
2024-12-14T09:58:40.2468078Z [71.66%] ··· repr.ReprMultiIndex.time_repr_html 2.86±0.02ms
2024-12-14T09:58:40.2469038Z [71.78%] ··· ...rayRollingMemory.peakmem_1drolling_construct ok
2024-12-14T09:58:40.2469771Z [71.78%] ··· ======== ======
2024-12-14T09:58:40.2470159Z stride
2024-12-14T09:58:40.2470715Z -------- ------
2024-12-14T09:58:40.2471111Z None 199M
2024-12-14T09:58:40.2471492Z 5 199M
2024-12-14T09:58:40.2471843Z 50 199M
2024-12-14T09:58:40.2472176Z ======== ======
2024-12-14T09:58:40.2472413Z
2024-12-14T09:58:40.2472861Z [71.91%] ··· ...aArrayRollingMemory.peakmem_1drolling_reduce ok
2024-12-14T09:58:40.2473514Z [71.91%] ··· ====== ====== =======
2024-12-14T09:58:40.2473896Z -- use_bottleneck
2024-12-14T09:58:40.2474275Z ------ --------------
2024-12-14T09:58:40.2474655Z func True False
2024-12-14T09:58:40.2475018Z ====== ====== =======
2024-12-14T09:58:40.2475544Z sum 172M 173M
2024-12-14T09:58:40.2475903Z max 172M 173M
2024-12-14T09:58:40.2476266Z mean 172M 173M
2024-12-14T09:58:40.2476625Z ====== ====== =======
2024-12-14T09:58:40.2476862Z
2024-12-14T09:58:40.2477307Z [72.03%] ··· ...aArrayRollingMemory.peakmem_ndrolling_reduce ok
2024-12-14T09:58:40.2477965Z [72.03%] ··· ====== ====== =======
2024-12-14T09:58:40.2478333Z -- use_bottleneck
2024-12-14T09:58:40.2478711Z ------ --------------
2024-12-14T09:58:40.2479083Z func True False
2024-12-14T09:58:40.2479444Z ====== ====== =======
2024-12-14T09:58:40.2479797Z sum 191M 191M
2024-12-14T09:58:40.2480320Z max 191M 191M
2024-12-14T09:58:40.2480685Z mean 191M 191M
2024-12-14T09:58:40.2481302Z ====== ====== =======
2024-12-14T09:58:40.2481542Z
2024-12-14T09:58:40.2481989Z [72.15%] ··· ...setRollingMemory.peakmem_1drolling_construct ok
2024-12-14T09:58:40.2482632Z [72.15%] ··· ======== ======
2024-12-14T09:58:40.2482970Z stride
2024-12-14T09:58:40.2483323Z -------- ------
2024-12-14T09:58:40.2483666Z None 199M
2024-12-14T09:58:40.2483999Z 5 199M
2024-12-14T09:58:40.2484332Z 50 199M
2024-12-14T09:58:40.2484845Z ======== ======
2024-12-14T09:58:40.2485063Z
2024-12-14T09:58:40.2485513Z [72.28%] ··· ...atasetRollingMemory.peakmem_1drolling_reduce ok
2024-12-14T09:58:40.2486165Z [72.28%] ··· ====== ====== =======
2024-12-14T09:58:40.2486575Z -- use_bottleneck
2024-12-14T09:58:40.2486947Z ------ --------------
2024-12-14T09:58:40.2487324Z func True False
2024-12-14T09:58:40.2487701Z ====== ====== =======
2024-12-14T09:58:40.2488055Z sum 196M 299M
2024-12-14T09:58:40.2488418Z max 196M 299M
2024-12-14T09:58:40.2488781Z mean 196M 299M
2024-12-14T09:58:40.2489133Z ====== ====== =======
2024-12-14T09:58:40.2489381Z
2024-12-14T09:58:40.2490038Z [72.40%] ··· ...atasetRollingMemory.peakmem_ndrolling_reduce ok
2024-12-14T09:58:40.2490701Z [72.40%] ··· ====== ====== =======
2024-12-14T09:58:40.2491064Z -- use_bottleneck
2024-12-14T09:58:40.2491434Z ------ --------------
2024-12-14T09:58:40.2491969Z func True False
2024-12-14T09:58:40.2492338Z ====== ====== =======
2024-12-14T09:58:40.2492699Z sum 215M 311M
2024-12-14T09:58:40.2493062Z max 215M 311M
2024-12-14T09:58:40.2493419Z mean 215M 306M
2024-12-14T09:58:40.2493779Z ====== ====== =======
2024-12-14T09:58:40.2494025Z
2024-12-14T09:58:40.2494626Z [72.52%] ··· rolling.Rolling.time_rolling ok
2024-12-14T09:58:40.2495302Z [72.52%] ··· ======= ======== ============ ============
2024-12-14T09:58:40.2495761Z -- use_bottleneck
2024-12-14T09:58:40.2496213Z ---------------- -------------------------
2024-12-14T09:58:40.2496650Z func center True False
2024-12-14T09:58:40.2497079Z ======= ======== ============ ============
2024-12-14T09:58:40.2497611Z mean True 22.5±0.2ms 143±0.3ms
2024-12-14T09:58:40.2498174Z mean False 13.5±0.9ms 144±0.9ms
2024-12-14T09:58:40.2498866Z count True 58.6±0.3ms 58.5±0.3ms
2024-12-14T09:58:40.2499413Z count False 58.7±0.4ms 58.7±0.4ms
2024-12-14T09:58:40.2499829Z ======= ======== ============ ============
2024-12-14T09:58:40.2500109Z
2024-12-14T09:58:40.2500538Z [72.65%] ··· rolling.Rolling.time_rolling_construct ok
2024-12-14T09:58:40.2501239Z [72.65%] ··· ======== ======== ========= =========
2024-12-14T09:58:40.2501677Z -- use_bottleneck
2024-12-14T09:58:40.2502105Z ----------------- -------------------
2024-12-14T09:58:40.2502530Z center stride True False
2024-12-14T09:58:40.2503074Z ======== ======== ========= =========
2024-12-14T09:58:40.2503754Z True 1 (0) 230±3ms 231±4ms
2024-12-14T09:58:40.2504270Z True 1 (1) 230±5ms 232±5ms
2024-12-14T09:58:40.2504785Z False 1 (0) 230±5ms 232±7ms
2024-12-14T09:58:40.2505294Z False 1 (1) 230±5ms 231±5ms
2024-12-14T09:58:40.2505703Z ======== ======== ========= =========
2024-12-14T09:58:40.2505973Z
2024-12-14T10:18:30.0516874Z [72.77%] ··· rolling.Rolling.time_rolling_long ok
2024-12-14T10:18:30.0517727Z [72.77%] ··· ======= ======== ============= =============
2024-12-14T10:18:30.0518075Z -- use_bottleneck
2024-12-14T10:18:30.0518402Z ---------------- ---------------------------
2024-12-14T10:18:30.0518738Z func pandas True False
2024-12-14T10:18:30.0519045Z ======= ======== ============= =============
2024-12-14T10:18:30.0519410Z mean True 496±3μs 493±2μs
2024-12-14T10:18:30.0519776Z mean False 255±2μs 6.33±0.07ms
2024-12-14T10:18:30.0520159Z count True 594±10μs 589±20μs
2024-12-14T10:18:30.0520523Z count False 2.15±0.04ms 2.10±0.02ms
2024-12-14T10:18:30.0520821Z ======= ======== ============= =============
2024-12-14T10:18:30.0521015Z
2024-12-14T10:18:30.0521302Z [72.90%] ··· rolling.Rolling.time_rolling_np ok
2024-12-14T10:18:30.0521786Z [72.90%] ··· ========= ============= ========= =========
2024-12-14T10:18:30.0522100Z -- use_bottleneck
2024-12-14T10:18:30.0522412Z ----------------------- -------------------
2024-12-14T10:18:30.0522721Z window_ min_periods True False
2024-12-14T10:18:30.0523026Z ========= ============= ========= =========
2024-12-14T10:18:30.0523374Z 20 5 (0) 199±1ms 198±3ms
2024-12-14T10:18:30.0523720Z 20 5 (1) 198±4ms 197±5ms
2024-12-14T10:18:30.0524067Z 40 5 (0) 355±5ms 358±7ms
2024-12-14T10:18:30.0524590Z 40 5 (1) 359±8ms 356±7ms
2024-12-14T10:18:30.0524880Z ========= ============= ========= =========
2024-12-14T10:18:30.0525070Z
2024-12-14T10:18:30.0544456Z ##[error][73.02%] ··· rolling.RollingDask.time_rolling 2/8 failed
2024-12-14T10:18:30.0550963Z [73.02%] ··· ======= ======== =========== ============
2024-12-14T10:18:30.0551284Z -- use_bottleneck
2024-12-14T10:18:30.0551563Z ---------------- ------------------------
2024-12-14T10:18:30.0551836Z func center True False
2024-12-14T10:18:30.0552077Z ======= ======== =========== ============
2024-12-14T10:18:30.0552385Z mean True 888±50ms failed
2024-12-14T10:18:30.0552693Z mean False 659±40ms failed
2024-12-14T10:18:30.0552993Z count True 21.8±0.3s 21.3±0.2s
2024-12-14T10:18:30.0553312Z count False 22.0±0.2s 21.4±0.06s
2024-12-14T10:18:30.0553562Z ======= ======== =========== ============
2024-12-14T10:18:30.0553813Z For parameters: 'mean', True, False
2024-12-14T10:18:30.0554062Z
2024-12-14T10:18:30.0554232Z
2024-12-14T10:18:30.0554435Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:30.0554681Z
2024-12-14T10:18:30.0554871Z For parameters: 'mean', False, False
2024-12-14T10:18:30.0555106Z
2024-12-14T10:18:30.0555264Z
2024-12-14T10:18:30.0555458Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:30.0555634Z
2024-12-14T10:18:30.0555791Z [73.02%] ···· For parameters: 'mean', True, False
2024-12-14T10:18:30.0556045Z
2024-12-14T10:18:30.0556207Z
2024-12-14T10:18:30.0556396Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:30.0556638Z
2024-12-14T10:18:30.0556841Z For parameters: 'mean', False, False
2024-12-14T10:18:30.0557076Z
2024-12-14T10:18:30.0557238Z
2024-12-14T10:18:30.0557430Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:30.0557607Z
2024-12-14T10:18:30.0557856Z [73.14%] ··· rolling.RollingDask.time_rolling_construct ok
2024-12-14T10:18:30.0558509Z [73.14%] ··· ======== ======== ============ ============
2024-12-14T10:18:30.0558777Z -- use_bottleneck
2024-12-14T10:18:30.0559033Z ----------------- -------------------------
2024-12-14T10:18:30.0559290Z center stride True False
2024-12-14T10:18:30.0559536Z ======== ======== ============ ============
2024-12-14T10:18:30.0559832Z True 1 (0) 21.0±0.2s 20.8±0.2s
2024-12-14T10:18:30.0560144Z True 1 (1) 20.8±0.2s 21.0±0.04s
2024-12-14T10:18:30.0560449Z False 1 (0) 20.9±0.08s 20.5±0.04s
2024-12-14T10:18:30.0560743Z False 1 (1) 20.5±0.03s 20.6±0.01s
2024-12-14T10:18:30.0560983Z ======== ======== ============ ============
2024-12-14T10:18:30.0561138Z
2024-12-14T10:18:30.0561380Z [73.27%] ··· rolling.RollingDask.time_rolling_long ok
2024-12-14T10:18:30.0561787Z [73.27%] ··· ======= ======== ============= =============
2024-12-14T10:18:30.0562053Z -- use_bottleneck
2024-12-14T10:18:30.0562317Z ---------------- ---------------------------
2024-12-14T10:18:30.0562572Z func pandas True False
2024-12-14T10:18:30.0562818Z ======= ======== ============= =============
2024-12-14T10:18:30.0563125Z mean True 1.29±0.04ms 1.29±0.04ms
2024-12-14T10:18:30.0563421Z mean False 5.27±0.07ms 29.6±0.4ms
2024-12-14T10:18:30.0563881Z count True 1.41±0.04ms 1.41±0.04ms
2024-12-14T10:18:30.0564195Z count False 13.0±0.2ms 13.1±0.2ms
2024-12-14T10:18:30.0564432Z ======= ======== ============= =============
2024-12-14T10:18:30.0564595Z
2024-12-14T10:18:30.0565389Z ##[error][73.39%] ··· rolling.RollingDask.time_rolling_np failed
2024-12-14T10:18:30.0566269Z [73.39%] ··· ========= ============= ======== ========
2024-12-14T10:18:30.0566553Z -- use_bottleneck
2024-12-14T10:18:30.0566819Z ----------------------- -----------------
2024-12-14T10:18:31.6136360Z window_ min_periods True False
2024-12-14T10:18:31.6136873Z ========= ============= ======== ========
2024-12-14T10:18:31.6137310Z 20 5 (0) failed failed
2024-12-14T10:18:31.6137739Z 20 5 (1) failed failed
2024-12-14T10:18:31.6138151Z 40 5 (0) failed failed
2024-12-14T10:18:31.6138600Z 40 5 (1) failed failed
2024-12-14T10:18:31.6139011Z ========= ============= ======== ========
2024-12-14T10:18:31.6139337Z For parameters: 20, 5 (0), True
2024-12-14T10:18:31.6139590Z
2024-12-14T10:18:31.6139759Z
2024-12-14T10:18:31.6139973Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6140228Z
2024-12-14T10:18:31.6140412Z For parameters: 20, 5 (0), False
2024-12-14T10:18:31.6140639Z
2024-12-14T10:18:31.6140796Z
2024-12-14T10:18:31.6140985Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6141224Z
2024-12-14T10:18:31.6141402Z For parameters: 20, 5 (1), True
2024-12-14T10:18:31.6141626Z
2024-12-14T10:18:31.6141783Z
2024-12-14T10:18:31.6141973Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6142230Z
2024-12-14T10:18:31.6142410Z For parameters: 20, 5 (1), False
2024-12-14T10:18:31.6142914Z
2024-12-14T10:18:31.6143086Z
2024-12-14T10:18:31.6143271Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6143515Z
2024-12-14T10:18:31.6143696Z For parameters: 40, 5 (0), True
2024-12-14T10:18:31.6144170Z
2024-12-14T10:18:31.6144326Z
2024-12-14T10:18:31.6144519Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6144759Z
2024-12-14T10:18:31.6144937Z For parameters: 40, 5 (0), False
2024-12-14T10:18:31.6145167Z
2024-12-14T10:18:31.6145319Z
2024-12-14T10:18:31.6145506Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6145745Z
2024-12-14T10:18:31.6145915Z For parameters: 40, 5 (1), True
2024-12-14T10:18:31.6146142Z
2024-12-14T10:18:31.6146298Z
2024-12-14T10:18:31.6146485Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6146727Z
2024-12-14T10:18:31.6146898Z For parameters: 40, 5 (1), False
2024-12-14T10:18:31.6147128Z
2024-12-14T10:18:31.6147282Z
2024-12-14T10:18:31.6147463Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6147649Z
2024-12-14T10:18:31.6148013Z [73.39%] ···· For parameters: 20, 5 (0), True
2024-12-14T10:18:31.6148252Z
2024-12-14T10:18:31.6148413Z
2024-12-14T10:18:31.6148615Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6148862Z
2024-12-14T10:18:31.6149036Z For parameters: 20, 5 (0), False
2024-12-14T10:18:31.6149267Z
2024-12-14T10:18:31.6149427Z
2024-12-14T10:18:31.6149612Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6149855Z
2024-12-14T10:18:31.6150209Z For parameters: 20, 5 (1), True
2024-12-14T10:18:31.6150435Z
2024-12-14T10:18:31.6150593Z
2024-12-14T10:18:31.6150783Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6151015Z
2024-12-14T10:18:31.6151195Z For parameters: 20, 5 (1), False
2024-12-14T10:18:31.6151431Z
2024-12-14T10:18:31.6151581Z
2024-12-14T10:18:31.6151771Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6152009Z
2024-12-14T10:18:31.6152182Z For parameters: 40, 5 (0), True
2024-12-14T10:18:31.6152407Z
2024-12-14T10:18:31.6184189Z
2024-12-14T10:18:31.6184504Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6184914Z
2024-12-14T10:18:31.6185203Z For parameters: 40, 5 (0), False
2024-12-14T10:18:31.6185568Z
2024-12-14T10:18:31.6185833Z
2024-12-14T10:18:31.6186163Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6186554Z
2024-12-14T10:18:31.6186837Z For parameters: 40, 5 (1), True
2024-12-14T10:18:31.6187222Z
2024-12-14T10:18:31.6187489Z
2024-12-14T10:18:31.6187794Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6188173Z
2024-12-14T10:18:31.6188471Z For parameters: 40, 5 (1), False
2024-12-14T10:18:31.6188863Z
2024-12-14T10:18:31.6189133Z
2024-12-14T10:18:31.6189486Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:18:31.6189814Z
2024-12-14T10:18:31.6190408Z [73.51%] ··· unstacking.Unstacking.time_unstack_fast 2.99±0.08ms
2024-12-14T10:18:31.6191356Z [73.64%] ··· unstacking.Unstacking.time_unstack_pandas_slow 3.35±0.2ms
2024-12-14T10:18:31.6192319Z [73.76%] ··· unstacking.Unstacking.time_unstack_slow 3.00±0.07ms
2024-12-14T10:18:31.6193267Z [73.89%] ··· unstacking.UnstackingDask.time_unstack_fast 19.1±0.4ms
2024-12-14T10:18:31.6194219Z [74.01%] ··· ...king.UnstackingDask.time_unstack_pandas_slow 2.94±0.03ms
2024-12-14T10:18:31.6195247Z [74.13%] ··· unstacking.UnstackingDask.time_unstack_slow 3.04±0.02ms
2024-12-14T10:18:31.6196227Z [74.26%] ··· ...nstackingSparse.peakmem_unstack_to_sparse_2d n/a
2024-12-14T10:18:31.6197407Z [74.26%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6198234Z [74.38%] ··· ...nstackingSparse.peakmem_unstack_to_sparse_3d n/a
2024-12-14T10:18:31.6199035Z [74.38%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6199864Z [74.50%] ··· unstacking.UnstackingSparse.time_unstack_fast n/a
2024-12-14T10:18:31.6200680Z [74.50%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6201469Z [74.63%] ··· ...ng.UnstackingSparse.time_unstack_pandas_slow n/a
2024-12-14T10:18:31.6202253Z [74.63%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6203093Z [74.75%] ··· unstacking.UnstackingSparse.time_unstack_slow n/a
2024-12-14T10:18:31.6203790Z [74.75%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6204240Z [74.88%] ··· ...g.UnstackingSparse.time_unstack_to_sparse_2d n/a
2024-12-14T10:18:31.6204728Z [74.88%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:18:31.6205140Z [75.00%] ··· ...g.UnstackingSparse.time_unstack_to_sparse_3d n/a
2024-12-14T10:20:04.2974284Z [75.00%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:20:04.2974982Z [75.00%] · For xarray commit 755581c8 (round 2/2):
2024-12-14T10:20:04.2976388Z [75.00%] ·· Building for mamba-py3.11-bottleneck-cftime-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-setuptools_scm-sparse
2024-12-14T10:20:04.2978429Z [75.00%] ·· Benchmarking mamba-py3.11-bottleneck-cftime-dask-distributed-flox-netcdf4-numpy-numpy_groupies-pandas-scipy-setuptools_scm-sparse
2024-12-14T10:20:04.2980449Z [75.12%] ··· accessors.DateTimeAccessor.time_dayofyear ok
2024-12-14T10:20:04.2981349Z [75.12%] ··· ========== =============
2024-12-14T10:20:04.2981875Z calendar
2024-12-14T10:20:04.2982365Z ---------- -------------
2024-12-14T10:20:04.2983178Z standard 424±7μs
2024-12-14T10:20:04.2983741Z noleap 1.03±0.01ms
2024-12-14T10:20:04.2984175Z ========== =============
2024-12-14T10:20:04.2984471Z
2024-12-14T10:20:04.2984972Z [75.25%] ··· accessors.DateTimeAccessor.time_floor ok
2024-12-14T10:20:04.2985733Z [75.25%] ··· ========== ============
2024-12-14T10:20:04.2986187Z calendar
2024-12-14T10:20:04.2986676Z ---------- ------------
2024-12-14T10:20:04.2987208Z standard 405±2μs
2024-12-14T10:20:04.2987733Z noleap 78.7±0.4ms
2024-12-14T10:20:04.2988156Z ========== ============
2024-12-14T10:20:04.2988487Z
2024-12-14T10:20:04.2988990Z [75.37%] ··· accessors.DateTimeAccessor.time_year ok
2024-12-14T10:20:04.2989770Z [75.37%] ··· ========== ==========
2024-12-14T10:20:04.2990195Z calendar
2024-12-14T10:20:04.2990631Z ---------- ----------
2024-12-14T10:20:04.2991154Z standard 399±3μs
2024-12-14T10:20:04.2991593Z noleap 926±30μs
2024-12-14T10:20:04.2991956Z ========== ==========
2024-12-14T10:20:04.2992187Z
2024-12-14T10:20:04.2992597Z [75.50%] ··· alignment.Align.time_already_aligned ok
2024-12-14T10:20:04.2993226Z [75.50%] ··· ========== ============
2024-12-14T10:20:04.2993588Z join
2024-12-14T10:20:04.2993960Z ---------- ------------
2024-12-14T10:20:04.2994411Z outer 29.2±0.5ms
2024-12-14T10:20:04.2994846Z inner 28.9±0.9ms
2024-12-14T10:20:04.2995302Z left 29.1±0.7ms
2024-12-14T10:20:04.2995743Z right 29.0±0.6ms
2024-12-14T10:20:04.2996194Z exact 29.4±0.6ms
2024-12-14T10:20:04.2996644Z override 177±3μs
2024-12-14T10:20:04.2997010Z ========== ============
2024-12-14T10:20:04.2997282Z
2024-12-14T10:20:04.2997974Z [75.62%] ··· alignment.Align.time_not_aligned ok
2024-12-14T10:20:04.2998612Z [75.62%] ··· ======= =============
2024-12-14T10:20:04.2998971Z join
2024-12-14T10:20:04.2999328Z ------- -------------
2024-12-14T10:20:04.2999766Z outer 30.0±0.7ms
2024-12-14T10:20:04.3000204Z inner 1.04±0.03ms
2024-12-14T10:20:04.3000639Z left 30.1±0.8ms
2024-12-14T10:20:04.3001092Z right 883±20μs
2024-12-14T10:20:04.3001457Z ======= =============
2024-12-14T10:20:04.3001686Z
2024-12-14T10:20:04.3002113Z [75.74%] ··· ...nment.Align.time_not_aligned_random_integers ok
2024-12-14T10:20:04.3002740Z [75.74%] ··· ======= ============
2024-12-14T10:20:04.3003100Z join
2024-12-14T10:20:04.3003449Z ------- ------------
2024-12-14T10:20:04.3003875Z outer 31.4±1ms
2024-12-14T10:20:04.3004313Z inner 14.4±0.3ms
2024-12-14T10:20:04.3004724Z left 30.3±0.5ms
2024-12-14T10:20:04.3005146Z right 13.6±0.1ms
2024-12-14T10:20:04.3005504Z ======= ============
2024-12-14T10:20:04.3005747Z
2024-12-14T10:20:04.3006203Z [75.87%] ··· alignment.AlignCFTime.time_already_aligned ok
2024-12-14T10:20:04.3006879Z [75.87%] ··· ========== ============
2024-12-14T10:20:04.3007259Z join
2024-12-14T10:20:04.3007553Z ---------- ------------
2024-12-14T10:20:04.3007890Z outer 30.0±0.7ms
2024-12-14T10:20:04.3008372Z inner 29.1±0.4ms
2024-12-14T10:20:04.3008988Z left 29.4±0.5ms
2024-12-14T10:20:04.3009270Z right 29.4±0.5ms
2024-12-14T10:20:04.3009536Z exact 28.4±0.6ms
2024-12-14T10:20:04.3009795Z override 230±7μs
2024-12-14T10:20:04.3010010Z ========== ============
2024-12-14T10:20:04.3010163Z
2024-12-14T10:20:04.3010419Z [75.99%] ··· alignment.AlignCFTime.time_not_aligned ok
2024-12-14T10:20:04.3010799Z [75.99%] ··· ======= =============
2024-12-14T10:20:04.3011019Z join
2024-12-14T10:20:04.3011235Z ------- -------------
2024-12-14T10:20:04.3011491Z outer 44.5±0.5ms
2024-12-14T10:20:04.3011739Z inner 10.5±0.09ms
2024-12-14T10:20:04.3011984Z left 41.1±0.1ms
2024-12-14T10:20:04.3012430Z right 1.36±0.03ms
2024-12-14T10:20:04.3012679Z ======= =============
2024-12-14T10:20:04.3012815Z
2024-12-14T10:20:04.3013065Z [76.11%] ··· ...AlignCFTime.time_not_aligned_random_integers ok
2024-12-14T10:20:04.3013424Z [76.11%] ··· ======= ============
2024-12-14T10:20:04.3013635Z join
2024-12-14T10:20:04.3013843Z ------- ------------
2024-12-14T10:20:04.3014092Z outer 48.2±0.3ms
2024-12-14T10:20:04.3014344Z inner 33.6±0.4ms
2024-12-14T10:20:04.3014592Z left 43.7±0.5ms
2024-12-14T10:20:04.3014831Z right 23.3±0.2ms
2024-12-14T10:20:04.3015036Z ======= ============
2024-12-14T10:20:04.3015169Z
2024-12-14T10:20:04.3015410Z [76.24%] ··· alignment.AlignDask.time_already_aligned ok
2024-12-14T10:20:04.3015769Z [76.24%] ··· ========== ==========
2024-12-14T10:20:04.3015982Z join
2024-12-14T10:20:04.3016198Z ---------- ----------
2024-12-14T10:20:04.3016442Z outer 603±4μs
2024-12-14T10:20:04.3016700Z inner 607±8μs
2024-12-14T10:20:04.3016948Z left 613±5μs
2024-12-14T10:20:04.3017189Z right 618±7μs
2024-12-14T10:20:04.3017436Z exact 610±10μs
2024-12-14T10:21:00.8872768Z override 175±2μs
2024-12-14T10:21:00.8873296Z ========== ==========
2024-12-14T10:21:00.8873826Z
2024-12-14T10:21:00.8874326Z [76.36%] ··· alignment.AlignDask.time_not_aligned ok
2024-12-14T10:21:00.8875042Z [76.36%] ··· ======= =============
2024-12-14T10:21:00.8875424Z join
2024-12-14T10:21:00.8875808Z ------- -------------
2024-12-14T10:21:00.8876162Z outer 1.27±0.01ms
2024-12-14T10:21:00.8876431Z inner 1.61±0.02ms
2024-12-14T10:21:00.8876682Z left 1.13±0.01ms
2024-12-14T10:21:00.8876941Z right 1.42±0.01ms
2024-12-14T10:21:00.8877152Z ======= =============
2024-12-14T10:21:00.8877308Z
2024-12-14T10:21:00.8877547Z [76.49%] ··· ...t.AlignDask.time_not_aligned_random_integers ok
2024-12-14T10:21:00.8877913Z [76.49%] ··· ======= =============
2024-12-14T10:21:00.8878127Z join
2024-12-14T10:21:00.8878338Z ------- -------------
2024-12-14T10:21:00.8878606Z outer 1.87±0.02ms
2024-12-14T10:21:00.8878857Z inner 12.3±0.1ms
2024-12-14T10:21:00.8879101Z left 1.17±0.01ms
2024-12-14T10:21:00.8879350Z right 11.6±0.08ms
2024-12-14T10:21:00.8879557Z ======= =============
2024-12-14T10:21:00.8879692Z
2024-12-14T10:21:00.8879937Z [76.61%] ··· coding.EncodeCFDatetime.time_encode_cf_datetime ok
2024-12-14T10:21:00.8880326Z [76.61%] ··· ========== ===========
2024-12-14T10:21:00.8880553Z calendar
2024-12-14T10:21:00.8880773Z ---------- -----------
2024-12-14T10:21:00.8881263Z standard 785±5μs
2024-12-14T10:21:00.8881547Z noleap 125±0.8ms
2024-12-14T10:21:00.8881756Z ========== ===========
2024-12-14T10:21:00.8881900Z
2024-12-14T10:21:00.8882147Z [76.73%] ··· combine.Combine1d.time_combine_by_coords 1.01±0ms
2024-12-14T10:21:00.8882649Z [76.86%] ··· combine.Combine1dDask.time_combine_by_coords 175±2ms
2024-12-14T10:21:00.8883146Z [76.98%] ··· combine.Combine3d.time_combine_by_coords 65.7±0.7ms
2024-12-14T10:21:00.8883622Z [77.10%] ··· combine.Combine3d.time_combine_nested 64.3±0.6ms
2024-12-14T10:21:00.8884118Z [77.23%] ··· ...issing.DataArrayMissingBottleneck.time_bfill ok
2024-12-14T10:21:00.8884543Z [77.23%] ··· =============== ==================== ======= =============
2024-12-14T10:21:00.8884835Z shape chunks limit
2024-12-14T10:21:00.8885155Z --------------- -------------------- ------- -------------
2024-12-14T10:21:00.8885523Z (365, 75, 75) None None 5.23±0.08ms
2024-12-14T10:21:00.8885862Z (365, 75, 75) None 3 5.12±0.2ms
2024-12-14T10:21:00.8886204Z (365, 75, 75) {'x': 25, 'y': 25} None 15.1±0.3ms
2024-12-14T10:21:00.8886553Z (365, 75, 75) {'x': 25, 'y': 25} 3 14.9±0.5ms
2024-12-14T10:21:00.8886824Z =============== ==================== ======= =============
2024-12-14T10:21:00.8886993Z
2024-12-14T10:21:00.8887244Z [77.35%] ··· ...issing.DataArrayMissingBottleneck.time_ffill ok
2024-12-14T10:21:00.8887674Z [77.35%] ··· =============== ==================== ======= ============
2024-12-14T10:21:00.8887965Z shape chunks limit
2024-12-14T10:21:00.8888255Z --------------- -------------------- ------- ------------
2024-12-14T10:21:00.8888614Z (365, 75, 75) None None 6.25±0.3ms
2024-12-14T10:21:00.8888949Z (365, 75, 75) None 3 6.21±0.3ms
2024-12-14T10:21:00.8889278Z (365, 75, 75) {'x': 25, 'y': 25} None 14.0±0.4ms
2024-12-14T10:21:00.8889611Z (365, 75, 75) {'x': 25, 'y': 25} 3 14.4±0.4ms
2024-12-14T10:21:00.8890018Z =============== ==================== ======= ============
2024-12-14T10:21:00.8890186Z
2024-12-14T10:21:00.8890433Z [77.48%] ··· ...rrayMissingInterpolateNA.time_interpolate_na ok
2024-12-14T10:21:00.8890851Z [77.48%] ··· =============== ==================== ========= =========
2024-12-14T10:21:00.8891124Z -- limit
2024-12-14T10:21:00.8891393Z ------------------------------------ -------------------
2024-12-14T10:21:00.8891677Z shape chunks None 3
2024-12-14T10:21:00.8891949Z =============== ==================== ========= =========
2024-12-14T10:21:00.8892274Z (365, 75, 75) None 101±1ms 121±1ms
2024-12-14T10:21:00.8892602Z (365, 75, 75) {'x': 25, 'y': 25} 156±4ms 310±6ms
2024-12-14T10:21:00.8892858Z =============== ==================== ========= =========
2024-12-14T10:21:00.8893031Z
2024-12-14T10:21:00.8893275Z [77.60%] ··· dataset.DatasetBinaryOp.time_normalize 496±5μs
2024-12-14T10:21:00.8893767Z [77.72%] ··· dataset.DatasetChunk.time_chunk 155±5ms
2024-12-14T10:21:00.8894253Z [77.85%] ··· dataset_io.IOReadCustomEngine.time_open_dataset ok
2024-12-14T10:21:00.8894631Z [77.85%] ··· ============== ==========
2024-12-14T10:21:00.8894851Z chunks
2024-12-14T10:21:00.8895076Z -------------- ----------
2024-12-14T10:21:00.8895341Z None 65.3±2ms
2024-12-14T10:21:00.8895599Z {} 445±2ms
2024-12-14T10:21:00.8895985Z {'time': 10} 500±6ms
2024-12-14T10:21:00.8896207Z ============== ==========
2024-12-14T10:21:00.8896347Z
2024-12-14T10:21:00.8896585Z [77.97%] ··· ...adDataTreeNetCDF4.time_load_datatree_netcdf4 n/a
2024-12-14T10:21:01.1328307Z [77.97%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1329277Z [78.09%] ··· ...adDataTreeNetCDF4.time_open_datatree_netcdf4 n/a
2024-12-14T10:21:01.1330108Z [78.09%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1330740Z [78.22%] ··· ...eadMultipleNetCDF3.time_load_dataset_netcdf4 n/a
2024-12-14T10:21:01.1331364Z [78.22%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1331976Z [78.34%] ··· ...OReadMultipleNetCDF3.time_load_dataset_scipy n/a
2024-12-14T10:21:01.1332583Z [78.34%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1333207Z [78.47%] ··· ...eadMultipleNetCDF3.time_open_dataset_netcdf4 n/a
2024-12-14T10:21:01.1333811Z [78.47%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1334420Z [78.59%] ··· ...OReadMultipleNetCDF3.time_open_dataset_scipy n/a
2024-12-14T10:21:01.1335029Z [78.59%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1335612Z [78.71%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:01.1336192Z [78.71%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1336802Z [78.84%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:01.1337424Z [78.84%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1338007Z [78.96%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:01.1338597Z [78.96%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1339198Z [79.08%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:01.1339794Z [79.08%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1340375Z [79.21%] ··· ...sk.time_load_dataset_scipy_with_block_chunks n/a
2024-12-14T10:21:01.1341244Z [79.21%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1341829Z [79.33%] ··· ...ask.time_load_dataset_scipy_with_time_chunks n/a
2024-12-14T10:21:01.1342424Z [79.33%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1343210Z [79.46%] ··· ....time_open_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:01.1343670Z [79.46%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1344119Z [79.58%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:01.1344572Z [79.58%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1345015Z [79.70%] ··· ...k.time_open_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:01.1345451Z [79.70%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1345889Z [79.83%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:01.1346339Z [79.83%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1346802Z [79.95%] ··· ...sk.time_open_dataset_scipy_with_block_chunks n/a
2024-12-14T10:21:01.1347243Z [79.95%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1347686Z [80.07%] ··· ...ask.time_open_dataset_scipy_with_time_chunks n/a
2024-12-14T10:21:01.1348137Z [80.07%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1348600Z [80.20%] ··· ...eadMultipleNetCDF4.time_load_dataset_netcdf4 n/a
2024-12-14T10:21:01.1349256Z [80.20%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1349735Z [80.32%] ··· ...eadMultipleNetCDF4.time_open_dataset_netcdf4 n/a
2024-12-14T10:21:01.1350185Z [80.32%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1350625Z [80.45%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:01.1351108Z [80.45%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1351576Z [80.57%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:01.1352063Z [80.57%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1352573Z [80.69%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:01.1353012Z [80.69%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1353452Z [80.82%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:01.1353902Z [80.82%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1354343Z [80.94%] ··· ....time_open_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:01.1354776Z [80.94%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1355383Z [81.06%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:01.1355826Z [81.06%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:01.1356290Z [81.19%] ··· ...k.time_open_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:09.3498995Z [81.19%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3499945Z [81.31%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:09.3500836Z [81.31%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3501711Z [81.44%] ··· dataset_io.IOReadSingleFile.time_read_dataset ok
2024-12-14T10:21:09.3502416Z [81.44%] ··· ========= ============= =============
2024-12-14T10:21:09.3503101Z -- chunks
2024-12-14T10:21:09.3503532Z --------- ---------------------------
2024-12-14T10:21:09.3503960Z engine None {}
2024-12-14T10:21:09.3504399Z ========= ============= =============
2024-12-14T10:21:09.3504932Z scipy 4.99±0.07ms 6.29±0.08ms
2024-12-14T10:21:09.3505462Z netcdf4 2.23±0ms 3.14±0.01ms
2024-12-14T10:21:09.3505882Z ========= ============= =============
2024-12-14T10:21:09.3506164Z
2024-12-14T10:21:09.3506602Z [81.56%] ··· ...OReadSingleNetCDF3.time_load_dataset_netcdf4 n/a
2024-12-14T10:21:09.3507460Z [81.56%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3508310Z [81.68%] ··· ....IOReadSingleNetCDF3.time_load_dataset_scipy n/a
2024-12-14T10:21:09.3509454Z [81.68%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3510322Z [81.81%] ··· ...IOReadSingleNetCDF3.time_orthogonal_indexing n/a
2024-12-14T10:21:09.3511164Z [81.81%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3512019Z [81.93%] ··· ...IOReadSingleNetCDF3.time_vectorized_indexing n/a
2024-12-14T10:21:09.3512849Z [81.93%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3513647Z [82.05%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:09.3514441Z [82.05%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3515253Z [82.18%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:09.3516055Z [82.18%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3516857Z [82.30%] ··· ..._dataset_netcdf4_with_block_chunks_oindexing n/a
2024-12-14T10:21:09.3517642Z [82.30%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3518429Z [82.43%] ··· ..._dataset_netcdf4_with_block_chunks_vindexing n/a
2024-12-14T10:21:09.3519213Z [82.43%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3519996Z [82.55%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:09.3520770Z [82.55%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3521559Z [82.67%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:09.3522336Z [82.67%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3523116Z [82.80%] ··· ...sk.time_load_dataset_scipy_with_block_chunks n/a
2024-12-14T10:21:09.3523891Z [82.80%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3524686Z [82.92%] ··· ...ad_dataset_scipy_with_block_chunks_oindexing n/a
2024-12-14T10:21:09.3525470Z [82.92%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3526253Z [83.04%] ··· ...ad_dataset_scipy_with_block_chunks_vindexing n/a
2024-12-14T10:21:09.3527032Z [83.04%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3528052Z [83.17%] ··· ...ask.time_load_dataset_scipy_with_time_chunks n/a
2024-12-14T10:21:09.3528832Z [83.17%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3529649Z [83.29%] ··· ...OReadSingleNetCDF4.time_load_dataset_netcdf4 n/a
2024-12-14T10:21:09.3530503Z [83.29%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3531324Z [83.42%] ··· ...IOReadSingleNetCDF4.time_orthogonal_indexing n/a
2024-12-14T10:21:09.3532141Z [83.42%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3532967Z [83.54%] ··· ...IOReadSingleNetCDF4.time_vectorized_indexing n/a
2024-12-14T10:21:09.3533785Z [83.54%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3534562Z [83.66%] ··· ....time_load_dataset_netcdf4_with_block_chunks n/a
2024-12-14T10:21:09.3535354Z [83.66%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3536142Z [83.79%] ··· ...et_netcdf4_with_block_chunks_multiprocessing n/a
2024-12-14T10:21:09.3536927Z [83.79%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3537714Z [83.91%] ··· ..._dataset_netcdf4_with_block_chunks_oindexing n/a
2024-12-14T10:21:09.3538497Z [83.91%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:09.3539285Z [84.03%] ··· ..._dataset_netcdf4_with_block_chunks_vindexing n/a
2024-12-14T10:21:09.3540233Z [84.03%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0497166Z [84.16%] ··· ...k.time_load_dataset_netcdf4_with_time_chunks n/a
2024-12-14T10:21:14.0497958Z [84.16%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0498503Z [84.28%] ··· ...set_netcdf4_with_time_chunks_multiprocessing n/a
2024-12-14T10:21:14.0499367Z [84.28%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0500226Z [84.41%] ··· ...teMultipleNetCDF3.time_write_dataset_netcdf4 n/a
2024-12-14T10:21:14.0501104Z [84.41%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0501917Z [84.53%] ··· ...riteMultipleNetCDF3.time_write_dataset_scipy n/a
2024-12-14T10:21:14.0502925Z [84.53%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0503734Z [84.65%] ··· dataset_io.IOWriteNetCDFDask.time_write n/a
2024-12-14T10:21:14.0504525Z [84.65%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0505359Z [84.78%] ··· ...t_io.IOWriteNetCDFDaskDistributed.time_write n/a
2024-12-14T10:21:14.0506220Z [84.78%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0507073Z [84.90%] ··· ...riteSingleNetCDF3.time_write_dataset_netcdf4 n/a
2024-12-14T10:21:14.0507906Z [84.90%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0508770Z [85.02%] ··· ...OWriteSingleNetCDF3.time_write_dataset_scipy n/a
2024-12-14T10:21:14.0509626Z [85.02%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:14.0510467Z [85.15%] ··· datatree.Datatree.time_from_dict_few 887±10μs
2024-12-14T10:21:14.0511346Z [85.27%] ··· datatree.Datatree.time_from_dict_many 19.8±0.07ms
2024-12-14T10:21:14.0512207Z [85.40%] ··· groupby.GroupBy.peakmem_binary_op_1d 171M
2024-12-14T10:21:14.0513040Z [85.52%] ··· groupby.GroupBy.peakmem_binary_op_2d 171M
2024-12-14T10:21:14.0513877Z [85.64%] ··· groupby.GroupBy.time_agg_large_num_groups ok
2024-12-14T10:21:14.0514549Z [85.64%] ··· ======== ====== ============= ============
2024-12-14T10:21:14.0514989Z -- use_flox
2024-12-14T10:21:14.0515717Z --------------- --------------------------
2024-12-14T10:21:14.0516162Z method ndim True False
2024-12-14T10:21:14.0516601Z ======== ====== ============= ============
2024-12-14T10:21:14.0517124Z sum 1 2.90±0.02ms 99.4±0.3ms
2024-12-14T10:21:14.0517649Z sum 2 4.11±0.03ms 134±2ms
2024-12-14T10:21:14.0518174Z mean 1 3.01±0.01ms 106±0.6ms
2024-12-14T10:21:14.0518673Z mean 2 4.16±0.05ms 142±2ms
2024-12-14T10:21:14.0519106Z ======== ====== ============= ============
2024-12-14T10:21:14.0519380Z
2024-12-14T10:21:14.0519654Z [85.64%] ···· For parameters: 'sum', 1, False
2024-12-14T10:21:14.0521617Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9730172Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9731458Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9732615Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9733782Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9734935Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9736096Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9737261Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9738690Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9739862Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9741034Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9742190Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9743560Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9744729Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9745902Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9747067Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9748350Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9749536Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9750697Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9752017Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9753172Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9754327Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9755495Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9756648Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9757821Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9759122Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9760461Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9761627Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9762789Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9763945Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9765106Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9766256Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9767418Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9768578Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9769902Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9771064Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9772228Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9773508Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9774667Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9775823Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9776980Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9778147Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9779311Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9780605Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9781897Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9783266Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9784433Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9785585Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9786744Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9787911Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9789070Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9790225Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9791559Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:15.9792722Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:15.9793887Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:30.4696070Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4696696Z
2024-12-14T10:21:30.4697241Z [85.89%] ··· groupby.GroupBy.time_binary_op_1d 1.36±0.02ms
2024-12-14T10:21:30.4698115Z [86.01%] ··· groupby.GroupBy.time_binary_op_2d 2.47±0.04ms
2024-12-14T10:21:30.4699083Z [86.14%] ··· groupby.GroupBy.time_init ok
2024-12-14T10:21:30.4699847Z [86.14%] ··· ====== ==========
2024-12-14T10:21:30.4700236Z ndim
2024-12-14T10:21:30.4700631Z ------ ----------
2024-12-14T10:21:30.4701102Z 1 338±1μs
2024-12-14T10:21:30.4701644Z 2 1.27±0ms
2024-12-14T10:21:30.4702044Z ====== ==========
2024-12-14T10:21:30.4702306Z
2024-12-14T10:21:30.4702964Z [86.26%] ··· groupby.GroupByDask.peakmem_binary_op_1d 194M
2024-12-14T10:21:30.4704009Z [86.39%] ··· groupby.GroupByDask.peakmem_binary_op_2d 237M
2024-12-14T10:21:30.4705230Z [86.51%] ··· groupby.GroupByDask.time_agg_large_num_groups ok
2024-12-14T10:21:30.4706114Z [86.51%] ··· ======== ============ =========== ============ ===========
2024-12-14T10:21:30.4706725Z -- ndim / use_flox
2024-12-14T10:21:30.4707621Z -------- -------------------------------------------------
2024-12-14T10:21:30.4708303Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:21:30.4708920Z ======== ============ =========== ============ ===========
2024-12-14T10:21:30.4709696Z sum 5.55±0.1ms 289±2ms 12.2±0.1ms 392±3ms
2024-12-14T10:21:30.4710482Z mean 5.84±0.2ms 280±3ms 12.8±0.1ms 381±3ms
2024-12-14T10:21:30.4710968Z ======== ============ =========== ============ ===========
2024-12-14T10:21:30.4711260Z
2024-12-14T10:21:30.4711515Z [86.51%] ···· For parameters: 'sum', 1, False
2024-12-14T10:21:30.4713345Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4715403Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4717459Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4719500Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4721567Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4723612Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4725622Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4727876Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4729892Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4731756Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4733288Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4734672Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4735113Z
2024-12-14T10:21:30.4735309Z For parameters: 'sum', 2, False
2024-12-14T10:21:30.4736596Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4737965Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4739376Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4740602Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4741781Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4743177Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4744400Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4745593Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4746772Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4747940Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4749105Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4750416Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4750747Z
2024-12-14T10:21:30.4751107Z For parameters: 'mean', 1, False
2024-12-14T10:21:30.4752198Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4753365Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4754527Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4755696Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4756876Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4758047Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:30.4759354Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:30.4760648Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6680554Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6682398Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6684369Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6685823Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6686210Z
2024-12-14T10:21:34.6686452Z For parameters: 'mean', 2, False
2024-12-14T10:21:34.6688365Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6690347Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6692259Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6694321Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6696347Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6698639Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6700648Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6702897Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6704974Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6707027Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6709092Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension b because variable b is not a coordinate. To create an index for b, please first call `.set_coords('b')` on this object.
2024-12-14T10:21:34.6711129Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6711815Z
2024-12-14T10:21:34.6712484Z [86.63%] ··· groupby.GroupByDask.time_agg_small_num_groups ok
2024-12-14T10:21:34.6713227Z [86.63%] ··· ======== ====== ============ ============
2024-12-14T10:21:34.6713675Z -- use_flox
2024-12-14T10:21:34.6714127Z --------------- -------------------------
2024-12-14T10:21:34.6714558Z method ndim True False
2024-12-14T10:21:34.6714969Z ======== ====== ============ ============
2024-12-14T10:21:34.6715473Z sum 1 5.79±0.1ms 8.88±0.2ms
2024-12-14T10:21:34.6715969Z sum 2 11.8±0.3ms 19.5±0.4ms
2024-12-14T10:21:34.6716501Z mean 1 5.73±0.1ms 8.79±0.1ms
2024-12-14T10:21:34.6717045Z mean 2 12.1±0.1ms 18.9±0.1ms
2024-12-14T10:21:34.6717463Z ======== ====== ============ ============
2024-12-14T10:21:34.6717748Z
2024-12-14T10:21:34.6718040Z [86.63%] ···· For parameters: 'sum', 1, False
2024-12-14T10:21:34.6719456Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:34.6720840Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6722191Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:34.6723620Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:34.6724890Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:44.9332679Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:44.9333871Z /home/runner/work/xarray/xarray/asv_bench/.asv/env/ddf8d2ba3c8ff735eaf962648983d514/lib/python3.11/site-packages/xarray/core/concat.py:540: UserWarning: No index created for dimension a because variable a is not a coordinate. To create an index for a, please first call `.set_coords('a')` on this object.
2024-12-14T10:21:44.9335079Z ds.expand_dims(dim_name, create_index_for_new_dim=create_index_for_new_dim)
2024-12-14T10:21:44.9335343Z
2024-12-14T10:21:44.9335812Z [86.76%] ··· groupby.GroupByDask.time_binary_op_1d 103±3ms
2024-12-14T10:21:44.9336327Z [86.88%] ··· groupby.GroupByDask.time_binary_op_2d 1.27±0.01s
2024-12-14T10:21:44.9336815Z [87.00%] ··· groupby.GroupByDask.time_init ok
2024-12-14T10:21:44.9337180Z [87.00%] ··· ====== =============
2024-12-14T10:21:44.9337397Z ndim
2024-12-14T10:21:44.9337616Z ------ -------------
2024-12-14T10:21:44.9337869Z 1 321±4μs
2024-12-14T10:21:44.9338107Z 2 1.75±0.02ms
2024-12-14T10:21:44.9338309Z ====== =============
2024-12-14T10:21:44.9338449Z
2024-12-14T10:21:44.9338882Z [87.13%] ··· ...by.GroupByDaskDataFrame.peakmem_binary_op_1d n/a
2024-12-14T10:21:44.9339372Z [87.13%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9339850Z [87.25%] ··· ...by.GroupByDaskDataFrame.peakmem_binary_op_2d n/a
2024-12-14T10:21:44.9340314Z [87.25%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9340791Z [87.38%] ··· ...oupByDaskDataFrame.time_agg_large_num_groups ok
2024-12-14T10:21:44.9341199Z [87.38%] ··· ======== ========== =========== ========== ===========
2024-12-14T10:21:44.9341493Z -- ndim / use_flox
2024-12-14T10:21:44.9341775Z -------- ---------------------------------------------
2024-12-14T10:21:44.9342066Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:21:44.9342339Z ======== ========== =========== ========== ===========
2024-12-14T10:21:44.9342786Z sum n/a n/a n/a n/a
2024-12-14T10:21:44.9343096Z mean n/a n/a n/a n/a
2024-12-14T10:21:44.9343375Z ======== ========== =========== ========== ===========
2024-12-14T10:21:44.9343553Z
2024-12-14T10:21:44.9343712Z [87.38%] ···· For parameters: 'sum', 1, True
2024-12-14T10:21:44.9344045Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9344347Z
2024-12-14T10:21:44.9344558Z For parameters: 'sum', 1, False
2024-12-14T10:21:44.9344870Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9345157Z
2024-12-14T10:21:44.9345347Z For parameters: 'sum', 2, True
2024-12-14T10:21:44.9345649Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9345931Z
2024-12-14T10:21:44.9346112Z For parameters: 'sum', 2, False
2024-12-14T10:21:44.9346417Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:21:44.9346705Z
2024-12-14T10:23:41.3610394Z For parameters: 'mean', 1, True
2024-12-14T10:23:41.3611058Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3611610Z
2024-12-14T10:23:41.3612300Z For parameters: 'mean', 1, False
2024-12-14T10:23:41.3612867Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3613400Z
2024-12-14T10:23:41.3613730Z For parameters: 'mean', 2, True
2024-12-14T10:23:41.3614285Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3614806Z
2024-12-14T10:23:41.3615127Z For parameters: 'mean', 2, False
2024-12-14T10:23:41.3615666Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3616072Z
2024-12-14T10:23:41.3616785Z [87.50%] ··· ...oupByDaskDataFrame.time_agg_small_num_groups ok
2024-12-14T10:23:41.3617552Z [87.50%] ··· ======== ========== =========== ========== ===========
2024-12-14T10:23:41.3618079Z -- ndim / use_flox
2024-12-14T10:23:41.3618568Z -------- ---------------------------------------------
2024-12-14T10:23:41.3619100Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:23:41.3619575Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.3620024Z sum n/a n/a n/a n/a
2024-12-14T10:23:41.3620502Z mean n/a n/a n/a n/a
2024-12-14T10:23:41.3620954Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.3621248Z
2024-12-14T10:23:41.3621514Z [87.50%] ···· For parameters: 'sum', 1, True
2024-12-14T10:23:41.3622066Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3623075Z
2024-12-14T10:23:41.3623417Z For parameters: 'sum', 1, False
2024-12-14T10:23:41.3623965Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3624476Z
2024-12-14T10:23:41.3624790Z For parameters: 'sum', 2, True
2024-12-14T10:23:41.3625328Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3625830Z
2024-12-14T10:23:41.3626139Z For parameters: 'sum', 2, False
2024-12-14T10:23:41.3626667Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3627171Z
2024-12-14T10:23:41.3627477Z For parameters: 'mean', 1, True
2024-12-14T10:23:41.3628003Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3628509Z
2024-12-14T10:23:41.3628817Z For parameters: 'mean', 1, False
2024-12-14T10:23:41.3629358Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3629857Z
2024-12-14T10:23:41.3630169Z For parameters: 'mean', 2, True
2024-12-14T10:23:41.3630696Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3631203Z
2024-12-14T10:23:41.3631510Z For parameters: 'mean', 2, False
2024-12-14T10:23:41.3632036Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3632425Z
2024-12-14T10:23:41.3632906Z [87.62%] ··· groupby.GroupByDaskDataFrame.time_binary_op_1d n/a
2024-12-14T10:23:41.3633764Z [87.62%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3634616Z [87.75%] ··· groupby.GroupByDaskDataFrame.time_binary_op_2d n/a
2024-12-14T10:23:41.3635464Z [87.75%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3636363Z [87.87%] ··· groupby.GroupByDaskDataFrame.time_init ok
2024-12-14T10:23:41.3637030Z [87.87%] ··· ====== =====
2024-12-14T10:23:41.3637379Z ndim
2024-12-14T10:23:41.3637722Z ------ -----
2024-12-14T10:23:41.3638061Z 1 n/a
2024-12-14T10:23:41.3638385Z 2 n/a
2024-12-14T10:23:41.3638940Z ====== =====
2024-12-14T10:23:41.3639074Z
2024-12-14T10:23:41.3639245Z [87.87%] ···· For parameters: 1
2024-12-14T10:23:41.3639708Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3640118Z
2024-12-14T10:23:41.3640324Z For parameters: 2
2024-12-14T10:23:41.3640669Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.3640948Z
2024-12-14T10:23:41.3641248Z [88.00%] ··· groupby.GroupByLongTime.time_mean ok
2024-12-14T10:23:41.3641752Z [88.00%] ··· ============ ============ ============
2024-12-14T10:23:41.3642137Z -- use_flox
2024-12-14T10:23:41.3642395Z ------------ -------------------------
2024-12-14T10:23:41.3642770Z use_cftime True False
2024-12-14T10:23:41.3643016Z ============ ============ ============
2024-12-14T10:23:41.3643435Z True 8.44±0.2ms 13.1±0.1ms
2024-12-14T10:23:41.3643782Z False 7.11±0.2ms 11.6±0.1ms
2024-12-14T10:23:41.3644091Z ============ ============ ============
2024-12-14T10:23:41.3644256Z
2024-12-14T10:23:41.3644622Z [88.12%] ··· groupby.GroupByLongTime.time_setup ok
2024-12-14T10:23:41.3645043Z [88.12%] ··· ============ ============= =============
2024-12-14T10:23:41.3645407Z -- use_flox
2024-12-14T10:23:41.3645670Z ------------ ---------------------------
2024-12-14T10:23:41.3645925Z use_cftime True False
2024-12-14T10:23:41.3646328Z ============ ============= =============
2024-12-14T10:23:41.3646643Z True 3.52±0.02ms 3.53±0.02ms
2024-12-14T10:23:41.3647083Z False 2.43±0.02ms 2.44±0.01ms
2024-12-14T10:23:41.3647327Z ============ ============= =============
2024-12-14T10:23:41.3647496Z
2024-12-14T10:23:41.5419105Z [88.24%] ··· ....GroupByPandasDataFrame.peakmem_binary_op_1d n/a
2024-12-14T10:23:41.5419732Z [88.24%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5420461Z [88.37%] ··· ....GroupByPandasDataFrame.peakmem_binary_op_2d n/a
2024-12-14T10:23:41.5421386Z [88.37%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5422322Z [88.49%] ··· ...pByPandasDataFrame.time_agg_large_num_groups ok
2024-12-14T10:23:41.5423393Z [88.49%] ··· ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5424056Z -- ndim / use_flox
2024-12-14T10:23:41.5424681Z -------- ---------------------------------------------
2024-12-14T10:23:41.5425302Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:23:41.5425886Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5426456Z sum n/a n/a n/a n/a
2024-12-14T10:23:41.5426981Z mean n/a n/a n/a n/a
2024-12-14T10:23:41.5427507Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5427830Z
2024-12-14T10:23:41.5428131Z [88.49%] ···· For parameters: 'sum', 1, True
2024-12-14T10:23:41.5428729Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5429329Z
2024-12-14T10:23:41.5429702Z For parameters: 'sum', 1, False
2024-12-14T10:23:41.5430316Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5430948Z
2024-12-14T10:23:41.5431314Z For parameters: 'sum', 2, True
2024-12-14T10:23:41.5431857Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5432368Z
2024-12-14T10:23:41.5432959Z For parameters: 'sum', 2, False
2024-12-14T10:23:41.5433506Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5434016Z
2024-12-14T10:23:41.5434338Z For parameters: 'mean', 1, True
2024-12-14T10:23:41.5434877Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5435383Z
2024-12-14T10:23:41.5435707Z For parameters: 'mean', 1, False
2024-12-14T10:23:41.5436253Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5436763Z
2024-12-14T10:23:41.5437090Z For parameters: 'mean', 2, True
2024-12-14T10:23:41.5437632Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5438138Z
2024-12-14T10:23:41.5438459Z For parameters: 'mean', 2, False
2024-12-14T10:23:41.5439001Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5439408Z
2024-12-14T10:23:41.5439859Z [88.61%] ··· ...pByPandasDataFrame.time_agg_small_num_groups ok
2024-12-14T10:23:41.5440603Z [88.61%] ··· ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5441101Z -- ndim / use_flox
2024-12-14T10:23:41.5441618Z -------- ---------------------------------------------
2024-12-14T10:23:41.5442126Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:23:41.5442635Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5443319Z sum n/a n/a n/a n/a
2024-12-14T10:23:41.5443815Z mean n/a n/a n/a n/a
2024-12-14T10:23:41.5444270Z ======== ========== =========== ========== ===========
2024-12-14T10:23:41.5444561Z
2024-12-14T10:23:41.5444824Z [88.61%] ···· For parameters: 'sum', 1, True
2024-12-14T10:23:41.5445388Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5445913Z
2024-12-14T10:23:41.5446246Z For parameters: 'sum', 1, False
2024-12-14T10:23:41.5446792Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5447307Z
2024-12-14T10:23:41.5447623Z For parameters: 'sum', 2, True
2024-12-14T10:23:41.5448083Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5448531Z
2024-12-14T10:23:41.5448734Z For parameters: 'sum', 2, False
2024-12-14T10:23:41.5449180Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5449486Z
2024-12-14T10:23:41.5449772Z For parameters: 'mean', 1, True
2024-12-14T10:23:41.5450089Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5450501Z
2024-12-14T10:23:41.5450689Z For parameters: 'mean', 1, False
2024-12-14T10:23:41.5451097Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5451386Z
2024-12-14T10:23:41.5451666Z For parameters: 'mean', 2, True
2024-12-14T10:23:41.5451971Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5452361Z
2024-12-14T10:23:41.5452545Z For parameters: 'mean', 2, False
2024-12-14T10:23:41.5452946Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5453166Z
2024-12-14T10:23:41.5453566Z [88.74%] ··· ...pby.GroupByPandasDataFrame.time_binary_op_1d n/a
2024-12-14T10:23:41.5454150Z [88.74%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5454662Z [88.86%] ··· ...pby.GroupByPandasDataFrame.time_binary_op_2d n/a
2024-12-14T10:23:41.5455310Z [88.86%] ···· asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:23:41.5455908Z [88.99%] ··· groupby.GroupByPandasDataFrame.time_init ok
2024-12-14T10:23:41.5456302Z [88.99%] ··· ====== =====
2024-12-14T10:24:31.9203627Z ndim
2024-12-14T10:24:31.9204081Z ------ -----
2024-12-14T10:24:31.9204381Z 1 n/a
2024-12-14T10:24:31.9204580Z 2 n/a
2024-12-14T10:24:31.9204916Z ====== =====
2024-12-14T10:24:31.9205137Z
2024-12-14T10:24:31.9205596Z [88.99%] ···· For parameters: 1
2024-12-14T10:24:31.9206165Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:24:31.9206725Z
2024-12-14T10:24:31.9207083Z For parameters: 2
2024-12-14T10:24:31.9207643Z asv: skipped: NotImplementedError('Skipping this test...')
2024-12-14T10:24:31.9208056Z
2024-12-14T10:24:31.9208520Z [89.11%] ··· groupby.Resample.time_agg_large_num_groups ok
2024-12-14T10:24:31.9209261Z [89.11%] ··· ======== ====== ============= ============
2024-12-14T10:24:31.9209720Z -- use_flox
2024-12-14T10:24:31.9210165Z --------------- --------------------------
2024-12-14T10:24:31.9210615Z method ndim True False
2024-12-14T10:24:31.9211044Z ======== ====== ============= ============
2024-12-14T10:24:31.9211569Z sum 1 3.00±0.01ms 54.0±0.4ms
2024-12-14T10:24:31.9212106Z sum 2 3.37±0.02ms 54.5±0.2ms
2024-12-14T10:24:31.9212919Z mean 1 3.07±0.01ms 47.2±0.2ms
2024-12-14T10:24:31.9213458Z mean 2 3.52±0.02ms 47.1±0.7ms
2024-12-14T10:24:31.9213880Z ======== ====== ============= ============
2024-12-14T10:24:31.9214166Z
2024-12-14T10:24:31.9214594Z [89.23%] ··· groupby.Resample.time_agg_small_num_groups ok
2024-12-14T10:24:31.9215315Z [89.23%] ··· ======== ====== ============= =============
2024-12-14T10:24:31.9215759Z -- use_flox
2024-12-14T10:24:31.9216202Z --------------- ---------------------------
2024-12-14T10:24:31.9216651Z method ndim True False
2024-12-14T10:24:31.9217079Z ======== ====== ============= =============
2024-12-14T10:24:31.9217601Z sum 1 3.10±0.01ms 3.68±0.03ms
2024-12-14T10:24:31.9218131Z sum 2 3.49±0.01ms 3.94±0.01ms
2024-12-14T10:24:31.9218658Z mean 1 3.17±0.01ms 3.44±0.01ms
2024-12-14T10:24:31.9219210Z mean 2 3.59±0.02ms 3.52±0.01ms
2024-12-14T10:24:31.9219633Z ======== ====== ============= =============
2024-12-14T10:24:31.9219912Z
2024-12-14T10:24:31.9220322Z [89.36%] ··· groupby.Resample.time_init ok
2024-12-14T10:24:31.9220989Z [89.36%] ··· ====== ==========
2024-12-14T10:24:31.9221346Z ndim
2024-12-14T10:24:31.9221693Z ------ ----------
2024-12-14T10:24:31.9222112Z 1 1.20±0ms
2024-12-14T10:24:31.9222522Z 2 1.22±0ms
2024-12-14T10:24:31.9223028Z ====== ==========
2024-12-14T10:24:31.9223258Z
2024-12-14T10:24:31.9223690Z [89.48%] ··· ...pby.ResampleCFTime.time_agg_large_num_groups ok
2024-12-14T10:24:31.9224386Z [89.48%] ··· ======== ====== ============= ============
2024-12-14T10:24:31.9224822Z -- use_flox
2024-12-14T10:24:31.9225279Z --------------- --------------------------
2024-12-14T10:24:31.9225726Z method ndim True False
2024-12-14T10:24:31.9226149Z ======== ====== ============= ============
2024-12-14T10:24:31.9226667Z sum 1 3.68±0.04ms 50.8±0.2ms
2024-12-14T10:24:31.9227189Z sum 2 3.97±0.03ms 51.3±0.2ms
2024-12-14T10:24:31.9227980Z mean 1 3.64±0.01ms 44.6±0.6ms
2024-12-14T10:24:31.9228510Z mean 2 4.11±0.01ms 44.2±0.7ms
2024-12-14T10:24:31.9228927Z ======== ====== ============= ============
2024-12-14T10:24:31.9229204Z
2024-12-14T10:24:31.9229627Z [89.60%] ··· ...pby.ResampleCFTime.time_agg_small_num_groups ok
2024-12-14T10:24:31.9230314Z [89.60%] ··· ======== ====== ============= =============
2024-12-14T10:24:31.9230757Z -- use_flox
2024-12-14T10:24:31.9231204Z --------------- ---------------------------
2024-12-14T10:24:31.9231661Z method ndim True False
2024-12-14T10:24:31.9232096Z ======== ====== ============= =============
2024-12-14T10:24:31.9232625Z sum 1 2.78±0.04ms 3.21±0.02ms
2024-12-14T10:24:31.9233153Z sum 2 3.13±0.04ms 3.49±0.07ms
2024-12-14T10:24:31.9236327Z mean 1 2.80±0.02ms 2.98±0.01ms
2024-12-14T10:24:31.9236908Z mean 2 3.23±0.03ms 3.07±0.04ms
2024-12-14T10:24:31.9237339Z ======== ====== ============= =============
2024-12-14T10:24:31.9237628Z
2024-12-14T10:24:31.9238076Z [89.73%] ··· groupby.ResampleCFTime.time_init ok
2024-12-14T10:24:31.9238751Z [89.73%] ··· ====== =============
2024-12-14T10:24:31.9239118Z ndim
2024-12-14T10:24:31.9239487Z ------ -------------
2024-12-14T10:24:31.9239929Z 1 2.63±0.02ms
2024-12-14T10:24:31.9240582Z 2 2.64±0.02ms
2024-12-14T10:24:31.9240948Z ====== =============
2024-12-14T10:24:31.9241183Z
2024-12-14T10:24:31.9241638Z [89.85%] ··· groupby.ResampleDask.time_agg_large_num_groups ok
2024-12-14T10:24:31.9242397Z [89.85%] ··· ======== ========== =========== ========== ============
2024-12-14T10:24:31.9242751Z -- ndim / use_flox
2024-12-14T10:24:31.9243046Z -------- ----------------------------------------------
2024-12-14T10:24:31.9243345Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:24:31.9243621Z ======== ========== =========== ========== ============
2024-12-14T10:24:31.9243986Z sum 306±1ms 875±8ms 806±3ms 1.25±0.02s
2024-12-14T10:25:21.3551464Z mean 366±3ms 494±2ms 976±4ms 824±7ms
2024-12-14T10:25:21.3551896Z ======== ========== =========== ========== ============
2024-12-14T10:25:21.3552156Z
2024-12-14T10:25:21.3552558Z [89.98%] ··· groupby.ResampleDask.time_agg_small_num_groups ok
2024-12-14T10:25:21.3553331Z [89.98%] ··· ======== ========== =========== ========== ===========
2024-12-14T10:25:21.3553824Z -- ndim / use_flox
2024-12-14T10:25:21.3554383Z -------- ---------------------------------------------
2024-12-14T10:25:21.3554914Z method 1 / True 1 / False 2 / True 2 / False
2024-12-14T10:25:21.3555365Z ======== ========== =========== ========== ===========
2024-12-14T10:25:21.3556009Z sum 126±1ms 106±0.8ms 358±6ms 268±3ms
2024-12-14T10:25:21.3556685Z mean 143±2ms 68.2±1ms 405±1ms 174±3ms
2024-12-14T10:25:21.3557252Z ======== ========== =========== ========== ===========
2024-12-14T10:25:21.3557615Z
2024-12-14T10:25:21.3558142Z [90.10%] ··· groupby.ResampleDask.time_init ok
2024-12-14T10:25:21.3558951Z [90.10%] ··· ====== =============
2024-12-14T10:25:21.3559412Z ndim
2024-12-14T10:25:21.3559915Z ------ -------------
2024-12-14T10:25:21.3560368Z 1 1.21±0ms
2024-12-14T10:25:21.3560876Z 2 1.24±0.02ms
2024-12-14T10:25:21.3561635Z ====== =============
2024-12-14T10:25:21.3561936Z
2024-12-14T10:25:21.3562443Z [90.22%] ··· import.Import.timeraw_import_xarray 323±10ms
2024-12-14T10:25:21.3563474Z [90.35%] ··· import.Import.timeraw_import_xarray_backends 333±3ms
2024-12-14T10:25:21.3564521Z [90.47%] ··· import.Import.timeraw_import_xarray_only 53.3±0.4ms
2024-12-14T10:25:21.3565578Z [90.59%] ··· import.Import.timeraw_import_xarray_plot 337±5ms
2024-12-14T10:25:21.3566649Z [90.72%] ··· indexing.Assignment.time_assignment_basic ok
2024-12-14T10:25:21.3567465Z [90.72%] ··· ================== =========
2024-12-14T10:25:21.3567923Z key
2024-12-14T10:25:21.3568349Z ------------------ ---------
2024-12-14T10:25:21.3568826Z 1scalar 136±1μs
2024-12-14T10:25:21.3569290Z 1slice 124±4μs
2024-12-14T10:25:21.3569786Z 1slice-1scalar 129±1μs
2024-12-14T10:25:21.3570267Z 2slicess-1scalar 133±2μs
2024-12-14T10:25:21.3570637Z ================== =========
2024-12-14T10:25:21.3570891Z
2024-12-14T10:25:21.3571313Z [90.84%] ··· indexing.Assignment.time_assignment_outer ok
2024-12-14T10:25:21.3571953Z [90.84%] ··· ============ =============
2024-12-14T10:25:21.3572326Z key
2024-12-14T10:25:21.3572699Z ------------ -------------
2024-12-14T10:25:21.3573161Z 1d 488±8μs
2024-12-14T10:25:21.3573601Z 2d 2.50±0.03ms
2024-12-14T10:25:21.3574291Z 2d-1scalar 203±3μs
2024-12-14T10:25:21.3574662Z ============ =============
2024-12-14T10:25:21.3574905Z
2024-12-14T10:25:21.3575333Z [90.97%] ··· indexing.Assignment.time_assignment_vectorized ok
2024-12-14T10:25:21.3575974Z [90.97%] ··· ====== ==========
2024-12-14T10:25:21.3576343Z key
2024-12-14T10:25:21.3576697Z ------ ----------
2024-12-14T10:25:21.3577122Z 1-1d 613±20μs
2024-12-14T10:25:21.3577529Z 2-1d 379±4μs
2024-12-14T10:25:21.3577930Z 3-2d 488±3μs
2024-12-14T10:25:21.3578265Z ====== ==========
2024-12-14T10:25:21.3578481Z
2024-12-14T10:25:21.3578909Z [91.09%] ··· ...nmentOptimized.time_assign_identical_indexes 283±1μs
2024-12-14T10:25:21.3579765Z [91.21%] ··· ...g.AssignmentOptimized.time_assign_no_reindex 246±1μs
2024-12-14T10:25:21.3580646Z [91.34%] ··· indexing.BooleanIndexing.time_indexing 109±2ms
2024-12-14T10:25:21.3581522Z [91.46%] ··· ...ing.HugeAxisSmallSliceIndexing.time_indexing 58.2±0.4μs
2024-12-14T10:25:21.3582341Z [91.58%] ··· indexing.Indexing.time_indexing_basic ok
2024-12-14T10:25:21.3583183Z [91.58%] ··· ================== ===========
2024-12-14T10:25:21.3583594Z key
2024-12-14T10:25:21.3583986Z ------------------ -----------
2024-12-14T10:25:21.3584463Z 1scalar 119±0.8μs
2024-12-14T10:25:21.3584931Z 1slice 113±0.9μs
2024-12-14T10:25:21.3585412Z 1slice-1scalar 143±2μs
2024-12-14T10:25:21.3585892Z 2slicess-1scalar 195±1μs
2024-12-14T10:25:21.3586292Z ================== ===========
2024-12-14T10:25:21.3586552Z
2024-12-14T10:25:21.3586970Z [91.71%] ··· indexing.Indexing.time_indexing_basic_ds_large ok
2024-12-14T10:25:21.3587710Z [91.71%] ··· ================== =============
2024-12-14T10:25:21.3588143Z key
2024-12-14T10:25:21.3588557Z ------------------ -------------
2024-12-14T10:25:21.3588922Z 1scalar 2.03±0.02ms
2024-12-14T10:25:21.3589354Z 1slice 2.02±0.01ms
2024-12-14T10:25:21.3590008Z 1slice-1scalar 2.10±0.02ms
2024-12-14T10:25:21.3590323Z 2slicess-1scalar 2.20±0.02ms
2024-12-14T10:25:21.3590669Z ================== =============
2024-12-14T10:25:21.3590826Z
2024-12-14T10:25:21.3591124Z [91.83%] ··· indexing.Indexing.time_indexing_outer ok
2024-12-14T10:25:21.3591567Z [91.83%] ··· ============ =============
2024-12-14T10:25:21.3591905Z key
2024-12-14T10:25:21.3592136Z ------------ -------------
2024-12-14T10:25:21.3592516Z 1d 411±5μs
2024-12-14T10:25:21.3592784Z 2d 1.06±0.02ms
2024-12-14T10:25:21.3593173Z 2d-1scalar 433±6μs
2024-12-14T10:25:57.7874965Z ============ =============
2024-12-14T10:25:57.7875321Z
2024-12-14T10:25:57.7876007Z [91.96%] ··· indexing.Indexing.time_indexing_vectorized ok
2024-12-14T10:25:57.7876795Z [91.96%] ··· ====== ==========
2024-12-14T10:25:57.7877226Z key
2024-12-14T10:25:57.7877611Z ------ ----------
2024-12-14T10:25:57.7878095Z 1-1d 537±10μs
2024-12-14T10:25:57.7878569Z 2-1d 490±9μs
2024-12-14T10:25:57.7878852Z 3-2d 725±8μs
2024-12-14T10:25:57.7879089Z ====== ==========
2024-12-14T10:25:57.7879243Z
2024-12-14T10:25:57.7879524Z [92.08%] ··· indexing.IndexingDask.time_indexing_basic ok
2024-12-14T10:25:57.7879965Z [92.08%] ··· ================== =============
2024-12-14T10:25:57.7880234Z key
2024-12-14T10:25:57.7880773Z ------------------ -------------
2024-12-14T10:25:57.7881099Z 1scalar 7.26±0.1ms
2024-12-14T10:25:57.7881415Z 1slice 7.25±0.06ms
2024-12-14T10:25:57.7881729Z 1slice-1scalar 7.62±0.1ms
2024-12-14T10:25:57.7882047Z 2slicess-1scalar 48.4±0.4ms
2024-12-14T10:25:57.7882321Z ================== =============
2024-12-14T10:25:57.7882497Z
2024-12-14T10:25:57.7882750Z [92.20%] ··· ...ng.IndexingDask.time_indexing_basic_ds_large ok
2024-12-14T10:25:57.7883167Z [92.20%] ··· ================== =============
2024-12-14T10:25:57.7883426Z key
2024-12-14T10:25:57.7883690Z ------------------ -------------
2024-12-14T10:25:57.7884002Z 1scalar 2.03±0.01ms
2024-12-14T10:25:57.7884334Z 1slice 2.03±0.01ms
2024-12-14T10:25:57.7884643Z 1slice-1scalar 2.08±0.01ms
2024-12-14T10:25:57.7884971Z 2slicess-1scalar 2.18±0.02ms
2024-12-14T10:25:57.7885235Z ================== =============
2024-12-14T10:25:57.7885407Z
2024-12-14T10:25:57.7885670Z [92.33%] ··· indexing.IndexingDask.time_indexing_outer ok
2024-12-14T10:25:57.7886095Z [92.33%] ··· ============ ===========
2024-12-14T10:25:57.7886352Z key
2024-12-14T10:25:57.7886596Z ------------ -----------
2024-12-14T10:25:57.7886884Z 1d 213±0.9ms
2024-12-14T10:25:57.7887170Z 2d 445±3ms
2024-12-14T10:25:57.7887454Z 2d-1scalar 208±1ms
2024-12-14T10:25:57.7887696Z ============ ===========
2024-12-14T10:25:57.7887854Z
2024-12-14T10:25:57.7888124Z [92.45%] ··· indexing.IndexingDask.time_indexing_vectorized ok
2024-12-14T10:25:57.7888550Z [92.45%] ··· ====== ============
2024-12-14T10:25:57.7888783Z key
2024-12-14T10:25:57.7889023Z ------ ------------
2024-12-14T10:25:57.7889290Z 1-1d 216±2ms
2024-12-14T10:25:57.7889546Z 2-1d 88.0±0.4ms
2024-12-14T10:25:57.7889787Z 3-2d 36.8±0.7ms
2024-12-14T10:25:57.7889984Z ====== ============
2024-12-14T10:25:57.7890125Z
2024-12-14T10:25:57.7890579Z [92.57%] ··· interp.Interpolation.time_interpolation ok
2024-12-14T10:25:57.7891003Z [92.57%] ··· ======== ============ ============
2024-12-14T10:25:57.7891265Z -- is_short
2024-12-14T10:25:57.7891518Z -------- -------------------------
2024-12-14T10:25:57.7891760Z method True False
2024-12-14T10:25:57.7891990Z ======== ============ ============
2024-12-14T10:25:57.7892273Z linear 10.3±0.5ms 12.7±0.6ms
2024-12-14T10:25:57.7892557Z cubic 65.5±0.8ms 67.1±2ms
2024-12-14T10:25:57.7892786Z ======== ============ ============
2024-12-14T10:25:57.7892957Z
2024-12-14T10:25:57.7893199Z [92.70%] ··· interp.Interpolation.time_interpolation_2d ok
2024-12-14T10:25:57.7893581Z [92.70%] ··· ========= ============
2024-12-14T10:25:57.7893799Z method
2024-12-14T10:25:57.7894017Z --------- ------------
2024-12-14T10:25:57.7894283Z linear 17.5±0.2ms
2024-12-14T10:25:57.7894536Z nearest 14.9±0.5ms
2024-12-14T10:25:57.7894747Z ========= ============
2024-12-14T10:25:57.7894883Z
2024-12-14T10:25:57.7895135Z [92.82%] ··· interp.InterpolationDask.time_interpolation ok
2024-12-14T10:25:57.7895523Z [92.82%] ··· ======== ============ ============
2024-12-14T10:25:57.7895768Z -- is_short
2024-12-14T10:25:57.7896008Z -------- -------------------------
2024-12-14T10:25:57.7896242Z method True False
2024-12-14T10:25:57.7896616Z ======== ============ ============
2024-12-14T10:25:57.7896909Z linear 27.9±0.6ms 35.2±2ms
2024-12-14T10:25:57.7897189Z cubic 74.5±0.5ms 84.0±0.5ms
2024-12-14T10:25:57.7897420Z ======== ============ ============
2024-12-14T10:25:57.7897574Z
2024-12-14T10:25:57.7897821Z [92.95%] ··· interp.InterpolationDask.time_interpolation_2d ok
2024-12-14T10:25:57.7898205Z [92.95%] ··· ========= ============
2024-12-14T10:25:57.7898430Z method
2024-12-14T10:25:57.7898644Z --------- ------------
2024-12-14T10:25:57.7898895Z linear 37.9±2ms
2024-12-14T10:25:57.7899147Z nearest 32.9±0.8ms
2024-12-14T10:25:57.7899359Z ========= ============
2024-12-14T10:25:57.7899495Z
2024-12-14T10:25:57.7899721Z [93.07%] ··· ...e.DatasetAddVariable.time_merge_two_datasets ok
2024-12-14T10:25:57.7900109Z [93.07%] ··· =================== =============
2024-12-14T10:25:57.7900370Z existing_elements
2024-12-14T10:25:57.7900609Z ------------------- -------------
2024-12-14T10:25:57.7900893Z 0 76.8±0.5μs
2024-12-14T10:25:57.7901173Z 10 182±1μs
2024-12-14T10:25:57.7901443Z 100 986±10μs
2024-12-14T10:25:57.7901728Z 1000 8.84±0.05ms
2024-12-14T10:25:57.7901958Z =================== =============
2024-12-14T10:25:57.7902109Z
2024-12-14T10:25:57.7902340Z [93.19%] ··· ...e.DatasetAddVariable.time_variable_insertion ok
2024-12-14T10:25:57.7903066Z [93.19%] ··· =================== =============
2024-12-14T10:26:22.8155985Z existing_elements
2024-12-14T10:26:22.8156564Z ------------------- -------------
2024-12-14T10:26:22.8157363Z 0 84.5±0.3μs
2024-12-14T10:26:22.8157898Z 10 137±2μs
2024-12-14T10:26:22.8158425Z 100 531±1μs
2024-12-14T10:26:22.8158925Z 1000 4.46±0.04ms
2024-12-14T10:26:22.8159355Z =================== =============
2024-12-14T10:26:22.8159633Z
2024-12-14T10:26:22.8160119Z [93.32%] ··· merge.DatasetCreation.time_dataset_creation ok
2024-12-14T10:26:22.8161161Z [93.32%] ··· ==================== ======= =============
2024-12-14T10:26:22.8161621Z strategy count
2024-12-14T10:26:22.8162036Z -------------------- ------- -------------
2024-12-14T10:26:22.8162551Z dict_of_DataArrays 0 145±0.2μs
2024-12-14T10:26:22.8163079Z dict_of_DataArrays 1 235±1μs
2024-12-14T10:26:22.8163626Z dict_of_DataArrays 10 736±2μs
2024-12-14T10:26:22.8164228Z dict_of_DataArrays 100 5.27±0.04ms
2024-12-14T10:26:22.8164789Z dict_of_DataArrays 1000 49.7±0.2ms
2024-12-14T10:26:22.8165359Z dict_of_Variables 0 145±1μs
2024-12-14T10:26:22.8165890Z dict_of_Variables 1 164±0.4μs
2024-12-14T10:26:22.8166425Z dict_of_Variables 10 256±1μs
2024-12-14T10:26:22.8166966Z dict_of_Variables 100 1.09±0.01ms
2024-12-14T10:26:22.8167518Z dict_of_Variables 1000 9.28±0.02ms
2024-12-14T10:26:22.8168055Z dict_of_Tuples 0 145±1μs
2024-12-14T10:26:22.8168566Z dict_of_Tuples 1 159±0.4μs
2024-12-14T10:26:22.8169062Z dict_of_Tuples 10 232±1μs
2024-12-14T10:26:22.8169579Z dict_of_Tuples 100 869±3μs
2024-12-14T10:26:22.8170072Z dict_of_Tuples 1000 7.28±0.1ms
2024-12-14T10:26:22.8170452Z ==================== ======= =============
2024-12-14T10:26:22.8170723Z
2024-12-14T10:26:22.8171375Z [93.44%] ··· pandas.MultiIndexSeries.time_from_series ok
2024-12-14T10:26:22.8172081Z [93.44%] ··· ======= ============= =============
2024-12-14T10:26:22.8172505Z -- subset
2024-12-14T10:26:22.8172925Z ------- ---------------------------
2024-12-14T10:26:22.8173339Z dtype True False
2024-12-14T10:26:22.8173736Z ======= ============= =============
2024-12-14T10:26:22.8174224Z int 1.95±0.04ms 3.75±0.02ms
2024-12-14T10:26:22.8174719Z float 1.98±0.06ms 3.75±0.01ms
2024-12-14T10:26:22.8175106Z ======= ============= =============
2024-12-14T10:26:22.8175385Z
2024-12-14T10:26:22.8175815Z [93.56%] ··· pandas.ToDataFrame.peakmem_to_dataframe 2.97G
2024-12-14T10:26:22.8176686Z [93.69%] ··· pandas.ToDataFrame.time_to_dataframe 746±3ms
2024-12-14T10:26:22.8177575Z [93.81%] ··· pandas.ToDataFrameDask.peakmem_to_dataframe 229M
2024-12-14T10:26:22.8178465Z [93.94%] ··· pandas.ToDataFrameDask.time_to_dataframe 447±5ms
2024-12-14T10:26:22.8179335Z [94.06%] ··· polyfit.Polyval.peakmem_polyval ok
2024-12-14T10:26:22.8180002Z [94.06%] ··· ========= ====== ====== ======
2024-12-14T10:26:22.8180423Z -- ndeg
2024-12-14T10:26:22.8180823Z --------- --------------------
2024-12-14T10:26:22.8181217Z nx 2 5 20
2024-12-14T10:26:22.8181599Z ========= ====== ====== ======
2024-12-14T10:26:22.8181982Z 100 170M 170M 170M
2024-12-14T10:26:22.8182363Z 1000000 184M 184M 184M
2024-12-14T10:26:22.8182939Z ========= ====== ====== ======
2024-12-14T10:26:22.8183215Z
2024-12-14T10:26:22.8183625Z [94.18%] ··· polyfit.Polyval.time_polyval ok
2024-12-14T10:26:22.8184338Z [94.18%] ··· ========= ============ ============= =============
2024-12-14T10:26:22.8184795Z -- ndeg
2024-12-14T10:26:22.8185245Z --------- ----------------------------------------
2024-12-14T10:26:22.8185694Z nx 2 5 20
2024-12-14T10:26:22.8186333Z ========= ============ ============= =============
2024-12-14T10:26:22.8186894Z 100 805±20μs 1.25±0.01ms 3.40±0.01ms
2024-12-14T10:26:22.8187420Z 1000000 2.04±0.2ms 3.89±0.3ms 13.4±0.4ms
2024-12-14T10:26:22.8187866Z ========= ============ ============= =============
2024-12-14T10:26:22.8188170Z
2024-12-14T10:26:22.8188629Z [94.31%] ··· polyfit.PolyvalDask.peakmem_polyval ok
2024-12-14T10:26:22.8189379Z [94.31%] ··· ========= ====== ====== ======
2024-12-14T10:26:22.8189815Z -- ndeg
2024-12-14T10:26:22.8190109Z --------- --------------------
2024-12-14T10:26:22.8190433Z nx 2 5 20
2024-12-14T10:26:22.8190715Z ========= ====== ====== ======
2024-12-14T10:26:22.8190956Z 100 195M 195M 195M
2024-12-14T10:26:22.8191281Z 1000000 214M 215M 220M
2024-12-14T10:26:22.8191511Z ========= ====== ====== ======
2024-12-14T10:26:22.8191760Z
2024-12-14T10:26:22.8192032Z [94.43%] ··· polyfit.PolyvalDask.time_polyval ok
2024-12-14T10:26:22.8192557Z [94.43%] ··· ========= ============ ============ ============
2024-12-14T10:26:22.8192858Z -- ndeg
2024-12-14T10:26:22.8193205Z --------- --------------------------------------
2024-12-14T10:26:33.9352597Z nx 2 5 20
2024-12-14T10:26:33.9353145Z ========= ============ ============ ============
2024-12-14T10:26:33.9354311Z 100 5.12±0.3ms 9.34±0.2ms 32.6±0.2ms
2024-12-14T10:26:33.9354968Z 1000000 49.9±0.5ms 88.0±0.6ms 339±2ms
2024-12-14T10:26:33.9355411Z ========= ============ ============ ============
2024-12-14T10:26:33.9355729Z
2024-12-14T10:26:33.9356219Z [94.55%] ··· reindexing.Reindex.time_1d_coarse 495±5μs
2024-12-14T10:26:33.9357105Z [94.68%] ··· reindexing.Reindex.time_1d_fine_all_found 1.62±0.08ms
2024-12-14T10:26:33.9357919Z [94.80%] ··· reindexing.Reindex.time_1d_fine_some_missing 9.14±1ms
2024-12-14T10:26:33.9358746Z [94.93%] ··· reindexing.Reindex.time_2d_coarse 1.90±0.01ms
2024-12-14T10:26:33.9359552Z [95.05%] ··· reindexing.Reindex.time_2d_fine_all_found 23.9±0.2ms
2024-12-14T10:26:33.9360371Z [95.17%] ··· reindexing.Reindex.time_2d_fine_some_missing 36.7±0.4ms
2024-12-14T10:26:33.9361196Z [95.30%] ··· reindexing.ReindexDask.time_1d_coarse 3.56±0.6ms
2024-12-14T10:26:33.9362028Z [95.42%] ··· reindexing.ReindexDask.time_1d_fine_all_found 8.32±0.5ms
2024-12-14T10:26:33.9362858Z [95.54%] ··· ...dexing.ReindexDask.time_1d_fine_some_missing 18.7±0.3ms
2024-12-14T10:26:33.9363679Z [95.67%] ··· reindexing.ReindexDask.time_2d_coarse 5.73±0.4ms
2024-12-14T10:26:33.9364526Z [95.79%] ··· reindexing.ReindexDask.time_2d_fine_all_found 22.8±0.6ms
2024-12-14T10:26:33.9365364Z [95.92%] ··· ...dexing.ReindexDask.time_2d_fine_some_missing 40.0±1ms
2024-12-14T10:26:33.9366153Z [96.04%] ··· renaming.SwapDims.time_swap_dims ok
2024-12-14T10:26:33.9366769Z [96.04%] ··· ========== ============
2024-12-14T10:26:33.9367138Z size
2024-12-14T10:26:33.9367490Z ---------- ------------
2024-12-14T10:26:33.9367904Z 1000 34.7±0.3μs
2024-12-14T10:26:33.9368320Z 100000 35.0±0.2μs
2024-12-14T10:26:33.9368764Z 10000000 35.4±0.1μs
2024-12-14T10:26:33.9369095Z ========== ============
2024-12-14T10:26:33.9369330Z
2024-12-14T10:26:33.9369736Z [96.16%] ··· renaming.SwapDims.time_swap_dims_newindex ok
2024-12-14T10:26:33.9370360Z [96.16%] ··· ========== =========
2024-12-14T10:26:33.9370988Z size
2024-12-14T10:26:33.9371332Z ---------- ---------
2024-12-14T10:26:33.9371763Z 1000 188±2μs
2024-12-14T10:26:33.9372162Z 100000 188±2μs
2024-12-14T10:26:33.9372581Z 10000000 191±2μs
2024-12-14T10:26:33.9372978Z ========== =========
2024-12-14T10:26:33.9373196Z
2024-12-14T10:26:33.9373565Z [96.29%] ··· repr.Repr.time_repr 7.59±0.05ms
2024-12-14T10:26:33.9374321Z [96.41%] ··· repr.Repr.time_repr_html 101±1ms
2024-12-14T10:26:33.9375304Z [96.53%] ··· repr.ReprMultiIndex.time_repr 689±4μs
2024-12-14T10:26:33.9376151Z [96.66%] ··· repr.ReprMultiIndex.time_repr_html 2.80±0.02ms
2024-12-14T10:26:33.9377008Z [96.78%] ··· ...rayRollingMemory.peakmem_1drolling_construct ok
2024-12-14T10:26:33.9377641Z [96.78%] ··· ======== ======
2024-12-14T10:26:33.9377973Z stride
2024-12-14T10:26:33.9378330Z -------- ------
2024-12-14T10:26:33.9378671Z None 198M
2024-12-14T10:26:33.9378993Z 5 198M
2024-12-14T10:26:33.9379323Z 50 198M
2024-12-14T10:26:33.9379641Z ======== ======
2024-12-14T10:26:33.9379849Z
2024-12-14T10:26:33.9380276Z [96.91%] ··· ...aArrayRollingMemory.peakmem_1drolling_reduce ok
2024-12-14T10:26:33.9380917Z [96.91%] ··· ====== ====== =======
2024-12-14T10:26:33.9381287Z -- use_bottleneck
2024-12-14T10:26:33.9381653Z ------ --------------
2024-12-14T10:26:33.9382027Z func True False
2024-12-14T10:26:33.9382376Z ====== ====== =======
2024-12-14T10:26:33.9383055Z sum 171M 172M
2024-12-14T10:26:33.9383428Z max 171M 172M
2024-12-14T10:26:33.9383788Z mean 171M 172M
2024-12-14T10:26:33.9384141Z ====== ====== =======
2024-12-14T10:26:33.9384390Z
2024-12-14T10:26:33.9384841Z [97.03%] ··· ...aArrayRollingMemory.peakmem_ndrolling_reduce ok
2024-12-14T10:26:33.9385479Z [97.03%] ··· ====== ====== =======
2024-12-14T10:26:33.9385840Z -- use_bottleneck
2024-12-14T10:26:33.9386201Z ------ --------------
2024-12-14T10:26:33.9386564Z func True False
2024-12-14T10:26:33.9386898Z ====== ====== =======
2024-12-14T10:26:33.9387246Z sum 190M 190M
2024-12-14T10:26:33.9387610Z max 189M 189M
2024-12-14T10:26:33.9387978Z mean 190M 190M
2024-12-14T10:26:33.9388347Z ====== ====== =======
2024-12-14T10:26:33.9388584Z
2024-12-14T10:26:33.9389035Z [97.15%] ··· ...setRollingMemory.peakmem_1drolling_construct ok
2024-12-14T10:26:33.9389621Z [97.15%] ··· ======== ======
2024-12-14T10:26:33.9389881Z stride
2024-12-14T10:26:33.9390177Z -------- ------
2024-12-14T10:26:33.9390719Z None 198M
2024-12-14T10:26:33.9390920Z 5 198M
2024-12-14T10:26:33.9391111Z 50 198M
2024-12-14T10:26:33.9391395Z ======== ======
2024-12-14T10:26:33.9391536Z
2024-12-14T10:26:33.9391851Z [97.28%] ··· ...atasetRollingMemory.peakmem_1drolling_reduce ok
2024-12-14T10:26:33.9392293Z [97.28%] ··· ====== ====== =======
2024-12-14T10:26:33.9392604Z -- use_bottleneck
2024-12-14T10:26:33.9392837Z ------ --------------
2024-12-14T10:26:33.9393099Z func True False
2024-12-14T10:26:33.9393357Z ====== ====== =======
2024-12-14T10:26:33.9393579Z sum 196M 298M
2024-12-14T10:26:33.9393889Z max 196M 298M
2024-12-14T10:26:33.9394095Z mean 196M 298M
2024-12-14T10:32:25.5958630Z ====== ====== =======
2024-12-14T10:32:25.5959023Z
2024-12-14T10:32:25.5959786Z [97.40%] ··· ...atasetRollingMemory.peakmem_ndrolling_reduce ok
2024-12-14T10:32:25.5960579Z [97.40%] ··· ====== ====== =======
2024-12-14T10:32:25.5960862Z -- use_bottleneck
2024-12-14T10:32:25.5961103Z ------ --------------
2024-12-14T10:32:25.5961334Z func True False
2024-12-14T10:32:25.5961543Z ====== ====== =======
2024-12-14T10:32:25.5961758Z sum 214M 310M
2024-12-14T10:32:25.5962055Z max 214M 310M
2024-12-14T10:32:25.5962277Z mean 214M 305M
2024-12-14T10:32:25.5962497Z ====== ====== =======
2024-12-14T10:32:25.5962636Z
2024-12-14T10:32:25.5963203Z [97.52%] ··· rolling.Rolling.time_rolling ok
2024-12-14T10:32:25.5963644Z [97.52%] ··· ======= ======== ============ ============
2024-12-14T10:32:25.5963927Z -- use_bottleneck
2024-12-14T10:32:25.5964199Z ---------------- -------------------------
2024-12-14T10:32:25.5964473Z func center True False
2024-12-14T10:32:25.5964717Z ======= ======== ============ ============
2024-12-14T10:32:25.5965024Z mean True 22.1±0.5ms 142±1ms
2024-12-14T10:32:25.5965329Z mean False 13.3±0.5ms 142±2ms
2024-12-14T10:32:25.5965633Z count True 57.6±0.4ms 57.9±0.6ms
2024-12-14T10:32:25.5965935Z count False 57.5±0.3ms 57.8±0.6ms
2024-12-14T10:32:25.5966195Z ======= ======== ============ ============
2024-12-14T10:32:25.5966353Z
2024-12-14T10:32:25.5966599Z [97.65%] ··· rolling.Rolling.time_rolling_construct ok
2024-12-14T10:32:25.5967003Z [97.65%] ··· ======== ======== =========== =========
2024-12-14T10:32:25.5967273Z -- use_bottleneck
2024-12-14T10:32:25.5967525Z ----------------- ---------------------
2024-12-14T10:32:25.5967781Z center stride True False
2024-12-14T10:32:25.5968035Z ======== ======== =========== =========
2024-12-14T10:32:25.5968321Z True 1 (0) 226±3ms 227±2ms
2024-12-14T10:32:25.5968611Z True 1 (1) 228±2ms 227±2ms
2024-12-14T10:32:25.5968902Z False 1 (0) 227±2ms 229±2ms
2024-12-14T10:32:25.5969194Z False 1 (1) 228±0.3ms 228±2ms
2024-12-14T10:32:25.5969431Z ======== ======== =========== =========
2024-12-14T10:32:25.5969595Z
2024-12-14T10:32:25.5969823Z [97.77%] ··· rolling.Rolling.time_rolling_long ok
2024-12-14T10:32:25.5970227Z [97.77%] ··· ======= ======== ============= =============
2024-12-14T10:32:25.5970492Z -- use_bottleneck
2024-12-14T10:32:25.5970758Z ---------------- ---------------------------
2024-12-14T10:32:25.5971022Z func pandas True False
2024-12-14T10:32:25.5971480Z ======= ======== ============= =============
2024-12-14T10:32:25.5971792Z mean True 495±3μs 492±1μs
2024-12-14T10:32:25.5972100Z mean False 253±1μs 6.11±0.07ms
2024-12-14T10:32:25.5972400Z count True 578±3μs 576±2μs
2024-12-14T10:32:25.5972708Z count False 1.98±0.05ms 2.07±0.04ms
2024-12-14T10:32:25.5972957Z ======= ======== ============= =============
2024-12-14T10:32:25.5973117Z
2024-12-14T10:32:25.5973346Z [97.90%] ··· rolling.Rolling.time_rolling_np ok
2024-12-14T10:32:25.5973750Z [97.90%] ··· ========= ============= =========== ===========
2024-12-14T10:32:25.5974026Z -- use_bottleneck
2024-12-14T10:32:25.5974290Z ----------------------- -----------------------
2024-12-14T10:32:25.5974568Z window_ min_periods True False
2024-12-14T10:32:25.5974837Z ========= ============= =========== ===========
2024-12-14T10:32:25.5975134Z 20 5 (0) 193±2ms 194±0.8ms
2024-12-14T10:32:25.5975436Z 20 5 (1) 195±0.5ms 192±1ms
2024-12-14T10:32:25.5975738Z 40 5 (0) 353±3ms 353±2ms
2024-12-14T10:32:25.5976032Z 40 5 (1) 356±3ms 356±1ms
2024-12-14T10:32:25.5976273Z ========= ============= =========== ===========
2024-12-14T10:32:25.5976433Z
2024-12-14T10:32:25.5977356Z ##[error][98.02%] ··· rolling.RollingDask.time_rolling 2/8 failed
2024-12-14T10:32:25.5980292Z [98.02%] ··· ======= ======== ============ ============
2024-12-14T10:32:25.5980753Z -- use_bottleneck
2024-12-14T10:32:25.5981161Z ---------------- -------------------------
2024-12-14T10:32:25.5981562Z func center True False
2024-12-14T10:32:25.5981925Z ======= ======== ============ ============
2024-12-14T10:32:25.5982259Z mean True 847±2ms failed
2024-12-14T10:32:25.5982794Z mean False 648±10ms failed
2024-12-14T10:32:25.5983250Z count True 21.0±0.06s 21.2±0.07s
2024-12-14T10:32:25.5983548Z count False 21.1±0.04s 21.2±0.01s
2024-12-14T10:32:25.5983793Z ======= ======== ============ ============
2024-12-14T10:32:25.5984056Z For parameters: 'mean', True, False
2024-12-14T10:32:25.5984306Z
2024-12-14T10:32:25.5984480Z
2024-12-14T10:32:25.5984695Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:32:25.5984947Z
2024-12-14T10:32:25.5985143Z For parameters: 'mean', False, False
2024-12-14T10:32:25.5985387Z
2024-12-14T10:32:25.5985543Z
2024-12-14T10:32:25.5985738Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:32:25.5985926Z
2024-12-14T10:32:25.5986091Z [98.02%] ···· For parameters: 'mean', True, False
2024-12-14T10:32:25.5986332Z
2024-12-14T10:32:25.5986498Z
2024-12-14T10:46:35.1276758Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1277331Z
2024-12-14T10:46:35.1277619Z For parameters: 'mean', False, False
2024-12-14T10:46:35.1278005Z
2024-12-14T10:46:35.1278290Z
2024-12-14T10:46:35.1278638Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1278964Z
2024-12-14T10:46:35.1279729Z [98.14%] ··· rolling.RollingDask.time_rolling_construct ok
2024-12-14T10:46:35.1280531Z [98.14%] ··· ======== ======== ============ ============
2024-12-14T10:46:35.1280993Z -- use_bottleneck
2024-12-14T10:46:35.1281459Z ----------------- -------------------------
2024-12-14T10:46:35.1282215Z center stride True False
2024-12-14T10:46:35.1282606Z ======== ======== ============ ============
2024-12-14T10:46:35.1283096Z True 1 (0) 20.5±0.07s 20.4±0.06s
2024-12-14T10:46:35.1283587Z True 1 (1) 20.4±0.08s 20.6±0.1s
2024-12-14T10:46:35.1284058Z False 1 (0) 20.6±0.07s 21.0±0.2s
2024-12-14T10:46:35.1284549Z False 1 (1) 21.0±0.1s 21.0±0.1s
2024-12-14T10:46:35.1284953Z ======== ======== ============ ============
2024-12-14T10:46:35.1285219Z
2024-12-14T10:46:35.1285638Z [98.27%] ··· rolling.RollingDask.time_rolling_long ok
2024-12-14T10:46:35.1286368Z [98.27%] ··· ======= ======== ============= =============
2024-12-14T10:46:35.1286841Z -- use_bottleneck
2024-12-14T10:46:35.1287312Z ---------------- ---------------------------
2024-12-14T10:46:35.1287857Z func pandas True False
2024-12-14T10:46:35.1288380Z ======= ======== ============= =============
2024-12-14T10:46:35.1288958Z mean True 1.32±0.05ms 1.29±0.03ms
2024-12-14T10:46:35.1289563Z mean False 5.21±0.2ms 29.9±0.2ms
2024-12-14T10:46:35.1290155Z count True 1.49±0.04ms 1.46±0.05ms
2024-12-14T10:46:35.1290743Z count False 13.1±0.4ms 13.5±0.2ms
2024-12-14T10:46:35.1291233Z ======= ======== ============= =============
2024-12-14T10:46:35.1291558Z
2024-12-14T10:46:35.1294992Z ##[error][98.39%] ··· rolling.RollingDask.time_rolling_np failed
2024-12-14T10:46:35.1296784Z [98.39%] ··· ========= ============= ======== ========
2024-12-14T10:46:35.1297249Z -- use_bottleneck
2024-12-14T10:46:35.1297696Z ----------------------- -----------------
2024-12-14T10:46:35.1298148Z window_ min_periods True False
2024-12-14T10:46:35.1298589Z ========= ============= ======== ========
2024-12-14T10:46:35.1298991Z 20 5 (0) failed failed
2024-12-14T10:46:35.1299410Z 20 5 (1) failed failed
2024-12-14T10:46:35.1299833Z 40 5 (0) failed failed
2024-12-14T10:46:35.1300251Z 40 5 (1) failed failed
2024-12-14T10:46:35.1300645Z ========= ============= ======== ========
2024-12-14T10:46:35.1301065Z For parameters: 20, 5 (0), True
2024-12-14T10:46:35.1301460Z
2024-12-14T10:46:35.1301741Z
2024-12-14T10:46:35.1302083Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1302499Z
2024-12-14T10:46:35.1302997Z For parameters: 20, 5 (0), False
2024-12-14T10:46:35.1303401Z
2024-12-14T10:46:35.1303677Z
2024-12-14T10:46:35.1304011Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1304444Z
2024-12-14T10:46:35.1304750Z For parameters: 20, 5 (1), True
2024-12-14T10:46:35.1305141Z
2024-12-14T10:46:35.1305412Z
2024-12-14T10:46:35.1305731Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1306172Z
2024-12-14T10:46:35.1306492Z For parameters: 20, 5 (1), False
2024-12-14T10:46:35.1306889Z
2024-12-14T10:46:35.1307160Z
2024-12-14T10:46:35.1307489Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1307886Z
2024-12-14T10:46:35.1308175Z For parameters: 40, 5 (0), True
2024-12-14T10:46:35.1308562Z
2024-12-14T10:46:35.1308825Z
2024-12-14T10:46:35.1309151Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1309556Z
2024-12-14T10:46:35.1309861Z For parameters: 40, 5 (0), False
2024-12-14T10:46:35.1310456Z
2024-12-14T10:46:35.1310724Z
2024-12-14T10:46:35.1311034Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1311437Z
2024-12-14T10:46:35.1311711Z For parameters: 40, 5 (1), True
2024-12-14T10:46:35.1312099Z
2024-12-14T10:46:35.1312371Z
2024-12-14T10:46:35.1312716Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1313152Z
2024-12-14T10:46:35.1313472Z For parameters: 40, 5 (1), False
2024-12-14T10:46:35.1313877Z
2024-12-14T10:46:35.1314169Z
2024-12-14T10:46:35.1314389Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1314626Z
2024-12-14T10:46:35.1314898Z [98.39%] ···· For parameters: 20, 5 (0), True
2024-12-14T10:46:35.1315161Z
2024-12-14T10:46:35.1315432Z
2024-12-14T10:46:35.1315641Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1315959Z
2024-12-14T10:46:35.1316204Z For parameters: 20, 5 (0), False
2024-12-14T10:46:35.1316448Z
2024-12-14T10:46:35.1316720Z
2024-12-14T10:46:35.1316928Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1317267Z
2024-12-14T10:46:35.1317475Z For parameters: 20, 5 (1), True
2024-12-14T10:46:35.1317721Z
2024-12-14T10:46:35.1317970Z
2024-12-14T10:46:35.1318175Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1318513Z
2024-12-14T10:46:35.1318709Z For parameters: 20, 5 (1), False
2024-12-14T10:46:35.1319198Z
2024-12-14T10:46:35.1319393Z
2024-12-14T10:46:35.1319631Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1319974Z
2024-12-14T10:46:35.1320167Z For parameters: 40, 5 (0), True
2024-12-14T10:46:35.1320493Z
2024-12-14T10:46:35.1320675Z
2024-12-14T10:46:35.1320872Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:35.1321122Z
2024-12-14T10:46:35.1321308Z For parameters: 40, 5 (0), False
2024-12-14T10:46:36.7115122Z
2024-12-14T10:46:36.7115545Z
2024-12-14T10:46:36.7115893Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:36.7116232Z
2024-12-14T10:46:36.7116532Z For parameters: 40, 5 (1), True
2024-12-14T10:46:36.7116967Z
2024-12-14T10:46:36.7117271Z
2024-12-14T10:46:36.7117563Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:36.7117860Z
2024-12-14T10:46:36.7118078Z For parameters: 40, 5 (1), False
2024-12-14T10:46:36.7118357Z
2024-12-14T10:46:36.7118543Z
2024-12-14T10:46:36.7118771Z asv: benchmark timed out (timeout 60.0s)
2024-12-14T10:46:36.7119004Z
2024-12-14T10:46:36.7119553Z [98.51%] ··· unstacking.Unstacking.time_unstack_fast 2.96±0.08ms
2024-12-14T10:46:36.7120182Z [98.64%] ··· unstacking.Unstacking.time_unstack_pandas_slow 3.25±0.2ms
2024-12-14T10:46:36.7120785Z [98.76%] ··· unstacking.Unstacking.time_unstack_slow 2.97±0.07ms
2024-12-14T10:46:36.7121384Z [98.89%] ··· unstacking.UnstackingDask.time_unstack_fast 19.3±0.2ms
2024-12-14T10:46:36.7121985Z [99.01%] ··· ...king.UnstackingDask.time_unstack_pandas_slow 2.99±0.02ms
2024-12-14T10:46:36.7122609Z [99.13%] ··· unstacking.UnstackingDask.time_unstack_slow 3.05±0.02ms
2024-12-14T10:46:36.7123227Z [99.26%] ··· ...nstackingSparse.peakmem_unstack_to_sparse_2d n/a
2024-12-14T10:46:36.7123715Z [99.26%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7124171Z [99.38%] ··· ...nstackingSparse.peakmem_unstack_to_sparse_3d n/a
2024-12-14T10:46:36.7124616Z [99.38%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7125337Z [99.50%] ··· unstacking.UnstackingSparse.time_unstack_fast n/a
2024-12-14T10:46:36.7125781Z [99.50%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7126209Z [99.63%] ··· ...ng.UnstackingSparse.time_unstack_pandas_slow n/a
2024-12-14T10:46:36.7126624Z [99.63%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7127066Z [99.75%] ··· unstacking.UnstackingSparse.time_unstack_slow n/a
2024-12-14T10:46:36.7127500Z [99.75%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7127918Z [99.88%] ··· ...g.UnstackingSparse.time_unstack_to_sparse_2d n/a
2024-12-14T10:46:36.7128343Z [99.88%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7128774Z [100.00%] ··· ...g.UnstackingSparse.time_unstack_to_sparse_3d n/a
2024-12-14T10:46:36.7129203Z [100.00%] ···· asv: skipped: NotImplementedError()
2024-12-14T10:46:36.7129567Z + grep 'Traceback \|failed\|PERFORMANCE DECREASED' benchmarks.log
2024-12-14T10:46:36.7134089Z + exit 1
2024-12-14T10:46:36.7139509Z ++ '[' 1 = 1 ']'
2024-12-14T10:46:36.7139835Z ++ '[' -x /usr/bin/clear_console ']'
```
</details> | open | 2024-12-14T11:11:36Z | 2025-03-21T19:24:23Z | https://github.com/pydata/xarray/issues/9890 | [
"CI"
] | Illviljan | 6 |
ultralytics/ultralytics | deep-learning | 19,134 | Different behavior (NOT good) of yolov11n on Jetson Orin Nano Super | ### Search before asking
- [x] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/orgs/ultralytics/discussions) and found no similar questions.
### Question
Recently, we have upgrade our Jetson Orin Nano 8GB DevKit to Jetpack 6.2, which is supposed to have great performance from 40TB to 67TB.
From the test we have issues compared to result from Jetpack 5.1.4, which lost a lot of object detection bbox from the test video here:
- Test video: [Detect bbox lost of yolov11n on Jetson Orin Nano Super/Jetpack 6.2](https://youtu.be/JQrGnXisGqw)
- Test code: https://github.com/SnapDragonfly/jetson-fpv/blob/main/utils/yolo.py
- Test script
```
$ cat test.txt
simulation rtp streaming source:
video-viewer file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 rtp://@:5600 --input-loop=-1 --headless
test yolo with 11n model
python3 ./utils/yolo.py rtp://@:5600
```
- Test Env:
```
Software part of jetson-stats 4.3.1 - (c) 2024, Raffaello Bonghi
Model: NVIDIA Jetson Orin Nano Developer Kit - Jetpack 6.2 [L4T 36.4.3]
NV Power Mode[0]: 15W
Serial Number: [XXX Show with: jetson_release -s XXX]
Hardware:
- P-Number: p3767-0005
- Module: NVIDIA Jetson Orin Nano (Developer kit)
Platform:
- Distribution: Ubuntu 22.04 Jammy Jellyfish
- Release: 5.15.148-tegra
jtop:
- Version: 4.3.1
- Service: Active
Libraries:
- CUDA: 12.6.68
- cuDNN: 9.3.0.75
- TensorRT: 10.3.0.30
- VPI: 3.2.4
- OpenCV: 4.11.0 - with CUDA: YES
DeepStream C/C++ SDK version: 7.1
Python Environment:
Python 3.10.12
GStreamer: YES (1.20.3)
NVIDIA CUDA: YES (ver 12.6, CUFFT CUBLAS FAST_MATH)
OpenCV version: 4.11.0 CUDA True
YOLO version: 8.3.68
PYCUDA version: 2024.1.2
Torch version: 2.5.0a0+872d972e41.nv24.08
Torchvision version: 0.20.0
DeepStream SDK version: 1.2.0
onnxruntime version: 1.20.1
onnxruntime-gpu version: 1.19.2
FPV Environment:
MSPOSD version: c28d645 20250205_151537
```
Any idea or suggestion is well appreciated.
### Additional
_No response_ | closed | 2025-02-08T02:21:13Z | 2025-03-10T02:08:47Z | https://github.com/ultralytics/ultralytics/issues/19134 | [
"question",
"detect",
"embedded"
] | lida2003 | 41 |
taverntesting/tavern | pytest | 619 | MQTT publish does it support the use of external function | Hello, I encountered a problem. I want to use an external method like HTTP request in the mqtt_publish configuration of yaml. But when executed locally, it did not resolve and replace my external function.
- tavern yaml
```yaml
stages:
- name: Success Request
mqtt_publish:
topic: xxx
qos: 1
json:
$ext:
function: testing_request:save_static_data_request
mqtt_response:
topic: xxx
timeout: 5
json:
msg: "success"
code: 0
```
- python external file
```python
def save_static_data_request():
current_timestamp = int(time.time())
request = {
...
"time": current_timestamp,
...
}
return request
```
The above configuration will not parse my external configuration, but it is normal for me to use http request to call the external configuration.
Please help me, thanks!! | closed | 2020-11-12T09:33:37Z | 2020-12-11T16:53:07Z | https://github.com/taverntesting/tavern/issues/619 | [] | Sokkam | 2 |
pytest-dev/pytest-cov | pytest | 216 | Pluggy incompatibility with pytest==3.7.0 when using pipenv | When updating in pipenv today:
```
$> pipenv install
Pipfile.lock (bd627c) out of date, updating to (5f00e3)...
Locking [dev-packages] dependencies...
Warning: Your dependencies could not be resolved. You likely have a mismatch in your sub-dependencies.
You can use $ pipenv install --skip-lock to bypass this mechanism, then run $ pipenv graph to inspect the situation.
Hint: try $ pipenv lock --pre if it is a pre-release dependency.
Could not find a version that matches pluggy<0.7,>=0.5,>=0.7
Tried: 0.3.0, 0.3.0, 0.3.1, 0.3.1, 0.4.0, 0.4.0, 0.5.0, 0.5.1, 0.5.1, 0.5.2, 0.5.2, 0.6.0, 0.6.0, 0.6.0, 0.7.1, 0.7.1
There are incompatible versions in the resolved dependencies.
```
I cannot see a way in which pytest-cov directly depends on pluggy though? | closed | 2018-07-31T12:37:01Z | 2018-08-01T08:11:46Z | https://github.com/pytest-dev/pytest-cov/issues/216 | [] | benhowes | 5 |
ClimbsRocks/auto_ml | scikit-learn | 139 | for adding predicted features: add a flag for exists or not | inside transform for AddPredictedFeature, check to see if we already have this feature present in the dataset. if we do, don't do anything! If we don't, add it in.
and maybe add it in every time for a single row, just to be sure we're doing that right at prediction time.
Doing this will let us get predicted features once for our training set, and once for our test set, rather than having to re-compute them each and every new model we train up. | closed | 2016-11-09T22:58:28Z | 2017-03-12T01:13:50Z | https://github.com/ClimbsRocks/auto_ml/issues/139 | [] | ClimbsRocks | 1 |
paperless-ngx/paperless-ngx | machine-learning | 8,476 | [BUG] Custom_field selectbox order | ### Description
When creating a custom field as a select box and assigning its contents to documents, the data is stored on the document as follows:
...'custom_fields': [{'value': None, 'field': 1}, {'value': 13, 'field': 2}],...
The order of the select box elements is therefore critical since they are numbered from 0 to x and stored as their value in the end.
If an element is deleted from the select box, the values of the custom fields across all documents are not updated. This means that all fields with a value higher than the deleted one will be assigned the next value, which is, of course, incorrect.
Proposed solution: Instead of storing the number as the field value, store the select box value (string). This way, the select box can later be reordered without affecting the values stored in documents. Moreover, this would save significant resources, as it would eliminate the need to iterate through all documents whenever the custom field list changes.
I found this issue by developing [this script](https://github.com/paperless-ngx/paperless-ngx/discussions/8475) - maybe it is interesting for integration in another way to paperless ngx core code.
### Steps to reproduce
1. Create custom field select box
2. assign different values to several documents
3. delete the first value of the custom field
Result: All values of the existing documents are moved forward by 1.
And there is another ui issue with this type of custom field (selectbox):

### Webserver logs
```bash
no logs
```
### Browser logs
_No response_
### Paperless-ngx version
2.13.5
### Host OS
Debian 12.8
### Installation method
Docker - official image
### System status
_No response_
### Browser
_No response_
### Configuration changes
_No response_
### Please confirm the following
- [X] I believe this issue is a bug that affects all users of Paperless-ngx, not something specific to my installation.
- [X] This issue is not about the OCR or archive creation of a specific file(s). Otherwise, please see above regarding OCR tools.
- [X] I have already searched for relevant existing issues and discussions before opening this report.
- [X] I have updated the title field above with a concise description. | closed | 2024-12-12T15:26:11Z | 2025-01-12T03:12:26Z | https://github.com/paperless-ngx/paperless-ngx/issues/8476 | [
"not a bug"
] | AndryXY | 3 |
autogluon/autogluon | computer-vision | 4,185 | [BUG] | [V] I provided code that demonstrates a minimal reproducible example. <!-- Ideal, especially via source install -->
[X] I confirmed bug exists on the latest mainline of AutoGluon via source install. <!-- Preferred -->
[V] I confirmed bug exists on the latest stable version of AutoGluon. <!-- Unnecessary if prior items are checked -->
**Describe the bug**
When attempting to run hyperparameter tuning with NN_TORCH as the model on a TabularDataset, an exception is thrown related to URI handling by pyarrow library. The error message indicates an "ArrowInvalid: URI has empty scheme".
**Expected behavior**
The training should proceed without errors, and the model should handle the URI scheme appropriately or provide more specific guidance on expected URI formats.
**To Reproduce**
Install a fresh environment with Python 3.10 and AutoGluon 1.1.0
Run the following script:
```
from autogluon.tabular import TabularDataset, TabularPredictor
data_url = 'https://raw.githubusercontent.com/mli/ag-docs/main/knot_theory/'
train_data = TabularDataset(f'{data_url}train.csv')
label = 'signature'
hp_args = {"num_trials": 3, "scheduler": "local", "searcher": "random"}
fit_args = {"hyperparameter_tune_kwargs": hp_args, "included_model_types": ["NN_TORCH"]}
predictor = TabularPredictor(label=label).fit(train_data, **fit_args)
Screenshots / Logs
```
Logs from the error:
`pyarrow.lib.ArrowInvalid: URI has empty scheme: 'AutogluonModels/ag-20240509_084509/models'`
**Installed Versions**
INSTALLED VERSIONS
------------------
date : 2024-05-09
time : 08:47:49.707205
python : 3.10.14.final.0
OS : Linux
OS-release : 5.15.0-1040-azure
Version : #47~20.04.1-Ubuntu SMP Fri Jun 2 21:38:08 UTC 2023
machine : x86_64
processor : x86_64
num_cores : 16
cpu_ram_mb : 128812.6796875
cuda version : None
num_gpus : 0
gpu_ram_mb : []
avail_disk_size_mb : 4284286
accelerate : 0.21.0
autogluon : 1.1.0
autogluon.common : 1.1.0
autogluon.core : 1.1.0
autogluon.features : 1.1.0
autogluon.multimodal : 1.1.0
autogluon.tabular : 1.1.0
autogluon.timeseries : 1.1.0
boto3 : 1.34.101
catboost : 1.2.5
defusedxml : 0.7.1
evaluate : 0.4.2
fastai : 2.7.15
gluonts : 0.14.3
hyperopt : 0.2.7
imodels : None
jinja2 : 3.1.4
joblib : 1.4.2
jsonschema : 4.21.1
lightgbm : 4.3.0
lightning : 2.1.4
matplotlib : 3.8.4
mlforecast : 0.10.0
networkx : 3.3
nlpaug : 1.1.11
nltk : 3.8.1
nptyping : 2.4.1
numpy : 1.26.4
nvidia-ml-py3 : 7.352.0
omegaconf : 2.2.3
onnxruntime-gpu : None
openmim : 0.3.9
optimum : 1.18.1
optimum-intel : None
orjson : 3.10.3
pandas : 2.2.2
pdf2image : 1.17.0
Pillow : 10.3.0
psutil : 5.9.8
pytesseract : 0.3.10
pytorch-lightning : 2.1.4
pytorch-metric-learning: 2.3.0
ray : 2.10.0
requests : 2.28.2
scikit-image : 0.20.0
scikit-learn : 1.4.0
scikit-learn-intelex : None
scipy : 1.12.0
seqeval : 1.2.2
setuptools : 60.2.0
skl2onnx : None
statsforecast : 1.4.0
tabpfn : None
tensorboard : 2.16.2
text-unidecode : 1.3
timm : 0.9.16
torch : 2.1.2
torchmetrics : 1.2.1
torchvision : 0.16.2
tqdm : 4.65.2
transformers : 4.38.2
utilsforecast : 0.0.10
vowpalwabbit : None
xgboost : 2.0.3
</details> | closed | 2024-05-09T08:54:13Z | 2024-05-10T18:25:37Z | https://github.com/autogluon/autogluon/issues/4185 | [
"duplicate"
] | giladrubin1 | 1 |
HIT-SCIR/ltp | nlp | 189 | glibc detected corrupted double-linked list |
cat input | bin/examples/pos_cmdline --postagger-model ./ltp_data/pos.model
```
*** glibc detected *** bin/examples/pos_cmdline: corrupted double-linked list: 0x00000000009193b0 ***
======= Backtrace: =========
/lib64/libc.so.6[0x3671275f3e]
/lib64/libc.so.6[0x36712790b6]
/usr/lib64/libstdc++.so.6(_ZdlPv+0x18)[0x3a1aa6748d]
/usr/lib64/libstdc++.so.6(_ZdaPv+0x18)[0x3a1aa674c5]
bin/examples/pos_cmdline(_ZN3ltp9postagger9PostaggerD1Ev+0x12c)[0x438cfc]
bin/examples/pos_cmdline(_Z26postagger_create_postaggerPKcS0_+0xe3)[0x42b7d3]
bin/examples/pos_cmdline(main+0xbd2)[0x423442]
/lib64/libc.so.6(__libc_start_main+0xfd)[0x367121ed1d]
bin/examples/pos_cmdline[0x422249]
======= Memory map: ========
00400000-00466000 r-xp 00000000 08:05 798521 /data/nlp/ltp/bin/examples/pos_cmdline
00666000-00667000 rw-p 00066000 08:05 798521 /data/nlp/ltp/bin/examples/pos_cmdline
00667000-00668000 rw-p 00000000 00:00 0
00912000-00933000 rw-p 00000000 00:00 0 [heap]
3670e00000-3670e20000 r-xp 00000000 08:02 1048626 /lib64/ld-2.12.so
367101f000-3671020000 r--p 0001f000 08:02 1048626 /lib64/ld-2.12.so
3671020000-3671021000 rw-p 00020000 08:02 1048626 /lib64/ld-2.12.so
3671021000-3671022000 rw-p 00000000 00:00 0
3671200000-367138a000 r-xp 00000000 08:02 1048709 /lib64/libc-2.12.so
367138a000-367158a000 ---p 0018a000 08:02 1048709 /lib64/libc-2.12.so
367158a000-367158e000 r--p 0018a000 08:02 1048709 /lib64/libc-2.12.so
367158e000-3671590000 rw-p 0018e000 08:02 1048709 /lib64/libc-2.12.so
3671590000-3671594000 rw-p 00000000 00:00 0
3671a00000-3671a17000 r-xp 00000000 08:02 1048793 /lib64/libpthread-2.12.so
3671a17000-3671c17000 ---p 00017000 08:02 1048793 /lib64/libpthread-2.12.so
3671c17000-3671c18000 r--p 00017000 08:02 1048793 /lib64/libpthread-2.12.so
3671c18000-3671c19000 rw-p 00018000 08:02 1048793 /lib64/libpthread-2.12.so
3671c19000-3671c1d000 rw-p 00000000 00:00 0
3671e00000-3671e83000 r-xp 00000000 08:02 1048990 /lib64/libm-2.12.so
3671e83000-3672082000 ---p 00083000 08:02 1048990 /lib64/libm-2.12.so
3672082000-3672083000 r--p 00082000 08:02 1048990 /lib64/libm-2.12.so
3672083000-3672084000 rw-p 00083000 08:02 1048990 /lib64/libm-2.12.so
3a1aa00000-3a1ab18000 r-xp 00000000 08:02 1330322 /usr/lib64/libstdc++.so.6.0.20
3a1ab18000-3a1ad17000 ---p 00118000 08:02 1330322 /usr/lib64/libstdc++.so.6.0.20
3a1ad17000-3a1ad20000 r--p 00117000 08:02 1330322 /usr/lib64/libstdc++.so.6.0.20
3a1ad20000-3a1ad23000 rw-p 00120000 08:02 1330322 /usr/lib64/libstdc++.so.6.0.20
3a1ad23000-3a1ad38000 rw-p 00000000 00:00 0
3a1ae00000-3a1ae16000 r-xp 00000000 08:02 1048641 /lib64/libgcc_s-4.4.7-20120601.so.1
3a1ae16000-3a1b015000 ---p 00016000 08:02 1048641 /lib64/libgcc_s-4.4.7-20120601.so.1
3a1b015000-3a1b016000 rw-p 00015000 08:02 1048641 /lib64/libgcc_s-4.4.7-20120601.so.1
7fdf74000000-7fdf74021000 rw-p 00000000 00:00 0
7fdf74021000-7fdf78000000 ---p 00000000 00:00 0
7fdf781e5000-7fdf781eb000 rw-p 00000000 00:00 0
7fdf781f4000-7fdf781f6000 rw-p 00000000 00:00 0
7fffc36f6000-7fffc370b000 rw-p 00000000 00:00 0 [stack]
7fffc37ff000-7fffc3800000 r-xp 00000000 00:00 0 [vdso]
ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall]
Aborted (core dumped)
``` | closed | 2016-11-01T06:23:40Z | 2016-11-01T07:46:34Z | https://github.com/HIT-SCIR/ltp/issues/189 | [] | luyee | 2 |
NullArray/AutoSploit | automation | 781 | Divided by zero exception51 | Error: Attempted to divide by zero.51 | closed | 2019-04-19T16:00:42Z | 2019-04-19T16:37:55Z | https://github.com/NullArray/AutoSploit/issues/781 | [] | AutosploitReporter | 0 |
MycroftAI/mycroft-core | nlp | 2,294 | Strange Skill regex behaviour | Interested in any comments or hypotheses on this one...
It was flagged that some utterances like "tell me about the royal bank of canada" were not receiving any response from Mycroft. So first stop was to investigate the Skill itself.
1. With the existing Skills regex, the Article Title is extracted correctly and returns a confidence of 0.375 however the intent does not seem to be triggered.
2. While playing around, [adding an additional regex line](https://github.com/krisgesling/skill-wiki/tree/gez/fix-long-match):
`.*(wiki|for|about|wikipedia)(?! (for|about)) the (?P<ArticleTitle>.+)`
correctly triggered the intent despite it now returning a lower confidence of 0.1875.
- Doing this as a one liner:
`.*(wiki|for|about|wikipedia)(?! (for|about))( the|) (?P<ArticleTitle>.+)`
extracted `ArticleTitle` as "the" instead of "royal bank of canada"
3. Reverting the regex back to the original single line now matches the utterance and triggers the intent correctly.
So I'm presuming that it's something in the regex registration or caching, but thought I'd seek advice before diving too deep into the rabbit hole. | closed | 2019-09-09T02:10:03Z | 2024-09-08T08:34:02Z | https://github.com/MycroftAI/mycroft-core/issues/2294 | [] | krisgesling | 2 |
deepset-ai/haystack | pytorch | 8,216 | clean up docstrings: MetadataRouter | closed | 2024-08-13T12:25:43Z | 2024-08-14T06:40:19Z | https://github.com/deepset-ai/haystack/issues/8216 | [] | dfokina | 0 |
|
PhantomInsights/subreddit-analyzer | matplotlib | 3 | thank you for putting this out there! | First of all, thank you for making this public :) Have always wanted to do something like this but have been lazy.
It would be great if you could put some simple instructions on how to install / use it without reading the source too much :P (like if I could see what you're running in cmd line)
Merry Christmas :)
| closed | 2019-12-24T22:03:52Z | 2019-12-25T19:08:41Z | https://github.com/PhantomInsights/subreddit-analyzer/issues/3 | [] | brianwu02 | 2 |
pallets/flask | flask | 4,979 | url_for() add support for explicit query params | I have a flask route that takes the query params of the request, and uses them to build a new URL.
A simplified example looks like this:
```python
@app.route("/my-route")
def my_route():
request_query = request.args.to_dict()
request_query['foo'] = 'bar'
url = url_for("my_route", **request_query)
return app.redirect(url)
```
So for example if you request `/my-route?value=1` you will be redirected to `/my-route?value=1&foo=bar`
Mypy started raising a type error after upgrading to Flask 2.2.
```
error: Argument 3 to "url_for" has incompatible type "**Dict[str, str]"; expected "Optional[bool]" [arg-type]
```
I understand the source of the error: The ``**request_query`` dict could technically contain reserved keys like `"_external"` in it, which would lead to an invalid function signature. But this points to a larger bug that I had not considered, which is that unless I sanitize all of the reserved keywords (`_external`, `_method`, etc.) from my query parameters, this could lead to unexpected URLs and a possible vulnerability (e.g. somebody sending `/my-route?_scheme=http` and downgrading the redirect to http).
Even if I sanitize those reserved keywords:
- Future versions of flask could add new reserved keywords, making my code invisibly vulnerable again.
- I don't know how I would reconcile the type error without switching away from ``**request_query`` to explicit keywords, which would be very difficult in my case.
What I think would fix this, and would be a nice API to have in general, is to have an additional `_query=` keyword argument. This would be an alternative to `**values` but would be explicitly for additional query params, with no reserved keywords.
```python
url = url_for("my_route", _query=request_query)
```
| closed | 2023-02-15T16:29:18Z | 2023-02-15T19:49:23Z | https://github.com/pallets/flask/issues/4979 | [] | michael-lazar | 3 |
nerfstudio-project/nerfstudio | computer-vision | 3,134 | Google Drive Permission Error when running `ns-download-data nerfstudio --capture-name=poster` | **Describe the bug**
Running `ns-download-data nerfstudio --capture-name=poster` results in a gdown google drive permission error.
**To Reproduce**
Steps to reproduce the behavior:
1. Run `ns-download-data nerfstudio --capture-name=poster` after installation.
**Expected behavior**
Download Data
**Screenshots**
<img width="567" alt="image" src="https://github.com/nerfstudio-project/nerfstudio/assets/107154811/dd3b5fed-f525-4aed-b62d-d7230dc803d3">
| open | 2024-05-10T01:22:54Z | 2024-05-12T21:51:23Z | https://github.com/nerfstudio-project/nerfstudio/issues/3134 | [] | branyang02 | 4 |
aimhubio/aim | tensorflow | 2,888 | Continue runs in callback | ## 🚀 Feature
Even though there is [continue run](https://aimstack.readthedocs.io/en/latest/using/manage_runs.html#continue-runs) functionality for run object but this support is not available for callbacks when using Hugging Face trainer api)
<!-- A clear and concise description of the feature proposal -->
### Motivation
Each run starts fresh but when training is resumed via Hugging Face trainer api but the the run should continue from previous state.
```
# Initialize aim_callback
aim_callback = AimCallback(experiment='huggingface_experiment')
# Initialize trainer
trainer = Trainer(
model=model,
args=training_args,
train_dataset=small_train_dataset,
eval_dataset=small_eval_dataset,
compute_metrics=compute_metrics,
callbacks=[aim_callback]
)
```
<!-- Please outline the motivation for the proposal. Is your feature request related to a problem? e.g., I'm always frustrated when [...]. If this is related to another GitHub issue, please link here too -->
| open | 2023-07-03T04:16:00Z | 2023-07-03T16:06:23Z | https://github.com/aimhubio/aim/issues/2888 | [
"type / enhancement"
] | prince14322 | 1 |
amisadmin/fastapi-amis-admin | fastapi | 67 | QuickSaveItemApi错误,请修正代码 | admin/admin.py 文件 中 :
primaryField=self.pk_name,
quickSaveItemApi=f"put:{self.router_path}/item/" + "${id}",
改为:
primaryField=self.pk_name,
quickSaveItemApi=f"put:{self.router_path}/item/${self.pk_name}"
否则pk_name不为id时,会报405错误 | closed | 2022-11-16T02:13:18Z | 2023-09-17T08:51:16Z | https://github.com/amisadmin/fastapi-amis-admin/issues/67 | [] | zinohome | 3 |
jowilf/starlette-admin | sqlalchemy | 585 | Bug: Empty bulk actions dropdown menu when no bulk actions available. | **Describe the bug**
Empty bulk actions dropdown menu when no bulk actions avalable.
**To Reproduce**
Allow only "view" row actions, ex:
```python
class ReadOnlyModelView(ModelView):
def can_create(self, request):
return False
def can_edit(self, request):
return False
def can_delete(self, request):
return False
row_actions = ["view"]
row_actions_display_type = RowActionsDisplayType.ICON_LIST
actions = []
```
**Environment (please complete the following information):**
- starlette-admin: 0.14.1
- fastapi: 0.99.0
- sqlmodel: 0.0.8
Can provide more packages on demand, but most likely not relevant.
**Additional context**
The best thing when no bulk actions are available would be to:
* automatically hide the row checkboxes
* never show the bulk actions button/dropdown
* alternatively, config to hide checkboxes and bulk actions button

---
PS: thank you for the great software, it's very useful and productive! ♥️ | open | 2024-10-03T12:30:30Z | 2024-10-14T13:17:35Z | https://github.com/jowilf/starlette-admin/issues/585 | [
"bug"
] | essenciary | 0 |
jofpin/trape | flask | 42 | No me sale el registro cuando me conecto a la dirección | 
| closed | 2018-04-26T17:14:51Z | 2018-09-19T16:26:03Z | https://github.com/jofpin/trape/issues/42 | [] | Blask0 | 3 |
psf/requests | python | 6,877 | List of documented exceptions doesn't match reality / unused `URLRequired` exception | The documentation lists a [relatively small list](https://requests.readthedocs.io/en/latest/api/#exceptions) of exceptions, among them:
> exception `requests.URLRequired(*args, **kwargs)`
> A valid URL is required to make a request.
which would imply that passing an invalid URL raises `URLRequired`. However, that exception is actually dead code and not raised anywhere ever since ab27027aa8916e6e199bbb35083beb5d6339f6aa in 2012. Instead, with requests 2.32.3, invalid URLs raise something like `MissingSchema`, `InvalidSchema` or `InvalidURL`, none of which are documented.
Looking at [exceptions.py](https://github.com/psf/requests/blob/main/src/requests/exceptions.py), there seem to be various other undocumented exceptions in there:
- `class InvalidJSONError(RequestException):` (only `JSONDecodeError` which inherits from it)
- `class ProxyError(ConnectionError):`
- `class SSLError(ConnectionError):`
- `class ConnectTimeout(ConnectionError, Timeout):` (`Timeout` is documented)
- `class ReadTimeout(Timeout):` (`Timeout` is documented)
- `class MissingSchema(RequestException, ValueError):`
- `class InvalidSchema(RequestException, ValueError):`
- `class InvalidURL(RequestException, ValueError):`
- `class InvalidHeader(RequestException, ValueError):`
- `class InvalidProxyURL(InvalidURL):`
- `class ChunkedEncodingError(RequestException):`
- `class ContentDecodingError(RequestException, BaseHTTPError):`
- `class StreamConsumedError(RequestException, TypeError):`
- `class RetryError(RequestException):`
- `class UnrewindableBodyError(RequestException):`
(Some of those might be internal, or considered not worth documenting since they can be caught by `except ValueError:`. However, even e.g. [Errors and Exceptions](https://docs.python-requests.org/en/latest/user/quickstart/#errors-and-exceptions) or the reference docs don't seem to point out that either. | open | 2025-01-31T16:15:01Z | 2025-02-01T19:57:14Z | https://github.com/psf/requests/issues/6877 | [] | The-Compiler | 4 |
mwaskom/seaborn | pandas | 3,182 | Missing xticklabels in non-rectangular shared plots | In the objects interface, one can instantiate a faceted plot where the faceted column is wrapped. When this results in a non-rectangular grid of axes, there are no xtick labels for the bottom axes that are not on the bottom row. Example code:
```python
import seaborn as sis
import seaborn.objects as so
iris = sns.load_dataset("iris")
(
so.Plot(iris, "sepal_length", 'sepal_width')
.add(so.Dots())
.facet("species", wrap=2)
)
```

Using `relplot` to generate a similar plot results with a correct plot, that has the x-tick labels for the rightmost axes.
Analysis: The objects interface instantiates a plot using a rectangular `Figure.subplots` call (with shared axes) and then removes the axes that do not correspond to a subset of the faceted variable:
https://github.com/mwaskom/seaborn/blob/22cdfb0c93f8ec78492d87edb810f10cb7f57a31/seaborn/_core/subplots.py#L199-L208
Behind the scenes, `Figure.subplots` creates a rectangular grid and disables the inner axes' xticklabels (since all axes are shared):
https://github.com/matplotlib/matplotlib/blob/cc20d3f7bbb18a21683fb499d98b0814546583c6/lib/matplotlib/axes/_base.py#L4587-L4596
But the deletion of the extra axes by seaborn doesn't restore the ticks and labels that `Figure.subplots` removed, and instead only operates on the visibility of the ticks (which are absent at this point due to the axes instantiation):
https://github.com/mwaskom/seaborn/blob/22cdfb0c93f8ec78492d87edb810f10cb7f57a31/seaborn/_core/plot.py#L1040-L1048
Expected behavior: the ticks and their labels should be visible for non-last-row bottom axes of non-rectangular wrapped shared facets. | closed | 2022-12-11T18:59:54Z | 2022-12-11T19:15:32Z | https://github.com/mwaskom/seaborn/issues/3182 | [] | MaozGelbart | 1 |
benbusby/whoogle-search | flask | 747 | [BUG] support.google.com link using removed /url endpoint | **Describe the bug**
Results for support.google.com had a href of /url?q=&{google_url}
1. the / in the href break compatability with WHOOGLE_URL_PREFIX
2. the url endpoint is removed in 2490089645f2cc2659f75be1398d13bbb6fec726
Probably another bug but may be related. With anonymous view enabled if search "google support" there will be anonymous link added to the result. However, no anon view link if search "support google"
**To Reproduce**
Steps to reproduce the behavior:
1. search "support google"
2. click first result
**Deployment Method**
- [x] Heroku (one-click deploy)
- [ ] Docker
- [ ] `run` executable
- [ ] pip/pipx
- [ ] Other: [describe setup]
**Version of Whoogle Search**
- [x] Latest build from afc93b8a2178eb1d78dcb3aab229966e03d133f8
- [ ] Version [version number]
- [ ] Not sure
**Desktop (please complete the following information):**
- OS: Windows 10
- Browser chrome
- Version 100
| closed | 2022-05-05T11:37:40Z | 2022-05-16T15:57:44Z | https://github.com/benbusby/whoogle-search/issues/747 | [
"bug"
] | invis-z | 0 |
deezer/spleeter | deep-learning | 615 | Error: AttributeError: module 'ffmpeg' has no attribute '_run' | - [x ] I didn't find a similar issue already open.
- [x ] I read the documentation (README AND Wiki)
- [x ] I have installed FFMpeg
- [ x] My problem is related to Spleeter only, not a derivative product (such as Webapplication, or GUI provided by others)
## Description
When running spleeter to extract stems from a song. I keep getting this error message:
`File "/Users/afolabi/Library/Python/3.8/lib/python/site-packages/spleeter/audio/ffmpeg.py", line 102, in load
except ffmpeg._run.Error as e:
AttributeError: module 'ffmpeg' has no attribute '_run'`
I've installed ffmpeg and read the docs. I've googled extensively for this but can't find anything.
| closed | 2021-04-22T22:59:31Z | 2022-10-17T14:05:43Z | https://github.com/deezer/spleeter/issues/615 | [
"bug",
"invalid"
] | afolabiaji | 3 |
huggingface/transformers | deep-learning | 36,513 | Allow parameters of the ViTPooler to be configurable, with the default values set to the current hardcoded values | ### Feature request
I would like to add `pooled_output_size: int | None` and `pooler_act: str` fields in ViTConfig such that the following https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_vit.py#L665-L666 can be configured if needed.
The two fields would have default values `None` (which results in using the value of `hidden_size`, like is currently done, and `tanh`. This would result in no change to the default behaviour, while allowing some flexibility.
### Motivation
This is part of the open-sourcing of the HeAR model (https://arxiv.org/abs/2403.02522) in PyTorch format. The encoder uses the ViT architecture, and the pooler has no activation (identify function instead of tanh) and the pooling dimension is smaller than the hidden dimension.
### Your contribution
I will submit a PR | open | 2025-03-03T14:22:02Z | 2025-03-07T14:12:29Z | https://github.com/huggingface/transformers/issues/36513 | [
"Feature request",
"Vision"
] | sebbaur | 6 |
marimo-team/marimo | data-science | 3,889 | Read script metadata when running `marimo edit` | ### Description
Script metadata is only read for a file when running `marimo edit run.py`. However, it would be nice to have it work for `marimo edit` also, and then any file that is opened within that session.
### Suggested solution
Probably reading from the file key / query param?
### Alternative
_No response_
### Additional context
_No response_ | closed | 2025-02-23T22:07:56Z | 2025-03-03T04:43:13Z | https://github.com/marimo-team/marimo/issues/3889 | [
"enhancement"
] | riyavsinha | 0 |
neuml/txtai | nlp | 746 | Support max_seq_length parameter with model pooling | txtai has an internal mechanism to run mean and cls pooling. This adds support for `sentence-transformers` models without having to install the library. This component was added to limit the number of package dependencies.
One difference noted between `sentence-transformers` and txtai is that txtai doesn't read the `max_seq_length` parameter. A change should be added to optionally read this parameter. The default behavior will continue to use the tokenizer/model max length unless this parameter is specified. | closed | 2024-07-04T13:17:57Z | 2024-07-04T14:05:48Z | https://github.com/neuml/txtai/issues/746 | [] | davidmezzetti | 0 |
tensorlayer/TensorLayer | tensorflow | 949 | Warning in tensorlayer.layers.utils.set_name_reuse | Hello Hao Dong,
I am using your library for my project on tumor segmentation. The code has stopped working because of the issue of tensorlayer.layers.utils.set_name_reuse.
How to solve this issue?
Thank you. | closed | 2019-03-25T06:05:15Z | 2019-04-03T12:34:32Z | https://github.com/tensorlayer/TensorLayer/issues/949 | [] | rupalkapdi | 2 |
keras-team/keras | tensorflow | 20,964 | User guide to use Rematerialization | gradient checkpoint feat was introduced here https://github.com/keras-team/keras/pull/20743 but currently there is no user guide how to adop it or use it. | open | 2025-02-26T01:06:43Z | 2025-02-27T17:11:10Z | https://github.com/keras-team/keras/issues/20964 | [
"type:docs"
] | pure-rgb | 2 |
Miserlou/Zappa | flask | 2,242 | Unable to deploy Zappa to AWS Lambda return 502 everytime | Zaapa settings
```
{
"prod": {
"aws_region": "ap-south-1",
"django_settings": "app.main.settings",
"profile_name": "project",
"project_name": "project-admin-backend",
"runtime": "python3.7",
"s3_bucket": "project-backend-api",
"cors": true,
"slim_handler": true,
"memory_size": 1024,
"exclude": [".env", "seed_data.json", "*pyc"]
},
"preprod": {
"aws_region": "ap-south-1",
"django_settings": "app.main.settings",
"profile_name": "project",
"project_name": "project-admin-backend",
"runtime": "python3.7",
"cors": true,
"slim_handler": true,
"s3_bucket": "project-backend-api",
"memory_size": 1024,
"exclude": [".env", "seed_data.json", "*pyc"]
}
}```
Requirements
```
Django==3.2
djangorestframework==3.13.1
environs==9.5.0
psycopg2-binary==2.9.3
Pillow==9.2.0
djangorestframework-simplejwt==5.2.0
django-cors-headers==3.3.0
celery==5.2.7
django-ses==3.1.2
django-storages==1.13.1
django-redis-cache==3.0.1
pandas==1.3.5
pyfcm==1.5.4
qrcode==7.4.2
importlib-metadata==4.13.0
zappa==0.56.1
```
When try to execute zappa manage prod "check" ... This was my traceback
0230415T043909Z
x-amz-security-token:IQoJb3JpZ2luX2VjEBUaCmFwLXNvdXRoLTEiRjBEAiAa1acqBoQXOxbiIqrrH4ZDyIrdRqwwwdWMlAX4sz2nNgIgI8zuEtRmAUgi8zAVp1SOJ7Rqz/YQK32WwfKi9Ss5O1AqkwMI/v//////////ARAAGgwyNjY1OTEyODM4MzAiDH1JaqtIeqgCdEJZPyrnApt1KZdSuY6dRu9Hn/O13iC8u5vWbhSOd7pxTBHEpiA96SnphRG1xQXuGPbtaqBYJCad4mSNwai6wZLfmW09lPmQ5hY9rnh1d5XS5jasnzVewRFmn78ks6Z32rP1fBKbuVmGqRziw2DXM/j5Rdavmtcf2ArQdzWZJj9uZNyhR31x/YCT1FOOXknk9W80hiX96ZUSJmFG0eyususAJazRdIz5bXu+EOC8+VEjQvdIFdnvPq0vECg5Cci9hogJDOOTBl5CwwlZT8nuDXW9vXcA1PQB1CXQUUIRv0TSvvnUdOTn/DrH4JwFMdwXhoyrnNB7bmDpGaadX/pfw29FtO2CKtCrFBTM0XJqZAv3vnw1BzeVXe40RnIa+io6230duUY2DLrXP6WXwZYeQBqnvGYxGCltCjT15Dmq7N6D6SoYt7R7pZxu2ro5vX0lE/iiTSnqLtqN5a3+D2vlLJwPwQJzzE1hAR/TNzXKMJ7U6KEGOp4BfXH9coWUVOLJzplfzBkalNw1lS5VXMuwo6+Lmx7ufomKTeFg63qii/d4RhLOqRm2ChV7QEchWQ0NIZpQmgbvH3Y1MhucHqtkH5HGiEdnXmogZMDQ3pti8tVJJ/bgkPwdOg9qNfYhWRInqLpalUCddyhrs0jC4f9XlZQ9D97AIKH6mesW7TKv9PzHcB3w+9aK3UmU+PRglWrj10C93Lk=
host;x-amz-content-sha256;x-amz-date;x-amz-security-token
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855
[DEBUG] 2023-04-15T04:39:09.852Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 StringToSign:
AWS4-HMAC-SHA256
20230415T043909Z
20230415/ap-south-1/s3/aws4_request
2a86e6bfa44c779be8ca80e30662d8f4e0d553109dee16a0ca9f380b1c6ab995
[DEBUG] 2023-04-15T04:39:09.853Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Signature:
59ba1efd32d3f6e64c4374ca94e80397fa96b8bcd585f693a122f468c35affa9
[DEBUG] 2023-04-15T04:39:09.853Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Event request-created.s3.GetObject: calling handler <function add_retry_headers at 0x7f6a1537e7a0>
[DEBUG] 2023-04-15T04:39:09.853Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Sending http request: <AWSPreparedRequest stream_output=True, method=GET, url=https://project-backend-api.s3.ap-south-1.amazonaws.com/prod_project-admin-backend_current_project.tar.gz, headers={'User-Agent': b'Boto3/1.26.114 Python/3.7.16 Linux/4.14.255-301-238.520.amzn2.x86_64 exec-env/AWS_Lambda_python3.7 Botocore/1.29.114 Resource', 'X-Amzn-Trace-Id': b'Root=1-643a2a6d-539eb256603bad4f729d972b;Parent=34f0a9b81b762cc0;Sampled=0;Lineage=e64e4ea5:0', 'X-Amz-Date': b'20230415T043909Z', 'X-Amz-Security-Token': b'IQoJb3JpZ2luX2VjEBUaCmFwLXNvdXRoLTEiRjBEAiAa1acqBoQXOxbiIqrrH4ZDyIrdRqwwwdWMlAX4sz2nNgIgI8zuEtRmAUgi8zAVp1SOJ7Rqz/YQK32WwfKi9Ss5O1AqkwMI/v//////////ARAAGgwyNjY1OTEyODM4MzAiDH1JaqtIeqgCdEJZPyrnApt1KZdSuY6dRu9Hn/O13iC8u5vWbhSOd7pxTBHEpiA96SnphRG1xQXuGPbtaqBYJCad4mSNwai6wZLfmW09lPmQ5hY9rnh1d5XS5jasnzVewRFmn78ks6Z32rP1fBKbuVmGqRziw2DXM/j5Rdavmtcf2ArQdzWZJj9uZNyhR31x/YCT1FOOXknk9W80hiX96ZUSJmFG0eyususAJazRdIz5bXu+EOC8+VEjQvdIFdnvPq0vECg5Cci9hogJDOOTBl5CwwlZT8nuDXW9vXcA1PQB1CXQUUIRv0TSvvnUdOTn/DrH4JwFMdwXhoyrnNB7bmDpGaadX/pfw29FtO2CKtCrFBTM0XJqZAv3vnw1BzeVXe40RnIa+io6230duUY2DLrXP6WXwZYeQBqnvGYxGCltCjT15Dmq7N6D6SoYt7R7pZxu2ro5vX0lE/iiTSnqLtqN5a3+D2vlLJwPwQJzzE1hAR/TNzXKMJ7U6KEGOp4BfXH9coWUVOLJzplfzBkalNw1lS5VXMuwo6+Lmx7ufomKTeFg63qii/d4RhLOqRm2ChV7QEchWQ0NIZpQmgbvH3Y1MhucHqtkH5HGiEdnXmogZMDQ3pti8tVJJ/bgkPwdOg9qNfYhWRInqLpalUCddyhrs0jC4f9XlZQ9D97AIKH6mesW7TKv9PzHcB3w+9aK3UmU+PRglWrj10C93Lk=', 'X-Amz-Content-SHA256': b'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855', 'Authorization': b'AWS4-HMAC-SHA256 Credential=ASIAT4EQUFJ3IDFCONNI/20230415/ap-south-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date;x-amz-security-token, Signature=59ba1efd32d3f6e64c4374ca94e80397fa96b8bcd585f693a122f468c35affa9', 'amz-sdk-invocation-id': b'7672d133-1180-4e30-8ff4-6cff1570f65e', 'amz-sdk-request': b'attempt=1'}>
[DEBUG] 2023-04-15T04:39:09.854Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Certificate path: /var/task/certifi/cacert.pem
[DEBUG] 2023-04-15T04:39:09.854Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Starting new HTTPS connection (1): project-backend-api.s3.ap-south-1.amazonaws.com:443
2023-04-15T04:39:39.558Z a9d26f60-693a-49ef-ae31-5c1c17d82b22 Task timed out after 30.03 seconds
[END] RequestId: a9d26f60-693a-49ef-ae31-5c1c17d82b22
[REPORT] RequestId: a9d26f60-693a-49ef-ae31-5c1c17d82b22
Duration: 30033.53 ms
Billed Duration: 30000 ms
Memory Size: 1024 MB
Max Memory Used: 45 MB
Error: Unhandled error occurred while invoking command.
| open | 2023-04-15T04:41:14Z | 2023-04-25T11:29:21Z | https://github.com/Miserlou/Zappa/issues/2242 | [] | cartoonmangodev1 | 1 |
iperov/DeepFaceLab | machine-learning | 5,214 | XSeg training GPU unavailable | ## Expected behavior
It should be able to use GPU for training.
## Actual behavior
```Press enter in 2 seconds to override model settings.Error: Cannot assign a device for operation XSeg/conv01/conv/weight: node XSeg/conv01/conv/weight (defined at /code/DeepFaceLab_Linux/DeepFaceLab/core/leras/layers/Conv2D.py:76) was explicitly assigned to /device:GPU:0 but available devices are [ /job:localhost/replica:0/task:0/device:CPU:0 ]. Make sure the device specification refers to a valid device.
[[XSeg/conv01/conv/weight]]
Traceback (most recent call last):
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1375, in _do_call
return fn(*args)
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1358, in _run_fn
self._extend_graph()
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1398, in _extend_graph
tf_session.ExtendSession(self._session)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Cannot assign a device for operation XSeg/conv01/conv/weight: {{node XSeg/conv01/conv/weight}} was explicitly assigned to /device:GPU:0 but available devices are [ /job:localhost/replica:0/task:0/device:CPU:0 ]. Make sure the device specification refers to a valid device.
[[XSeg/conv01/conv/weight]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/mainscripts/Trainer.py", line 58, in trainerThread
debug=debug,
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/models/Model_XSeg/Model.py", line 17, in __init__
super().__init__(*args, force_model_class_name='XSeg', **kwargs)
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/models/ModelBase.py", line 189, in __init__
self.on_initialize()
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/models/Model_XSeg/Model.py", line 68, in on_initialize
data_format=nn.data_format)
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/facelib/XSegNet.py", line 79, in __init__
model.init_weights()
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/core/leras/layers/Saveable.py", line 104, in init_weights
nn.init_weights(self.get_weights())
File "/home/shaoyu/code/DeepFaceLab_Linux/DeepFaceLab/core/leras/ops/__init__.py", line 48, in init_weights
nn.tf_sess.run (ops)
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 968, in run
run_metadata_ptr)
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1191, in _run
feed_dict_tensor, options, run_metadata)
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1369, in _do_run
run_metadata)
File "/home/shaoyu/.conda/envs/deepfacelab/lib/python3.7/site-packages/tensorflow/python/client/session.py", line 1394, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Cannot assign a device for operation XSeg/conv01/conv/weight: node XSeg/conv01/conv/weight (defined at /code/DeepFaceLab_Linux/DeepFaceLab/core/leras/layers/Conv2D.py:76) was explicitly assigned to /device:GPU:0 but available devices are [ /job:localhost/replica:0/task:0/device:CPU:0 ]. Make sure the device specification refers to a valid device.
[[XSeg/conv01/conv/weight]]
```
## Steps to reproduce
Follow the instructions to go through the process, and start training XSeg model.
## Other relevant information
- **Command lined used (if not specified in steps to reproduce)**: main.py ...
- **Operating system and version:** Linux
- **Python version:** 3.7 | open | 2020-12-24T19:44:38Z | 2023-06-08T22:20:57Z | https://github.com/iperov/DeepFaceLab/issues/5214 | [] | 1over137 | 7 |
widgetti/solara | jupyter | 996 | Meta tags: Potential bug or documentation inconsistency | Hi!
When looking at the docstrings of `solara.Head` we see this:
```
A component that manager the "head" tag of the page to avoid duplicate tags, such as titles.
Currently only supports the [title](/documentation/components/page/title) tag as child, e.g.:
```python
import solara
@solara.component
def Page():
with solara.VBox() as main:
MyAwesomeComponent()
with solara.Head():
solara.Title("My page title")
return main
```
```
However in the docs of `solara.Meta` we see that this can also be in the `solara.Head` :
```
"""Add a meta tag to the head element, or replace a meta tag with the same name and or property.
This component should be used inside a [Head](/documentation/components/page/head) component, e.g.:
```python
import solara
@solara.component
def Page():
with solara.VBox() as main:
MyAwesomeComponent()
with solara.Head():
solara.Meta(name="description", property="og:description", content="My page description")
solara.Meta(property="og:title", content="My page title for social media")
solara.Meta(property="og:image", content="https://solara.dev/static/assets/images/logo.svg")
solara.Meta(property="og:type", content="website")
return main
```
```
Anyway, my point is.. i have been trying various approaches and I can't get the open-graph tags to propaga to the head correctly.
I have tried various attempts to include them in my `Layout` component (which is practically identical to the one solara offers for multi-page apps, with an attempt to add the meta og tags).
I have also tried following the example from the `solara.dev` page itself that is also in the solara repo, but also nothing.
Either there is some funny bug, or am i doing something wrong (probably the latter since the solara.dev website works).
Can someone please explain what is the expected way of using these components to add the OG/Twitter tags?
Many thanks! | open | 2025-02-06T20:36:08Z | 2025-02-07T11:42:06Z | https://github.com/widgetti/solara/issues/996 | [] | JovanVeljanoski | 2 |
PokeAPI/pokeapi | api | 933 | Okidogi Wrong Hidden Ability | Okidogi's hidden ability is listed as Zero to Hero instead of the proper ability of Guard Dog. | closed | 2023-10-01T01:42:54Z | 2023-10-05T05:09:41Z | https://github.com/PokeAPI/pokeapi/issues/933 | [] | tmbocheeko | 1 |
newpanjing/simpleui | django | 409 | Save button on change_list.html page update all table rows not only the editable row | I have reated `list_editable = ("pickup_date", "pickup_available_time")` in admin.py file but in the page change_list.html when I edit pickup_date for an selected row in the table I noticed that all the table rows are updated not only the selected one
| closed | 2021-11-14T00:12:58Z | 2022-06-13T02:29:02Z | https://github.com/newpanjing/simpleui/issues/409 | [
"bug"
] | AlyRadwan2020 | 2 |
replicate/cog | tensorflow | 1,524 | cog multimodel setup issue |
output_path = "/tmp/super_resolution.png"
if high_resolution:
cv2.imwrite(output_path, cv2.cvtColor(output, cv2.COLOR_RGB2BGR))
else:
output_path = "/tmp/out.png"
output.save(output_path)
return output_path
```
my run comment is
```
$ cog predict -i image_path=@img.jpg
```
The error i got..
```
Running prediction...
{"prediction_id": null, "logger": "cog.server.http", "timestamp": "2024-02-09T06:08:01.061146Z", "severity": "ERROR", "message": " The return value of predict()
was not valid:\n\n 1 validation error for PredictionResponse\noutput -> __root__\n '' is not a valid URL scheme. 'data', 'http', or 'https' is supported. (type=value_error)\n\n Check that your predict function is in this form, where `output_type` is the same as the type you are returning (e.g. `str`):\n\n
def predict(...) -> output_type:\n ...\n"}
ⅹ /predictions call returned status 500
```
| closed | 2024-02-09T04:53:00Z | 2024-06-21T22:35:16Z | https://github.com/replicate/cog/issues/1524 | [] | tzktz | 1 |
xlwings/xlwings | automation | 2,498 | Remove vendored mistune | Anaconda now includes >2.0 so save to use as optional external dependency. | closed | 2024-08-12T20:48:29Z | 2024-08-13T09:51:01Z | https://github.com/xlwings/xlwings/issues/2498 | [] | fzumstein | 0 |
biosustain/potion | sqlalchemy | 11 | Eliminate 'DeferredSchema' | Instead return new instances using the `Schema.bind()` method if needed.
| closed | 2015-01-15T13:37:46Z | 2015-02-15T14:14:29Z | https://github.com/biosustain/potion/issues/11 | [] | lyschoening | 0 |
hankcs/HanLP | nlp | 664 | 繁体简体转换词典中"立體=三維"的一个bug | t2s.txt中
立體=三維
但是"三維"的“維”也是一个繁体字,应该是“三维” | closed | 2017-11-02T04:33:33Z | 2017-11-02T06:45:57Z | https://github.com/hankcs/HanLP/issues/664 | [
"improvement"
] | searchserver | 1 |
google-research/bert | tensorflow | 914 | computing self-attention for tokens in a sentence | Hi,
I follow the intuitive working of BERT but I am still not quite sure how to compute self-attention for tokens in a sentence.
For instance:
['The cat sat on the mat', 'The cat lay on the rug']
I want to compute the attention scores for all the tokens in both the sentences. Also, if I plan to use the multi-head self-attention, how do I know which heads give the best scores?
Thanks! | open | 2019-11-13T03:36:44Z | 2019-11-13T03:36:44Z | https://github.com/google-research/bert/issues/914 | [] | vr25 | 0 |
feder-cr/Jobs_Applier_AI_Agent_AIHawk | automation | 613 | [QUESTION]: What is the point of MINIMUM_WAIT_TIME? | ### Summary of your question
Why do we have the bot idle on pages if they complete all jobs before MINIMUM_WAIT_TIME (in app_config.py) has elapsed?
### Question details
If several pages need to be skipped because jobs have already been applied to, then this will cause the bot to idle for several minutes before starting the real workload. Also slowing down developers from testing when they need to restart the bot often.
### Context for the question
_No response_
### Additional context
_No response_ | closed | 2024-10-26T05:31:48Z | 2024-10-31T16:51:50Z | https://github.com/feder-cr/Jobs_Applier_AI_Agent_AIHawk/issues/613 | [
"question"
] | sloganking | 1 |
waditu/tushare | pandas | 1,720 | (pro版)adj_factor保持更新时间一致、补全数据 | adj_factor文档上说更新时间是早上9点30分,结果只是部分更新。
以今天2023-10-09为例,'689009.SH'在16:45仍未更新,直到17点左右才更新。
另外'836504.BJ'在2022-08-22有行情,而复权因子直到2023-08-17才有数据。
| open | 2023-10-09T12:16:09Z | 2023-10-09T12:27:09Z | https://github.com/waditu/tushare/issues/1720 | [] | edward852 | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.