text1 stringlengths 2 269k | text2 stringlengths 2 242k | label int64 0 1 |
|---|---|---|
### Is there an existing issue for this?
* I have searched the existing issues
### This issue exists in the latest npm version
* I am using the latest npm
### Current Behavior
Create this package in an empty directory:
{
"name": "missing-fs-extra",
"private": true,
"description": "A simple package to reproduce missing dependencies when installing with the --production flag",
"version": "0.1.0",
"dependencies": {
"gatsby": "^4.4.0"
},
"devDependencies": {
"@storybook/react": "^6.4.9"
},
"license": "MIT",
"scripts": {
"build": "gatsby build"
}
}
Then attempt to run:
npm install --production && npm run build
Notice how `fs-extra` which is a direct dependency of `gatsby` is missing from
the installed node-modules. `fs-extra` is also a direct dependency (with a
mismatched version) of `@storybook/react`, however `@storybook/react` is a
`devDependency` so my understanding is that it should be ignored completely
when installing with the `--production` flag.
### Expected Behavior
`fs-extra` should be installed in `node_modules` because it is a direct
dependency of `gatsby` which is a direct dependency of the root package.
### Steps To Reproduce
1. Using `npm` version 8.3.0
2. With this package:
{
"name": "missing-fs-extra",
"private": true,
"description": "A simple package to reproduce missing dependencies when installing with the --production flag",
"version": "0.1.0",
"dependencies": {
"gatsby": "^4.4.0"
},
"devDependencies": {
"@storybook/react": "^6.4.9"
},
"license": "MIT",
"scripts": {
"build": "gatsby build"
}
}
3. Run `npm install --production`
4. Notice the `fs-extra` module is missing from the installed modules, and when gatsby runs (`npm run build`) it fails to resolve the dependency.
### Environment
* npm: 8.3.0
* Node: v14.18.2
* OS: MacOS Catalina 10.15.7 (19H1615)
* platform: MacbookPro
* npm config:
; node bin location = /Users/fernandotoledo/.nvm/versions/node/v14.18.2/bin/node
; cwd = /Users/fernandotoledo/Desktop/test
; HOME = /Users/fernandotoledo
; Run `npm config ls -l` to show all defaults.
|
### Is there an existing issue for this?
* I have searched the existing issues
### Current Behavior
If a package contains a node_modules that is symlinked then an npm install
will issue the message
npm WARN reify Removing non-directory ....
and delete the node_modules directory.
### Expected Behavior
A symlinked node_modules directory should be used as is and not recreated -
many people symlink node_modules to a local filesystem so that it does not get
synced to cloud storage (like Dropbox).
I suspect that the npm code is just checking if any existing file object named
node_modules is a directory and does not also check to see if the non-
directory file object is actually a symlinked directory.
### Steps To Reproduce
cd /tmp
mkdir symlinked
mkdir z
cd z
npm init -y
ln -s /tmp/symlinked node_modules .
npm install lodash
### Environment
* OS: Ubuntu 20.04.1
* Node: 16.7
* npm: 7.21.0
| 0 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7
* Operating System version:
* Java version: 1.8
### Steps to reproduce this issue
1. 正常导出服务,服务提供者地址为http://169.254.71.3:8081/com.luban.mock_demo.api.HelloService?anyhost=true&application=xml-demo-provider&bean.name=com.luban.mock_demo.api.HelloService&bind.ip=169.254.71.3&bind.port=8081&default.deprecated=false&default.dynamic=false&default.register=true&deprecated=false&dubbo=2.0.2&dynamic=false&generic=false&interface=com.luban.mock_demo.api.HelloService&methods=sayHello&pid=15854®ister=true&release=2.7.0&server=tomcat&side=provider&timeout=2000×tamp=1553146018199
2. 修改改服务的动态配置后,服务提供者地址变为http://169.254.71.3:8081/com.luban.mock_demo.api.HelloService?anyhost=true&application=xml-demo-provider&bean.name=com.luban.mock_demo.api.HelloService&bind.ip=169.254.71.3&bind.port=8081&compatible_config=true&default.deprecated=false&default.dynamic=false&default.register=true&deprecated=false&dubbo=2.0.2&dynamic=false&generic=false&interface=com.luban.mock_demo.api.HelloService&methods=sayHello&pid=15854®ister=true&release=2.7.0&server=tomcat&side=provider&timeout=6000×tamp=1553146018199,该地址中增加了参数compatible_config=true。
3. 消费引用服务报找不到服务提供者地址。
问题出现的原因其实就是因为新的服务提供者地址中增加了compatible_config=true参数,消费者在引入该服务时,改地址会被过滤掉,所以找不到该服务提供者地址。
而compatible_config=true该参数出现的原因是,修改了动态配置中心,所以在configurators中存在了override%3A%2F%2F0.0.0.0%2Fcom.luban.mock_demo.api.HelloService%3Fcategory%3Dconfigurators%26compatible_config%3Dtrue%26dynamic%3Dfalse%26enabled%3Dtrue%26timeout%3D6000,这个地址中有compatible_config参数,并且服务端会监听到这个地址的变化,然后去重写服务提供者地址,就导致了服务提供者地址中出现compatible_config=true参数。
Pls. provide [GitHub address] to reproduce this issue.
### Expected Result
消费能正常引用改服务
### Actual Result
消费不能正常引用改服务
What actually happens?
If there is an exception, please attach the exception trace:
Just put your stack trace here!
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Confusion
* Are there a feature for batched rpc in dubbo?
what I mean batched rpc is that the individual invocations in a batch are
dispatched by a single thread in server-side in the order in which they were
placed into the batch message, which guarantees the individual operations in a
batched rpc are processed in order in the server.
* the official-site-doc says we can set up retry-timeout in an xml file and then the client will retry all the failed invocations regardless what's failing the invocation.
* Does the Dubbo follow at-most-once semantics when handling failures?
* How does Dubbo handle failed invocations in the following cases:
1. A connection could not be established due to the crash of the remote physical machine (connection unreachable error)
2. A connection could not be established due to the crash of the remote process (connection refused error)
3. A connection lost due to the accident shutdown of the remote physical machine (connection unreachable error)
4. A connection lost due to the crash of the remote process (connection refused error)
5. An remote invocation timeout expired due to software bug such as thread overflowed, deadlock or high cpu loads, oom.
6. An exception occurred while sending the request
7. An exception occurred while receiving the reply
8. An exception occurred after sending the request
9. An exception occurred after receiving the reply
where abcde are the failure cases and fghi are the state of the system when
error happens.
that means we have 20 combinations that need to take into account...
Regards,
Thanks.
| 0 |
## Problem Description
In the most recent alpha with the most recent release of normalize.css
(4.0.0), EnhancedButtons are styled like actual buttons.

After doing some digging, the problem is with recent changes in both projects.
In material-ui, commit `00b6ba1` added passing the type attribute to the inner
element, and in normalize.css a recent commit lowered the specificity of a
rule giving any element with `[type=button]` `-webkit-appearance: button`.
I've already opened an issue in their project but because type is not a valid
attribute for span and because lowering the specificity is slightly more
performant, they don't seem to keen on reverting those changes.
## Versions
* Material-UI: 0.15.0-alpha.2
* React: 0.14.7
* Browser: Google Chrome 49.0.2623.87 (64-bit)
|
While working on mui-app-container, I ran into an issue with `AppBar` that
appears to only be present on Safari (it does not exhibit on Chrome or IE 11
at least) - when toggling a persistent drawer closed and then reopen it, the
`AppBar` does not account for the width of the drawer.
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
An `AppBar` should offset to the right by the amount of the drawer after
reopening.
## Current Behavior
After closing and reopening a persistent `Drawer`, the left side of `AppBar`
is now under the Drawer instead of being shifted to the right. This is present
on the v1 docs as of today (v1.0.0-beta.28). Notice the "Drawers" title not
being present.
## Safari (problem)

## Chrome (no problem)

You can also see this problem when toggling the persistent drawer example
(notice "Persistent drawer" no longer being visible when the drawer is open)

You can also see this problem reproduced in mui-app-container's storybook with
the story source here
## Steps to Reproduce (for bugs)
1. Go to persistent drawer example
2. Click on menu icon to toggle drawer
3. Notice "Persistent drawer" title is hidden under drawer instead of being shifted]
## Context
I have been working on creating a `<AppContainer>` component to help
orchestrate `<Drawer>`, `<AppBar>`, and content components and came across
this issue.
## Your Environment
Tech | Version
---|---
Material-UI | v1.0.0-beta.28
React | 16.1.1
browser | Safari 11.0.2 (desktop and iOS)
| 0 |
This code:
m = krm.Graph()
m.add_input('x', input_shape=(10,))
m.add_node(klc.Dense(5), 'y', input='x', create_output=True)
m.compile(loss={'y': 'mse'}, optimizer='adam')
with open('test.json', 'w') as f:
f.write(m.to_json())
with open('test.json', 'r') as f:
m = krm.model_from_json(f.read())
crashes with the following error:
... models.py", line 166, in model_from_json
... models.py", line 177, in model_from_config
... layer_utils.py", line 64, in container_from_config
... containers.py", line 568, in add_output
raise Exception('Duplicate output identifier: ' + name)
Exception: Duplicate output identifier: y
I'll see if I can fix it.
|
This is probably a duplicate of #4302 (where only tensorflow was tested).
### Problem
when using the Reshape layer it is not clear if one can use an unknown
dimension (`-1`) like in numpy. Using it results in an error. See the
following example (please bare with we that this is not a useful example, but
it's just to show the actual problem):
from keras.layers.core import Dense, Reshape
from keras.models import Sequential
import numpy as np
X = np.random.random((1000, 50))
y = np.random.random((1000, 1))
model = Sequential()
model.add(Dense(30, input_shape=(50,)))
model.add(Reshape((-1, 6)))
model.add(Reshape((30,)))
model.add(Dense(1))
model.compile(loss='mse', optimizer='sgd')
model.fit(X, y)
This results in the following error (tested with Keras 1.2.0 and
Theano-0.9.0.dev4)
File "test.py", line 10, in <module>
model.add(Reshape((-1, 6)))
File "keras/models.py", line 327, in add
output_tensor = layer(self.outputs[0])
File "keras/engine/topology.py", line 569, in __call__
self.add_inbound_node(inbound_layers, node_indices, tensor_indices)
File "keras/engine/topology.py", line 632, in add_inbound_node
Node.create_node(self, inbound_layers, node_indices, tensor_indices)
File "keras/engine/topology.py", line 164, in create_node
output_tensors = to_list(outbound_layer.call(input_tensors[0], mask=input_masks[0]))
File "keras/layers/core.py", line 354, in call
return K.reshape(x, (-1,) + target_shape)
File "keras/backend/theano_backend.py", line 567, in reshape
return T.reshape(x, shape)
File "theano/tensor/basic.py", line 4722, in reshape
newshape = as_tensor_variable(newshape)
File "theano/tensor/basic.py", line 212, in as_tensor_variable
raise AsTensorError("Cannot convert %s to TensorType" % str_x, type(x))
theano.tensor.var.AsTensorError: ('Cannot convert (-1, None, 5, 6) to TensorType', <type 'tuple'>)
where as
model.add(Reshape((5, 6)))
instead of
model.add(Reshape((-1, 6)))
would work as expected.
To me it looks like the keras backend supports `-1` dims so the question is
why can't users access it from the reshape layer? Is there any way around it?
Where should I look to implement this feature?
### Applications
Connecting the output of a 2D CNN and a RNN is difficult because the exact
output shape after pooling/downsampling operations need to be known. See the
the following keras builtin application. If the input dimension would change
or even just the pooling stride, the reshape operation would need to be
adjusted manually, which could be cumbersome in a network with many layers.
| 0 |
_From@wangyibu on April 29, 2016 9:34_
* VSCode Version:1.0.0
* OS Version:window7 sp1 build 7601
* VS2015: commuity 2015 update1
* system path: C:\Program Files (x86)\Microsoft SDKs\TypeScript\1.7;
Steps to Reproduce:
1.add two ts file and compile


2.add new folder

1. move all ts file and js file to new folder

_Copied from original issue:microsoft/vscode#5980_
|
Using destructuring in functions signatures is a relief, but when combining it
with type declarations there is a lot of duplication necessary:
function foo(
{paramA, paramB, paramC, paramD = null}
:{paramA: number, paramB: string, paramC: boolean, paramD?: Object}
) {}
function bar(
[paramA, paramB, paramC, paramD]
:[number, string, boolean, Object]
) {}
Would it make sense to introduce the following syntax to reduce duplication of
code?
function foo(
{paramA as number, paramB as string, paramC as boolean, paramD? as Object}
) {}
function bar(
[paramA as number, paramB as string, paramC as boolean, paramD as Object]
) {}
Originally I wanted to suggest to use the colon ":" instead of the "as", but
since the colon is used for renaming destructured parametes, the "as" is the
only option I guess.
With this syntax change the following would be possible too I guess:
let {paramA as number, paramB as string, paramC as boolean, paramD as Object} = anyObject;
let {paramA: a as number, paramB: b as string, paramC: c as boolean, paramD: d as Object} = anyObject;
let [paramA as number, paramB as string, paramC as boolean, paramD as Object] = anyArray;
| 0 |
//== tests/cases/compiler/conflictingMemberTypesInBases.ts (1 errors) ====
interface A {
m: string;
}
interface B extends A {
}
interface C {
m: number;
}
interface D extends C {
}
interface E extends B { } // Error here for extending B and D
//!!! Interface 'E' cannot simultaneously extend types 'B' and 'D':
//!!! Named properties 'm' of types 'B' and 'D' are not identical.
interface E extends D { } // No duplicate error here
Expected: errors as above
Actual: no errors
|
The early discussions about generics had this feature outlined, but it never
made it into the final version for some reason. Meanwhile, it would be
extremely helpful in some cases. The use case that it most dear to my heart is
Knockout:
interface Observable<T> {
<U>( this: U, value: T ): U;
}
var viewModel = {
Str: ko.observable( "abc" ),
Num: ko.observable( 5 )
};
viewModel.Str( "xyz" ).Num( 10 );
| 0 |
Hi guys, thanks for such a wonderful library. I want a special type of image
transformation.
This custom transformation should scale the image down maintaining the
original aspect ratio so that one dimension of the image is exactly equal to
the target dimensions and the other dimension of the image is greater than the
target dimensions.
Can I achieve this effect using `CENTER_CROP` or should I write custom
transformation? Please help me out.
If I need to write custom transformation, please help me writing it.
|
**Glide Version/Integration library (if any)** : 3.6.0
**Device/Android Version** : any
**Issue details/Repro steps/Use case background** :
How to repo:
I have a demo app with this bug here: https://github.com/yrizk/glide-
centercrop-bug
but essentially...
no scale type declared in the xml
centerCrop() when loading image into an imageview.
make sure the app is cleared from memory
The image will start w/o the center crop, then if you scroll past or above it,
then sometimes it comes back with the correct scale type.
on a separate note, the library is really good! thank you guys for building
and supporting this
| 1 |
# Environment
Windows build number: Microsoft Windows [Version 10.0.18362.476]
Windows Terminal Version: 0.6.2951.0
Any other software?: NO
# Steps to reproduce
when i open the terminal on my surface laptop it works fine but when i try to
drag the windows to another monitor, the application crashes.
i have a surface laptop hooked up to surface dock which is connected to 2
monitors and the behavior appears on both of them
# Expected behavior
the terminal stays open and might change the size and the resolution depending
on the monitor
# Actual behavior
the terminal window crashes and all the tabs in the windows get closed
|
🤣
| 0 |
ERROR: type should be string, got "\n\nhttps://k8s-gubernator.appspot.com/build/kubernetes-\njenkins/logs/kubernetes-e2e-gke-staging/6076/\n\nFailed: Horizontal pod autoscaling (scale resource: CPU) [Serial] [Slow]\nReplicaSet Should scale from 1 pod to 3 pods and from 3 to 5 and verify\ndecision stability {Kubernetes e2e suite}\n\n \n \n /go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/horizontal_pod_autoscaling.go:61\n Jun 30 14:37:23.294: timeout waiting 10m0s for pods size to be 5\n \n\nPrevious issues for this test: #27316 #27773\n\n" | ERROR: type should be string, got "\n\nhttps://k8s-gubernator.appspot.com/build/kubernetes-\njenkins/logs/kubernetes-e2e-gke-serial/1619/\n\nFailed: [k8s.io] [HPA] Horizontal pod autoscaling (scale resource: CPU)\n[k8s.io] [Serial] [Slow] ReplicationController Should scale from 1 pod to 3\npods and from 3 to 5 and verify decision stability {Kubernetes e2e suite}\n\n \n \n /go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/horizontal_pod_autoscaling.go:70\n Expected error:\n <*errors.StatusError | 0xc8214e6200>: {\n ErrStatus: {\n TypeMeta: {Kind: \"\", APIVersion: \"\"},\n ListMeta: {SelfLink: \"\", ResourceVersion: \"\"},\n Status: \"Failure\",\n Message: \"the server has asked for the client to provide credentials (get replicationControllers rc)\",\n Reason: \"Unauthorized\",\n Details: {\n Name: \"rc\",\n Group: \"\",\n Kind: \"replicationControllers\",\n Causes: [\n {\n Type: \"UnexpectedServerResponse\",\n Message: \"Unauthorized\",\n Field: \"\",\n },\n ],\n RetryAfterSeconds: 0,\n },\n Code: 401,\n },\n }\n the server has asked for the client to provide credentials (get replicationControllers rc)\n not to have occurred\n \n\nPrevious issues for this test: #27479 #27675\n\n" | 1 |
## 🐛 Bug
Hello, I’m running a GAN model with some second-order regularization. The
training went well at the beginning and the loss decreased. However, after a
random period of time, ranging from 3000-8000 updates, the segfault occurs and
the training was terminated automatically.
## To Reproduce
It is hard for me to simplify the code and reproduce it. However, it happens
every time when I include the second order gradient:
if pl_reg:
pl_dlatents.requires_grad_(True)
pl_fake = self.G(None, dlatents=pl_dlatents) # forward the generator
pl_noise = pl_fake.new(pl_fake.shape).normal_() / 1024
pl_grads = autograd.grad(
outputs=(pl_fake * pl_noise).sum(),
inputs=pl_dlatents,
grad_outputs=None,
create_graph=True,
retain_graph=True,
only_inputs=True,
)[0]
pl_lengths = pl_grads.pow(2).sum(1).mul(1/self.G.get_num_layers()).sqrt()
return pl_lengths
Another major difference for my generator design is that it has convolution
layers with **both** the input and weights calculated from other up-stream
networks.
class ModulatedConv2d(nn.Module):
def __init__(self, in_channels, out_channels, hidden_channels, kernel_size=3, stride=1, padding=1, dilation=1,
noisy=True, randomize_noise=True, up=False, demodulize=True, gain=1, lrmul=1):
super(ModulatedConv2d, self).__init__()
assert kernel_size >= 1 and kernel_size % 2 == 1
self.noisy = noisy
self.stride = stride
self.padding = padding
self.dilation = dilation
self.randomize_noise = randomize_noise
self.up = up
self.demodulize = demodulize
self.lrmul = lrmul
# Get weight.
fan_in = in_channels * kernel_size * kernel_size
self.runtime_coef = gain / math.sqrt(fan_in) * math.sqrt(lrmul)
self.weight = Parameter(torch.randn(out_channels, in_channels, kernel_size, kernel_size) / math.sqrt(lrmul), requires_grad=True) # [OIkk]
# Get bias.
self.bias = Parameter(torch.zeros(1, out_channels, 1, 1), requires_grad=True)
# Modulate layer.
self.mod = ScaleLinear(hidden_channels, in_channels, bias=True) # [BI] Transform incoming W to style.
# Noise scale.
if noisy:
self.noise_scale = Parameter(torch.zeros(1), requires_grad=True)
def forward(self, x, y, noise=None):
w = self.weight * self.runtime_coef
ww = w[np.newaxis] # [BOIkk] Introduce minibatch dimension.
# Modulate.
s = self.mod(y) + 1 # [BI] Add bias (initially 1).
ww = ww * s[:, np.newaxis, :, np.newaxis, np.newaxis] # [BOIkk] Scale input feature maps.
# Demodulate.
if self.demodulize:
d = torch.rsqrt(ww.pow(2).sum(dim=(2,3,4), keepdim=True) + 1e-8) # [BOIkk] Scaling factor.
ww = ww * d # [BOIkk] Scale output feature maps.
# Reshape/scale input.
B = y.size(0)
x = x.view(1, -1, *x.shape[2:]) # Fused [BIhw] => reshape minibatch to convolution groups [1(BI)hw].
w = ww.view(-1, *ww.shape[2:]) # [(BO)Ikk]
# Convolution with optional up/downsampling.
if self.up: x = F.interpolate(x, scale_factor=2, mode='bilinear', align_corners=False)
x = F.conv2d(x, w, None, self.stride, self.padding, self.dilation, groups=B) # [1(BO)hw]
# Reshape/scale output.
x = x.view(B, -1, *x.shape[2:]) # [BOhw]
# Apply noise and bias
if self.noisy:
if self.randomize_noise: noise = x.new_empty(B, 1, *x.shape[2:]).normal_()
x += noise * self.noise_scale
x += self.bias * self.lrmul
return x
The above code happens inside a DataParallel module and results are gathered
afterward, followed by a loss to minimize pl_lengths.
## Expected behavior
After some random number of updates, around 3000-8000, the problem occurs and
training was terminated.
In PyTorch 1.3 it shows:
> *** Error in `python3’: double free or corruption (fasttop):
> 0x00007f9e280c3fe0 ***
> ======= Backtrace: =========
> /lib/x86_64-linux-gnu/libc.so.6(+0x777e5)[0x7fa1793127e5]
> /lib/x86_64-linux-gnu/libc.so.6(+0x8037a)[0x7fa17931b37a]
> /lib/x86_64-linux-gnu/libc.so.6(cfree+0x4c)[0x7fa17931f53c]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch.so(+0x39d820e)[0x7fa13bb9420e]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch.so(+0x39d82b9)[0x7fa13bb942b9]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch.so(+0x39d8435)[0x7fa13bb94435]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch.so(_ZN5torch8autograd6Engine17evaluate_functionERNS0_8NodeTaskE+0x1210)[0x7fa13bb8bb50]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch.so(_ZN5torch8autograd6Engine11thread_mainEPNS0_9GraphTaskE+0x1c4)[0x7fa13bb8da04]
> /opt/conda/lib/python3.6/site-
> packages/torch/lib/libtorch_python.so(_ZN5torch8autograd6python12PythonEngine11thread_initEi+0x2a)[0x7fa16a537eda]
> /opt/conda/lib/python3.6/site-
> packages/torch/…/…/…/libstdc++.so.6(+0xc819d)[0x7fa169ffb19d]
> /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba)[0x7fa17966c6ba]
> /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d)[0x7fa1793a241d]
and followed by a long memory map:
> ======= Memory map: ========
> 200000000-200200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 200200000-200400000 ---p 00000000 00:00 0
> 200400000-200600000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 200600000-202600000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 202600000-205600000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 205600000-206200000 ---p 00000000 00:00 0
> 206200000-206400000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 206400000-206600000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 206600000-206800000 rw-s 206600000 00:06 492 /dev/nvidia-uvm
> 206800000-206a00000 ---p 00000000 00:00 0
> 206a00000-206c00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 206c00000-206e00000 ---p 00000000 00:00 0
> 206e00000-207000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 207000000-207200000 ---p 00000000 00:00 0
> 207200000-207400000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 207400000-209400000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 209400000-20c400000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20c400000-20d000000 ---p 00000000 00:00 0
> 20d000000-20d200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20d200000-20d400000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20d400000-20d600000 rw-s 20d400000 00:06 492 /dev/nvidia-uvm
> 20d600000-20d800000 ---p 00000000 00:00 0
> 20d800000-20da00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20da00000-20dc00000 ---p 00000000 00:00 0
> 20dc00000-20de00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20de00000-20e000000 ---p 00000000 00:00 0
> 20e000000-20e200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 20e200000-210200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 210200000-213200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 213200000-213e00000 ---p 00000000 00:00 0
> 213e00000-214000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 214000000-214200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 214200000-214400000 rw-s 214200000 00:06 492 /dev/nvidia-uvm
> 214400000-214600000 ---p 00000000 00:00 0
> 214600000-214800000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 214800000-214a00000 ---p 00000000 00:00 0
> 214a00000-214c00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 214c00000-214e00000 ---p 00000000 00:00 0
> 214e00000-215000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 215000000-217000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 217000000-21a000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 21a000000-21ac00000 ---p 00000000 00:00 0
> 21ac00000-21ae00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 21ae00000-21b000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 21b000000-21b200000 rw-s 21b000000 00:06 492 /dev/nvidia-uvm
> 21b200000-21b400000 ---p 00000000 00:00 0
> 21b400000-21b600000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 21b600000-600200000 ---p 00000000 00:00 0
> 10000000000-10810000000 ---p 00000000 00:00 0
> 5578812bd000-557881314000 r--p 00000000 08:31 78220892
> /opt/conda/bin/python3.6
> 557881314000-5578814db000 r-xp 00057000 08:31 78220892
> /opt/conda/bin/python3.6
> 5578814db000-557881578000 r--p 0021e000 08:31 78220892
> /opt/conda/bin/python3.6
> 557881579000-55788157c000 r--p 002bb000 08:31 78220892
> /opt/conda/bin/python3.6
> 55788157c000-5578815df000 rw-p 002be000 08:31 78220892
> /opt/conda/bin/python3.6
> 5578815df000-557881610000 rw-p 00000000 00:00 0
> 557881cc4000-5579eb27e000 rw-p 00000000 00:00 0 [heap]
> 5579eb27e000-5579eb67e000 rw-p 00000000 00:00 0 [heap]
> 5579eb67e000-5579ed2c1000 rw-p 00000000 00:00 0 [heap]
> 5579ed2c1000-5579ee436000 rw-p 00000000 00:00 0 [heap]
> 5579ee436000-5579ee491000 rw-p 00000000 00:00 0 [heap]
> 5579ee491000-5579ef3d1000 rw-p 00000000 00:00 0 [heap]
> 5579ef3d1000-5579effc9000 rw-p 00000000 00:00 0 [heap]
> 7f98c0000000-7f9bc0a00000 ---p 00000000 00:00 0
> 7f9bc0a00000-7f9bc0c00000 rw-s 00000000 00:05 1587841 /dev/zero (deleted)
> 7f9bc0c00000-7f9c00000000 ---p 00000000 00:00 0
> 7f9c00000000-7f9c03d1e000 rw-p 00000000 00:00 0
> 7f9c03d1e000-7f9c04000000 ---p 00000000 00:00 0
> 7f9c04000000-7f9c06fa0000 rw-p 00000000 00:00 0
> 7f9c06fa0000-7f9c08000000 ---p 00000000 00:00 0
> 7f9c08000000-7f9cd0000000 ---p 00000000 00:00 0
> 7f9cd0000000-7f9cd2e22000 rw-p 00000000 00:00 0
> 7f9cd2e22000-7f9cd4000000 ---p 00000000 00:00 0
> 7f9cd6000000-7f9dd0a00000 ---p 00000000 00:00 0
> 7f9dd0a00000-7f9dd0c00000 rw-s 00000000 00:05 1604789 /dev/zero (deleted)
> 7f9dd0c00000-7f9dd8000000 ---p 00000000 00:00 0
> 7f9dd8000000-7f9ddaeae000 rw-p 00000000 00:00 0
> 7f9ddaeae000-7f9ddc000000 ---p 00000000 00:00 0
> 7f9ddc000000-7f9ddfd11000 rw-p 00000000 00:00 0
> 7f9ddfd11000-7f9de0000000 ---p 00000000 00:00 0
> 7f9de0000000-7f9e18000000 ---p 00000000 00:00 0
> 7f9e18000000-7f9e1814b000 rw-p 00000000 00:00 0
> 7f9e1814b000-7f9e1c000000 ---p 00000000 00:00 0
> 7f9e1c000000-7f9e1fd09000 rw-p 00000000 00:00 0
> 7f9e1fd09000-7f9e20000000 ---p 00000000 00:00 0
> 7f9e20000000-7f9e20150000 rw-p 00000000 00:00 0
> 7f9e20150000-7f9e24000000 ---p 00000000 00:00 0
> 7f9e28000000-7f9e282e5000 rw-p 00000000 00:00 0
> 7f9e282e5000-7f9e2c000000 ---p 00000000 00:00 0
> 7f9e2c000000-7f9e2c14b000 rw-p 00000000 00:00 0
> 7f9e2c14b000-7f9e30000000 ---p 00000000 00:00 0
> 7f9e30000000-7f9e30022000 rw-p 00000000 00:00 0
> 7f9e30022000-7f9e34000000 ---p 00000000 00:00 0
> 7f9e36000000-7f9ef8a00000 ---p 00000000 00:00 0
> 7f9ef8a00000-7f9ef8c00000 rw-s 00000000 00:05 1582741 /dev/zero (deleted)
> 7f9ef8c00000-7f9f00000000 ---p 00000000 00:00 0
> 7f9f00000000-7f9f02e3a000 rw-p 00000000 00:00 0
> 7f9f02e3a000-7f9f04000000 ---p 00000000 00:00 0
> 7f9f05000000-7f9f05040000 rw-p 00000000 00:00 0
> 7f9f06000000-7f9f08000000 ---p 00000000 00:00 0
> 7f9f08000000-7f9f0bffc000 rw-p 00000000 00:00 0
> 7f9f0bffc000-7f9f0c000000 ---p 00000000 00:00 0
> 7f9f0c000000-7f9f0fd27000 rw-p 00000000 00:00 0
> 7f9f0fd27000-7f9f10000000 ---p 00000000 00:00 0
> 7f9f10000000-7f9f13ffe000 rw-p 00000000 00:00 0
> 7f9f13ffe000-7f9f14000000 ---p 00000000 00:00 0
> 7f9f14000000-7f9f18000000 rw-p 00000000 00:00 0
> 7f9f18000000-7f9f1bffd000 rw-p 00000000 00:00 0
> 7f9f1bffd000-7f9f1c000000 ---p 00000000 00:00 0
> 7f9f1d166000-7f9f1d1a6000 rw-p 00000000 00:00 0
> 7f9f1d266000-7f9f1d2a6000 rw-p 00000000 00:00 0
> 7f9f1d2e5000-7f9f1d2e6000 ---p 00000000 00:00 0
> 7f9f1d2e6000-7f9f1dae6000 rw-p 00000000 00:00 0
> 7f9f1dae6000-7f9f1daf9000 r-xp 00000000 08:31 85822789
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/mvn.cpython-36m-x86_64-linux-gnu.so
> 7f9f1daf9000-7f9f1dcf8000 ---p 00013000 08:31 85822789
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/mvn.cpython-36m-x86_64-linux-gnu.so
> 7f9f1dcf8000-7f9f1dcfa000 rw-p 00012000 08:31 85822789
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/mvn.cpython-36m-x86_64-linux-gnu.so
> 7f9f1dcfa000-7f9f1ddf1000 rw-p 00000000 00:00 0
> 7f9f1ddf1000-7f9f1ddf3000 rw-p 00015000 08:31 85822789
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/mvn.cpython-36m-x86_64-linux-gnu.so
> 7f9f1ddf3000-7f9f1ddfd000 r-xp 00000000 08:31 85822791
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/statlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f1ddfd000-7f9f1dffc000 ---p 0000a000 08:31 85822791
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/statlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f1dffc000-7f9f1dffe000 rw-p 00009000 08:31 85822791
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/statlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f1dffe000-7f9f1e000000 rw-p 0000c000 08:31 85822791
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/statlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f1e000000-7f9f47200000 ---p 00000000 00:00 0
> 7f9f47200000-7f9f47400000 rw-s 00000000 00:05 1582740 /dev/zero (deleted)
> 7f9f47400000-7f9f47e00000 ---p 00000000 00:00 0
> 7f9f47e00000-7f9f48000000 rw-s 00000000 00:05 1590697 /dev/zero (deleted)
> 7f9f48000000-7f9f48021000 rw-p 00000000 00:00 0
> 7f9f48021000-7f9f4c000000 ---p 00000000 00:00 0
> 7f9f4c021000-7f9f4c094000 r-xp 00000000 08:31 85822779
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/_stats.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c094000-7f9f4c294000 ---p 00073000 08:31 85822779
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/_stats.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c294000-7f9f4c29a000 rw-p 00073000 08:31 85822779
> /opt/conda/lib/python3.6/site-
> packages/scipy/stats/_stats.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c29a000-7f9f4c29c000 rw-p 00000000 00:00 0
> 7f9f4c29c000-7f9f4c2f0000 r-xp 00000000 08:31 85429655
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/interpnd.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c2f0000-7f9f4c4f0000 ---p 00054000 08:31 85429655
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/interpnd.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c4f0000-7f9f4c4f5000 rw-p 00054000 08:31 85429655
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/interpnd.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c4f5000-7f9f4c4f6000 rw-p 00000000 00:00 0
> 7f9f4c4f6000-7f9f4c546000 r-xp 00000000 08:31 85429651
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_ppoly.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c546000-7f9f4c746000 ---p 00050000 08:31 85429651
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_ppoly.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c746000-7f9f4c74c000 rw-p 00050000 08:31 85429651
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_ppoly.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c74c000-7f9f4c74d000 rw-p 00000000 00:00 0
> 7f9f4c74d000-7f9f4c78a000 r-xp 00000000 08:31 85429645
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_bspl.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c78a000-7f9f4c989000 ---p 0003d000 08:31 85429645
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_bspl.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c989000-7f9f4c98f000 rw-p 0003c000 08:31 85429645
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_bspl.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c98f000-7f9f4c990000 rw-p 00000000 00:00 0
> 7f9f4c990000-7f9f4c993000 rw-p 00043000 08:31 85429645
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_bspl.cpython-36m-x86_64-linux-gnu.so
> 7f9f4c993000-7f9f50000000 rw-p 00000000 00:00 0
> 7f9f50000000-7f9f50021000 rw-p 00000000 00:00 0
> 7f9f50021000-7f9f54000000 ---p 00000000 00:00 0
> 7f9f54000000-7f9f5d400000 ---p 00000000 00:00 0
> 7f9f5d400000-7f9f5d600000 rw-s 00000000 00:05 1567619 /dev/zero (deleted)
> 7f9f5d600000-7f9f5d800000 ---p 00000000 00:00 0
> 7f9f5d800000-7f9f5da00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f5da00000-7f9f5dc00000 ---p 00000000 00:00 0
> 7f9f5dc00000-7f9f5de00000 rw-s 00000000 00:05 `1567621` /dev/zero (deleted)
> 7f9f5de00000-7f9f5e0d6000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f5e0d6000-7f9f63200000 ---p 00000000 00:00 0
> 7f9f63200000-7f9f63400000 rw-s 00000000 00:05 1598092 /dev/zero (deleted)
> 7f9f63400000-7f9f63c00000 ---p 00000000 00:00 0
> 7f9f63c00000-7f9f63e00000 rw-s 00000000 00:05 `1567618` /dev/zero (deleted)
> 7f9f63e00000-7f9f64000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f64000000-7f9f64021000 rw-p 00000000 00:00 0
> 7f9f64021000-7f9f68000000 ---p 00000000 00:00 0
> 7f9f680b9000-7f9f68117000 r-xp 00000000 08:31 85429652
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/dfitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68117000-7f9f68317000 ---p 0005e000 08:31 85429652
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/dfitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68317000-7f9f6831e000 rw-p 0005e000 08:31 85429652
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/dfitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f6831e000-7f9f68320000 rw-p 00066000 08:31 85429652
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/dfitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68320000-7f9f68354000 r-xp 00000000 08:31 85429648
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_fitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68354000-7f9f68554000 ---p 00034000 08:31 85429648
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_fitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68554000-7f9f68555000 rw-p 00034000 08:31 85429648
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_fitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68555000-7f9f68557000 rw-p 00036000 08:31 85429648
> /opt/conda/lib/python3.6/site-
> packages/scipy/interpolate/_fitpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f68557000-7f9f68570000 r-xp 00000000 08:31 85429603
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/lsoda.cpython-36m-x86_64-linux-gnu.so
> 7f9f68570000-7f9f6876f000 ---p 00019000 08:31 85429603
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/lsoda.cpython-36m-x86_64-linux-gnu.so
> 7f9f6876f000-7f9f68774000 rw-p 00018000 08:31 85429603
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/lsoda.cpython-36m-x86_64-linux-gnu.so
> 7f9f68774000-7f9f6878f000 r-xp 00000000 08:31 85429576
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_dop.cpython-36m-x86_64-linux-gnu.so
> 7f9f6878f000-7f9f6898e000 ---p 0001b000 08:31 85429576
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_dop.cpython-36m-x86_64-linux-gnu.so
> 7f9f6898e000-7f9f68990000 rw-p 0001a000 08:31 85429576
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_dop.cpython-36m-x86_64-linux-gnu.so
> 7f9f68990000-7f9f68993000 rw-p 0001d000 08:31 85429576
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_dop.cpython-36m-x86_64-linux-gnu.so
> 7f9f68993000-7f9f6c000000 rw-p 00000000 00:00 0
> 7f9f6c000000-7f9f6c021000 rw-p 00000000 00:00 0
> 7f9f6c021000-7f9f70000000 ---p 00000000 00:00 0
> 7f9f700e8000-7f9f70128000 rw-p 00000000 00:00 0
> 7f9f701e8000-7f9f70219000 r-xp 00000000 08:31 85429628
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/vode.cpython-36m-x86_64-linux-gnu.so
> 7f9f70219000-7f9f70418000 ---p 00031000 08:31 85429628
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/vode.cpython-36m-x86_64-linux-gnu.so
> 7f9f70418000-7f9f7041a000 rw-p 00030000 08:31 85429628
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/vode.cpython-36m-x86_64-linux-gnu.so
> 7f9f7041a000-7f9f7041b000 rw-p 00000000 00:00 0
> 7f9f7041b000-7f9f7041f000 rw-p 00033000 08:31 85429628
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/vode.cpython-36m-x86_64-linux-gnu.so
> 7f9f7041f000-7f9f70439000 r-xp 00000000 08:31 85429600
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_quadpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70439000-7f9f70639000 ---p 0001a000 08:31 85429600
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_quadpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70639000-7f9f7063a000 rw-p 0001a000 08:31 85429600
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_quadpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f7063a000-7f9f7063d000 rw-p 0001c000 08:31 85429600
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_quadpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f7063d000-7f9f70653000 r-xp 00000000 08:31 85429598
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_odepack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70653000-7f9f70852000 ---p 00016000 08:31 85429598
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_odepack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70852000-7f9f70853000 rw-p 00015000 08:31 85429598
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_odepack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70853000-7f9f70854000 rw-p 00000000 00:00 0
> 7f9f70854000-7f9f70857000 rw-p 00017000 08:31 85429598
> /opt/conda/lib/python3.6/site-
> packages/scipy/integrate/_odepack.cpython-36m-x86_64-linux-gnu.so
> 7f9f70857000-7f9f7085a000 r-xp 00000000 08:31 85560786
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsap_module.cpython-36m-x86_64-linux-gnu.so
> 7f9f7085a000-7f9f70a5a000 ---p 00003000 08:31 85560786
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsap_module.cpython-36m-x86_64-linux-gnu.so
> 7f9f70a5a000-7f9f70a5b000 rw-p 00003000 08:31 85560786
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsap_module.cpython-36m-x86_64-linux-gnu.so
> 7f9f70a5b000-7f9f70a9d000 r-xp 00000000 08:31 85560771
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_bglu_dense.cpython-36m-x86_64-linux-gnu.so
> 7f9f70a9d000-7f9f70c9c000 ---p 00042000 08:31 85560771
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_bglu_dense.cpython-36m-x86_64-linux-gnu.so
> 7f9f70c9c000-7f9f70ca1000 rw-p 00041000 08:31 85560771
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_bglu_dense.cpython-36m-x86_64-linux-gnu.so
> 7f9f70ca1000-7f9f70ca2000 rw-p 00000000 00:00 0
> 7f9f70ca2000-7f9f70cac000 r-xp 00000000 08:31 85560810
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_nnls.cpython-36m-x86_64-linux-gnu.so
> 7f9f70cac000-7f9f70eac000 ---p 0000a000 08:31 85560810
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_nnls.cpython-36m-x86_64-linux-gnu.so
> 7f9f70eac000-7f9f70ead000 rw-p 0000a000 08:31 85560810
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_nnls.cpython-36m-x86_64-linux-gnu.so
> 7f9f70ead000-7f9f70eaf000 rw-p 0000c000 08:31 85560810
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_nnls.cpython-36m-x86_64-linux-gnu.so
> 7f9f70eaf000-7f9f70eb2000 r-xp 00000000 08:31 85560872
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_zeros.cpython-36m-x86_64-linux-gnu.so
> 7f9f70eb2000-7f9f710b1000 ---p 00003000 08:31 85560872
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_zeros.cpython-36m-x86_64-linux-gnu.so
> 7f9f710b1000-7f9f710b2000 rw-p 00002000 08:31 85560872
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_zeros.cpython-36m-x86_64-linux-gnu.so
> 7f9f710b2000-7f9f710d9000 r-xp 00000000 08:31 85560802
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsq/givens_elimination.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f710d9000-7f9f712d8000 ---p 00027000 08:31 85560802
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsq/givens_elimination.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f712d8000-7f9f712dc000 rw-p 00026000 08:31 85560802
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lsq/givens_elimination.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f712dc000-7f9f712fb000 r-xp 00000000 08:31 85560809
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_minpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f712fb000-7f9f714fb000 ---p 0001f000 08:31 85560809
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_minpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f714fb000-7f9f714fe000 rw-p 0001f000 08:31 85560809
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_minpack.cpython-36m-x86_64-linux-gnu.so
> 7f9f714fe000-7f9f71516000 r-xp 00000000 08:31 85560825
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_slsqp.cpython-36m-x86_64-linux-gnu.so
> 7f9f71516000-7f9f71715000 ---p 00018000 08:31 85560825
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_slsqp.cpython-36m-x86_64-linux-gnu.so
> 7f9f71715000-7f9f71719000 rw-p 00017000 08:31 85560825
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_slsqp.cpython-36m-x86_64-linux-gnu.so
> 7f9f71719000-7f9f71737000 r-xp 00000000 08:31 85560772
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_cobyla.cpython-36m-x86_64-linux-gnu.so
> 7f9f71737000-7f9f71937000 ---p 0001e000 08:31 85560772
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_cobyla.cpython-36m-x86_64-linux-gnu.so
> 7f9f71937000-7f9f71938000 rw-p 0001e000 08:31 85560772
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_cobyla.cpython-36m-x86_64-linux-gnu.so
> 7f9f71938000-7f9f7193a000 rw-p 00020000 08:31 85560772
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_cobyla.cpython-36m-x86_64-linux-gnu.so
> 7f9f7193a000-7f9f71946000 r-xp 00000000 08:31 85560888
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/moduleTNC.cpython-36m-x86_64-linux-gnu.so
> 7f9f71946000-7f9f71b45000 ---p 0000c000 08:31 85560888
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/moduleTNC.cpython-36m-x86_64-linux-gnu.so
> 7f9f71b45000-7f9f71b46000 rw-p 0000b000 08:31 85560888
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/moduleTNC.cpython-36m-x86_64-linux-gnu.so
> 7f9f71b46000-7f9f71b63000 r-xp 00000000 08:31 85560779
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lbfgsb.cpython-36m-x86_64-linux-gnu.so
> 7f9f71b63000-7f9f71d63000 ---p 0001d000 08:31 85560779
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lbfgsb.cpython-36m-x86_64-linux-gnu.so
> 7f9f71d63000-7f9f71d64000 rw-p 0001d000 08:31 85560779
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lbfgsb.cpython-36m-x86_64-linux-gnu.so
> 7f9f71d64000-7f9f71d67000 rw-p 0001f000 08:31 85560779
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_lbfgsb.cpython-36m-x86_64-linux-gnu.so
> 7f9f71d67000-7f9f71def000 r-xp 00000000 08:31 85691771
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/eigen/arpack/_arpack.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f71def000-7f9f71fef000 ---p 00088000 08:31 85691771
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/eigen/arpack/_arpack.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f71fef000-7f9f71ffa000 rw-p 00088000 08:31 85691771
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/eigen/arpack/_arpack.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f71ffa000-7f9f71ffc000 rw-p 00000000 00:00 0
> 7f9f71ffc000-7f9f72000000 rw-p 00094000 08:31 85691771
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/eigen/arpack/_arpack.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f72000000-7f9f7a000000 ---p 00000000 00:00 0
> 7f9f7a00a000-7f9f7a035000 r-xp 00000000 08:31 85560777
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_group_columns.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a035000-7f9f7a235000 ---p 0002b000 08:31 85560777
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_group_columns.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a235000-7f9f7a238000 rw-p 0002b000 08:31 85560777
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_group_columns.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a238000-7f9f7a239000 rw-p 00000000 00:00 0
> 7f9f7a239000-7f9f7a286000 r-xp 00000000 08:31 85691749
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/dsolve/_superlu.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a286000-7f9f7a485000 ---p 0004d000 08:31 85691749
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/dsolve/_superlu.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a485000-7f9f7a487000 rw-p 0004c000 08:31 85691749
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/dsolve/_superlu.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a487000-7f9f7a48a000 rw-p 0004f000 08:31 85691749
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/dsolve/_superlu.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a48a000-7f9f7a4bb000 r-xp 00000000 08:31 85691809
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/isolve/_iterative.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f7a4bb000-7f9f7a6bb000 ---p 00031000 08:31 85691809
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/isolve/_iterative.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f7a6bb000-7f9f7a6c2000 rw-p 00031000 08:31 85691809
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/isolve/_iterative.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f7a6c2000-7f9f7a6c4000 rw-p 00000000 00:00 0
> 7f9f7a6c4000-7f9f7a6c7000 rw-p 00039000 08:31 85691809
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/linalg/isolve/_iterative.cpython-36m-x86_64-linux-
> gnu.so
> 7f9f7a6c7000-7f9f7a710000 r-xp 00000000 08:31 85560832
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_trlib/_trlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a710000-7f9f7a910000 ---p 00049000 08:31 85560832
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_trlib/_trlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a910000-7f9f7a914000 rw-p 00049000 08:31 85560832
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_trlib/_trlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a914000-7f9f7a915000 rw-p 00000000 00:00 0
> 7f9f7a915000-7f9f7a919000 rw-p 0004e000 08:31 85560832
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/_trlib/_trlib.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a919000-7f9f7a922000 r-xp 00000000 08:31 85560887
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/minpack2.cpython-36m-x86_64-linux-gnu.so
> 7f9f7a922000-7f9f7ab22000 ---p 00009000 08:31 85560887
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/minpack2.cpython-36m-x86_64-linux-gnu.so
> 7f9f7ab22000-7f9f7ab23000 rw-p 00009000 08:31 85560887
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/minpack2.cpython-36m-x86_64-linux-gnu.so
> 7f9f7ab23000-7f9f7ab25000 rw-p 0000b000 08:31 85560887
> /opt/conda/lib/python3.6/site-
> packages/scipy/optimize/minpack2.cpython-36m-x86_64-linux-gnu.so
> 7f9f7ab25000-7f9f7ab26000 ---p 00000000 00:00 0
> 7f9f7ab26000-7f9f7e993000 rw-p 00000000 00:00 0
> 7f9f7eb8f000-7f9f7ebd8000 r-xp 00000000 08:31 85560677
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_ni_label.cpython-36m-x86_64-linux-gnu.so
> 7f9f7ebd8000-7f9f7edd8000 ---p 00049000 08:31 85560677
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_ni_label.cpython-36m-x86_64-linux-gnu.so
> 7f9f7edd8000-7f9f7eddc000 rw-p 00049000 08:31 85560677
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_ni_label.cpython-36m-x86_64-linux-gnu.so
> 7f9f7eddc000-7f9f7edde000 rw-p 00000000 00:00 0
> 7f9f7edde000-7f9f7edf9000 r-xp 00000000 08:31 85560675
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_nd_image.cpython-36m-x86_64-linux-gnu.so
> 7f9f7edf9000-7f9f7eff9000 ---p 0001b000 08:31 85560675
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_nd_image.cpython-36m-x86_64-linux-gnu.so
> 7f9f7eff9000-7f9f7effa000 rw-p 0001b000 08:31 85560675
> /opt/conda/lib/python3.6/site-
> packages/scipy/ndimage/_nd_image.cpython-36m-x86_64-linux-gnu.so
> 7f9f7effa000-7f9f7effb000 ---p 00000000 00:00 0
> 7f9f7effb000-7f9f7f7fb000 rw-p 00000000 00:00 0
> 7f9f7f7fb000-7f9f7f7fc000 ---p 00000000 00:00 0
> 7f9f7f7fc000-7f9f7fffc000 rw-p 00000000 00:00 0
> 7f9f7fffc000-7f9f7fffd000 ---p 00000000 00:00 0
> 7f9f7fffd000-7f9f807fd000 rw-p 00000000 00:00 0
> 7f9f807fd000-7f9f807fe000 ---p 00000000 00:00 0
> 7f9f807fe000-7f9f80ffe000 rw-p 00000000 00:00 0
> 7f9f811ff000-7f9f81200000 ---p 00000000 00:00 0
> 7f9f81200000-7f9f81a00000 rw-p 00000000 00:00 0
> 7f9f82000000-7f9f83400000 ---p 00000000 00:00 0
> 7f9f83400000-7f9f83600000 rw-s 00000000 00:05 `1579907` /dev/zero (deleted)
> 7f9f83600000-7f9f83800000 ---p 00000000 00:00 0
> 7f9f83800000-7f9f83a00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f83a00000-7f9f83c00000 ---p 00000000 00:00 0
> 7f9f83c00000-7f9f83e00000 rw-s 00000000 00:05 1579909 /dev/zero (deleted)
> 7f9f83e00000-7f9f840d6000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f840d6000-7f9f89200000 ---p 00000000 00:00 0
> 7f9f89200000-7f9f89400000 rw-s 00000000 00:05 1603718 /dev/zero (deleted)
> 7f9f89400000-7f9f89c00000 ---p 00000000 00:00 0
> 7f9f89c00000-7f9f89e00000 rw-s 00000000 00:05 1579906 /dev/zero (deleted)
> 7f9f89e00000-7f9f8a000000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f8a000000-7f9f8a600000 rw-s 00000000 00:05 1595906 /dev/zero (deleted)
> 7f9f8a600000-7f9f8ac00000 rw-s 00000000 00:05 1595908 /dev/zero (deleted)
> 7f9f8ac00000-7f9f92400000 ---p 00000000 00:00 0
> 7f9f92400000-7f9f92a00000 rw-s 00000000 00:05 1599848 /dev/zero (deleted)
> 7f9f92a00000-7f9f93000000 rw-s 00000000 00:05 1579899 /dev/zero (deleted)
> 7f9f93000000-7f9f93600000 rw-s 00000000 00:05 1579901 /dev/zero (deleted)
> 7f9f93600000-7f9f93800000 ---p 00000000 00:00 0
> 7f9f93800000-7f9f93e00000 rw-s 00000000 00:05 1589911 /dev/zero (deleted)
> 7f9f93e00000-7f9f95200000 ---p 00000000 00:00 0
> 7f9f95200000-7f9f95800000 rw-s 00000000 00:05 1599844 /dev/zero (deleted)
> 7f9f95800000-7f9f95a00000 ---p 00000000 00:00 0
> 7f9f95a00000-7f9f96000000 rw-s 00000000 00:05 1599846 /dev/zero (deleted)
> 7f9f96000000-7f9f96600000 rw-s 00000000 00:05 1599843 /dev/zero (deleted)
> 7f9f96600000-7f9f96800000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f96800000-7f9f96a00000 ---p 00000000 00:00 0
> 7f9f96a00000-7f9f97000000 rw-s 00000000 00:05 1589908 /dev/zero (deleted)
> 7f9f97000000-7f9f97200000 rw-s 00000000 00:05 1603717 /dev/zero (deleted)
> 7f9f97200000-7f9f974d6000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f974d6000-7f9f97600000 ---p 00000000 00:00 0
> 7f9f97600000-7f9f97c00000 rw-s 00000000 00:05 1589910 /dev/zero (deleted)
> 7f9f97c00000-7f9f98000000 ---p 00000000 00:00 0
> 7f9f98000000-7f9f98021000 rw-p 00000000 00:00 0
> 7f9f98021000-7f9f9c000000 ---p 00000000 00:00 0
> 7f9f9c109000-7f9f9c11b000 r-xp 00000000 08:31 85692000
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ellip_harm_2.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c11b000-7f9f9c31b000 ---p 00012000 08:31 85692000
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ellip_harm_2.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c31b000-7f9f9c320000 rw-p 00012000 08:31 85692000
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ellip_harm_2.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c320000-7f9f9c3d5000 r-xp 00000000 08:31 85692041
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/specfun.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c3d5000-7f9f9c5d5000 ---p 000b5000 08:31 85692041
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/specfun.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c5d5000-7f9f9c5de000 rw-p 000b5000 08:31 85692041
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/specfun.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c5de000-7f9f9c5fb000 r-xp 00000000 08:31 85692032
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs_cxx.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c5fb000-7f9f9c7fb000 ---p 0001d000 08:31 85692032
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs_cxx.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c7fb000-7f9f9c7fc000 rw-p 0001d000 08:31 85692032
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs_cxx.cpython-36m-x86_64-linux-gnu.so
> 7f9f9c7fc000-7f9f9c7fd000 rw-p 00000000 00:00 0
> 7f9f9c7fd000-7f9f9c7fe000 ---p 00000000 00:00 0
> 7f9f9c7fe000-7f9f9cffe000 rw-p 00000000 00:00 0
> 7f9f9cffe000-7f9f9cfff000 ---p 00000000 00:00 0
> 7f9f9cfff000-7f9f9d7ff000 rw-p 00000000 00:00 0
> 7f9f9d7ff000-7f9f9d800000 ---p 00000000 00:00 0
> 7f9f9d800000-7f9f9e000000 rw-p 00000000 00:00 0
> 7f9f9e000000-7f9f9e600000 rw-s 00000000 00:05 1587833 /dev/zero (deleted)
> 7f9f9e600000-7f9f9e800000 ---p 00000000 00:00 0
> 7f9f9e800000-7f9f9ee00000 rw-s 00000000 00:05 1599841 /dev/zero (deleted)
> 7f9f9ee00000-7f9f9f000000 rw-s 00000000 00:05 1589158 /dev/zero (deleted)
> 7f9f9f000000-7f9f9f200000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7f9f9f200000-7f9f9f800000 rw-s 00000000 00:05 1603715 /dev/zero (deleted)
> 7f9f9f800000-7f9f9fa00000 rw-s 00000000 00:05 `1589159` /dev/zero (deleted)
> 7f9f9fa00000-7f9fa0000000 ---p 00000000 00:00 0
> 7f9fa0000000-7f9fa0600000 rw-s 00000000 00:05 1589905 /dev/zero (deleted)
> 7f9fa0600000-7f9fa0c00000 rw-s 00000000 00:05 `1589906` /dev/zero (deleted)
> 7f9fa0c00000-7f9fa1200000 rw-s 00000000 00:05 `1589154` /dev/zero (deleted)
> 7f9fa1200000-7f9fa1800000 rw-s 00000000 00:05 `1589156` /dev/zero (deleted)
> 7f9fa1800000-7f9fa1e00000 rw-s 00000000 00:05 1580930 /dev/zero (deleted)
> 7f9fa1e00000-7f9fa2000000 ---p 00000000 00:00 0
> 7f9fa2000000-7f9fa2600000 rw-s 00000000 00:05 1589895 /dev/zero (deleted)
> 7f9fa2600000-7f9fa2c00000 rw-s 00000000 00:05 1589897 /dev/zero (deleted)
> 7f9fa2c00000-7f9fa3200000 rw-s 00000000 00:05 1589899 /dev/zero (deleted)
> 7f9fa3200000-7f9fa3800000 rw-s 00000000 00:05 `1589901` /dev/zero (deleted)
> 7f9fa3800000-7f9fa3e00000 rw-s 00000000 00:05 `1589903` /dev/zero (deleted)
> 7f9fa3e00000-7f9fa4000000 ---p 00000000 00:00 0
> 7f9fa4000000-7f9fa4021000 rw-p 00000000 00:00 0
> 7f9fa4021000-7f9fa8000000 ---p 00000000 00:00 0
> 7f9fa800d000-7f9fa8013000 r-xp 00000000 08:31 85691998
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_comb.cpython-36m-x86_64-linux-gnu.so
> 7f9fa8013000-7f9fa8213000 ---p 00006000 08:31 85691998
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_comb.cpython-36m-x86_64-linux-gnu.so
> 7f9fa8213000-7f9fa8214000 rw-p 00006000 08:31 85691998
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_comb.cpython-36m-x86_64-linux-gnu.so
> 7f9fa8214000-7f9fa83b7000 r-xp 00000000 08:31 85692031
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs.cpython-36m-x86_64-linux-gnu.so
> 7f9fa83b7000-7f9fa85b7000 ---p 001a3000 08:31 85692031
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs.cpython-36m-x86_64-linux-gnu.so
> 7f9fa85b7000-7f9fa85c1000 rw-p 001a3000 08:31 85692031
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs.cpython-36m-x86_64-linux-gnu.so
> 7f9fa85c1000-7f9fa85ca000 rw-p 00000000 00:00 0
> 7f9fa85ca000-7f9fa85cf000 rw-p 001ae000 08:31 85692031
> /opt/conda/lib/python3.6/site-
> packages/scipy/special/_ufuncs.cpython-36m-x86_64-linux-gnu.so
> 7f9fa85cf000-7f9fa85f9000 r-xp 00000000 08:31 85691895
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_hausdorff.cpython-36m-x86_64-linux-gnu.so
> 7f9fa85f9000-7f9fa87f9000 ---p 0002a000 08:31 85691895
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_hausdorff.cpython-36m-x86_64-linux-gnu.so
> 7f9fa87f9000-7f9fa87fc000 rw-p 0002a000 08:31 85691895
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_hausdorff.cpython-36m-x86_64-linux-gnu.so
> 7f9fa87fc000-7f9fa87fd000 rw-p 00000000 00:00 0
> 7f9fa87fd000-7f9fa87fe000 ---p 00000000 00:00 0
> 7f9fa87fe000-7f9fa8ffe000 rw-p 00000000 00:00 0
> 7f9fa8ffe000-7f9fa8fff000 ---p 00000000 00:00 0
> 7f9fa8fff000-7f9fa97ff000 rw-p 00000000 00:00 0
> 7f9fa97ff000-7f9fa9800000 ---p 00000000 00:00 0
> 7f9fa9800000-7f9faa000000 rw-p 00000000 00:00 0
> 7f9faa000000-7f9fab400000 ---p 00000000 00:00 0
> 7f9fab400000-7f9faba00000 rw-s 00000000 00:05 1493310 /dev/zero (deleted)
> 7f9faba00000-7f9fac000000 rw-s 00000000 00:05 1589892 /dev/zero (deleted)
> 7f9fac000000-7f9facb90000 rw-p 00000000 00:00 0
> 7f9facb90000-7f9fb0000000 ---p 00000000 00:00 0
> 7f9fb0000000-7f9fb098b000 rw-p 00000000 00:00 0
> 7f9fb098b000-7f9fb4000000 ---p 00000000 00:00 0
> 7f9fb4000000-7f9fb444b000 rw-p 00000000 00:00 0
> 7f9fb444b000-7f9fb8000000 ---p 00000000 00:00 0
> 7f9fb804c000-7f9fb808c000 rw-p 00000000 00:00 0
> 7f9fb808c000-7f9fb808d000 ---p 00000000 00:00 0
> 7f9fb808d000-7f9fb888d000 rw-p 00000000 00:00 0
> 7f9fb888d000-7f9fb888e000 ---p 00000000 00:00 0
> 7f9fb888e000-7f9fb908e000 rw-p 00000000 00:00 0
> 7f9fb908e000-7f9fba28c000 r-xp 00000000 08:31 80973568
> /opt/conda/lib/libnvrtc.so.10.1.168
> 7f9fba28c000-7f9fba48b000 ---p 011fe000 08:31 80973568
> /opt/conda/lib/libnvrtc.so.10.1.168
> 7f9fba48b000-7f9fba70e000 r--p 011fd000 08:31 80973568
> /opt/conda/lib/libnvrtc.so.10.1.168
> 7f9fba70e000-7f9fba755000 rw-p 01480000 08:31 80973568
> /opt/conda/lib/libnvrtc.so.10.1.168
> 7f9fba755000-7f9fba7fc000 rw-p 00000000 00:00 0
> 7f9fba7fc000-7f9fba7fd000 rw-p 014c7000 08:31 80973568
> /opt/conda/lib/libnvrtc.so.10.1.168
> 7f9fba7fd000-7f9fba7fe000 ---p 00000000 00:00 0
> 7f9fba7fe000-7f9fbaffe000 rw-p 00000000 00:00 0
> 7f9fbaffe000-7f9fbafff000 ---p 00000000 00:00 0
> 7f9fbafff000-7f9fbb7ff000 rw-p 00000000 00:00 0
> 7f9fbb7ff000-7f9fbb800000 ---p 00000000 00:00 0
> 7f9fbb800000-7f9fbc000000 rw-p 00000000 00:00 0
> 7f9fbc000000-7f9fbc3a3000 rw-p 00000000 00:00 0
> 7f9fbc3a3000-7f9fc0000000 ---p 00000000 00:00 0
> 7f9fc0000000-7f9fc01eb000 rw-p 00000000 00:00 0
> 7f9fc01eb000-7f9fc4000000 ---p 00000000 00:00 0
> 7f9fc4000000-7f9fc4772000 rw-p 00000000 00:00 0
> 7f9fc4772000-7f9fc8000000 ---p 00000000 00:00 0
> 7f9fc8000000-7f9fc8722000 rw-p 00000000 00:00 0
> 7f9fc8722000-7f9fcc000000 ---p 00000000 00:00 0
> 7f9fcc000000-7f9fcc8ac000 rw-p 00000000 00:00 0
> 7f9fcc8ac000-7f9fd0000000 ---p 00000000 00:00 0
> 7f9fd0000000-7f9fd0649000 rw-p 00000000 00:00 0
> 7f9fd0649000-7f9fd4000000 ---p 00000000 00:00 0
> 7f9fd4000000-7f9fd467f000 rw-p 00000000 00:00 0
> 7f9fd467f000-7f9fd8000000 ---p 00000000 00:00 0
> 7f9fd8000000-7f9fd852f000 rw-p 00000000 00:00 0
> 7f9fd852f000-7f9fdc000000 ---p 00000000 00:00 0
> 7f9fdc000000-7f9fdc28d000 rw-p 00000000 00:00 0
> 7f9fdc28d000-7f9fe0000000 ---p 00000000 00:00 0
> 7f9fe0000000-7f9fe0ab9000 rw-p 00000000 00:00 0
> 7f9fe0ab9000-7f9fe4000000 ---p 00000000 00:00 0
> 7f9fe4000000-7f9fe49c6000 rw-p 00000000 00:00 0
> 7f9fe49c6000-7f9fe8000000 ---p 00000000 00:00 0
> 7f9fe8000000-7f9fe83db000 rw-p 00000000 00:00 0
> 7f9fe83db000-7f9fec000000 ---p 00000000 00:00 0
> 7f9fec000000-7f9fec917000 rw-p 00000000 00:00 0
> 7f9fec917000-7f9ff0000000 ---p 00000000 00:00 0
> 7f9ff0000000-7f9ff052e000 rw-p 00000000 00:00 0
> 7f9ff052e000-7f9ff4000000 ---p 00000000 00:00 0
> 7f9ff4000000-7f9ff7800000 ---p 00000000 00:00 0
> 7f9ff7800000-7f9ff7e00000 rw-s 00000000 00:05 1591824 /dev/zero (deleted)
> 7f9ff7e00000-7f9ff9600000 ---p 00000000 00:00 0
> 7f9ff9600000-7f9ff9c00000 rw-s 00000000 00:05 1591827 /dev/zero (deleted)
> 7f9ff9c00000-7fa002000000 ---p 00000000 00:00 0
> 7fa00200c000-7fa00210c000 rw-p 00000000 00:00 0
> 7fa00210c000-7fa002123000 r-xp 00000000 08:31 85691894
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_distance_wrap.cpython-36m-x86_64-linux-gnu.so
> 7fa002123000-7fa002323000 ---p 00017000 08:31 85691894
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_distance_wrap.cpython-36m-x86_64-linux-gnu.so
> 7fa002323000-7fa002324000 rw-p 00017000 08:31 85691894
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_distance_wrap.cpython-36m-x86_64-linux-gnu.so
> 7fa002324000-7fa002325000 ---p 00000000 00:00 0
> 7fa002325000-7fa002b25000 rw-p 00000000 00:00 0
> 7fa002b25000-7fa002b26000 ---p 00000000 00:00 0
> 7fa002b26000-7fa006993000 rw-p 00000000 00:00 0
> 7fa0069ac000-7fa006b16000 r-xp 00000000 08:01 `2374130`
> /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.430.64
> 7fa006b16000-7fa006d16000 ---p 0016a000 08:01 `2374130`
> /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.430.64
> 7fa006d16000-7fa006d2f000 rw-p 0016a000 08:01 `2374130`
> /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.430.64
> 7fa006d2f000-7fa006ffa000 rw-p 00000000 00:00 0
> 7fa006ffa000-7fa006ffb000 ---p 00000000 00:00 0
> 7fa006ffb000-7fa0077fb000 rw-p 00000000 00:00 0
> 7fa0077fb000-7fa0077fc000 ---p 00000000 00:00 0
> 7fa0077fc000-7fa007ffc000 rw-p 00000000 00:00 0
> 7fa007ffc000-7fa007ffd000 ---p 00000000 00:00 0
> 7fa007ffd000-7fa0087fd000 rw-p 00000000 00:00 0
> 7fa0087fd000-7fa0087fe000 ---p 00000000 00:00 0
> 7fa0087fe000-7fa008ffe000 rw-p 00000000 00:00 0
> 7fa008ffe000-7fa008fff000 ---p 00000000 00:00 0
> 7fa008fff000-7fa0097ff000 rw-p 00000000 00:00 0
> 7fa0097ff000-7fa009800000 ---p 00000000 00:00 0
> 7fa009800000-7fa00a000000 rw-p 00000000 00:00 0
> 7fa00a000000-7fa011200000 ---p 00000000 00:00 0
> 7fa011200000-7fa011400000 rw-s 00000000 00:05 1589151 /dev/zero (deleted)
> 7fa011400000-7fa017a00000 ---p 00000000 00:00 0
> 7fa017a00000-7fa018000000 rw-s 00000000 00:05 1427137 /dev/zero (deleted)
> 7fa018000000-7fa01b400000 ---p 00000000 00:00 0
> 7fa01b400000-7fa01ba00000 rw-s 00000000 00:05 1567613 /dev/zero (deleted)
> 7fa01ba00000-7fa01c000000 rw-s 00000000 00:05 1604786 /dev/zero (deleted)
> 7fa01c000000-7fa01c021000 rw-p 00000000 00:00 0
> 7fa01c021000-7fa020000000 ---p 00000000 00:00 0
> 7fa020000000-7fa020021000 rw-p 00000000 00:00 0
> 7fa020021000-7fa024000000 ---p 00000000 00:00 0
> 7fa024031000-7fa02405a000 r-xp 00000000 08:31 85691899
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_voronoi.cpython-36m-x86_64-linux-gnu.so
> 7fa02405a000-7fa02425a000 ---p 00029000 08:31 85691899
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_voronoi.cpython-36m-x86_64-linux-gnu.so
> 7fa02425a000-7fa02425d000 rw-p 00029000 08:31 85691899
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/_voronoi.cpython-36m-x86_64-linux-gnu.so
> 7fa02425d000-7fa02425e000 rw-p 00000000 00:00 0
> 7fa02425e000-7fa024267000 r-xp 00000000 08:31 85298773
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/messagestream.cpython-36m-x86_64-linux-gnu.so
> 7fa024267000-7fa024466000 ---p 00009000 08:31 85298773
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/messagestream.cpython-36m-x86_64-linux-gnu.so
> 7fa024466000-7fa024468000 rw-p 00008000 08:31 85298773
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/messagestream.cpython-36m-x86_64-linux-gnu.so
> 7fa024468000-7fa026000000 rw-p 00000000 00:00 0
> 7fa026000000-7fa026400000 ---p 00000000 00:00 0
> 7fa026400000-7fa026600000 rw-s 00000000 00:05 1589884 /dev/zero (deleted)
> 7fa026600000-7fa026800000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa026800000-7fa026a00000 rw-s 00000000 00:05 1589885 /dev/zero (deleted)
> 7fa026a00000-7fa026c00000 ---p 00000000 00:00 0
> 7fa026c00000-7fa026e00000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa026e00000-7fa027000000 ---p 00000000 00:00 0
> 7fa027000000-7fa027200000 rw-s 00000000 00:05 `1589887` /dev/zero (deleted)
> 7fa027200000-7fa0274d6000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa0274d6000-7fa027600000 ---p 00000000 00:00 0
> 7fa027600000-7fa027c00000 rw-s 00000000 00:05 1589893 /dev/zero (deleted)
> 7fa027c00000-7fa028000000 ---p 00000000 00:00 0
> 7fa028000000-7fa028021000 rw-p 00000000 00:00 0
> 7fa028021000-7fa02c000000 ---p 00000000 00:00 0
> 7fa02c000000-7fa02c021000 rw-p 00000000 00:00 0
> 7fa02c021000-7fa030000000 ---p 00000000 00:00 0
> 7fa030000000-7fa030021000 rw-p 00000000 00:00 0
> 7fa030021000-7fa034000000 ---p 00000000 00:00 0
> 7fa034000000-7fa034021000 rw-p 00000000 00:00 0
> 7fa034021000-7fa038000000 ---p 00000000 00:00 0
> 7fa038000000-7fa038021000 rw-p 00000000 00:00 0
> 7fa038021000-7fa03c000000 ---p 00000000 00:00 0
> 7fa03c01c000-7fa03c05c000 rw-p 00000000 00:00 0
> 7fa03c05c000-7fa03c147000 r-xp 00000000 08:31 85691903
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/qhull.cpython-36m-x86_64-linux-gnu.so
> 7fa03c147000-7fa03c346000 ---p 000eb000 08:31 85691903
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/qhull.cpython-36m-x86_64-linux-gnu.so
> 7fa03c346000-7fa03c34f000 rw-p 000ea000 08:31 85691903
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/qhull.cpython-36m-x86_64-linux-gnu.so
> 7fa03c34f000-7fa03c352000 rw-p 00000000 00:00 0
> 7fa03c352000-7fa03c356000 rw-p 000f4000 08:31 85691903
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/qhull.cpython-36m-x86_64-linux-gnu.so
> 7fa03c356000-7fa03c406000 r-xp 00000000 08:31 85691900
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/ckdtree.cpython-36m-x86_64-linux-gnu.so
> 7fa03c406000-7fa03c605000 ---p 000b0000 08:31 85691900
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/ckdtree.cpython-36m-x86_64-linux-gnu.so
> 7fa03c605000-7fa03c60f000 rw-p 000af000 08:31 85691900
> /opt/conda/lib/python3.6/site-
> packages/scipy/spatial/ckdtree.cpython-36m-x86_64-linux-gnu.so
> 7fa03c60f000-7fa03c611000 rw-p 00000000 00:00 0
> 7fa03c611000-7fa03c612000 ---p 00000000 00:00 0
> 7fa03c612000-7fa03ce12000 rw-p 00000000 00:00 0
> 7fa03ce12000-7fa03ce13000 ---p 00000000 00:00 0
> 7fa03ce13000-7fa03d613000 rw-p 00000000 00:00 0
> 7fa03d613000-7fa03d614000 ---p 00000000 00:00 0
> 7fa03d614000-7fa03de14000 rw-p 00000000 00:00 0
> 7fa03de14000-7fa03de15000 ---p 00000000 00:00 0
> 7fa03de15000-7fa03e615000 rw-p 00000000 00:00 0
> 7fa03e615000-7fa03edd8000 r-xp 00000000 08:01 2374115
> /usr/lib/x86_64-linux-gnu/libnvidia-ptxjitcompiler.so.430.64
> 7fa03edd8000-7fa03efd7000 ---p 007c3000 08:01 2374115
> /usr/lib/x86_64-linux-gnu/libnvidia-ptxjitcompiler.so.430.64
> 7fa03efd7000-7fa03eff6000 rw-p 007c2000 08:01 2374115
> /usr/lib/x86_64-linux-gnu/libnvidia-ptxjitcompiler.so.430.64
> 7fa03eff6000-7fa03efff000 rw-p 00000000 00:00 0
> 7fa03f008000-7fa03f1c8000 rw-p 00000000 00:00 0
> 7fa03f1c8000-7fa03f1d3000 r-xp 00000000 08:31 59728427 /lib/x86_64-linux-
> gnu/libnss_files-2.23.so
> 7fa03f1d3000-7fa03f3d2000 ---p 0000b000 08:31 59728427 /lib/x86_64-linux-
> gnu/libnss_files-2.23.so
> 7fa03f3d2000-7fa03f3d3000 r--p 0000a000 08:31 59728427 /lib/x86_64-linux-
> gnu/libnss_files-2.23.so
> 7fa03f3d3000-7fa03f3d4000 rw-p 0000b000 08:31 59728427 /lib/x86_64-linux-
> gnu/libnss_files-2.23.so
> 7fa03f3d4000-7fa03f3da000 rw-p 00000000 00:00 0
> 7fa03f3da000-7fa03f3e5000 r-xp 00000000 08:31 59728431 /lib/x86_64-linux-
> gnu/libnss_nis-2.23.so
> 7fa03f3e5000-7fa03f5e4000 ---p 0000b000 08:31 59728431 /lib/x86_64-linux-
> gnu/libnss_nis-2.23.so
> 7fa03f5e4000-7fa03f5e5000 r--p 0000a000 08:31 59728431 /lib/x86_64-linux-
> gnu/libnss_nis-2.23.so
> 7fa03f5e5000-7fa03f5e6000 rw-p 0000b000 08:31 59728431 /lib/x86_64-linux-
> gnu/libnss_nis-2.23.so
> 7fa03f5e6000-7fa03f5fc000 r-xp 00000000 08:31 59728421 /lib/x86_64-linux-
> gnu/libnsl-2.23.so
> 7fa03f5fc000-7fa03f7fb000 ---p 00016000 08:31 59728421 /lib/x86_64-linux-
> gnu/libnsl-2.23.so
> 7fa03f7fb000-7fa03f7fc000 r--p 00015000 08:31 59728421 /lib/x86_64-linux-
> gnu/libnsl-2.23.so
> 7fa03f7fc000-7fa03f7fd000 rw-p 00016000 08:31 59728421 /lib/x86_64-linux-
> gnu/libnsl-2.23.so
> 7fa03f7fd000-7fa03f7ff000 rw-p 00000000 00:00 0
> 7fa03f7ff000-7fa03f800000 ---p 00000000 00:00 0
> 7fa03f800000-7fa040000000 rw-p 00000000 00:00 0
> 7fa040000000-7fa040021000 rw-p 00000000 00:00 0
> 7fa040021000-7fa044000000 ---p 00000000 00:00 0
> 7fa044000000-7fa044021000 rw-p 00000000 00:00 0
> 7fa044021000-7fa048000000 ---p 00000000 00:00 0
> 7fa048000000-7fa048021000 rw-p 00000000 00:00 0
> 7fa048021000-7fa04c000000 ---p 00000000 00:00 0
> 7fa04c000000-7fa04c021000 rw-p 00000000 00:00 0
> 7fa04c021000-7fa050000000 ---p 00000000 00:00 0
> 7fa050000000-7fa050021000 rw-p 00000000 00:00 0
> 7fa050021000-7fa054000000 ---p 00000000 00:00 0
> 7fa054000000-7fa054021000 rw-p 00000000 00:00 0
> 7fa054021000-7fa058000000 ---p 00000000 00:00 0
> 7fa058000000-7fa058021000 rw-p 00000000 00:00 0
> 7fa058021000-7fa05c000000 ---p 00000000 00:00 0
> 7fa05c000000-7fa05c021000 rw-p 00000000 00:00 0
> 7fa05c021000-7fa060000000 ---p 00000000 00:00 0
> 7fa060000000-7fa060021000 rw-p 00000000 00:00 0
> 7fa060021000-7fa064000000 ---p 00000000 00:00 0
> 7fa064000000-7fa064021000 rw-p 00000000 00:00 0
> 7fa064021000-7fa068000000 ---p 00000000 00:00 0
> 7fa068000000-7fa068021000 rw-p 00000000 00:00 0
> 7fa068021000-7fa06c000000 ---p 00000000 00:00 0
> 7fa06c000000-7fa06c021000 rw-p 00000000 00:00 0
> 7fa06c021000-7fa070000000 ---p 00000000 00:00 0
> 7fa070000000-7fa070021000 rw-p 00000000 00:00 0
> 7fa070021000-7fa074000000 ---p 00000000 00:00 0
> 7fa074000000-7fa074021000 rw-p 00000000 00:00 0
> 7fa074021000-7fa078000000 ---p 00000000 00:00 0
> 7fa078000000-7fa078021000 rw-p 00000000 00:00 0
> 7fa078021000-7fa07c000000 ---p 00000000 00:00 0
> 7fa07c000000-7fa07c021000 rw-p 00000000 00:00 0
> 7fa07c021000-7fa080000000 ---p 00000000 00:00 0
> 7fa080000000-7fa080001000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080001000-7fa080002000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080002000-7fa080003000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080003000-7fa080004000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080004000-7fa080005000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080005000-7fa080006000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080006000-7fa080007000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080007000-7fa080008000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080008000-7fa080009000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080009000-7fa08000a000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000a000-7fa08000b000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000b000-7fa08000c000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000c000-7fa08000d000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000d000-7fa08000e000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000e000-7fa08000f000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa08000f000-7fa080010000 rw-s 00000000 00:06 768 /dev/nvidia4
> 7fa080010000-7fa080011000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080011000-7fa080012000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080012000-7fa080013000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080013000-7fa080014000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080014000-7fa080015000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080015000-7fa080016000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080016000-7fa080017000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080017000-7fa080018000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080018000-7fa080019000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080019000-7fa08001a000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001a000-7fa08001b000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001b000-7fa08001c000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001c000-7fa08001d000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001d000-7fa08001e000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001e000-7fa08001f000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa08001f000-7fa080020000 rw-s 00000000 00:06 783 /dev/nvidia5
> 7fa080020000-7fa080021000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080021000-7fa080022000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080022000-7fa080023000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080023000-7fa080024000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080024000-7fa080025000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080025000-7fa080026000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080026000-7fa080027000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080027000-7fa080028000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080028000-7fa080029000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080029000-7fa08002a000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002a000-7fa08002b000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002b000-7fa08002c000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002c000-7fa08002d000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002d000-7fa08002e000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002e000-7fa08002f000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa08002f000-7fa080030000 rw-s 00000000 00:06 798 /dev/nvidia6
> 7fa080030000-7fa080031000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080031000-7fa080032000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080032000-7fa080033000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080033000-7fa080034000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080034000-7fa080035000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080035000-7fa080036000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080036000-7fa080037000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080037000-7fa080038000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080038000-7fa080039000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080039000-7fa08003a000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003a000-7fa08003b000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003b000-7fa08003c000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003c000-7fa08003d000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003d000-7fa08003e000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003e000-7fa08003f000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa08003f000-7fa080040000 rw-s 00000000 00:06 813 /dev/nvidia7
> 7fa080040000-7fa090000000 ---p 00000000 00:00 0
> 7fa09002d000-7fa0901ed000 rw-p 00000000 00:00 0
> 7fa0901ed000-7fa0901f5000 r-xp 00000000 08:31 59728423 /lib/x86_64-linux-
> gnu/libnss_compat-2.23.so
> 7fa0901f5000-7fa0903f4000 ---p 00008000 08:31 59728423 /lib/x86_64-linux-
> gnu/libnss_compat-2.23.so
> 7fa0903f4000-7fa0903f5000 r--p 00007000 08:31 59728423 /lib/x86_64-linux-
> gnu/libnss_compat-2.23.so
> 7fa0903f5000-7fa0903f6000 rw-p 00008000 08:31 59728423 /lib/x86_64-linux-
> gnu/libnss_compat-2.23.so
> 7fa0903f6000-7fa0903f7000 r-xp 00000000 08:31 83987909
> /opt/conda/lib/python3.6/site-packages/torch/lib/libcaffe2_nvrtc.so
> 7fa0903f7000-7fa0905f7000 ---p 00001000 08:31 83987909
> /opt/conda/lib/python3.6/site-packages/torch/lib/libcaffe2_nvrtc.so
> 7fa0905f7000-7fa0905f8000 r--p 00001000 08:31 83987909
> /opt/conda/lib/python3.6/site-packages/torch/lib/libcaffe2_nvrtc.so
> 7fa0905f8000-7fa0905f9000 rw-p 00002000 08:31 83987909
> /opt/conda/lib/python3.6/site-packages/torch/lib/libcaffe2_nvrtc.so
> 7fa0905f9000-7fa0905fa000 ---p 00000000 00:00 0
> 7fa0905fa000-7fa090dfa000 rw-p 00000000 00:00 0
> 7fa090dfe000-7fa090efe000 rw-p 00000000 00:00 0
> 7fa090efe000-7fa090eff000 ---p 00000000 00:00 0
> 7fa090eff000-7fa0916ff000 rw-p 00000000 00:00 0
> 7fa0916ff000-7fa091700000 ---p 00000000 00:00 0
> 7fa091700000-7fa091f00000 rw-p 00000000 00:00 0
> 7fa091f00000-7fa091f01000 ---p 00000000 00:00 0
> 7fa091f01000-7fa092701000 rw-p 00000000 00:00 0
> 7fa092701000-7fa092702000 ---p 00000000 00:00 0
> 7fa092702000-7fa092f02000 rw-p 00000000 00:00 0
> 7fa092f02000-7fa092f03000 ---p 00000000 00:00 0
> 7fa092f03000-7fa093703000 rw-p 00000000 00:00 0
> 7fa093703000-7fa093704000 ---p 00000000 00:00 0
> 7fa093704000-7fa094004000 rw-p 00000000 00:00 0
> 7fa094004000-7fa09403f000 r-xp 00000000 08:31 85691691
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_reordering.cpython-36m-x86_64-linux-gnu.so
> 7fa09403f000-7fa09423f000 ---p 0003b000 08:31 85691691
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_reordering.cpython-36m-x86_64-linux-gnu.so
> 7fa09423f000-7fa094244000 rw-p 0003b000 08:31 85691691
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_reordering.cpython-36m-x86_64-linux-gnu.so
> 7fa094244000-7fa094245000 rw-p 00000000 00:00 0
> 7fa094245000-7fa094273000 r-xp 00000000 08:31 85691689
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_matching.cpython-36m-x86_64-linux-gnu.so
> 7fa094273000-7fa094473000 ---p 0002e000 08:31 85691689
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_matching.cpython-36m-x86_64-linux-gnu.so
> 7fa094473000-7fa094477000 rw-p 0002e000 08:31 85691689
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_matching.cpython-36m-x86_64-linux-gnu.so
> 7fa094477000-7fa094478000 rw-p 00000000 00:00 0
> 7fa094478000-7fa0944b0000 r-xp 00000000 08:31 85691687
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_flow.cpython-36m-x86_64-linux-gnu.so
> 7fa0944b0000-7fa0946b0000 ---p 00038000 08:31 85691687
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_flow.cpython-36m-x86_64-linux-gnu.so
> 7fa0946b0000-7fa0946b6000 rw-p 00038000 08:31 85691687
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_flow.cpython-36m-x86_64-linux-gnu.so
> 7fa0946b6000-7fa0946b7000 rw-p 00000000 00:00 0
> 7fa0946b7000-7fa0946e2000 r-xp 00000000 08:31 85691690
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_min_spanning_tree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa0946e2000-7fa0948e2000 ---p 0002b000 08:31 85691690
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_min_spanning_tree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa0948e2000-7fa0948e6000 rw-p 0002b000 08:31 85691690
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_min_spanning_tree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa0948e6000-7fa0948e7000 rw-p 00000000 00:00 0
> 7fa0948e7000-7fa094908000 r-xp 00000000 08:31 85691694
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_traversal.cpython-36m-x86_64-linux-gnu.so
> 7fa094908000-7fa094b07000 ---p 00021000 08:31 85691694
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_traversal.cpython-36m-x86_64-linux-gnu.so
> 7fa094b07000-7fa094b0c000 rw-p 00020000 08:31 85691694
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_traversal.cpython-36m-x86_64-linux-gnu.so
> 7fa094b0c000-7fa094b2e000 r-xp 00000000 08:31 85691693
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_tools.cpython-36m-x86_64-linux-gnu.so
> 7fa094b2e000-7fa094d2e000 ---p 00022000 08:31 85691693
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_tools.cpython-36m-x86_64-linux-gnu.so
> 7fa094d2e000-7fa094d33000 rw-p 00022000 08:31 85691693
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_tools.cpython-36m-x86_64-linux-gnu.so
> 7fa094d33000-7fa094d34000 rw-p 00000000 00:00 0
> 7fa094d34000-7fa094d91000 r-xp 00000000 08:31 85691692
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_shortest_path.cpython-36m-x86_64-linux-gnu.so
> 7fa094d91000-7fa094f90000 ---p 0005d000 08:31 85691692
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_shortest_path.cpython-36m-x86_64-linux-gnu.so
> 7fa094f90000-7fa094f99000 rw-p 0005c000 08:31 85691692
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/csgraph/_shortest_path.cpython-36m-x86_64-linux-gnu.so
> 7fa094f99000-7fa094fdb000 rw-p 00000000 00:00 0
> 7fa094fdb000-7fa095047000 r-xp 00000000 08:31 85691670
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_csparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa095047000-7fa095246000 ---p 0006c000 08:31 85691670
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_csparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa095246000-7fa09524c000 rw-p 0006b000 08:31 85691670
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_csparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa09524c000-7fa09528e000 rw-p 00000000 00:00 0
> 7fa09528e000-7fa0955d7000 r-xp 00000000 08:31 85691673
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_sparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa0955d7000-7fa0957d7000 ---p 00349000 08:31 85691673
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_sparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa0957d7000-7fa0957d8000 rw-p 00349000 08:31 85691673
> /opt/conda/lib/python3.6/site-
> packages/scipy/sparse/_sparsetools.cpython-36m-x86_64-linux-gnu.so
> 7fa0957d8000-7fa095872000 r-xp 00000000 08:31 85560568
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_lapack.cpython-36m-x86_64-linux-gnu.so
> 7fa095872000-7fa095a71000 ---p 0009a000 08:31 85560568
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_lapack.cpython-36m-x86_64-linux-gnu.so
> 7fa095a71000-7fa095a75000 rw-p 00099000 08:31 85560568
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_lapack.cpython-36m-x86_64-linux-gnu.so
> 7fa095a75000-7fa095a8a000 rw-p 0009e000 08:31 85560568
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_lapack.cpython-36m-x86_64-linux-gnu.so
> 7fa095a8a000-7fa095ac3000 r-xp 00000000 08:31 85560566
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_blas.cpython-36m-x86_64-linux-gnu.so
> 7fa095ac3000-7fa095cc3000 ---p 00039000 08:31 85560566
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_blas.cpython-36m-x86_64-linux-gnu.so
> 7fa095cc3000-7fa095ccd000 rw-p 00039000 08:31 85560566
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/cython_blas.cpython-36m-x86_64-linux-gnu.so
> 7fa095ccd000-7fa095d15000 r-xp 00000000 08:31 85560549
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_decomp_update.cpython-36m-x86_64-linux-gnu.so
> 7fa095d15000-7fa095f15000 ---p 00048000 08:31 85560549
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_decomp_update.cpython-36m-x86_64-linux-gnu.so
> 7fa095f15000-7fa095f1b000 rw-p 00048000 08:31 85560549
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_decomp_update.cpython-36m-x86_64-linux-gnu.so
> 7fa095f1b000-7fa0b1f1c000 rw-p 00000000 00:00 0
> 7fa0b1f1c000-7fa0b1f50000 r-xp 00000000 08:31 85560561
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_solve_toeplitz.cpython-36m-x86_64-linux-gnu.so
> 7fa0b1f50000-7fa0b2150000 ---p 00034000 08:31 85560561
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_solve_toeplitz.cpython-36m-x86_64-linux-gnu.so
> 7fa0b2150000-7fa0b2154000 rw-p 00034000 08:31 85560561
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_solve_toeplitz.cpython-36m-x86_64-linux-gnu.so
> 7fa0b2154000-7fa0d6155000 rw-p 00000000 00:00 0
> 7fa0d6155000-7fa0d6162000 r-xp 00000000 08:31 85560553
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flinalg.cpython-36m-x86_64-linux-gnu.so
> 7fa0d6162000-7fa0d6362000 ---p 0000d000 08:31 85560553
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flinalg.cpython-36m-x86_64-linux-gnu.so
> 7fa0d6362000-7fa0d6365000 rw-p 0000d000 08:31 85560553
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flinalg.cpython-36m-x86_64-linux-gnu.so
> 7fa0d6365000-7fa0d6368000 rw-p 00011000 08:31 85560553
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flinalg.cpython-36m-x86_64-linux-gnu.so
> 7fa0d6368000-7fa0e6368000 rw-p 00000000 00:00 0
> 7fa0e6368000-7fa0e6486000 r-xp 00000000 08:31 85560552
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flapack.cpython-36m-x86_64-linux-gnu.so
> 7fa0e6486000-7fa0e6685000 ---p 0011e000 08:31 85560552
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flapack.cpython-36m-x86_64-linux-gnu.so
> 7fa0e6685000-7fa0e66ef000 rw-p 0011d000 08:31 85560552
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flapack.cpython-36m-x86_64-linux-gnu.so
> 7fa0e66ef000-7fa0e66f0000 rw-p 00000000 00:00 0
> 7fa0e66f0000-7fa0e66f9000 rw-p 00188000 08:31 85560552
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_flapack.cpython-36m-x86_64-linux-gnu.so
> 7fa0e66f9000-7fa0e86f9000 rw-p 00000000 00:00 0
> 7fa0e86f9000-7fa0e86fa000 ---p 00000000 00:00 0
> 7fa0e86fa000-7fa0e8efa000 rw-p 00000000 00:00 0
> 7fa0e8efa000-7fa0e8efb000 ---p 00000000 00:00 0
> 7fa0e8efb000-7fa0e96fb000 rw-p 00000000 00:00 0
> 7fa0e96fb000-7fa0e96fc000 ---p 00000000 00:00 0
> 7fa0e96fc000-7fa0e9efc000 rw-p 00000000 00:00 0
> 7fa0e9efc000-7fa0e9efd000 ---p 00000000 00:00 0
> 7fa0e9efd000-7fa0ea6fd000 rw-p 00000000 00:00 0
> 7fa0ea735000-7fa0ea7f5000 rw-p 00000000 00:00 0
> 7fa0ea7f5000-7fa0ea7f6000 ---p 00000000 00:00 0
> 7fa0ea7f6000-7fa0eaff6000 rw-p 00000000 00:00 0
> 7fa0eaff6000-7fa0eaff7000 ---p 00000000 00:00 0
> 7fa0eaff7000-7fa0eb7f7000 rw-p 00000000 00:00 0
> 7fa0eb7f7000-7fa0eb7f8000 ---p 00000000 00:00 0
> 7fa0eb7f8000-7fa0ebff8000 rw-p 00000000 00:00 0
> 7fa0ebff8000-7fa0ebff9000 ---p 00000000 00:00 0
> 7fa0ebff9000-7fa0ec7f9000 rw-p 00000000 00:00 0
> 7fa0ec7f9000-7fa0ec7fa000 ---p 00000000 00:00 0
> 7fa0ec7fa000-7fa0ecffa000 rw-p 00000000 00:00 0
> 7fa0ecffa000-7fa0ecffb000 ---p 00000000 00:00 0
> 7fa0ecffb000-7fa0ed7fb000 rw-p 00000000 00:00 0
> 7fa0ed7fb000-7fa0ed7fc000 ---p 00000000 00:00 0
> 7fa0ed7fc000-7fa0edffc000 rw-p 00000000 00:00 0
> 7fa0edffc000-7fa0edffd000 ---p 00000000 00:00 0
> 7fa0edffd000-7fa0ee7fd000 rw-p 00000000 00:00 0
> 7fa0ee7fd000-7fa0ee7fe000 ---p 00000000 00:00 0
> 7fa0ee7fe000-7fa0eeffe000 rw-p 00000000 00:00 0
> 7fa0eeffe000-7fa0eefff000 ---p 00000000 00:00 0
> 7fa0eefff000-7fa0ef7ff000 rw-p 00000000 00:00 0
> 7fa0ef7ff000-7fa0ef800000 ---p 00000000 00:00 0
> 7fa0ef800000-7fa0f0000000 rw-p 00000000 00:00 0
> 7fa0f0000000-7fa0f0021000 rw-p 00000000 00:00 0
> 7fa0f0021000-7fa0f4000000 ---p 00000000 00:00 0
> 7fa0f4011000-7fa0f4211000 rw-s 00000000 00:05 1567620 /dev/zero (deleted)
> 7fa0f4211000-7fa0f4411000 rw-s 00000000 00:05 1579908 /dev/zero (deleted)
> 7fa0f4411000-7fa0f4611000 rw-s 00000000 00:05 1589160 /dev/zero (deleted)
> 7fa0f4611000-7fa0f4612000 ---p 00000000 00:00 0
> 7fa0f4612000-7fa0f4e12000 rw-p 00000000 00:00 0
> 7fa0f4e12000-7fa0f4e13000 ---p 00000000 00:00 0
> 7fa0f4e13000-7fa0f5613000 rw-p 00000000 00:00 0
> 7fa0f5614000-7fa0f57d4000 rw-p 00000000 00:00 0
> 7fa0f57fb000-7fa0f58fb000 rw-p 00000000 00:00 0
> 7fa0f58fb000-7fa0f597b000 rw-p 00000000 00:00 0
> 7fa0f5994000-7fa0f9f14000 rw-p 00000000 00:00 0
> 7fa0f9f32000-7fa0f9fb3000 rw-p 00000000 00:00 0
> 7fa0f9fb3000-7fa0fa1b3000 rw-s 00000000 00:05 `1589886` /dev/zero (deleted)
> 7fa0fa1b3000-7fa0fa5b3000 rw-p 00000000 00:00 0
> 7fa0fa5b3000-7fa1005b3000 ---p 00000000 00:00 0
> 7fa1005b3000-7fa10139c000 r-xp 00000000 08:01 1441951
> /usr/lib/x86_64-linux-gnu/libcuda.so.430.64
> 7fa10139c000-7fa10159b000 ---p 00de9000 08:01 1441951
> /usr/lib/x86_64-linux-gnu/libcuda.so.430.64
> 7fa10159b000-7fa101713000 rw-p 00de8000 08:01 1441951
> /usr/lib/x86_64-linux-gnu/libcuda.so.430.64
> 7fa101713000-7fa103723000 rw-p 00000000 00:00 0
> 7fa103723000-7fa103724000 ---p 00000000 00:00 0
> 7fa103724000-7fa105f24000 rw-p 00000000 00:00 0
> 7fa105f24000-7fa105f25000 ---p 00000000 00:00 0
> 7fa105f25000-7fa10a725000 rw-p 00000000 00:00 0
> 7fa10a725000-7fa10a726000 r--p 00000000 08:31 78745082
> /opt/conda/lib/python3.6/lib-dynload/fcntl.cpython-36m-x86_64-linux-gnu.so
> 7fa10a726000-7fa10a728000 r-xp 00001000 08:31 78745082
> /opt/conda/lib/python3.6/lib-dynload/fcntl.cpython-36m-x86_64-linux-gnu.so
> 7fa10a728000-7fa10a729000 r--p 00003000 08:31 78745082
> /opt/conda/lib/python3.6/lib-dynload/fcntl.cpython-36m-x86_64-linux-gnu.so
> 7fa10a729000-7fa10a72a000 r--p 00003000 08:31 78745082
> /opt/conda/lib/python3.6/lib-dynload/fcntl.cpython-36m-x86_64-linux-gnu.so
> 7fa10a72a000-7fa10a72b000 rw-p 00004000 08:31 78745082
> /opt/conda/lib/python3.6/lib-dynload/fcntl.cpython-36m-x86_64-linux-gnu.so
> 7fa10a72b000-7fa10a72e000 r--p 00000000 08:31 78745095
> /opt/conda/lib/python3.6/lib-dynload/termios.cpython-36m-x86_64-linux-gnu.so
> 7fa10a72e000-7fa10a72f000 r-xp 00003000 08:31 78745095
> /opt/conda/lib/python3.6/lib-dynload/termios.cpython-36m-x86_64-linux-gnu.so
> 7fa10a72f000-7fa10a730000 r--p 00004000 08:31 78745095
> /opt/conda/lib/python3.6/lib-dynload/termios.cpython-36m-x86_64-linux-gnu.so
> 7fa10a730000-7fa10a731000 r--p 00004000 08:31 78745095
> /opt/conda/lib/python3.6/lib-dynload/termios.cpython-36m-x86_64-linux-gnu.so
> 7fa10a731000-7fa10a733000 rw-p 00005000 08:31 78745095
> /opt/conda/lib/python3.6/lib-dynload/termios.cpython-36m-x86_64-linux-gnu.so
> 7fa10a733000-7fa10a734000 rw-s 00000000 00:15 113 /dev/shm/8WILOt (deleted)
> 7fa10a734000-7fa10a735000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a735000-7fa10a736000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a736000-7fa10a737000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a737000-7fa10a738000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a738000-7fa10a739000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a739000-7fa10a73a000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73a000-7fa10a73b000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73b000-7fa10a73c000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73c000-7fa10a73d000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73d000-7fa10a73e000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73e000-7fa10a73f000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a73f000-7fa10a740000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a740000-7fa10a741000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a741000-7fa10a742000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a742000-7fa10a743000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a743000-7fa10a744000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a744000-7fa10a745000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a745000-7fa10a746000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a746000-7fa10a747000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a747000-7fa10a748000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a748000-7fa10a749000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a749000-7fa10a74a000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74a000-7fa10a74b000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74b000-7fa10a74c000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74c000-7fa10a74d000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74d000-7fa10a74e000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74e000-7fa10a74f000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a74f000-7fa10a750000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a750000-7fa10a751000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a751000-7fa10a752000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a752000-7fa10a753000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a753000-7fa10a754000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a754000-7fa10a755000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a755000-7fa10a756000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a756000-7fa10a757000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a757000-7fa10a758000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa10a758000-7fa10ac18000 rw-p 00000000 00:00 0
> 7fa10ac18000-7fa10acd8000 rw-p 00000000 00:00 0
> 7fa10acd8000-7fa10acd9000 ---p 00000000 00:00 0
> 7fa10acd9000-7fa10b4d9000 rw-p 00000000 00:00 0
> 7fa10b4d9000-7fa10b520000 r-xp 00000000 08:01 2374111
> /usr/lib/x86_64-linux-gnu/libnvidia-fatbinaryloader.so.430.64
> 7fa10b520000-7fa10b720000 ---p 00047000 08:01 2374111
> /usr/lib/x86_64-linux-gnu/libnvidia-fatbinaryloader.so.430.64
> 7fa10b720000-7fa10b722000 rw-p 00047000 08:01 2374111
> /usr/lib/x86_64-linux-gnu/libnvidia-fatbinaryloader.so.430.64
> 7fa10b722000-7fa10b727000 rw-p 00000000 00:00 0
> 7fa10b727000-7fa10b817000 r-xp 00000000 08:31 85298692
> /opt/conda/lib/python3.6/site-packages/scipy/.libs/libgfortran-
> ed201abd.so.3.0.0
> 7fa10b817000-7fa10ba16000 ---p 000f0000 08:31 85298692
> /opt/conda/lib/python3.6/site-packages/scipy/.libs/libgfortran-
> ed201abd.so.3.0.0
> 7fa10ba16000-7fa10ba18000 rw-p 000ef000 08:31 85298692
> /opt/conda/lib/python3.6/site-packages/scipy/.libs/libgfortran-
> ed201abd.so.3.0.0
> 7fa10ba18000-7fa10ba19000 rw-p 00000000 00:00 0
> 7fa10ba19000-7fa10ba21000 rw-p 000f2000 08:31 85298692
> /opt/conda/lib/python3.6/site-packages/scipy/.libs/libgfortran-
> ed201abd.so.3.0.0
> 7fa10ba21000-7fa10d517000 r-xp 00000000 08:31 85298693
> /opt/conda/lib/python3.6/site-
> packages/scipy/.libs/libopenblasp-r0-34a18dc3.3.7.so
> 7fa10d517000-7fa10d716000 ---p 01af6000 08:31 85298693
> /opt/conda/lib/python3.6/site-
> packages/scipy/.libs/libopenblasp-r0-34a18dc3.3.7.so
> 7fa10d716000-7fa10d72f000 rw-p 01af5000 08:31 85298693
> /opt/conda/lib/python3.6/site-
> packages/scipy/.libs/libopenblasp-r0-34a18dc3.3.7.so
> 7fa10d72f000-7fa10d73a000 rw-p 00000000 00:00 0
> 7fa10d73a000-7fa10d7b2000 rw-p 01be1000 08:31 85298693
> /opt/conda/lib/python3.6/site-
> packages/scipy/.libs/libopenblasp-r0-34a18dc3.3.7.so
> 7fa10d7b2000-7fa10d81e000 r-xp 00000000 08:31 85560551
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_fblas.cpython-36m-x86_64-linux-gnu.so
> 7fa10d81e000-7fa10da1e000 ---p 0006c000 08:31 85560551
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_fblas.cpython-36m-x86_64-linux-gnu.so
> 7fa10da1e000-7fa10da44000 rw-p 0006c000 08:31 85560551
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_fblas.cpython-36m-x86_64-linux-gnu.so
> 7fa10da44000-7fa10da48000 rw-p 00093000 08:31 85560551
> /opt/conda/lib/python3.6/site-
> packages/scipy/linalg/_fblas.cpython-36m-x86_64-linux-gnu.so
> 7fa10da48000-7fa10dafa000 r-xp 00000000 08:31 85298866
> /opt/conda/lib/python3.6/site-
> packages/scipy/fft/_pocketfft/pypocketfft.cpython-36m-x86_64-linux-gnu.so
> 7fa10dafa000-7fa10dcfa000 ---p 000b2000 08:31 85298866
> /opt/conda/lib/python3.6/site-
> packages/scipy/fft/_pocketfft/pypocketfft.cpython-36m-x86_64-linux-gnu.so
> 7fa10dcfa000-7fa10dcfc000 rw-p 000b2000 08:31 85298866
> /opt/conda/lib/python3.6/site-
> packages/scipy/fft/_pocketfft/pypocketfft.cpython-36m-x86_64-linux-gnu.so
> 7fa10dcfc000-7fa10dd00000 rw-p 00000000 00:00 0
> 7fa10dd00000-7fa10dd09000 r-xp 00000000 08:31 85298766
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_uarray/_uarray.cpython-36m-x86_64-linux-gnu.so
> 7fa10dd09000-7fa10df09000 ---p 00009000 08:31 85298766
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_uarray/_uarray.cpython-36m-x86_64-linux-gnu.so
> 7fa10df09000-7fa10df0a000 rw-p 00009000 08:31 85298766
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_uarray/_uarray.cpython-36m-x86_64-linux-gnu.so
> 7fa10df0a000-7fa10df19000 r-xp 00000000 08:31 85298749
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_ccallback_c.cpython-36m-x86_64-linux-gnu.so
> 7fa10df19000-7fa10e119000 ---p 0000f000 08:31 85298749
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_ccallback_c.cpython-36m-x86_64-linux-gnu.so
> 7fa10e119000-7fa10e11b000 rw-p 0000f000 08:31 85298749
> /opt/conda/lib/python3.6/site-
> packages/scipy/_lib/_ccallback_c.cpython-36m-x86_64-linux-gnu.so
> 7fa10e11b000-7fa10e3dc000 rw-p 00000000 00:00 0
> 7fa10e3dc000-7fa10e3e1000 r-xp 00000000 08:31 85823089
> /opt/conda/lib/python3.6/site-
> packages/skimage/external/tifffile/_tifffile.cpython-36m-x86_64-linux-gnu.so
> 7fa10e3e1000-7fa10e5e0000 ---p 00005000 08:31 85823089
> /opt/conda/lib/python3.6/site-
> packages/skimage/external/tifffile/_tifffile.cpython-36m-x86_64-linux-gnu.so
> 7fa10e5e0000-7fa10e5e1000 rw-p 00004000 08:31 85823089
> /opt/conda/lib/python3.6/site-
> packages/skimage/external/tifffile/_tifffile.cpython-36m-x86_64-linux-gnu.so
> 7fa10e5e1000-7fa10e5e3000 rw-p 00006000 08:31 85823089
> /opt/conda/lib/python3.6/site-
> packages/skimage/external/tifffile/_tifffile.cpython-36m-x86_64-linux-gnu.so
> 7fa10e5e3000-7fa10e6a3000 rw-p 00000000 00:00 0
> 7fa10e6a3000-7fa10e6b0000 r-xp 00000000 08:31 85822855
> /opt/conda/lib/python3.6/site-
> packages/skimage/.libs/libgomp-3300acd3.so.1.0.0
> 7fa10e6b0000-7fa10e8b0000 ---p 0000d000 08:31 85822855
> /opt/conda/lib/python3.6/site-
> packages/skimage/.libs/libgomp-3300acd3.so.1.0.0
> 7fa10e8b0000-7fa10e8b3000 rw-p 0000d000 08:31 85822855
> /opt/conda/lib/python3.6/site-
> packages/skimage/.libs/libgomp-3300acd3.so.1.0.0
> 7fa10e8b3000-7fa10e8b6000 r-xp 00000000 08:31 85822879
> /opt/conda/lib/python3.6/site-
> packages/skimage/_shared/geometry.cpython-36m-x86_64-linux-gnu.so
> 7fa10e8b6000-7fa10eab6000 ---p 00003000 08:31 85822879
> /opt/conda/lib/python3.6/site-
> packages/skimage/_shared/geometry.cpython-36m-x86_64-linux-gnu.so
> 7fa10eab6000-7fa10eab9000 rw-p 00003000 08:31 85822879
> /opt/conda/lib/python3.6/site-
> packages/skimage/_shared/geometry.cpython-36m-x86_64-linux-gnu.so
> 7fa10eab9000-7fa10eacc000 r-xp 00000000 08:31 65356073
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/testing.cpython-36m-x86_64-linux-gnu.so
> 7fa10eacc000-7fa10eccc000 ---p 00013000 08:31 65356073
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/testing.cpython-36m-x86_64-linux-gnu.so
> 7fa10eccc000-7fa10ecce000 rw-p 00013000 08:31 65356073
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/testing.cpython-36m-x86_64-linux-gnu.so
> 7fa10ecce000-7fa10ed8e000 rw-p 00000000 00:00 0
> 7fa10ed8e000-7fa10eda4000 r-xp 00000000 08:31 66928961
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_unpacker.cpython-36m-x86_64-linux-gnu.so
> 7fa10eda4000-7fa10efa4000 ---p 00016000 08:31 66928961
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_unpacker.cpython-36m-x86_64-linux-gnu.so
> 7fa10efa4000-7fa10efa7000 rw-p 00016000 08:31 66928961
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_unpacker.cpython-36m-x86_64-linux-gnu.so
> 7fa10efa7000-7fa10efb8000 r-xp 00000000 08:31 66928960
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_packer.cpython-36m-x86_64-linux-gnu.so
> 7fa10efb8000-7fa10f1b8000 ---p 00011000 08:31 66928960
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_packer.cpython-36m-x86_64-linux-gnu.so
> 7fa10f1b8000-7fa10f1ba000 rw-p 00011000 08:31 66928960
> /opt/conda/lib/python3.6/site-
> packages/pandas/io/msgpack/_packer.cpython-36m-x86_64-linux-gnu.so
> 7fa10f1ba000-7fa10f1bb000 r-xp 00000000 08:31 85298548
> /opt/conda/lib/python3.6/site-
> packages/pandas/util/_move.cpython-36m-x86_64-linux-gnu.so
> 7fa10f1bb000-7fa10f3bb000 ---p 00001000 08:31 85298548
> /opt/conda/lib/python3.6/site-
> packages/pandas/util/_move.cpython-36m-x86_64-linux-gnu.so
> 7fa10f3bb000-7fa10f3bc000 rw-p 00001000 08:31 85298548
> /opt/conda/lib/python3.6/site-
> packages/pandas/util/_move.cpython-36m-x86_64-linux-gnu.so
> 7fa10f3bc000-7fa10f3ee000 r-xp 00000000 08:31 65356076
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/writers.cpython-36m-x86_64-linux-gnu.so
> 7fa10f3ee000-7fa10f5ee000 ---p 00032000 08:31 65356076
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/writers.cpython-36m-x86_64-linux-gnu.so
> 7fa10f5ee000-7fa10f5f2000 rw-p 00032000 08:31 65356076
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/writers.cpython-36m-x86_64-linux-gnu.so
> 7fa10f5f2000-7fa10f5f3000 rw-p 00000000 00:00 0
> 7fa10f5f3000-7fa10f607000 r-xp 00000000 08:31 65356063
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/json.cpython-36m-x86_64-linux-gnu.so
> 7fa10f607000-7fa10f807000 ---p 00014000 08:31 65356063
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/json.cpython-36m-x86_64-linux-gnu.so
> 7fa10f807000-7fa10f808000 rw-p 00014000 08:31 65356063
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/json.cpython-36m-x86_64-linux-gnu.so
> 7fa10f808000-7fa10f88e000 r-xp 00000000 08:31 `6535606`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/parsers.cpython-36m-x86_64-linux-gnu.so
> 7fa10f88e000-7fa10fa8e000 ---p 00086000 08:31 `6535606`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/parsers.cpython-36m-x86_64-linux-gnu.so
> 7fa10fa8e000-7fa10fa95000 rw-p 00086000 08:31 `6535606`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/parsers.cpython-36m-x86_64-linux-gnu.so
> 7fa10fa95000-7fa10fa98000 rw-p 00000000 00:00 0
> 7fa10fa98000-7fa10faf1000 r-xp 00000000 08:31 65356069
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reduction.cpython-36m-x86_64-linux-gnu.so
> 7fa10faf1000-7fa10fcf1000 ---p 00059000 08:31 65356069
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reduction.cpython-36m-x86_64-linux-gnu.so
> 7fa10fcf1000-7fa10fcf6000 rw-p 00059000 08:31 65356069
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reduction.cpython-36m-x86_64-linux-gnu.so
> 7fa10fcf6000-7fa10fcf8000 rw-p 00000000 00:00 0
> 7fa10fcf8000-7fa10fdf3000 r-xp 00000000 08:31 65356055
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/groupby.cpython-36m-x86_64-linux-gnu.so
> 7fa10fdf3000-7fa10fff3000 ---p 000fb000 08:31 65356055
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/groupby.cpython-36m-x86_64-linux-gnu.so
> 7fa10fff3000-7fa10fffd000 rw-p 000fb000 08:31 65356055
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/groupby.cpython-36m-x86_64-linux-gnu.so
> 7fa10fffd000-7fa110000000 rw-p 00000000 00:00 0
> 7fa110000000-7fa110021000 rw-p 00000000 00:00 0
> 7fa110021000-7fa114000000 ---p 00000000 00:00 0
> 7fa114000000-7fa114001000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114001000-7fa114002000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114002000-7fa114003000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114003000-7fa114004000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114004000-7fa114005000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114005000-7fa114006000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114006000-7fa114007000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114007000-7fa1141c7000 rw-p 00000000 00:00 0
> 7fa1141c7000-7fa1141da000 r-xp 00000000 08:31 65356071
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/skiplist.cpython-36m-x86_64-linux-gnu.so
> 7fa1141da000-7fa1143d9000 ---p 00013000 08:31 65356071
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/skiplist.cpython-36m-x86_64-linux-gnu.so
> 7fa1143d9000-7fa1143db000 rw-p 00012000 08:31 65356071
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/skiplist.cpython-36m-x86_64-linux-gnu.so
> 7fa1143db000-7fa114499000 r-xp 00000000 08:31 65356075
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/window.cpython-36m-x86_64-linux-gnu.so
> 7fa114499000-7fa114698000 ---p 000be000 08:31 65356075
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/window.cpython-36m-x86_64-linux-gnu.so
> 7fa114698000-7fa1146a0000 rw-p 000bd000 08:31 65356075
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/window.cpython-36m-x86_64-linux-gnu.so
> 7fa1146a0000-7fa1147e2000 rw-p 00000000 00:00 0
> 7fa1147e2000-7fa1147e3000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1147e3000-7fa1147e4000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1147e4000-7fa1147e5000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1147e5000-7fa1147e6000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1147e6000-7fa1147e7000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1147e7000-7fa1147e8000 rw-s 00000000 00:15 58 /dev/shm/4DKcfO (deleted)
> 7fa1147e8000-7fa1147e9000 rw-s 00000000 00:15 57 /dev/shm/1bsMfw (deleted)
> 7fa1147e9000-7fa1147ea000 rw-s 00000000 00:15 56 /dev/shm/sfXmge (deleted)
> 7fa1147ea000-7fa1147eb000 rw-s 00000000 00:15 55 /dev/shm/AtAZgW (deleted)
> 7fa1147eb000-7fa1147ec000 rw-s 00000000 00:15 54 /dev/shm/wr57sE (deleted)
> 7fa1147ec000-7fa1147ed000 rw-s 00000000 00:15 53 /dev/shm/QmChFm (deleted)
> 7fa1147ed000-7fa1147ee000 rw-s 00000000 00:15 52 /dev/shm/mvltR4 (deleted)
> 7fa1147ee000-7fa1147ef000 rw-s 00000000 00:15 51 /dev/shm/2sD5eN (deleted)
> 7fa1147ef000-7fa1147f0000 rw-s 00000000 00:15 50 /dev/shm/o6IICv (deleted)
> 7fa1147f0000-7fa1147f1000 rw-s 00000000 00:15 49 /dev/shm/gRZn0d (deleted)
> 7fa1147f1000-7fa1147f2000 rw-s 00000000 00:15 48 /dev/shm/lyUyzW (deleted)
> 7fa1147f2000-7fa1147f3000 rw-s 00000000 00:15 47 /dev/shm/LfOK8E (deleted)
> 7fa1147f3000-7fa1147f4000 rw-s 00000000 00:15 46 /dev/shm/lCXYHn (deleted)
> 7fa1147f4000-7fa1147f5000 rw-s 00000000 00:15 45 /dev/shm/uWMxs6 (deleted)
> 7fa1147f5000-7fa1147f6000 rw-s 00000000 00:15 44 /dev/shm/jds7cP (deleted)
> 7fa1147f6000-7fa1147f7000 rw-s 00000000 00:15 43 /dev/shm/QF6IXx (deleted)
> 7fa1147f7000-7fa1147f8000 rw-s 00000000 00:15 42 /dev/shm/VFKKTg (deleted)
> 7fa1147f8000-7fa1147f9000 rw-s 00000000 00:15 41 /dev/shm/zmfNPZ (deleted)
> 7fa1147f9000-7fa1147fa000 rw-s 00000000 00:15 40 /dev/shm/QIFRLI (deleted)
> 7fa1147fa000-7fa1147fb000 rw-s 00000000 00:15 39 /dev/shm/lZQuTr (deleted)
> 7fa1147fb000-7fa1147fc000 rw-s 00000000 00:15 38 /dev/shm/vAR80a (deleted)
> 7fa1147fc000-7fa1147fd000 rw-s 00000000 00:15 37 /dev/shm/AZUO8T (deleted)
> 7fa1147fd000-7fa1147fe000 rw-s 00000000 00:15 36 /dev/shm/DFEgsD (deleted)
> 7fa1147fe000-7fa1147ff000 rw-s 00000000 00:15 35 /dev/shm/0ZbJLm (deleted)
> 7fa1147ff000-7fa114800000 rw-s 00000000 00:15 34 /dev/shm/zbGd55 (deleted)
> 7fa114800000-7fa114801000 rw-s 00000000 00:15 33 /dev/shm/lt85zP (deleted)
> 7fa114801000-7fa114802000 rw-s 00000000 00:15 32 /dev/shm/VsiZ4y (deleted)
> 7fa114802000-7fa114803000 rw-s 00000000 00:15 31 /dev/shm/jOiUzi (deleted)
> 7fa114803000-7fa114804000 rw-s 00000000 00:15 30 /dev/shm/VGLbg2 (deleted)
> 7fa114804000-7fa114805000 rw-s 00000000 00:15 29 /dev/shm/02auWL (deleted)
> 7fa114805000-7fa114806000 rw-s 00000000 00:15 28 /dev/shm/XzLOCv (deleted)
> 7fa114806000-7fa114807000 rw-s 00000000 00:15 27 /dev/shm/kFmRuf (deleted)
> 7fa114807000-7fa114808000 rw-s 00000000 00:15 26 /dev/shm/XwHUmZ (deleted)
> 7fa114808000-7fa114809000 rw-s 00000000 00:15 25 /dev/shm/Jfc0eJ (deleted)
> 7fa114809000-7fa11480a000 rw-s 00000000 00:15 24 /dev/shm/nqTGit (deleted)
> 7fa11480a000-7fa11480b000 rw-s 00000000 00:15 23 /dev/shm/njpomd (deleted)
> 7fa11480b000-7fa11480c000 rw-s 00000000 00:15 22 /dev/shm/uexcqX (deleted)
> 7fa11480c000-7fa11480d000 rw-s 00000000 00:15 21 /dev/shm/lA9DFH (deleted)
> 7fa11480d000-7fa11480e000 rw-s 00000000 00:15 20 /dev/shm/lQz6Ur (deleted)
> 7fa11480e000-7fa11480f000 rw-s 00000000 00:15 19 /dev/shm/AngBac (deleted)
> 7fa11480f000-7fa114810000 rw-s 00000000 00:15 18 /dev/shm/EtDYCW (deleted)
> 7fa114810000-7fa114811000 rw-s 00000000 00:15 17 /dev/shm/ARVm5G (deleted)
> 7fa114811000-7fa114812000 rw-s 00000000 00:15 16 /dev/shm/2lvNxr (deleted)
> 7fa114812000-7fa114813000 rw-s 00000000 00:15 15 /dev/shm/usDvdc (deleted)
> 7fa114813000-7fa114814000 rw-s 00000000 00:15 14 /dev/shm/4KJeTW (deleted)
> 7fa114814000-7fa114815000 rw-s 00000000 00:15 13 /dev/shm/Sea0yH (deleted)
> 7fa114815000-7fa114816000 rw-s 00000000 00:15 12 /dev/shm/EjHcss (deleted)
> 7fa114816000-7fa114817000 rw-s 00000000 00:15 11 /dev/shm/SaZpld (deleted)
> 7fa114817000-7fa114818000 rw-s 00000000 00:15 10 /dev/shm/0UPDeY (deleted)
> 7fa114818000-7fa114819000 rw-s 00000000 00:15 9 /dev/shm/qkwS7I (deleted)
> 7fa114819000-7fa11481a000 rw-s 00000000 00:15 8 /dev/shm/sdU70t (deleted)
> 7fa11481a000-7fa11481b000 rw-s 00000000 00:15 7 /dev/shm/UjOnUe (deleted)
> 7fa11481b000-7fa11481c000 rw-s 00000000 00:15 6 /dev/shm/MrcENZ (deleted)
> 7fa11481c000-7fa11481d000 rw-s 00000000 00:15 5 /dev/shm/EvrVGK (deleted)
> 7fa11481d000-7fa11481e000 rw-s 00000000 00:15 4 /dev/shm/eeAdAv (deleted)
> 7fa11481e000-7fa11481f000 rw-s 00000000 00:15 3 /dev/shm/cbCwtg (deleted)
> 7fa11481f000-7fa114820000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114820000-7fa114821000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114821000-7fa114822000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa114822000-7fa114a22000 rw-p 00000000 00:00 0
> 7fa114a22000-7fa114a5c000 r-xp 00000000 08:31 58547880
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_image.cpython-36m-x86_64-linux-gnu.so
> 7fa114a5c000-7fa114c5c000 ---p 0003a000 08:31 58547880
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_image.cpython-36m-x86_64-linux-gnu.so
> 7fa114c5c000-7fa114c5d000 rw-p 0003a000 08:31 58547880
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_image.cpython-36m-x86_64-linux-gnu.so
> 7fa114c5d000-7fa114d9e000 rw-p 00000000 00:00 0
> 7fa114d9e000-7fa114da2000 r-xp 00000000 08:31 59728475 /lib/x86_64-linux-
> gnu/libuuid.so.1.3.0
> 7fa114da2000-7fa114fa1000 ---p 00004000 08:31 59728475 /lib/x86_64-linux-
> gnu/libuuid.so.1.3.0
> 7fa114fa1000-7fa114fa2000 r--p 00003000 08:31 59728475 /lib/x86_64-linux-
> gnu/libuuid.so.1.3.0
> 7fa114fa2000-7fa114fa3000 rw-p 00004000 08:31 59728475 /lib/x86_64-linux-
> gnu/libuuid.so.1.3.0
> 7fa114fa3000-7fa115263000 rw-p 00000000 00:00 0
> 7fa115263000-7fa115290000 r-xp 00000000 08:31 58547883
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_path.cpython-36m-x86_64-linux-gnu.so
> 7fa115290000-7fa115490000 ---p 0002d000 08:31 58547883
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_path.cpython-36m-x86_64-linux-gnu.so
> 7fa115490000-7fa115491000 rw-p 0002d000 08:31 58547883
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/_path.cpython-36m-x86_64-linux-gnu.so
> 7fa115491000-7fa115512000 rw-p 00000000 00:00 0
> 7fa115512000-7fa11554e000 r-xp 00000000 08:31 58547802
> /opt/conda/lib/python3.6/site-packages/kiwisolver.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11554e000-7fa11574d000 ---p 0003c000 08:31 58547802
> /opt/conda/lib/python3.6/site-packages/kiwisolver.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11574d000-7fa115750000 rw-p 0003b000 08:31 58547802
> /opt/conda/lib/python3.6/site-packages/kiwisolver.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115750000-7fa115824000 r-xp 00000000 08:31 58548057
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/ft2font.cpython-36m-x86_64-linux-gnu.so
> 7fa115824000-7fa115a24000 ---p 000d4000 08:31 58548057
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/ft2font.cpython-36m-x86_64-linux-gnu.so
> 7fa115a24000-7fa115a2b000 rw-p 000d4000 08:31 58548057
> /opt/conda/lib/python3.6/site-
> packages/matplotlib/ft2font.cpython-36m-x86_64-linux-gnu.so
> 7fa115a2b000-7fa115c6c000 rw-p 00000000 00:00 0
> 7fa115c6c000-7fa115ca9000 r-xp 00000000 08:31 65356070
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reshape.cpython-36m-x86_64-linux-gnu.so
> 7fa115ca9000-7fa115ea9000 ---p 0003d000 08:31 65356070
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reshape.cpython-36m-x86_64-linux-gnu.so
> 7fa115ea9000-7fa115eae000 rw-p 0003d000 08:31 65356070
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/reshape.cpython-36m-x86_64-linux-gnu.so
> 7fa115eae000-7fa115eaf000 rw-p 00000000 00:00 0
> 7fa115eaf000-7fa115eb1000 r--p 00000000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb1000-7fa115eb4000 r-xp 00002000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb4000-7fa115eb5000 r--p 00005000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb5000-7fa115eb6000 ---p 00006000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb6000-7fa115eb7000 r--p 00006000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb7000-7fa115eb8000 rw-p 00007000 08:31 78745085
> /opt/conda/lib/python3.6/lib-dynload/mmap.cpython-36m-x86_64-linux-gnu.so
> 7fa115eb8000-7fa115ef8000 rw-p 00000000 00:00 0
> 7fa115ef8000-7fa115efb000 r--p 00000000 08:31 78745096
> /opt/conda/lib/python3.6/lib-dynload/unicodedata.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115efb000-7fa115f00000 r-xp 00003000 08:31 78745096
> /opt/conda/lib/python3.6/lib-dynload/unicodedata.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115f00000-7fa115fba000 r--p 00008000 08:31 78745096
> /opt/conda/lib/python3.6/lib-dynload/unicodedata.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115fba000-7fa115fbb000 r--p 000c1000 08:31 78745096
> /opt/conda/lib/python3.6/lib-dynload/unicodedata.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115fbb000-7fa115fd6000 rw-p 000c2000 08:31 78745096
> /opt/conda/lib/python3.6/lib-dynload/unicodedata.cpython-36m-x86_64-linux-
> gnu.so
> 7fa115fd6000-7fa116016000 rw-p 00000000 00:00 0
> 7fa116016000-7fa116056000 r-xp 00000000 08:31 65356060
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/internals.cpython-36m-x86_64-linux-gnu.so
> 7fa116056000-7fa116256000 ---p 00040000 08:31 65356060
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/internals.cpython-36m-x86_64-linux-gnu.so
> 7fa116256000-7fa11625b000 rw-p 00040000 08:31 65356060
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/internals.cpython-36m-x86_64-linux-gnu.so
> 7fa11625b000-7fa11629c000 rw-p 00000000 00:00 0
> 7fa11629c000-7fa1162a5000 r-xp 00000000 08:31 65356059
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/indexing.cpython-36m-x86_64-linux-gnu.so
> 7fa1162a5000-7fa1164a4000 ---p 00009000 08:31 65356059
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/indexing.cpython-36m-x86_64-linux-gnu.so
> 7fa1164a4000-7fa1164a6000 rw-p 00008000 08:31 65356059
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/indexing.cpython-36m-x86_64-linux-gnu.so
> 7fa1164a6000-7fa116626000 rw-p 00000000 00:00 0
> 7fa116626000-7fa116700000 r-xp 00000000 08:31 65356072
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/sparse.cpython-36m-x86_64-linux-gnu.so
> 7fa116700000-7fa116900000 ---p `000da00` 08:31 65356072
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/sparse.cpython-36m-x86_64-linux-gnu.so
> 7fa116900000-7fa116907000 rw-p `000da00` 08:31 65356072
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/sparse.cpython-36m-x86_64-linux-gnu.so
> 7fa116907000-7fa1169c9000 rw-p 00000000 00:00 0
> 7fa1169c9000-7fa116c57000 r-xp 00000000 08:31 65356062
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/join.cpython-36m-x86_64-linux-gnu.so
> 7fa116c57000-7fa116e57000 ---p 0028e000 08:31 65356062
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/join.cpython-36m-x86_64-linux-gnu.so
> 7fa116e57000-7fa116e60000 rw-p 0028e000 08:31 65356062
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/join.cpython-36m-x86_64-linux-gnu.so
> 7fa116e60000-7fa116e66000 rw-p 00000000 00:00 0
> 7fa116e66000-7fa116f00000 r-xp 00000000 08:31 65356058
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/index.cpython-36m-x86_64-linux-gnu.so
> 7fa116f00000-7fa1170ff000 ---p 0009a000 08:31 65356058
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/index.cpython-36m-x86_64-linux-gnu.so
> 7fa1170ff000-7fa117108000 rw-p 00099000 08:31 65356058
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/index.cpython-36m-x86_64-linux-gnu.so
> 7fa117108000-7fa11724b000 rw-p 00000000 00:00 0
> 7fa11724b000-7fa11727f000 r-xp 00000000 08:31 65356066
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/ops.cpython-36m-x86_64-linux-gnu.so
> 7fa11727f000-7fa11747f000 ---p 00034000 08:31 65356066
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/ops.cpython-36m-x86_64-linux-gnu.so
> 7fa11747f000-7fa117483000 rw-p 00034000 08:31 65356066
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/ops.cpython-36m-x86_64-linux-gnu.so
> 7fa117483000-7fa1174c4000 rw-p 00000000 00:00 0
> 7fa1174c4000-7fa1174ee000 r-xp 00000000 08:31 65356056
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashing.cpython-36m-x86_64-linux-gnu.so
> 7fa1174ee000-7fa1176ed000 ---p 0002a000 08:31 65356056
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashing.cpython-36m-x86_64-linux-gnu.so
> 7fa1176ed000-7fa1176f0000 rw-p 00029000 08:31 65356056
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashing.cpython-36m-x86_64-linux-gnu.so
> 7fa1176f0000-7fa1176f1000 rw-p 00000000 00:00 0
> 7fa1176f1000-7fa1176fe000 r-xp 00000000 08:31 65356068
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/properties.cpython-36m-x86_64-linux-gnu.so
> 7fa1176fe000-7fa1178fe000 ---p 0000d000 08:31 65356068
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/properties.cpython-36m-x86_64-linux-gnu.so
> 7fa1178fe000-7fa117900000 rw-p 0000d000 08:31 65356068
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/properties.cpython-36m-x86_64-linux-gnu.so
> 7fa117900000-7fa117940000 rw-p 00000000 00:00 0
> 7fa117940000-7fa117b71000 r-xp 00000000 08:31 65356061
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/interval.cpython-36m-x86_64-linux-gnu.so
> 7fa117b71000-7fa117d71000 ---p 00231000 08:31 65356061
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/interval.cpython-36m-x86_64-linux-gnu.so
> 7fa117d71000-7fa117d83000 rw-p 00231000 08:31 65356061
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/interval.cpython-36m-x86_64-linux-gnu.so
> 7fa117d83000-7fa117dc7000 rw-p 00000000 00:00 0
> 7fa117dc7000-7fa117f71000 r-xp 00000000 08:31 65356054
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/algos.cpython-36m-x86_64-linux-gnu.so
> 7fa117f71000-7fa118170000 ---p 001aa000 08:31 65356054
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/algos.cpython-36m-x86_64-linux-gnu.so
> 7fa118170000-7fa11817c000 rw-p 001a9000 08:31 65356054
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/algos.cpython-36m-x86_64-linux-gnu.so
> 7fa11817c000-7fa118182000 rw-p 00000000 00:00 0
> 7fa118182000-7fa1181d0000 r-xp 00000000 08:31 65356074
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslib.cpython-36m-x86_64-linux-gnu.so
> 7fa1181d0000-7fa1183cf000 ---p 0004e000 08:31 65356074
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslib.cpython-36m-x86_64-linux-gnu.so
> 7fa1183cf000-7fa1183d5000 rw-p 0004d000 08:31 65356074
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslib.cpython-36m-x86_64-linux-gnu.so
> 7fa1183d5000-7fa118416000 rw-p 00000000 00:00 0
> 7fa118416000-7fa1184a2000 r-xp 00000000 08:31 65356064
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/lib.cpython-36m-x86_64-linux-gnu.so
> 7fa1184a2000-7fa1186a1000 ---p `0008c00` 08:31 65356064
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/lib.cpython-36m-x86_64-linux-gnu.so
> 7fa1186a1000-7fa1186ae000 rw-p 0008b000 08:31 65356064
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/lib.cpython-36m-x86_64-linux-gnu.so
> 7fa1186ae000-7fa1186b1000 rw-p 00000000 00:00 0
> 7fa1186b1000-7fa1186c3000 r-xp 00000000 08:31 65356065
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/missing.cpython-36m-x86_64-linux-gnu.so
> 7fa1186c3000-7fa1188c2000 ---p 00012000 08:31 65356065
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/missing.cpython-36m-x86_64-linux-gnu.so
> 7fa1188c2000-7fa1188c4000 rw-p 00011000 08:31 65356065
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/missing.cpython-36m-x86_64-linux-gnu.so
> 7fa1188c4000-7fa1188c5000 rw-p 00000000 00:00 0
> 7fa1188c5000-7fa118954000 r-xp 00000000 08:31 65356057
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashtable.cpython-36m-x86_64-linux-gnu.so
> 7fa118954000-7fa118b53000 ---p 0008f000 08:31 65356057
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashtable.cpython-36m-x86_64-linux-gnu.so
> 7fa118b53000-7fa118b5f000 rw-p 0008e000 08:31 65356057
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/hashtable.cpython-36m-x86_64-linux-gnu.so
> 7fa118b5f000-7fa118b61000 rw-p 00000000 00:00 0
> 7fa118b61000-7fa118b9e000 r-xp 00000000 08:31 65618804
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/resolution.cpython-36m-x86_64-linux-gnu.so
> 7fa118b9e000-7fa118d9e000 ---p 0003d000 08:31 65618804
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/resolution.cpython-36m-x86_64-linux-gnu.so
> 7fa118d9e000-7fa118da3000 rw-p 0003d000 08:31 65618804
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/resolution.cpython-36m-x86_64-linux-gnu.so
> 7fa118da3000-7fa118da4000 rw-p 00000000 00:00 0
> 7fa118da4000-7fa118dee000 r-xp 00000000 08:31 65618807
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timestamps.cpython-36m-x86_64-linux-gnu.so
> 7fa118dee000-7fa118fee000 ---p 0004a000 08:31 65618807
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timestamps.cpython-36m-x86_64-linux-gnu.so
> 7fa118fee000-7fa118ff6000 rw-p 0004a000 08:31 65618807
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timestamps.cpython-36m-x86_64-linux-gnu.so
> 7fa118ff6000-7fa118ff8000 rw-p 00000000 00:00 0
> 7fa118ff8000-7fa119018000 r-xp 00000000 08:31 65618798
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/frequencies.cpython-36m-x86_64-linux-gnu.so
> 7fa119018000-7fa119217000 ---p 00020000 08:31 65618798
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/frequencies.cpython-36m-x86_64-linux-gnu.so
> 7fa119217000-7fa11921a000 rw-p 0001f000 08:31 65618798
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/frequencies.cpython-36m-x86_64-linux-gnu.so
> 7fa11921a000-7fa11921b000 rw-p 00000000 00:00 0
> 7fa11921b000-7fa11928a000 r-xp 00000000 08:31 65618803
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/period.cpython-36m-x86_64-linux-gnu.so
> 7fa11928a000-7fa119489000 ---p 0006f000 08:31 65618803
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/period.cpython-36m-x86_64-linux-gnu.so
> 7fa119489000-7fa119492000 rw-p 0006e000 08:31 65618803
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/period.cpython-36m-x86_64-linux-gnu.so
> 7fa119492000-7fa1194d4000 rw-p 00000000 00:00 0
> 7fa1194d4000-7fa11953c000 r-xp 00000000 08:31 65618802
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/parsing.cpython-36m-x86_64-linux-gnu.so
> 7fa11953c000-7fa11973c000 ---p 00068000 08:31 65618802
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/parsing.cpython-36m-x86_64-linux-gnu.so
> 7fa11973c000-7fa119744000 rw-p 00068000 08:31 65618802
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/parsing.cpython-36m-x86_64-linux-gnu.so
> 7fa119744000-7fa119746000 rw-p 00000000 00:00 0
> 7fa119746000-7fa119786000 r-xp 00000000 08:31 65618797
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/fields.cpython-36m-x86_64-linux-gnu.so
> 7fa119786000-7fa119985000 ---p 00040000 08:31 65618797
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/fields.cpython-36m-x86_64-linux-gnu.so
> 7fa119985000-7fa119989000 rw-p 0003f000 08:31 65618797
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/fields.cpython-36m-x86_64-linux-gnu.so
> 7fa119989000-7fa11998b000 rw-p 00000000 00:00 0
> 7fa11998b000-7fa1199f4000 r-xp 00000000 08:31 65618805
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/strptime.cpython-36m-x86_64-linux-gnu.so
> 7fa1199f4000-7fa119bf3000 ---p 00069000 08:31 65618805
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/strptime.cpython-36m-x86_64-linux-gnu.so
> 7fa119bf3000-7fa119bfb000 rw-p 00068000 08:31 65618805
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/strptime.cpython-36m-x86_64-linux-gnu.so
> 7fa119bfb000-7fa119c3d000 rw-p 00000000 00:00 0
> 7fa119c3d000-7fa119c4a000 r-xp 00000000 08:31 65618795
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/ccalendar.cpython-36m-x86_64-linux-gnu.so
> 7fa119c4a000-7fa119e49000 ---p 0000d000 08:31 65618795
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/ccalendar.cpython-36m-x86_64-linux-gnu.so
> 7fa119e49000-7fa119e4c000 rw-p 0000c000 08:31 65618795
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/ccalendar.cpython-36m-x86_64-linux-gnu.so
> 7fa119e4c000-7fa119eb2000 r-xp 00000000 08:31 65618801
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/offsets.cpython-36m-x86_64-linux-gnu.so
> 7fa119eb2000-7fa11a0b1000 ---p 00066000 08:31 65618801
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/offsets.cpython-36m-x86_64-linux-gnu.so
> 7fa11a0b1000-7fa11a0ba000 rw-p `0006500` 08:31 65618801
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/offsets.cpython-36m-x86_64-linux-gnu.so
> 7fa11a0ba000-7fa11a0bd000 rw-p 00000000 00:00 0
> 7fa11a0bd000-7fa11a133000 r-xp 00000000 08:31 65618806
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timedeltas.cpython-36m-x86_64-linux-gnu.so
> 7fa11a133000-7fa11a333000 ---p 00076000 08:31 65618806
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timedeltas.cpython-36m-x86_64-linux-gnu.so
> 7fa11a333000-7fa11a33b000 rw-p 00076000 08:31 65618806
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timedeltas.cpython-36m-x86_64-linux-gnu.so
> 7fa11a33b000-7fa11a33e000 rw-p 00000000 00:00 0
> 7fa11a33e000-7fa11a38c000 r-xp 00000000 08:31 65618809
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/tzconversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11a38c000-7fa11a58b000 ---p 0004e000 08:31 65618809
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/tzconversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11a58b000-7fa11a590000 rw-p 0004d000 08:31 65618809
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/tzconversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11a590000-7fa11a5d1000 rw-p 00000000 00:00 0
> 7fa11a5d1000-7fa11a609000 r-xp 00000000 08:31 65618808
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timezones.cpython-36m-x86_64-linux-gnu.so
> 7fa11a609000-7fa11a809000 ---p 00038000 08:31 65618808
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timezones.cpython-36m-x86_64-linux-gnu.so
> 7fa11a809000-7fa11a80d000 rw-p 00038000 08:31 65618808
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/timezones.cpython-36m-x86_64-linux-gnu.so
> 7fa11a80d000-7fa11a80e000 rw-p 00000000 00:00 0
> 7fa11a80e000-7fa11a819000 r-xp 00000000 08:31 65618800
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/np_datetime.cpython-36m-x86_64-linux-gnu.so
> 7fa11a819000-7fa11aa19000 ---p 0000b000 08:31 65618800
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/np_datetime.cpython-36m-x86_64-linux-gnu.so
> 7fa11aa19000-7fa11aa1a000 rw-p 0000b000 08:31 65618800
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/np_datetime.cpython-36m-x86_64-linux-gnu.so
> 7fa11aa1a000-7fa11aa45000 r-xp 00000000 08:31 65618799
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/nattype.cpython-36m-x86_64-linux-gnu.so
> 7fa11aa45000-7fa11ac45000 ---p 0002b000 08:31 65618799
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/nattype.cpython-36m-x86_64-linux-gnu.so
> 7fa11ac45000-7fa11ac4a000 rw-p 0002b000 08:31 65618799
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/nattype.cpython-36m-x86_64-linux-gnu.so
> 7fa11ac4a000-7fa11ac4b000 rw-p 00000000 00:00 0
> 7fa11ac4b000-7fa11ac8c000 r-xp 00000000 08:31 65618794
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/c_timestamp.cpython-36m-x86_64-linux-gnu.so
> 7fa11ac8c000-7fa11ae8b000 ---p 00041000 08:31 65618794
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/c_timestamp.cpython-36m-x86_64-linux-gnu.so
> 7fa11ae8b000-7fa11ae90000 rw-p 00040000 08:31 65618794
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/c_timestamp.cpython-36m-x86_64-linux-gnu.so
> 7fa11ae90000-7fa11ae91000 rw-p 00000000 00:00 0
> 7fa11ae91000-7fa11aed9000 r-xp 00000000 08:31 `6561879`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/conversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11aed9000-7fa11b0d9000 ---p 00048000 08:31 `6561879`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/conversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11b0d9000-7fa11b0de000 rw-p 00048000 08:31 `6561879`
> /opt/conda/lib/python3.6/site-
> packages/pandas/_libs/tslibs/conversion.cpython-36m-x86_64-linux-gnu.so
> 7fa11b0de000-7fa11b11f000 rw-p 00000000 00:00 0
> 7fa11b11f000-7fa11b166000 r--p 00000000 08:31 84118908
> /opt/conda/lib/python3.6/site-
> packages/torchvision/_C.cpython-36m-x86_64-linux-gnu.so
> 7fa11b166000-7fa11b1c7000 r-xp 00047000 08:31 84118908
> /opt/conda/lib/python3.6/site-
> packages/torchvision/_C.cpython-36m-x86_64-linux-gnu.so
> 7fa11b1c7000-7fa11b2c7000 r--p 000a8000 08:31 84118908
> /opt/conda/lib/python3.6/site-
> packages/torchvision/_C.cpython-36m-x86_64-linux-gnu.so
> 7fa11b2c7000-7fa11b2c8000 r--p 001a7000 08:31 84118908
> /opt/conda/lib/python3.6/site-
> packages/torchvision/_C.cpython-36m-x86_64-linux-gnu.so
> 7fa11b2c8000-7fa11b2cc000 rw-p 001a8000 08:31 84118908
> /opt/conda/lib/python3.6/site-
> packages/torchvision/_C.cpython-36m-x86_64-linux-gnu.so
> 7fa11b2cc000-7fa11b30d000 rw-p 00000000 00:00 0
> 7fa11b30d000-7fa11b30f000 r--p 00000000 08:31 78745055
> /opt/conda/lib/python3.6/lib-dynload/_json.cpython-36m-x86_64-linux-gnu.so
> 7fa11b30f000-7fa11b31d000 r-xp 00002000 08:31 78745055
> /opt/conda/lib/python3.6/lib-dynload/_json.cpython-36m-x86_64-linux-gnu.so
> 7fa11b31d000-7fa11b31f000 r--p 00010000 08:31 78745055
> /opt/conda/lib/python3.6/lib-dynload/_json.cpython-36m-x86_64-linux-gnu.so
> 7fa11b31f000-7fa11b320000 r--p 00011000 08:31 78745055
> /opt/conda/lib/python3.6/lib-dynload/_json.cpython-36m-x86_64-linux-gnu.so
> 7fa11b320000-7fa11b321000 rw-p 00012000 08:31 78745055
> /opt/conda/lib/python3.6/lib-dynload/_json.cpython-36m-x86_64-linux-gnu.so
> 7fa11b321000-7fa11b328000 r--p 00000000 08:31 78745089
> /opt/conda/lib/python3.6/lib-dynload/pyexpat.cpython-36m-x86_64-linux-gnu.so
> 7fa11b328000-7fa11b354000 r-xp 00007000 08:31 78745089
> /opt/conda/lib/python3.6/lib-dynload/pyexpat.cpython-36m-x86_64-linux-gnu.so
> 7fa11b354000-7fa11b35f000 r--p 00033000 08:31 78745089
> /opt/conda/lib/python3.6/lib-dynload/pyexpat.cpython-36m-x86_64-linux-gnu.so
> 7fa11b35f000-7fa11b362000 r--p 0003d000 08:31 78745089
> /opt/conda/lib/python3.6/lib-dynload/pyexpat.cpython-36m-x86_64-linux-gnu.so
> 7fa11b362000-7fa11b364000 rw-p 00040000 08:31 78745089
> /opt/conda/lib/python3.6/lib-dynload/pyexpat.cpython-36m-x86_64-linux-gnu.so
> 7fa11b364000-7fa11b368000 r--p 00000000 08:31 78745052
> /opt/conda/lib/python3.6/lib-dynload/_elementtree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11b368000-7fa11b371000 r-xp 00004000 08:31 78745052
> /opt/conda/lib/python3.6/lib-dynload/_elementtree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11b371000-7fa11b374000 r--p 0000d000 08:31 78745052
> /opt/conda/lib/python3.6/lib-dynload/_elementtree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11b374000-7fa11b375000 r--p 0000f000 08:31 78745052
> /opt/conda/lib/python3.6/lib-dynload/_elementtree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11b375000-7fa11b377000 rw-p 00010000 08:31 78745052
> /opt/conda/lib/python3.6/lib-dynload/_elementtree.cpython-36m-x86_64-linux-
> gnu.so
> 7fa11b377000-7fa11b4b7000 rw-p 00000000 00:00 0
> 7fa11b4b7000-7fa11b4c1000 r--p 00000000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b4c1000-7fa11b54d000 r-xp 0000a000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b54d000-7fa11b559000 r--p 00096000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b559000-7fa11b55a000 ---p 000a2000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b55a000-7fa11b55b000 r--p 000a2000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b55b000-7fa11b55c000 rw-p 000a3000 08:31 78482777
> /opt/conda/lib/libzstd.so.1.3.7
> 7fa11b55c000-7fa11b566000 r--p 00000000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b566000-7fa11b5a9000 r-xp 0000a000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b5a9000-7fa11b5d4000 r--p 0004d000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b5d4000-7fa11b5d5000 ---p 00078000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b5d5000-7fa11b5d9000 r--p 00078000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b5d9000-7fa11b5da000 rw-p 0007c000 08:31 80973581
> /opt/conda/lib/libtiff.so.5.4.0
> 7fa11b5da000-7fa11b615000 r-xp 00000000 08:31 80973514
> /opt/conda/lib/libjpeg.so.9.2.0
> 7fa11b615000-7fa11b814000 ---p 0003b000 08:31 80973514
> /opt/conda/lib/libjpeg.so.9.2.0
> 7fa11b814000-7fa11b815000 r--p 0003a000 08:31 80973514
> /opt/conda/lib/libjpeg.so.9.2.0
> 7fa11b815000-7fa11b816000 rw-p 0003b000 08:31 80973514
> /opt/conda/lib/libjpeg.so.9.2.0
> 7fa11b816000-7fa11b828000 r--p 00000000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b828000-7fa11b876000 r-xp 00012000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b876000-7fa11b884000 r--p 00060000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b884000-7fa11b885000 ---p 0006e000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b885000-7fa11b889000 r--p 0006e000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b889000-7fa11b88c000 rw-p 00072000 08:31 82677504
> /opt/conda/lib/python3.6/site-
> packages/PIL/_imaging.cpython-36m-x86_64-linux-gnu.so
> 7fa11b88c000-7fa11b94d000 rw-p 00000000 00:00 0
> 7fa11b94d000-7fa11b94e000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa11b94e000-7fa11bb8e000 rw-p 00000000 00:00 0
> 7fa11bb8e000-7fa11bbad000 r--p 00000000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bbad000-7fa11bbf7000 r-xp 0001f000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bbf7000-7fa11bc10000 r--p 00069000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bc10000-7fa11bc11000 ---p 00082000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bc11000-7fa11bc1a000 r--p 00082000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bc1a000-7fa11bc1e000 rw-p 0008b000 08:31 78482740
> /opt/conda/lib/libssl.so.1.1
> 7fa11bc1e000-7fa11bc28000 r--p 00000000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc28000-7fa11bc33000 r-xp 0000a000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc33000-7fa11bc39000 r--p 00015000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc39000-7fa11bc3a000 ---p 0001b000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc3a000-7fa11bc3b000 r--p 0001b000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc3b000-7fa11bc40000 rw-p 0001c000 08:31 78745071
> /opt/conda/lib/python3.6/lib-dynload/_ssl.cpython-36m-x86_64-linux-gnu.so
> 7fa11bc40000-7fa11bd80000 rw-p 00000000 00:00 0
> 7fa11bd80000-7fa11bd82000 r--p 00000000 08:31 78745060
> /opt/conda/lib/python3.6/lib-
> dynload/_multiprocessing.cpython-36m-x86_64-linux-gnu.so
> 7fa11bd82000-7fa11bd83000 r-xp 00002000 08:31 78745060
> /opt/conda/lib/python3.6/lib-
> dynload/_multiprocessing.cpython-36m-x86_64-linux-gnu.so
> 7fa11bd83000-7fa11bd84000 r--p 00003000 08:31 78745060
> /opt/conda/lib/python3.6/lib-
> dynload/_multiprocessing.cpython-36m-x86_64-linux-gnu.so
> 7fa11bd84000-7fa11bd85000 r--p 00003000 08:31 78745060
> /opt/conda/lib/python3.6/lib-
> dynload/_multiprocessing.cpython-36m-x86_64-linux-gnu.so
> 7fa11bd85000-7fa11bd86000 rw-p 00004000 08:31 78745060
> /opt/conda/lib/python3.6/lib-
> dynload/_multiprocessing.cpython-36m-x86_64-linux-gnu.so
> 7fa11bd86000-7fa11c0c6000 rw-p 00000000 00:00 0
> 7fa11c0c6000-7fa11c0ca000 r--p 00000000 08:31 78745078
> /opt/conda/lib/python3.6/lib-dynload/array.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0ca000-7fa11c0d2000 r-xp 00004000 08:31 78745078
> /opt/conda/lib/python3.6/lib-dynload/array.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0d2000-7fa11c0d5000 r--p 0000c000 08:31 78745078
> /opt/conda/lib/python3.6/lib-dynload/array.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0d5000-7fa11c0d6000 r--p 0000e000 08:31 78745078
> /opt/conda/lib/python3.6/lib-dynload/array.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0d6000-7fa11c0d9000 rw-p 0000f000 08:31 78745078
> /opt/conda/lib/python3.6/lib-dynload/array.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0d9000-7fa11c0de000 r--p 00000000 08:31 78745069
> /opt/conda/lib/python3.6/lib-dynload/_socket.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0de000-7fa11c0ec000 r-xp 00005000 08:31 78745069
> /opt/conda/lib/python3.6/lib-dynload/_socket.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0ec000-7fa11c0f1000 r--p 00013000 08:31 78745069
> /opt/conda/lib/python3.6/lib-dynload/_socket.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0f1000-7fa11c0f2000 r--p 00017000 08:31 78745069
> /opt/conda/lib/python3.6/lib-dynload/_socket.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0f2000-7fa11c0f7000 rw-p 00018000 08:31 78745069
> /opt/conda/lib/python3.6/lib-dynload/_socket.cpython-36m-x86_64-linux-gnu.so
> 7fa11c0f7000-7fa11c447000 rw-p 00000000 00:00 0
> 7fa11c447000-7fa11e31b000 r-xp 00000000 08:31 80973485
> /opt/conda/lib/libcublasLt.so.10.2.0.168
> 7fa11e31b000-7fa11e51b000 ---p 01ed4000 08:31 80973485
> /opt/conda/lib/libcublasLt.so.10.2.0.168
> 7fa11e51b000-7fa11e58f000 rw-p 01ed4000 08:31 80973485
> /opt/conda/lib/libcublasLt.so.10.2.0.168
> 7fa11e58f000-7fa11e598000 rw-p 00000000 00:00 0
> 7fa11e598000-7fa11e59a000 rw-p 01f48000 08:31 80973485
> /opt/conda/lib/libcublasLt.so.10.2.0.168
> 7fa11e59a000-7fa1259f5000 r-xp 00000000 08:31 80973503
> /opt/conda/lib/libcusparse.so.10.1.168
> 7fa1259f5000-7fa125bf5000 ---p 0745b000 08:31 80973503
> /opt/conda/lib/libcusparse.so.10.1.168
> 7fa125bf5000-7fa125c04000 rw-p 0745b000 08:31 80973503
> /opt/conda/lib/libcusparse.so.10.1.168
> 7fa125c04000-7fa125c0b000 rw-p 00000000 00:00 0
> 7fa125c0b000-7fa125c11000 rw-p 0746b000 08:31 80973503
> /opt/conda/lib/libcusparse.so.10.1.168
> 7fa125c11000-7fa125c1b000 r--p 00000000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c1b000-7fa125c32000 r-xp 0000a000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c32000-7fa125c3b000 r--p 00021000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c3b000-7fa125c3c000 ---p 0002a000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c3c000-7fa125c3d000 r--p 0002a000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c3d000-7fa125c3e000 rw-p 0002b000 08:31 78482634
> /opt/conda/lib/libgomp.so.1.0.0
> 7fa125c3e000-7fa12979f000 r-xp 00000000 08:31 80973482
> /opt/conda/lib/libcublas.so.10.2.0.168
> 7fa12979f000-7fa12999e000 ---p 03b61000 08:31 80973482
> /opt/conda/lib/libcublas.so.10.2.0.168
> 7fa12999e000-7fa1299ac000 rw-p 03b60000 08:31 80973482
> /opt/conda/lib/libcublas.so.10.2.0.168
> 7fa1299ac000-7fa1299b6000 rw-p 00000000 00:00 0
> 7fa1299b6000-7fa1299b9000 rw-p 03b6f000 08:31 80973482
> /opt/conda/lib/libcublas.so.10.2.0.168
> 7fa1299b9000-7fa12bef5000 r-xp 00000000 08:31 80973497
> /opt/conda/lib/libcurand.so.10.1.168
> 7fa12bef5000-7fa12c0f4000 ---p 0253c000 08:31 80973497
> /opt/conda/lib/libcurand.so.10.1.168
> 7fa12c0f4000-7fa12d4c4000 rw-p 0253b000 08:31 80973497
> /opt/conda/lib/libcurand.so.10.1.168
> 7fa12d4c4000-7fa12da1a000 rw-p 00000000 00:00 0
> 7fa12da1a000-7fa12da1b000 rw-p 0390b000 08:31 80973497
> /opt/conda/lib/libcurand.so.10.1.168
> 7fa12da1b000-7fa135dd3000 r-xp 00000000 08:31 80973491
> /opt/conda/lib/libcufft.so.10.1.168
> 7fa135dd3000-7fa135fd3000 ---p 083b8000 08:31 80973491
> /opt/conda/lib/libcufft.so.10.1.168
> 7fa135fd3000-7fa135fe5000 rw-p 083b8000 08:31 80973491
> /opt/conda/lib/libcufft.so.10.1.168
> 7fa135fe5000-7fa13605d000 rw-p 00000000 00:00 0
> 7fa13605d000-7fa13605f000 rw-p 083ca000 08:31 80973491
> /opt/conda/lib/libcufft.so.10.1.168
> 7fa13605f000-7fa1360d6000 r-xp 00000000 08:31 80973488
> /opt/conda/lib/libcudart.so.10.1.168
> 7fa1360d6000-7fa1362d6000 ---p 00077000 08:31 80973488
> /opt/conda/lib/libcudart.so.10.1.168
> 7fa1362d6000-7fa1362da000 rw-p 00077000 08:31 80973488
> /opt/conda/lib/libcudart.so.10.1.168
> 7fa1362da000-7fa1362db000 rw-p 00000000 00:00 0
> 7fa1362db000-7fa1362de000 rw-p 0007c000 08:31 80973488
> /opt/conda/lib/libcudart.so.10.1.168
> 7fa1362de000-7fa1362e6000 r-xp 00000000 08:31 80973553
> /opt/conda/lib/libnvToolsExt.so.1.0.0
> 7fa1362e6000-7fa1364e6000 ---p 00008000 08:31 80973553
> /opt/conda/lib/libnvToolsExt.so.1.0.0
> 7fa1364e6000-7fa1364e7000 rw-p 00008000 08:31 80973553
> /opt/conda/lib/libnvToolsExt.so.1.0.0
> 7fa1364e7000-7fa1364e8000 rw-p 0000a000 08:31 80973553
> /opt/conda/lib/libnvToolsExt.so.1.0.0
> 7fa1364e8000-7fa136531000 r-xp 00000000 08:31 83987905
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10.so
> 7fa136531000-7fa136731000 ---p 00049000 08:31 83987905
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10.so
> 7fa136731000-7fa136732000 r--p 00049000 08:31 83987905
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10.so
> 7fa136732000-7fa136733000 rw-p 0004a000 08:31 83987905
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10.so
> 7fa136733000-7fa136734000 rw-p 00000000 00:00 0
> 7fa136734000-7fa13673e000 rw-p 00063000 08:31 83987905
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10.so
> 7fa13673e000-7fa136764000 r-xp 00000000 08:31 83987906
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10_cuda.so
> 7fa136764000-7fa136963000 ---p 00026000 08:31 83987906
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10_cuda.so
> 7fa136963000-7fa136964000 r--p 00025000 08:31 83987906
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10_cuda.so
> 7fa136964000-7fa136969000 rw-p 00026000 08:31 83987906
> /opt/conda/lib/python3.6/site-packages/torch/lib/libc10_cuda.so
> 7fa136969000-7fa137fa1000 r-xp 00000000 08:31 78482684
> /opt/conda/lib/libmkl_gnu_thread.so
> 7fa137fa1000-7fa1381a0000 ---p 01638000 08:31 78482684
> /opt/conda/lib/libmkl_gnu_thread.so
> 7fa1381a0000-7fa1381a4000 r--p 01637000 08:31 78482684
> /opt/conda/lib/libmkl_gnu_thread.so
> 7fa1381a4000-7fa1381bc000 rw-p 0163b000 08:31 78482684
> /opt/conda/lib/libmkl_gnu_thread.so
> 7fa1381bc000-7fa169a60000 r-xp 00000000 08:31 83987912
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch.so
> 7fa169a60000-7fa169c5f000 ---p 318a4000 08:31 83987912
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch.so
> 7fa169c5f000-7fa169da3000 r--p 318a3000 08:31 83987912
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch.so
> 7fa169da3000-7fa169de4000 rw-p 319e7000 08:31 83987912
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch.so
> 7fa169de4000-7fa169f33000 rw-p 00000000 00:00 0
> 7fa169f33000-7fa169fd5000 r--p 00000000 08:31 78482743
> /opt/conda/lib/libstdc++.so.6.0.26
> 7fa169fd5000-7fa16a054000 r-xp 000a2000 08:31 78482743
> /opt/conda/lib/libstdc++.so.6.0.26
> 7fa16a054000-7fa16a095000 r--p 00121000 08:31 78482743
> /opt/conda/lib/libstdc++.so.6.0.26
> 7fa16a095000-7fa16a0a0000 r--p 00161000 08:31 78482743
> /opt/conda/lib/libstdc++.so.6.0.26
> 7fa16a0a0000-7fa16a0a4000 rw-p `0016c00` 08:31 78482743
> /opt/conda/lib/libstdc++.so.6.0.26
> 7fa16a0a4000-7fa16a0a7000 rw-p 00000000 00:00 0
> 7fa16a0a7000-7fa16ab20000 r-xp 00000000 08:31 83987913
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch_python.so
> 7fa16ab20000-7fa16ad20000 ---p 00a79000 08:31 83987913
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch_python.so
> 7fa16ad20000-7fa16ad31000 r--p 00a79000 08:31 83987913
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch_python.so
> 7fa16ad31000-7fa16ad4d000 rw-p 00a8a000 08:31 83987913
> /opt/conda/lib/python3.6/site-packages/torch/lib/libtorch_python.so
> 7fa16ad4d000-7fa16ad9f000 rw-p 00000000 00:00 0
> 7fa16ad9f000-7fa16ada7000 r-xp 00000000 08:31 83987911
> /opt/conda/lib/python3.6/site-packages/torch/lib/libshm.so
> 7fa16ada7000-7fa16afa7000 ---p 00008000 08:31 83987911
> /opt/conda/lib/python3.6/site-packages/torch/lib/libshm.so
> 7fa16afa7000-7fa16afa8000 r--p 00008000 08:31 83987911
> /opt/conda/lib/python3.6/site-packages/torch/lib/libshm.so
> 7fa16afa8000-7fa16afa9000 rw-p 00009000 08:31 83987911
> /opt/conda/lib/python3.6/site-packages/torch/lib/libshm.so
> 7fa16afa9000-7fa16afaa000 r--p 00000000 08:31 83594736
> /opt/conda/lib/python3.6/site-packages/torch/_C.cpython-36m-x86_64-linux-
> gnu.so
> 7fa16afaa000-7fa16afab000 r-xp 00001000 08:31 83594736
> /opt/conda/lib/python3.6/site-packages/torch/_C.cpython-36m-x86_64-linux-
> gnu.so
> 7fa16afab000-7fa16afac000 r--p 00002000 08:31 83594736
> /opt/conda/lib/python3.6/site-packages/torch/_C.cpython-36m-x86_64-linux-
> gnu.so
> 7fa16afac000-7fa16afad000 r--p 00002000 08:31 83594736
> /opt/conda/lib/python3.6/site-packages/torch/_C.cpython-36m-x86_64-linux-
> gnu.so
> 7fa16afad000-7fa16afae000 rw-p 00003000 08:31 83594736
> /opt/conda/lib/python3.6/site-packages/torch/_C.cpython-36m-x86_64-linux-
> gnu.so
> 7fa16afae000-7fa16bb38000 r-xp 00000000 08:31 78482697
> /opt/conda/lib/libmkl_vml_avx2.so
> 7fa16bb38000-7fa16bd37000 ---p 00b8a000 08:31 78482697
> /opt/conda/lib/libmkl_vml_avx2.so
> 7fa16bd37000-7fa16bd3a000 r--p 00b89000 08:31 78482697
> /opt/conda/lib/libmkl_vml_avx2.so
> 7fa16bd3a000-7fa16bd4f000 rw-p 00b8c000 08:31 78482697
> /opt/conda/lib/libmkl_vml_avx2.so
> 7fa16bd4f000-7fa16bd50000 rw-p 00000000 00:00 0
> 7fa16bd50000-7fa16f229000 r-xp 00000000 08:31 78482670
> /opt/conda/lib/libmkl_avx2.so
> 7fa16f229000-7fa16f429000 ---p 034d9000 08:31 78482670
> /opt/conda/lib/libmkl_avx2.so
> 7fa16f429000-7fa16f430000 r--p 034d9000 08:31 78482670
> /opt/conda/lib/libmkl_avx2.so
> 7fa16f430000-7fa16f43e000 rw-p 034e0000 08:31 78482670
> /opt/conda/lib/libmkl_avx2.so
> 7fa16f43e000-7fa16f445000 rw-p 00000000 00:00 0
> 7fa16f445000-7fa16fda4000 r-xp 00000000 08:31 78482686
> /opt/conda/lib/libmkl_intel_lp64.so
> 7fa16fda4000-7fa16ffa4000 ---p 0095f000 08:31 78482686
> /opt/conda/lib/libmkl_intel_lp64.so
> 7fa16ffa4000-7fa16ffa5000 r--p 0095f000 08:31 78482686
> /opt/conda/lib/libmkl_intel_lp64.so
> 7fa16ffa5000-7fa16ffb8000 rw-p 00960000 08:31 78482686
> /opt/conda/lib/libmkl_intel_lp64.so
> 7fa16ffb8000-7fa16ffbd000 rw-p 00000000 00:00 0
> 7fa16ffbd000-7fa172026000 r-xp 00000000 08:31 78482687
> /opt/conda/lib/libmkl_intel_thread.so
> 7fa172026000-7fa172226000 ---p 02069000 08:31 78482687
> /opt/conda/lib/libmkl_intel_thread.so
> 7fa172226000-7fa17222a000 r--p 02069000 08:31 78482687
> /opt/conda/lib/libmkl_intel_thread.so
> 7fa17222a000-7fa172490000 rw-p 0206d000 08:31 78482687
> /opt/conda/lib/libmkl_intel_thread.so
> 7fa172490000-7fa172499000 rw-p 00000000 00:00 0
> 7fa172499000-7fa176513000 r-xp 00000000 08:31 78482680
> /opt/conda/lib/libmkl_core.so
> 7fa176513000-7fa176712000 ---p 0407a000 08:31 78482680
> /opt/conda/lib/libmkl_core.so
> 7fa176712000-7fa176719000 r--p 04079000 08:31 78482680
> /opt/conda/lib/libmkl_core.so
> 7fa176719000-7fa176747000 rw-p 04080000 08:31 78482680
> /opt/conda/lib/libmkl_core.so
> 7fa176747000-7fa1768ae000 rw-p 00000000 00:00 0
> 7fa1768ae000-7fa1768bb000 r--p 00000000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa1768bb000-7fa176914000 r-xp 0000d000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176914000-7fa176940000 r--p 00066000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176940000-7fa176941000 ---p 00092000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176941000-7fa176942000 r--p 00092000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176942000-7fa176964000 rw-p 00093000 08:31 79793411
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176964000-7fa176966000 rw-p 00000000 00:00 0
> 7fa176966000-7fa17696a000 r--p 00000000 08:31 79793418
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/sfc64.cpython-36m-x86_64-linux-gnu.so
> 7fa17696a000-7fa176971000 r-xp 00004000 08:31 79793418
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/sfc64.cpython-36m-x86_64-linux-gnu.so
> 7fa176971000-7fa176974000 r--p 0000b000 08:31 79793418
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/sfc64.cpython-36m-x86_64-linux-gnu.so
> 7fa176974000-7fa176975000 r--p 0000d000 08:31 79793418
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/sfc64.cpython-36m-x86_64-linux-gnu.so
> 7fa176975000-7fa176976000 rw-p 0000e000 08:31 79793418
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/sfc64.cpython-36m-x86_64-linux-gnu.so
> 7fa176976000-7fa17697a000 r--p 00000000 08:31 79793415
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/pcg64.cpython-36m-x86_64-linux-gnu.so
> 7fa17697a000-7fa176984000 r-xp 00004000 08:31 79793415
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/pcg64.cpython-36m-x86_64-linux-gnu.so
> 7fa176984000-7fa176987000 r--p 0000e000 08:31 79793415
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/pcg64.cpython-36m-x86_64-linux-gnu.so
> 7fa176987000-7fa176988000 r--p 00010000 08:31 79793415
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/pcg64.cpython-36m-x86_64-linux-gnu.so
> 7fa176988000-7fa17698a000 rw-p 00011000 08:31 79793415
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/pcg64.cpython-36m-x86_64-linux-gnu.so
> 7fa17698a000-7fa17698e000 r--p 00000000 08:31 79793416
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/philox.cpython-36m-x86_64-linux-gnu.so
> 7fa17698e000-7fa17699b000 r-xp 00004000 08:31 79793416
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/philox.cpython-36m-x86_64-linux-gnu.so
> 7fa17699b000-7fa17699f000 r--p 00011000 08:31 79793416
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/philox.cpython-36m-x86_64-linux-gnu.so
> 7fa17699f000-7fa1769a0000 r--p 00014000 08:31 79793416
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/philox.cpython-36m-x86_64-linux-gnu.so
> 7fa1769a0000-7fa1769a2000 rw-p 00015000 08:31 79793416
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/philox.cpython-36m-x86_64-linux-gnu.so
> 7fa1769a2000-7fa1769a9000 r--p 00000000 08:31 79793410
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/entropy.cpython-36m-x86_64-linux-gnu.so
> 7fa1769a9000-7fa1769c9000 r-xp 00007000 08:31 79793410
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/entropy.cpython-36m-x86_64-linux-gnu.so
> 7fa1769c9000-7fa1769d0000 r--p 00027000 08:31 79793410
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/entropy.cpython-36m-x86_64-linux-gnu.so
> 7fa1769d0000-7fa1769d1000 r--p 0002d000 08:31 79793410
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/entropy.cpython-36m-x86_64-linux-gnu.so
> 7fa1769d1000-7fa1769d4000 rw-p 0002e000 08:31 79793410
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/entropy.cpython-36m-x86_64-linux-gnu.so
> 7fa1769d4000-7fa1769d5000 rw-p 00000000 00:00 0
> 7fa1769d5000-7fa1769d7000 r--p 00000000 08:31 78745064
> /opt/conda/lib/python3.6/lib-dynload/_random.cpython-36m-x86_64-linux-gnu.so
> 7fa1769d7000-7fa1769d9000 r-xp 00002000 08:31 78745064
> /opt/conda/lib/python3.6/lib-dynload/_random.cpython-36m-x86_64-linux-gnu.so
> 7fa1769d9000-7fa1769da000 r--p 00004000 08:31 78745064
> /opt/conda/lib/python3.6/lib-dynload/_random.cpython-36m-x86_64-linux-gnu.so
> 7fa1769da000-7fa1769db000 r--p 00004000 08:31 78745064
> /opt/conda/lib/python3.6/lib-dynload/_random.cpython-36m-x86_64-linux-gnu.so
> 7fa1769db000-7fa1769dc000 rw-p 00005000 08:31 78745064
> /opt/conda/lib/python3.6/lib-dynload/_random.cpython-36m-x86_64-linux-gnu.so
> 7fa1769dc000-7fa1769dd000 r--p 00000000 08:31 78745035
> /opt/conda/lib/python3.6/lib-dynload/_bisect.cpython-36m-x86_64-linux-gnu.so
> 7fa1769dd000-7fa1769de000 r-xp 00001000 08:31 78745035
> /opt/conda/lib/python3.6/lib-dynload/_bisect.cpython-36m-x86_64-linux-gnu.so
> 7fa1769de000-7fa1769df000 r--p 00002000 08:31 78745035
> /opt/conda/lib/python3.6/lib-dynload/_bisect.cpython-36m-x86_64-linux-gnu.so
> 7fa1769df000-7fa1769e0000 r--p 00002000 08:31 78745035
> /opt/conda/lib/python3.6/lib-dynload/_bisect.cpython-36m-x86_64-linux-gnu.so
> 7fa1769e0000-7fa1769e1000 rw-p 00003000 08:31 78745035
> /opt/conda/lib/python3.6/lib-dynload/_bisect.cpython-36m-x86_64-linux-gnu.so
> 7fa1769e1000-7fa1769e4000 r--p 00000000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769e4000-7fa1769f7000 r-xp 00003000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769f7000-7fa1769f8000 r--p 00016000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769f8000-7fa1769f9000 ---p 00017000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769f9000-7fa1769fa000 r--p 00017000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769fa000-7fa1769fc000 rw-p 00018000 08:31 78745067
> /opt/conda/lib/python3.6/lib-dynload/_sha3.cpython-36m-x86_64-linux-gnu.so
> 7fa1769fc000-7fa1769fe000 r--p 00000000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa1769fe000-7fa176a08000 r-xp 00002000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa176a08000-7fa176a09000 r--p 0000c000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa176a09000-7fa176a0a000 ---p 0000d000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa176a0a000-7fa176a0b000 r--p 0000d000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa176a0b000-7fa176a0c000 rw-p 0000e000 08:31 78745036
> /opt/conda/lib/python3.6/lib-dynload/_blake2.cpython-36m-x86_64-linux-gnu.so
> 7fa176a0c000-7fa176a87000 r--p 00000000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176a87000-7fa176bfd000 r-xp 0007b000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176bfd000-7fa176c88000 r--p 001f1000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176c88000-7fa176c89000 ---p 0027c000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176c89000-7fa176cb4000 r--p 0027c000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176cb4000-7fa176cb6000 rw-p 002a7000 08:31 78482609
> /opt/conda/lib/libcrypto.so.1.1
> 7fa176cb6000-7fa176cba000 rw-p 00000000 00:00 0
> 7fa176cba000-7fa176cbc000 r--p 00000000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cbc000-7fa176cbf000 r-xp 00002000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cbf000-7fa176cc0000 r--p 00005000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc0000-7fa176cc1000 ---p 00006000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc1000-7fa176cc2000 r--p 00006000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc2000-7fa176cc3000 rw-p 00007000 08:31 78745053
> /opt/conda/lib/python3.6/lib-dynload/_hashlib.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc3000-7fa176cc5000 r--p 00000000 08:31 78745080
> /opt/conda/lib/python3.6/lib-dynload/binascii.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc5000-7fa176cc8000 r-xp 00002000 08:31 78745080
> /opt/conda/lib/python3.6/lib-dynload/binascii.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cc8000-7fa176cca000 r--p 00005000 08:31 78745080
> /opt/conda/lib/python3.6/lib-dynload/binascii.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176cca000-7fa176ccb000 r--p 00006000 08:31 78745080
> /opt/conda/lib/python3.6/lib-dynload/binascii.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176ccb000-7fa176ccc000 rw-p 00007000 08:31 78745080
> /opt/conda/lib/python3.6/lib-dynload/binascii.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176ccc000-7fa176cd3000 r--p 00000000 08:31 79793405
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bit_generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176cd3000-7fa176ced000 r-xp 00007000 08:31 79793405
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bit_generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176ced000-7fa176cf5000 r--p 00021000 08:31 79793405
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bit_generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176cf5000-7fa176cf6000 r--p 00028000 08:31 79793405
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bit_generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176cf6000-7fa176cfb000 rw-p 00029000 08:31 79793405
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bit_generator.cpython-36m-x86_64-linux-gnu.so
> 7fa176cfb000-7fa176cff000 r--p 00000000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176cff000-7fa176d0f000 r-xp 00004000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176d0f000-7fa176d17000 r--p 00014000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176d17000-7fa176d18000 ---p 0001c000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176d18000-7fa176d19000 r--p 0001c000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176d19000-7fa176d1b000 rw-p 0001d000 08:31 79793413
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mt19937.cpython-36m-x86_64-linux-gnu.so
> 7fa176d1b000-7fa176d20000 r--p 00000000 08:31 79793407
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bounded_integers.cpython-36m-x86_64-linux-gnu.so
> 7fa176d20000-7fa176d6c000 r-xp 00005000 08:31 79793407
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bounded_integers.cpython-36m-x86_64-linux-gnu.so
> 7fa176d6c000-7fa176d75000 r--p 00051000 08:31 79793407
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bounded_integers.cpython-36m-x86_64-linux-gnu.so
> 7fa176d75000-7fa176d76000 r--p 00059000 08:31 79793407
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bounded_integers.cpython-36m-x86_64-linux-gnu.so
> 7fa176d76000-7fa176d77000 rw-p 0005a000 08:31 79793407
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/bounded_integers.cpython-36m-x86_64-linux-gnu.so
> 7fa176d77000-7fa176db8000 rw-p 00000000 00:00 0
> 7fa176db8000-7fa176dbc000 r--p 00000000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176dbc000-7fa176ded000 r-xp 00004000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176ded000-7fa176df0000 r--p 00035000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176df0000-7fa176df1000 ---p 00038000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176df1000-7fa176df2000 r--p 00038000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176df2000-7fa176df4000 rw-p 00039000 08:31 79793408
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/common.cpython-36m-x86_64-linux-gnu.so
> 7fa176df4000-7fa176e00000 r--p 00000000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e00000-7fa176e48000 r-xp 0000c000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e48000-7fa176e71000 r--p 00054000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e71000-7fa176e72000 ---p 0007d000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e72000-7fa176e73000 r--p 0007d000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e73000-7fa176e96000 rw-p 0007e000 08:31 79793414
> /opt/conda/lib/python3.6/site-
> packages/numpy/random/mtrand.cpython-36m-x86_64-linux-gnu.so
> 7fa176e96000-7fa176ed8000 rw-p 00000000 00:00 0
> 7fa176ed8000-7fa176ee0000 r--p 00000000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176ee0000-7fa176f24000 r-xp 00008000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176f24000-7fa176f2a000 r--p 0004c000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176f2a000-7fa176f2b000 ---p 00052000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176f2b000-7fa176f2c000 r--p 00052000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176f2c000-7fa176f30000 rw-p 00053000 08:31 79531601
> /opt/conda/lib/python3.6/site-
> packages/mkl_fft/_pydfti.cpython-36m-x86_64-linux-gnu.so
> 7fa176f30000-7fa176f31000 rw-p 00000000 00:00 0
> 7fa176f31000-7fa176f32000 r--p 00000000 08:31 79662705
> /opt/conda/lib/python3.6/site-
> packages/numpy/fft/pocketfft_internal.cpython-36m-x86_64-linux-gnu.so
> 7fa176f32000-7fa176f44000 r-xp 00001000 08:31 79662705
> /opt/conda/lib/python3.6/site-
> packages/numpy/fft/pocketfft_internal.cpython-36m-x86_64-linux-gnu.so
> 7fa176f44000-7fa176f46000 r--p 00013000 08:31 79662705
> /opt/conda/lib/python3.6/site-
> packages/numpy/fft/pocketfft_internal.cpython-36m-x86_64-linux-gnu.so
> 7fa176f46000-7fa176f47000 r--p 00014000 08:31 79662705
> /opt/conda/lib/python3.6/site-
> packages/numpy/fft/pocketfft_internal.cpython-36m-x86_64-linux-gnu.so
> 7fa176f47000-7fa176f48000 rw-p 00015000 08:31 79662705
> /opt/conda/lib/python3.6/site-
> packages/numpy/fft/pocketfft_internal.cpython-36m-x86_64-linux-gnu.so
> 7fa176f48000-7fa176f88000 rw-p 00000000 00:00 0
> 7fa176f88000-7fa176f8f000 r--p 00000000 08:31 78745051
> /opt/conda/lib/python3.6/lib-dynload/_decimal.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176f8f000-7fa176fc3000 r-xp 00007000 08:31 78745051
> /opt/conda/lib/python3.6/lib-dynload/_decimal.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176fc3000-7fa176fcd000 r--p 0003b000 08:31 78745051
> /opt/conda/lib/python3.6/lib-dynload/_decimal.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176fcd000-7fa176fce000 r--p 00044000 08:31 78745051
> /opt/conda/lib/python3.6/lib-dynload/_decimal.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176fce000-7fa176fd6000 rw-p 00045000 08:31 78745051
> /opt/conda/lib/python3.6/lib-dynload/_decimal.cpython-36m-x86_64-linux-
> gnu.so
> 7fa176fd6000-7fa176fd8000 r--p 00000000 08:31 78745083
> /opt/conda/lib/python3.6/lib-dynload/grp.cpython-36m-x86_64-linux-gnu.so
> 7fa176fd8000-7fa176fd9000 r-xp 00002000 08:31 78745083
> /opt/conda/lib/python3.6/lib-dynload/grp.cpython-36m-x86_64-linux-gnu.so
> 7fa176fd9000-7fa176fda000 r--p 00003000 08:31 78745083
> /opt/conda/lib/python3.6/lib-dynload/grp.cpython-36m-x86_64-linux-gnu.so
> 7fa176fda000-7fa176fdb000 r--p 00003000 08:31 78745083
> /opt/conda/lib/python3.6/lib-dynload/grp.cpython-36m-x86_64-linux-gnu.so
> 7fa176fdb000-7fa176fdc000 rw-p 00004000 08:31 78745083
> /opt/conda/lib/python3.6/lib-dynload/grp.cpython-36m-x86_64-linux-gnu.so
> 7fa176fdc000-7fa177001000 r-xp 00000000 08:31 `7848265`
> /opt/conda/lib/liblzma.so.5.2.4
> 7fa177001000-7fa177200000 ---p 00025000 08:31 `7848265`
> /opt/conda/lib/liblzma.so.5.2.4
> 7fa177200000-7fa177201000 r--p 00024000 08:31 `7848265`
> /opt/conda/lib/liblzma.so.5.2.4
> 7fa177201000-7fa177202000 rw-p 00025000 08:31 `7848265`
> /opt/conda/lib/liblzma.so.5.2.4
> 7fa177202000-7fa177205000 r--p 00000000 08:31 78745057
> /opt/conda/lib/python3.6/lib-dynload/_lzma.cpython-36m-x86_64-linux-gnu.so
> 7fa177205000-7fa177209000 r-xp 00003000 08:31 78745057
> /opt/conda/lib/python3.6/lib-dynload/_lzma.cpython-36m-x86_64-linux-gnu.so
> 7fa177209000-7fa17720b000 r--p 00007000 08:31 78745057
> /opt/conda/lib/python3.6/lib-dynload/_lzma.cpython-36m-x86_64-linux-gnu.so
> 7fa17720b000-7fa17720c000 r--p 00008000 08:31 78745057
> /opt/conda/lib/python3.6/lib-dynload/_lzma.cpython-36m-x86_64-linux-gnu.so
> 7fa17720c000-7fa17720e000 rw-p 00009000 08:31 78745057
> /opt/conda/lib/python3.6/lib-dynload/_lzma.cpython-36m-x86_64-linux-gnu.so
> 7fa17720e000-7fa177211000 r--p 00000000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177211000-7fa177220000 r-xp 00003000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177220000-7fa177222000 r--p 00012000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177222000-7fa177223000 ---p 00014000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177223000-7fa177224000 r--p 00014000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177224000-7fa177226000 rw-p 00015000 08:31 78745037
> /opt/conda/lib/python3.6/lib-dynload/_bz2.cpython-36m-x86_64-linux-gnu.so
> 7fa177226000-7fa177229000 r--p 00000000 08:31 78482774
> /opt/conda/lib/libz.so.1.2.11
> 7fa177229000-7fa17723d000 r-xp 00003000 08:31 78482774
> /opt/conda/lib/libz.so.1.2.11
> 7fa17723d000-7fa177244000 r--p 00017000 08:31 78482774
> /opt/conda/lib/libz.so.1.2.11
> 7fa177244000-7fa177245000 r--p 0001d000 08:31 78482774
> /opt/conda/lib/libz.so.1.2.11
> 7fa177245000-7fa177246000 rw-p 0001e000 08:31 78482774
> /opt/conda/lib/libz.so.1.2.11
> 7fa177246000-7fa177248000 r--p 00000000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa177248000-7fa17724c000 r-xp 00002000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa17724c000-7fa17724d000 r--p 00006000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa17724d000-7fa17724e000 ---p 00007000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa17724e000-7fa17724f000 r--p 00007000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa17724f000-7fa177251000 rw-p 00008000 08:31 78745098
> /opt/conda/lib/python3.6/lib-dynload/zlib.cpython-36m-x86_64-linux-gnu.so
> 7fa177251000-7fa177311000 rw-p 00000000 00:00 0
> 7fa177311000-7fa177319000 r--p 00000000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa177319000-7fa177334000 r-xp 00008000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa177334000-7fa177339000 r--p 00023000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa177339000-7fa17733a000 ---p 00028000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa17733a000-7fa17733b000 r--p 00028000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa17733b000-7fa17733c000 rw-p 00029000 08:31 79793272
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/_umath_linalg.cpython-36m-x86_64-linux-gnu.so
> 7fa17733c000-7fa17733d000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa17733d000-7fa17733e000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa17733e000-7fa17733f000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa17733f000-7fa177340000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa177340000-7fa177341000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa177341000-7fa177342000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa177342000-7fa177344000 r--p 00000000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa177344000-7fa177348000 r-xp 00002000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa177348000-7fa177349000 r--p 00006000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa177349000-7fa17734a000 ---p 00007000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa17734a000-7fa17734b000 r--p 00007000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa17734b000-7fa17734d000 rw-p 00008000 08:31 78745045
> /opt/conda/lib/python3.6/lib-dynload/_csv.cpython-36m-x86_64-linux-gnu.so
> 7fa17734d000-7fa17740d000 rw-p 00000000 00:00 0
> 7fa17740d000-7fa177416000 r--p 00000000 08:31 79662234
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_tests.cpython-36m-x86_64-linux-gnu.so
> 7fa177416000-7fa177428000 r-xp 00009000 08:31 79662234
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_tests.cpython-36m-x86_64-linux-gnu.so
> 7fa177428000-7fa17742d000 r--p 0001b000 08:31 79662234
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_tests.cpython-36m-x86_64-linux-gnu.so
> 7fa17742d000-7fa17742e000 r--p 0001f000 08:31 79662234
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_tests.cpython-36m-x86_64-linux-gnu.so
> 7fa17742e000-7fa17742f000 rw-p 00020000 08:31 79662234
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_tests.cpython-36m-x86_64-linux-gnu.so
> 7fa17742f000-7fa17752f000 rw-p 00000000 00:00 0
> 7fa17752f000-7fa177534000 r--p 00000000 08:31 78745062
> /opt/conda/lib/python3.6/lib-dynload/_pickle.cpython-36m-x86_64-linux-gnu.so
> 7fa177534000-7fa177547000 r-xp 00005000 08:31 78745062
> /opt/conda/lib/python3.6/lib-dynload/_pickle.cpython-36m-x86_64-linux-gnu.so
> 7fa177547000-7fa17754b000 r--p 00018000 08:31 78745062
> /opt/conda/lib/python3.6/lib-dynload/_pickle.cpython-36m-x86_64-linux-gnu.so
> 7fa17754b000-7fa17754c000 r--p 0001b000 08:31 78745062
> /opt/conda/lib/python3.6/lib-dynload/_pickle.cpython-36m-x86_64-linux-gnu.so
> 7fa17754c000-7fa17754f000 rw-p 0001c000 08:31 78745062
> /opt/conda/lib/python3.6/lib-dynload/_pickle.cpython-36m-x86_64-linux-gnu.so
> 7fa17754f000-7fa1775cf000 rw-p 00000000 00:00 0
> 7fa1775cf000-7fa1775fa000 r--p 00000000 08:31 79662235
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_umath.cpython-36m-x86_64-linux-gnu.so
> 7fa1775fa000-7fa177883000 r-xp 0002b000 08:31 79662235
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_umath.cpython-36m-x86_64-linux-gnu.so
> 7fa177883000-7fa17791a000 r--p 002b4000 08:31 79662235
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_umath.cpython-36m-x86_64-linux-gnu.so
> 7fa17791a000-7fa17791e000 r--p 0034a000 08:31 79662235
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_umath.cpython-36m-x86_64-linux-gnu.so
> 7fa17791e000-7fa17793b000 rw-p `0034e00` 08:31 79662235
> /opt/conda/lib/python3.6/site-
> packages/numpy/core/_multiarray_umath.cpython-36m-x86_64-linux-gnu.so
> 7fa17793b000-7fa17799b000 rw-p 00000000 00:00 0
> 7fa17799b000-7fa1779a1000 r--p 00000000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779a1000-7fa1779b6000 r-xp 00006000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779b6000-7fa1779b9000 r--p 0001b000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779b9000-7fa1779ba000 ---p 0001e000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779ba000-7fa1779bb000 r--p 0001e000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779bb000-7fa1779be000 rw-p 0001f000 08:31 79531589
> /opt/conda/lib/python3.6/site-
> packages/mkl/_py_mkl_service.cpython-36m-x86_64-linux-gnu.so
> 7fa1779be000-7fa1779bf000 rw-p 00000000 00:00 0
> 7fa1779bf000-7fa1779c2000 r--p 00000000 08:31 78482628
> /opt/conda/lib/libgcc_s.so.1
> 7fa1779c2000-7fa1779ce000 r-xp 00003000 08:31 78482628
> /opt/conda/lib/libgcc_s.so.1
> 7fa1779ce000-7fa1779d1000 r--p 0000f000 08:31 78482628
> /opt/conda/lib/libgcc_s.so.1
> 7fa1779d1000-7fa1779d2000 r--p 00011000 08:31 78482628
> /opt/conda/lib/libgcc_s.so.1
> 7fa1779d2000-7fa1779d3000 rw-p 00012000 08:31 78482628
> /opt/conda/lib/libgcc_s.so.1
> 7fa1779d3000-7fa177b83000 r-xp 00000000 08:31 78482639
> /opt/conda/lib/libiomp5.so
> 7fa177b83000-7fa177d82000 ---p 001b0000 08:31 78482639
> /opt/conda/lib/libiomp5.so
> 7fa177d82000-7fa177d85000 r--p 001af000 08:31 78482639
> /opt/conda/lib/libiomp5.so
> 7fa177d85000-7fa177d8f000 rw-p 001b2000 08:31 78482639
> /opt/conda/lib/libiomp5.so
> 7fa177d8f000-7fa177dbd000 rw-p 00000000 00:00 0
> 7fa177dbd000-7fa178288000 r-xp 00000000 08:31 78482691
> /opt/conda/lib/libmkl_rt.so
> 7fa178288000-7fa178488000 ---p 004cb000 08:31 78482691
> /opt/conda/lib/libmkl_rt.so
> 7fa178488000-7fa17848e000 r--p 004cb000 08:31 78482691
> /opt/conda/lib/libmkl_rt.so
> 7fa17848e000-7fa178490000 rw-p 004d1000 08:31 78482691
> /opt/conda/lib/libmkl_rt.so
> 7fa178490000-7fa1784a4000 rw-p 00000000 00:00 0
> 7fa1784a4000-7fa1784ab000 r-xp 00000000 08:31 78482618
> /opt/conda/lib/libffi.so.6.0.4
> 7fa1784ab000-7fa1786ab000 ---p 00007000 08:31 78482618
> /opt/conda/lib/libffi.so.6.0.4
> 7fa1786ab000-7fa1786ac000 r--p 00007000 08:31 78482618
> /opt/conda/lib/libffi.so.6.0.4
> 7fa1786ac000-7fa1786ad000 rw-p 00008000 08:31 78482618
> /opt/conda/lib/libffi.so.6.0.4
> 7fa1786ad000-7fa1786b5000 r--p 00000000 08:31 78745046
> /opt/conda/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so
> 7fa1786b5000-7fa1786c6000 r-xp 00008000 08:31 78745046
> /opt/conda/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so
> 7fa1786c6000-7fa1786cd000 r--p 00019000 08:31 78745046
> /opt/conda/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so
> 7fa1786cd000-7fa1786ce000 r--p 0001f000 08:31 78745046
> /opt/conda/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so
> 7fa1786ce000-7fa1786d2000 rw-p 00020000 08:31 78745046
> /opt/conda/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so
> 7fa1786d2000-7fa1787d2000 rw-p 00000000 00:00 0
> 7fa1787d2000-7fa1787d4000 r--p 00000000 08:31 78745092
> /opt/conda/lib/python3.6/lib-dynload/select.cpython-36m-x86_64-linux-gnu.so
> 7fa1787d4000-7fa1787d8000 r-xp 00002000 08:31 78745092
> /opt/conda/lib/python3.6/lib-dynload/select.cpython-36m-x86_64-linux-gnu.so
> 7fa1787d8000-7fa1787d9000 r--p 00006000 08:31 78745092
> /opt/conda/lib/python3.6/lib-dynload/select.cpython-36m-x86_64-linux-gnu.so
> 7fa1787d9000-7fa1787da000 r--p 00006000 08:31 78745092
> /opt/conda/lib/python3.6/lib-dynload/select.cpython-36m-x86_64-linux-gnu.so
> 7fa1787da000-7fa1787dc000 rw-p 00007000 08:31 78745092
> /opt/conda/lib/python3.6/lib-dynload/select.cpython-36m-x86_64-linux-gnu.so
> 7fa1787dc000-7fa1787e1000 r--p 00000000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787e1000-7fa1787f1000 r-xp 00005000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787f1000-7fa1787f6000 r--p 00015000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787f6000-7fa1787f7000 ---p 0001a000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787f7000-7fa1787f8000 r--p 0001a000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787f8000-7fa1787fa000 rw-p 0001b000 08:31 78745050
> /opt/conda/lib/python3.6/lib-dynload/_datetime.cpython-36m-x86_64-linux-
> gnu.so
> 7fa1787fa000-7fa17883a000 rw-p 00000000 00:00 0
> 7fa17883a000-7fa17883b000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa17883b000-7fa17883c000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa17883c000-7fa17883e000 r--p 00000000 08:31 79793274
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/lapack_lite.cpython-36m-x86_64-linux-gnu.so
> 7fa17883e000-7fa178840000 r-xp 00002000 08:31 79793274
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/lapack_lite.cpython-36m-x86_64-linux-gnu.so
> 7fa178840000-7fa178841000 r--p 00004000 08:31 79793274
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/lapack_lite.cpython-36m-x86_64-linux-gnu.so
> 7fa178841000-7fa178842000 r--p 00004000 08:31 79793274
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/lapack_lite.cpython-36m-x86_64-linux-gnu.so
> 7fa178842000-7fa178843000 rw-p 00005000 08:31 79793274
> /opt/conda/lib/python3.6/site-
> packages/numpy/linalg/lapack_lite.cpython-36m-x86_64-linux-gnu.so
> 7fa178843000-7fa178983000 rw-p 00000000 00:00 0
> 7fa178983000-7fa178a8b000 r-xp 00000000 08:31 59728410 /lib/x86_64-linux-
> gnu/libm-2.23.so
> 7fa178a8b000-7fa178c8a000 ---p 00108000 08:31 59728410 /lib/x86_64-linux-
> gnu/libm-2.23.so
> 7fa178c8a000-7fa178c8b000 r--p 00107000 08:31 59728410 /lib/x86_64-linux-
> gnu/libm-2.23.so
> 7fa178c8b000-7fa178c8c000 rw-p 00108000 08:31 59728410 /lib/x86_64-linux-
> gnu/libm-2.23.so
> 7fa178c8c000-7fa178c93000 r-xp 00000000 08:31 59728452 /lib/x86_64-linux-
> gnu/librt-2.23.so
> 7fa178c93000-7fa178e92000 ---p 00007000 08:31 59728452 /lib/x86_64-linux-
> gnu/librt-2.23.so
> 7fa178e92000-7fa178e93000 r--p 00006000 08:31 59728452 /lib/x86_64-linux-
> gnu/librt-2.23.so
> 7fa178e93000-7fa178e94000 rw-p 00007000 08:31 59728452 /lib/x86_64-linux-
> gnu/librt-2.23.so
> 7fa178e94000-7fa178e96000 r-xp 00000000 08:31 59728472 /lib/x86_64-linux-
> gnu/libutil-2.23.so
> 7fa178e96000-7fa179095000 ---p 00002000 08:31 59728472 /lib/x86_64-linux-
> gnu/libutil-2.23.so
> 7fa179095000-7fa179096000 r--p 00001000 08:31 59728472 /lib/x86_64-linux-
> gnu/libutil-2.23.so
> 7fa179096000-7fa179097000 rw-p 00002000 08:31 59728472 /lib/x86_64-linux-
> gnu/libutil-2.23.so
> 7fa179097000-7fa17909a000 r-xp 00000000 08:31 59728391 /lib/x86_64-linux-
> gnu/libdl-2.23.so
> 7fa17909a000-7fa179299000 ---p 00003000 08:31 59728391 /lib/x86_64-linux-
> gnu/libdl-2.23.so
> 7fa179299000-7fa17929a000 r--p 00002000 08:31 59728391 /lib/x86_64-linux-
> gnu/libdl-2.23.so
> 7fa17929a000-7fa17929b000 rw-p 00003000 08:31 59728391 /lib/x86_64-linux-
> gnu/libdl-2.23.so
> 7fa17929b000-7fa17945b000 r-xp 00000000 08:31 59728378 /lib/x86_64-linux-
> gnu/libc-2.23.so
> 7fa17945b000-7fa17965b000 ---p 001c0000 08:31 59728378 /lib/x86_64-linux-
> gnu/libc-2.23.so
> 7fa17965b000-7fa17965f000 r--p 001c0000 08:31 59728378 /lib/x86_64-linux-
> gnu/libc-2.23.so
> 7fa17965f000-7fa179661000 rw-p 001c4000 08:31 59728378 /lib/x86_64-linux-
> gnu/libc-2.23.so
> 7fa179661000-7fa179665000 rw-p 00000000 00:00 0
> 7fa179665000-7fa17967d000 r-xp 00000000 08:31 59728446 /lib/x86_64-linux-
> gnu/libpthread-2.23.so
> 7fa17967d000-7fa17987c000 ---p 00018000 08:31 59728446 /lib/x86_64-linux-
> gnu/libpthread-2.23.so
> 7fa17987c000-7fa17987d000 r--p 00017000 08:31 59728446 /lib/x86_64-linux-
> gnu/libpthread-2.23.so
> 7fa17987d000-7fa17987e000 rw-p 00018000 08:31 59728446 /lib/x86_64-linux-
> gnu/libpthread-2.23.so
> 7fa17987e000-7fa179882000 rw-p 00000000 00:00 0
> 7fa179882000-7fa1798a8000 r-xp 00000000 08:31 59728358 /lib/x86_64-linux-
> gnu/ld-2.23.so
> 7fa1798a8000-7fa1798a9000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1798a9000-7fa1798aa000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1798aa000-7fa1798ab000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1798ab000-7fa1798ac000 r--p 00000000 08:31 79531588
> /opt/conda/lib/python3.6/site-
> packages/mkl/_mklinit.cpython-36m-x86_64-linux-gnu.so
> 7fa1798ac000-7fa1798ad000 r-xp 00001000 08:31 79531588
> /opt/conda/lib/python3.6/site-
> packages/mkl/_mklinit.cpython-36m-x86_64-linux-gnu.so
> 7fa1798ad000-7fa1798ae000 r--p 00002000 08:31 79531588
> /opt/conda/lib/python3.6/site-
> packages/mkl/_mklinit.cpython-36m-x86_64-linux-gnu.so
> 7fa1798ae000-7fa1798af000 r--p 00002000 08:31 79531588
> /opt/conda/lib/python3.6/site-
> packages/mkl/_mklinit.cpython-36m-x86_64-linux-gnu.so
> 7fa1798af000-7fa1798b0000 rw-p 00003000 08:31 79531588
> /opt/conda/lib/python3.6/site-
> packages/mkl/_mklinit.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b0000-7fa1798b1000 r--p 00000000 08:31 78745054
> /opt/conda/lib/python3.6/lib-dynload/_heapq.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b1000-7fa1798b3000 r-xp 00001000 08:31 78745054
> /opt/conda/lib/python3.6/lib-dynload/_heapq.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b3000-7fa1798b4000 r--p 00003000 08:31 78745054
> /opt/conda/lib/python3.6/lib-dynload/_heapq.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b4000-7fa1798b5000 r--p 00003000 08:31 78745054
> /opt/conda/lib/python3.6/lib-dynload/_heapq.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b5000-7fa1798b7000 rw-p 00004000 08:31 78745054
> /opt/conda/lib/python3.6/lib-dynload/_heapq.cpython-36m-x86_64-linux-gnu.so
> 7fa1798b7000-7fa1798f7000 rw-p 00000000 00:00 0
> 7fa1798f7000-7fa1798f8000 rw-s 00000000 00:06 493 /dev/nvidiactl
> 7fa1798f8000-7fa1798fa000 r--p 00000000 08:31 78745063
> /opt/conda/lib/python3.6/lib-
> dynload/_posixsubprocess.cpython-36m-x86_64-linux-gnu.so
> 7fa1798fa000-7fa1798fc000 r-xp 00002000 08:31 78745063
> /opt/conda/lib/python3.6/lib-
> dynload/_posixsubprocess.cpython-36m-x86_64-linux-gnu.so
> 7fa1798fc000-7fa1798fd000 r--p 00004000 08:31 78745063
> /opt/conda/lib/python3.6/lib-
> dynload/_posixsubprocess.cpython-36m-x86_64-linux-gnu.so
> 7fa1798fd000-7fa1798fe000 r--p 00004000 08:31 78745063
> /opt/conda/lib/python3.6/lib-
> dynload/_posixsubprocess.cpython-36m-x86_64-linux-gnu.so
> 7fa1798fe000-7fa1798ff000 rw-p 00005000 08:31 78745063
> /opt/conda/lib/python3.6/lib-
> dynload/_posixsubprocess.cpython-36m-x86_64-linux-gnu.so
> 7fa1798ff000-7fa179902000 r--p 00000000 08:31 78745084
> /opt/conda/lib/python3.6/lib-dynload/math.cpython-36m-x86_64-linux-gnu.so
> 7fa179902000-7fa179908000 r-xp 00003000 08:31 78745084
> /opt/conda/lib/python3.6/lib-dynload/math.cpython-36m-x86_64-linux-gnu.so
> 7fa179908000-7fa17990a000 r--p 00009000 08:31 78745084
> /opt/conda/lib/python3.6/lib-dynload/math.cpython-36m-x86_64-linux-gnu.so
> 7fa17990a000-7fa17990b000 r--p 0000a000 08:31 78745084
> /opt/conda/lib/python3.6/lib-dynload/math.cpython-36m-x86_64-linux-gnu.so
> 7fa17990b000-7fa17990d000 rw-p 0000b000 08:31 78745084
> /opt/conda/lib/python3.6/lib-dynload/math.cpython-36m-x86_64-linux-gnu.so
> 7fa17990d000-7fa179910000 r--p 00000000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa179910000-7fa179916000 r-xp 00003000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa179916000-7fa179918000 r--p 00009000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa179918000-7fa179919000 ---p 0000b000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa179919000-7fa17991a000 r--p 0000b000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa17991a000-7fa17991c000 rw-p 0000c000 08:31 78745072
> /opt/conda/lib/python3.6/lib-dynload/_struct.cpython-36m-x86_64-linux-gnu.so
> 7fa17991c000-7fa179aa1000 rw-p 00000000 00:00 0
> 7fa179aa1000-7fa179aa2000 rwxp 00000000 00:00 0
> 7fa179aa2000-7fa179aa3000 r--p 00000000 08:31 78745061
> /opt/conda/lib/python3.6/lib-dynload/_opcode.cpython-36m-x86_64-linux-gnu.so
> 7fa179aa3000-7fa179aa4000 r-xp 00001000 08:31 78745061
> /opt/conda/lib/python3.6/lib-dynload/_opcode.cpython-36m-x86_64-linux-gnu.so
> 7fa179aa4000-7fa179aa5000 r--p 00002000 08:31 78745061
> /opt/conda/lib/python3.6/lib-dynload/_opcode.cpython-36m-x86_64-linux-gnu.so
> 7fa179aa5000-7fa179aa6000 r--p 00002000 08:31 78745061
> /opt/conda/lib/python3.6/lib-dynload/_opcode.cpython-36m-x86_64-linux-gnu.so
> 7fa179aa6000-7fa179aa7000 rw-p 00003000 08:31 78745061
> /opt/conda/lib/python3.6/lib-dynload/_opcode.cpython-36m-x86_64-linux-gnu.so
> 7fa179aa7000-7fa179aa8000 r--p 00025000 08:31 59728358 /lib/x86_64-linux-
> gnu/ld-2.23.so
> 7fa179aa8000-7fa179aa9000 rw-p 00026000 08:31 59728358 /lib/x86_64-linux-
> gnu/ld-2.23.so
> 7fa179aa9000-7fa179aaa000 rw-p 00000000 00:00 0
> 7fff4fa0b000-7fff4fa2d000 rw-p 00000000 00:00 0 [stack]
> 7fff4fb0e000-7fff4fb11000 r--p 00000000 00:00 0 [vvar]
> 7fff4fb11000-7fff4fb13000 r-xp 00000000 00:00 0 [vdso]
> ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0 [vsyscall]
In PyTorch 1.4 it shows:
> free(): invalid pointer
without other information.
## Environment
I'm using the official PyTorch Docker image to run my training. I tried both
1.3-cuda10.1-cudnn7-devel and 1.4-cuda10.1-cudnn7-devel (note that PyTorch
version older than 1.3 will not successfully run the second-order gradient, in
which case the grad return pl_lengths always have require_grad=False for some
reason.)
I’m using 4 Titan x (pascal) for data-parallel training.
## Additional context
|
## 🐛 Bug
Hello, I’m running a GAN model with some second-order regularization. The
training went well at the beginning and the loss decreased. However, after a
random period of time, ranging from 3000-8000 updates, the segfault occurs and
the training was terminated automatically.
## To Reproduce
run the following script and wait, the error will finally occur. In my case it
took 2 days to occur (total number fo iteration = 443298).
import torch
from torch.utils.data import Dataset, DataLoader
import torchvision.models as models
import torch.nn as nn
import torch.autograd as autograd
import torch.nn.functional as F
class data_gpu_prefetcher():
def __init__(self, loader, stop=False):
self.original_loader = loader
self.loader = iter(loader)
self.stream = torch.cuda.Stream()
self.stop = stop
self._count = 0
self.preload()
def preload(self):
self.next_input = next(self.loader, None)
if self.next_input is None and not self.stop:
self.loader = iter(self.original_loader)
self.next_input = next(self.loader)
with torch.cuda.stream(self.stream):
if isinstance(self.next_input, list):
self.next_input = [item.cuda(non_blocking=True) for item in self.next_input]
elif self.next_input is not None:
self.next_input = self.next_input.cuda(non_blocking=True)
def next(self):
self._count += 1
torch.cuda.current_stream().wait_stream(self.stream)
input = self.next_input
self.preload()
return input
def reset_count(self):
self._count = 0
def get_count(self):
return self._count
class RandomDataset(Dataset):
def __init__(self, size, length):
self.len = length
self.data = torch.randn(length, 3, size, size)
self.label = torch.randint(0, 10, (length,))
def __getitem__(self, index):
return self.data[index], self.label[index]
def __len__(self):
return self.len
class resnet_cls(nn.Module):
def __init__(self):
super(resnet_cls, self).__init__()
self.model = models.resnet50()
self.cls = nn.Linear(1000, 10)
self._count = 0
def forward(self, im, label, r1_reg=False):
im.requires_grad_(r1_reg)
score = self.model(im)
loss = F.nll_loss(F.log_softmax(self.cls(score), dim=1), target=label, reduction='none')
if r1_reg:
gradients = autograd.grad(
outputs=score.sum(),
inputs=im,
grad_outputs=None,
create_graph=True,
retain_graph=True,
only_inputs=True,
)[0]
gradients = gradients.view(gradients.size(0), -1)
gradient_norms = gradients.pow(2).sum(1)
loss += gradient_norms * 1
return loss
rand_loader = DataLoader(dataset=RandomDataset(224, 1000),
batch_size=32, shuffle=True)
train_loader_iterator = data_gpu_prefetcher(rand_loader, stop=False)
model = resnet_cls()
model.cuda()
model_para = nn.DataParallel(model)
opt = torch.optim.Adam(model.parameters(), lr=0.001, betas=(0., 0.99))
count = 0
while True:
r1_reg = count % 2 == 0
print(r1_reg)
count += 1
imgs, labels = train_loader_iterator.next()
loss = model_para(imgs, labels, r1_reg).mean()
opt.zero_grad()
loss.backward()
opt.step()
print(loss.item(), flush=True)
## Expected behavior
After some random number of updates, in my case 443298 iterations, the problem
occurs and training was terminated.
In PyTorch 1.3 it shows:
> Segmentation fault (core dumped)
It prints "True" for the value of "r1_reg" for this iteration, but the loss
for the forward pass didn't print out. No other information is shown.
## Environment
I'm using the official PyTorch Docker image to run my training. I used the
official image 1.3-cuda10.1-cudnn7-devel.
I’m using 4 Titan x (pascal) for data-parallel training.
## Additional context
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
The controlled/uncontrolled behavior is determined at the mount time.
## Current Behavior
The controlled/uncontrolled behavior is determined at each render.
## Steps to Reproduce (for bugs)
Look at the source code.
## Context
@kgregory Raised this issue in #9523. It's important to notice that the
`Input` component is the only component behaving this way. It's no the case
for: `input`, `Checkbox`, `Switch`, `Radio`, `ExpansionPanel` and `Tooltip`.
## Your Environment
Tech | Version
---|---
Material-UI | v1.0.0-beta.24
React | v16.2.0
|
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
The GridListTile should look the same when wrapped in a component as when it's
not wrapped.
## Current Behavior
Wrapping GridListTile in a component collapses it, it does not fill out up the
space it should.
## Steps to Reproduce (for bugs)
Here is a demo: https://stackblitz.com/edit/react-ershgc
Output here: https://react-ershgc.stackblitz.io/

## Context
1. I would like to be able to have onclick pass back the id of the clicked GridListTile without using a wrapper arrow function for the GridListTile onClick prop. Using a wrapper function created a new function on every re-render of the main component, which causes an update to each GridListTile.
2. Also having a wrapper component for GridListTile enables me to optimize renders with PureComponent/shouldComponentUpdate/onlyUpdateForKeys etc.
## Your Environment
Tech | Version
---|---
Material-UI | 1.0.0-beta.29
React | Latest
browser | Chrome, Mac, latest
I am not sure if StackBlitz deletes demo apps after some period of time so
pasting the code here too, just in case:
/*
Experiment with moving GridListTile to it's own component.
The reason for wanting this is to avoid using anonymous functions in the
GridListTile props. Generating functions makes every GridListTile update
on every update to 'list', shouldComponent update or PureComponent will
also re-render since the passed onClick function technically differs from
the previous one.
*/
import React, {Component} from 'react';
import {render} from 'react-dom';
import GridList, {GridListTile} from 'material-ui/GridList';
const list = ['id1', 'id2', 'id3'];
class App extends Component {
handleClick = (event, id) => {
console.log('id', id);
};
render() {
return (
<GridList cellHeight={192} cols={3}>
{list.map(id => (
<GridListTile
// An anonymous event handler is needed to be able to pass
// 'id' to the actual event handler
onClick={event => this.handleClick(event, id)}
key={id}
cols={1}
style={{background: 'red'}}
>
{id}
</GridListTile>
))}
{/*
Wrapping GridListTile in a separate component solves the event
handler issue, but this does also not render as expected
*/}
{list.map(id => <MyTile key={id} id={id} onClick={this.handleClick} />)}
</GridList>
);
}
}
class MyTile extends Component {
handleClick = event => {
// Passing the id without using an anonymous function in the JSX props
this.props.onClick(event, this.props.id);
};
render() {
return (
<GridListTile cols={1} style={{background: 'green'}} onClick={this.handleClick}>
{this.props.id}
</GridListTile>
);
}
}
render(<App />, document.getElementById('root'));
| 0 |
After upgrade from 2.4.10 to 2.5.6 I've got error
_Variable "welcome_title" does not exist ..._
config.yml
twig:
globals:
welcome_title: Welcome!
After investigation I've found difference in the
app/cache/dev/appDevDebugProjectContainer.php
**2.4.10**
...
$instance->addGlobal('app', $this->get('templating.globals'));
$instance->addGlobal('welcome_title', 'Welcome!');
return $instance;
**2.5.x** (I've tried few version: 2.5.1, 2.5.4, , 2.5.6)
only
...
$instance->addGlobal('app', $this->get('templating.globals'));
return $instance;
Bug?
Suggestions?
|
When rendering a _Date_ field type using a _Choice_ widget the value field
e.g. 1901 is correct but the text part is empty.
<option value="1900"></option>
<option value="1901"></option>
<option value="1902">1902</option>
This is most obvious when using a Birthday field as it by default starts 120
years ago.
Symfony 2.1.2
PHP 5.4.6
| 0 |
#### Summary
The user code usually expects a specific type for the data returned from a
request. currently however that metadata is lost and when using AxiosResponse
and AxiosPromise we have to do casting.
Here is how you could allow setting the type while not breaking backward
compatibility:
redefine the types:
export interface AxiosResponse<T = any> {
data: T;
status: number;
statusText: string;
headers: any;
config: AxiosRequestConfig;
}
export interface AxiosPromise<TData = any> extends Promise<AxiosResponse<TData>> {
}
export interface AxiosInstance {
/// ...redacted code
request<T = any>(config: AxiosRequestConfig): AxiosPromise<T>;
get<T = any>(url: string, config?: AxiosRequestConfig): AxiosPromise<T>;
delete<T = any>(url: string, config?: AxiosRequestConfig): AxiosPromise<T>;
head<T = any>(url: string, config?: AxiosRequestConfig): AxiosPromise<T>;
post<T = any>(url: string, data?: any, config?: AxiosRequestConfig): AxiosPromise<T>;
put<T = any>(url: string, data?: any, config?: AxiosRequestConfig): AxiosPromise<T>;
patch<T = any>(url: string, data?: any, config?: AxiosRequestConfig): AxiosPromise<T>;
}
Now you can do this and get an error
let promise = axios.request<{email:string}>(config);
promise.then(res=>{
console.log('The user email is ' + res.data.eail); /// ERROR
});
I can look at the actual code (not the d.ts files) and make a pull request if
you want.
#### Context
* axios version: 1.6
* typescript >= 2.3 (for the default generic type)
|
Hello,
As axios automatically converts JSON responses could we get something like
this:
get<T>(url: string, config?: AxiosRequestConfig): AxiosPromise<T>;
export interface AxiosPromise<T> extends Promise<AxiosResponse<T>> {
}
export interface AxiosResponse<T> {
data: T;
status: number;
statusText: string;
headers: any;
config: AxiosRequestConfig;
}
| 1 |
`kubectl get <resource_name>` prints all the resource info as a table by
default. Sometimes there is just too much data and one row of the table is not
able to fit in one line. It takes time to comprehend that entire output. It
may be particularly useful to restrict columns by name. So for example in
context of resource pod, something like..
`kubectl get pods --filter names`
that would only pod names instead of an entire table containing all columns
like POD, HOST, STATUS, CREATED, etc. would be particularly useful.
Exact interface for this might need to be discussed and planned on a wiki.
(I'll try to update this issue with a more detailed suggest if I can think of
something)
|
I am working with Huawei PaaS team on Ubernetes Phase I implementation, and
focus on Uberenetes scheduler. I created a sequence diagram to illustrate the
process of ubernetes scheduler and want to start some discussions.
**A general principle design spec**
Currently clusterSelector (implemented as a LabelSelector) only supports a
simple list of acceptable clusters. Workloads will be evenly distributed on
these acceptable clusters in phase one.
* Need to make agreement that "evenly distributed" simply means cloning Federation-RC as sub-RC, and assign target clusters, replicas will not be cared in Federation level, that is sub-RC will keep the same value for replicas of Federation RC.
**Supported cases in phase I**
#1. User specifies clusterSelector in RC Spec, e.g. The following RC is
created in Federation-1 with 3 running clusters a, b and c.
apiVersion: v1
kind: ReplicationController
metadata:
name: nginx-controller
spec:
replicas: 5
...
clusterSelector:
name in (a, b)
The result will be, if scheduling succeeds, the two sub-RC are created, each
RC has 5 replicas, one RC is assigned to cluster a, the other RC is assigned
to cluster b.
#2. User does not specify clusterSelector in RC Spec, e.g. The following RC is
created in Federation-1 with 3 running clusters a, b and c.
* need to discuss should we support this case?
apiVersion: v1
kind: ReplicationController
metadata:
name: nginx-controller
spec:
replicas: 5
...
As a result, the RC will be cloned to 3 sub-RC and assigned to all running
clusters a, b, and c in Federation-1.
**Code logics, details in the diagram**
* most of code of package plugin/cmd/kube-scheduler could be reused, minor changes include package import to ube-scheduler, and remove some unnecessary logics.
* The design of ube-scheduler will be created based on k8s scheduler code, but with some logical changes, highlighted in RED in the sequence diagram.
1. There is no need for binding pod, so Ubernetes scheduler will not check scheduled Federation RC, no code like scheduledPodPopulator
2. Only default predicate and priority plugins are supported in P1
3. No need to watch Federation Services, as there is no need for spreading sub-RC of same Federation Service
4. About scheduler plugins
a). There will be two predicates plugins, MatchClusterSelector and
RCFitsResources
b).
From above cases description, I did not see requirement to do prioritize, so
only a dummy priority plugin will be created.
5. The plugin will return a list of clusters, the Federation RC will be cloned as sub-RC and each element of the cluster list will be assigned as target.
6. As #1 described, no need for cluster binding, scheduling done after step 5.
**Dependencies**
Need definition of cluster, federation RC and sub RC
* Cluster entity is already in WIP
* Federation RC and sub RC definition need to be settled ASAP.
* Need api of accessing new defined entities in pkg/client/cache/listers.go

| 0 |
trait Foo {
type Bar;
fn bar(&self) -> Option<<Self as Foo>::Bar>;
}
Gives:
<anon>:7:32: 7:34 error: expected `;` or `{`, found `<<`
<anon>:7 fn bar(&self) -> Option<<Self as Foo>::Bar>;
Adding a space so it becomes `< <` fixes it.
|
### STR
#![feature(associated_types)]
trait Trait {
type Type;
// OK
fn method() -> <Self as Trait>::Type;
// Can't parse
fn method() -> Box<<Self as Trait>::Type>;
}
fn main() {}
### Output
ai.rs:9:23: 9:25 error: expected `;` or `{`, found `<<`
ai.rs:9 fn method() -> Box<<Self as Trait>::Type>;
### Version
rustc 0.12.0-pre (9508faa22 2014-09-17 23:45:36 +0000)
cc @pcwalton
| 1 |
I have rules set like this
rules = [
Rule(LinkExtractor(
allow= '/topic/\d+/organize$',
restrict_xpaths = '//div[@id= "zh-topic-organize-child-editor"]'
),
process_request='request_tagPage', callback = "parse_tagPage", follow = True)
]
`request_tagPage()` is used to add cookie and headers to a request. I found
that once I used `process_request` parameter, the callback function
`parse_tagPage()` isn't get called.
Then I manually set `parse_tagPage()` as callback function in the
`request_tagPage()`. Now when response is returned, `parse_tagPage()` is
called but the spider only crawls the links from the `start_urls`
My full spider is here:
class ZhihuSpider(CrawlSpider):
name = "zhihu"
BASE_URL = "www.zhihu.com"
get_xsrf_url = "https://www.zhihu.com" # url to visit first to get xsrf information
login_url = "https://www.zhihu.com/login/email" # url to visit to login and get valid cookie
start_urls = [
"https://www.zhihu.com/topic/19776749/organize",
]
headers = {
"Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
"Accept-Encoding": "gzip,deflate",
"Accept-Language": "en-US,en;q=0.8,zh-TW;q=0.6,zh;q=0.4",
"Connection": "keep-alive",
"Content-Type":" application/x-www-form-urlencoded; charset=UTF-8",
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36",
"Referer": "http://www.zhihu.com"
}
# Rules to enforce using cookie every time request a tag information page
#restrict_xpaths = '//div[@id= "zh-topic-organize-page-children"]/ul/li/ul[@class= "zm-topic-organize-list"]',
rules = [
Rule(LinkExtractor(
allow= '/topic/\d+/organize$',
restrict_xpaths = '//div[@id= "zh-topic-organize-child-editor"]'
),
process_request='request_tagPage', callback = "parse_tagPage", follow = True)
# 发现match rule的页面会用parse_tagPage去处理
# follow = True,这样页面被parse_tagPage处理后,还是会被抓内部的link去继续crawling
] # 使用list,rules就自动成为iterable
# 用来保存tag结构的大dictionary,会在item pipeline中得到更新
d = {"「根话题」":{}}
# 用来保存每一条tag的list,会在item pipeline中得到更新
l = []
# Function to get the login response; Only called once
# Scrapy刚启动时会call这个函数,函数的目的是拿到xsrf信息
def start_requests(self):
print("---"*5)
print("start to request for getting the hidden info and cookie")
print("---"*5)
return [Request(self.get_xsrf_url, headers= self.headers, meta= \
{"cookiejar":1}, callback= self.post_login)]
# Function to post a login form, notice it gets the xsrf string first before send the form
# 这个函数会提取只有试图login知乎时才会得到的xsrf信息来构建一个登录form,然后得到登录成功的cookie
def post_login(self, response):
print("---"*5)
print("preparing login...")
print("---"*5)
# Get the xsrf string
xsrf = Selector(response).xpath('//div[@data-za-module="SignInForm"]//form//input[@name="_xsrf"]/@value').extract()[0]
return FormRequest(self.login_url,
meta = {"cookiejar": response.meta["cookiejar"]},
headers = self.headers,
# create form
formdata = {
"_xsrf": xsrf,
"password": "zhihu_19891217",
"email": "skywalker.ljc@gmail.com",
"remeber_me": "true",
},
callback = self.after_login,
)
# After login, this function request urls in the start_urls, initiate the whole process
# 这个函数会给登录成功的cookie给start_urls, 这样start_urls也会带着cookie去request
def after_login(self, response):
for url in self.start_urls:
# No need to callback since the rules has set the process_request parameter,
# which specifies a function to send the actual request.
yield Request(url, meta = {"cookiejar": 1}, headers = self.headers, \
callback = self.parse, dont_filter = True)
# self.parse is the default parser used by CrawlSpider to apply rules
# dont_filter = True so that this start_url request won't be filtered
print("A start_url has been requested:", url)
print("---"*5)
print("All start_urls cookies are have been requested!")
print("---"*5)
# Function to request tag information page
# 这个函数是为了让scrapy爬后续的页面时也会带上cookie;
# CrawlSpider会首先用最低级的Request()去形成基础的request, 但是基础request无法通过zhihu.com反爬虫机制
# 所以在rules中要求spider再用这个函数加工基础request成带cookie和header的能通过zhihu反爬虫机制的request
def request_tagPage(self, request):
return Request(request.url, meta = {"cookiejar": 1}, \
headers = self.headers, callback=self.parse_tagPage)
# When use process_request, the callback in Rule object won't work, has to assign a callback here
# Finally, the function to actually parse the tag information page
# 这个函数才是真正接受知乎的tag结构页面并且parse页面的
# 这个函数还会根据每一个tag的信息,修改spider用来保存tag structure的dictionary
def parse_tagPage(self, response):
print("---"*5)
print("parse_tagPage is called!")
print("---"*5)
sel = Selector(response)
# tag的名字和链接
name = sel.xpath('//h1[@class= "zm-editable-content"]/text()').extract()[0]
relative_link = sel.xpath('//div[@class= "zm-topic-topbar"]//a/@href').extract()[0]
# tag的parent
parents = sel.xpath('//div[@id= "zh-topic-organize-parent-editor"]//a[@class= "zm-item-tag"]/text()').extract()
parents = [s.replace("\n", "") for s in parents]
# tag的children
children = sel.xpath('//div[@id= "zh-topic-organize-child-editor"]//a[@class= "zm-item-tag"]/text()').extract()
children = [s.replace("\n", "") for s in children]
# 新建一个tag item
item = {}
item["name"] = name
item["relative_link"] = relative_link
item["parents"] = parents
item["children"] = children
self.l.append(item)
|
Was trying to send email through scrapy.
Settings I used are:-
#settings for mail
MAIL_FROM = '...@gmail.com'
MAIL_HOST = 'smtp.gmail.com'
MAIL_PORT = 587
MAIL_USER = '...@gmail.com'
MAIL_PASS = 'xxxxxxx'
MAIL_TLS = True
MAIL_SSL = False
Email was sent successfully. But the following exception is thrown;
2019-04-25 18:10:15 [twisted] CRITICAL: Unhandled Error
Traceback (most recent call last):
File "/lib/python3.6/site-packages/twisted/python/log.py", line 103, in
callWithLogger
return callWithContext({"system": lp}, func, *args, **kw)
File "/python3.6/site-packages/twisted/python/log.py", line 86, in
callWithContext
return context.call({ILogContext: newCtx}, func, *args, **kw)
File "/lib/python3.6/site-packages/twisted/python/context.py", line 122, in
callWithContext
return self.currentContext().callWithContext(ctx, func, *args, **kw)
File "/lib/python3.6/site-packages/twisted/python/context.py", line 85, in
callWithContext
return func(*args,**kw)
\--- ---
File "/lib/python3.6/site-packages/twisted/internet/posixbase.py", line 614,
in _doReadOrWrite
why = selectable.doRead()
File "/lib/python3.6/site-packages/twisted/internet/tcp.py", line 243, in
doRead
return self._dataReceived(data)
File "/lib/python3.6/site-packages/twisted/internet/tcp.py", line 249, in
_dataReceived
rval = self.protocol.dataReceived(data)
File "/lib/python3.6/site-packages/twisted/protocols/tls.py", line 330, in
dataReceived
self._flushReceiveBIO()
File "/lib/python3.6/site-packages/twisted/protocols/tls.py", line 300, in
_flushReceiveBIO
self._flushSendBIO()
File "/lib/python3.6/site-packages/twisted/protocols/tls.py", line 252, in
_flushSendBIO
bytes = self._tlsConnection.bio_read(2 ** 15)
builtins.AttributeError: 'NoneType' object has no attribute 'bio_read'
| 0 |
I don't see what version of tsc you support so this might be a version
problem:
error.ts
/// <reference path="node.d.ts" />
> tsc error.ts
> node.d.ts(101,36): error TS1005: ';' expected.
> node.d.ts(690,23): error TS1005: ';' expected.
> node.d.ts(702,23): error TS1005: ';' expected.
> node.d.ts(1228,36): error TS1005: ';' expected.
> tsc
> Version 1.0.3.0
|
A definition file for the small color library chroma.js would be quite useful.
| 0 |
**Migrated issue, originally created by Michael Bayer (@zzzeek)**
ML thread https://groups.google.com/forum/?hl=en#!topic/sqlalchemy/t8a6SLTYrIc
even though all the docs show string names being sent to insert.values(), you
can use columns as keys inside a dictionary as well, just as we need to do
with UPDATE some times. It doesn't work for multi-values though:
from sqlalchemy import MetaData, String, Integer, Table, Column
from sqlalchemy.dialects.postgresql.base import PGDialect
m = MetaData()
t = Table('mytable', m,
Column('int_col', Integer),
Column('str_col', String),
)
print("Case 1")
print(t.insert().values(
{t.c.str_col:"string", t.c.int_col:2}
).compile(dialect=PGDialect()))
print("Case 2")
print(t.insert().values(
[
{t.c.str_col:"str", t.c.int_col:1}
]
).compile(dialect=PGDialect()))
print("Case 3")
print(t.insert().values(
[
{t.c.str_col:"string", t.c.int_col:1},
{t.c.str_col:"text", t.c.int_col:2}
]
).compile(dialect=PGDialect()))
gerrit at https://gerrit.sqlalchemy.org/#/c/zzzeek/sqlalchemy/+/626 fixes.
|
**Migrated issue, originally created by Michael Bayer (@zzzeek)**
even though columns-as-keys is de-emphasized in the documentation for insert,
it does work, except for multi-values:
from sqlalchemy import *
from sqlalchemy.dialects import sqlite
t = table('t', column('a'), column('b'))
print t.insert().values([
{"a": 1, "b": 2},
{"a": 2, "b": 2},
{"a": 3, "b": 2}
]).compile(dialect=sqlite.dialect())
print t.insert().values([
{t.c.a: 1, t.c.b: 2},
{t.c.a: 2, t.c.b: 2},
{t.c.a: 3, t.c.b: 2},
]).compile(dialect=sqlite.dialect())
output:
#!
INSERT INTO t (a, b) VALUES (?, ?), (?, ?), (?, ?)
Traceback (most recent call last):
...
File "/home/classic/dev/sqlalchemy/lib/sqlalchemy/sql/crud.py", line 419, in _process_multiparam_default_bind
"a Python-side value or SQL expression is required" % c)
sqlalchemy.exc.CompileError: INSERT value for column t.a is explicitly rendered as a boundparameter in the VALUES clause; a Python-side value or SQL expression is required
| 1 |
* Electron version: 1.3.5
* Operating system: Windows 10
It seems like certain menu accelerators don't work when the app is running in
Windows, but they do work in macOS. In the example below, none of the
accelerators work in Windows, but clicking the menu item does log the correct
message. But `CmdOrCtrl+B` _does_ work in Windows.
Here's the code:
const electron = require('electron');
const app = electron.app;
const Menu = electron.Menu;
const BrowserWindow = electron.BrowserWindow;
app.on('ready', function() {
let mainWindow = new BrowserWindow({width: 800, height: 600});
mainWindow.loadURL(`file://${__dirname}/index.html`);
let menu = Menu.buildFromTemplate([
{
label: 'Edit',
submenu: [
{
label: 'Cut',
accelerator: 'CmdOrCtrl+X',
click: () => console.log('cut'),
},
{
label: 'Copy',
accelerator: 'CmdOrCtrl+C',
click: () => console.log('copy'),
},
{
label: 'Paste',
accelerator: 'CmdOrCtrl+V',
click: () => console.log('paste'),
},
{
label: 'Select All',
accelerator: 'CmdOrCtrl+A',
click: () => console.log('select all'),
},
],
},
]);
Menu.setApplicationMenu(menu);
});
app.on('window-all-closed', function() {
app.quit();
});
|
* Electron version: All
* Operating system: Windows vs Mac
When I define a custom undo behaviour I found it become different between
Windows and Mac devices.
This is what I did:
// the main menu I overwrite:
let menuTemplate = [
// Edit
{
label: 'Edit',
submenu: [
{
label: 'Undo',
accelerator: 'CmdOrCtrl+Z',
click () {
console.log('Undo');
}
},
{
label: 'Redo',
accelerator: 'Shift+CmdOrCtrl+Z',
click () {
console.log('Redo');
}
},
]
}
];
As you can see, I overwrite the Undo behaviour and use the same shortcuts from
the Browser. When I press `ctrl/cmd+z` on Mac, things goes fine, the menu
click go first and take over it. But in Windows, it seems like the original
`webContents.undo()` will be executed first, if the undo stack have states
(which means you should typing some letters on a `<input>` first).
This makes me have no chance to overwrite the undo for Windows platform. Is
this expect in Windows? If we can not change the menu behaviour, is there a
way to disable/stop `webContents.undo` when pressing `ctrl+z` so that the Main
Menu can receive it?
| 1 |
### Preflight Checklist
* I have read the Contributing Guidelines for this project.
* I agree to follow the Code of Conduct that this project adheres to.
* I have searched the issue tracker for a feature request that matches the one I want to file, without success.
### Electron Version
13.1.17
### What operating system are you using?
Windows
### Operating System Version
win10 20H2
### What arch are you using?
x64
### Last Known Working Electron version
9.0.0
### Expected Behavior
const options = {
deviceName: "233B",
silent: true,
printBackground: true,
margins: {
marginsType: 'custom',
top: 0,
bottom: 0,
left: 0,
right: 0
},
pageSize: {
width: curTagBaseSet.printSet.width * 1000 - 1000,
height: curTagBaseSet.printSet.height * 1000,
}
}
### Actual Behavior
webview print is not working with deviceName
### Testcase Gist URL
_No response_
### Additional Information
_No response_
|
### Preflight Checklist
* I have read the Contributing Guidelines for this project.
* I agree to follow the Code of Conduct that this project adheres to.
* I have searched the issue tracker for a feature request that matches the one I want to file, without success.
### Electron Version
12.0.5,11.3.0,12.0.7
### What operating system are you using?
Windows
### Operating System Version
Windows 10
### What arch are you using?
x64
### Last Known Working Electron version
12.0.7
### Expected Behavior
Window Set printer dynamically
code:
let printCon = { silent: true, printBackground: false, ...printOptions }
log.log('url = ' + strUrl + '\n printCon=' + JSON.stringify(printCon))
try {
// did-finish-load
printWindow.webContents.on('did-finish-load', () => {
log.log('did-finish-load')
printWindow.webContents.print(
printCon,
(success, failureReason) => {
log.log('success=' + success + ' failureReason =' + failureReason)
if (callback) {
callback(success ? null : failureReason)
}
}
)
})
### Actual Behavior
Windows Error: webContents.print(): Invalid deviceName provided!
mac is working

### Testcase Gist URL
_No response_
### Additional Information
code:
let printCon = { silent: true, printBackground: false, ...printOptions }
log.log('url = ' + strUrl + '\n printCon=' + JSON.stringify(printCon))
try {
// did-finish-load
printWindow.webContents.on('did-finish-load', () => {
log.log('did-finish-load')
printWindow.webContents.print(
printCon,
(success, failureReason) => {
log.log('success=' + success + ' failureReason =' + failureReason)
if (callback) {
callback(success ? null : failureReason)
}
}
)
})
| 1 |
After upgrading to babel 7, the command to transpile a directory, that worked
well in the same project with Babel 6 is failing with a `Maximum call stack
size exceeded` after transpiling **369 files**.
The command executed is:
babel --copy-files --ignore '**/__tests__/*.js,node_modules' --out-dir ./functions ./modules
When executed on a subset of the the folder, it works well. It seems to be an
issue with the number of files being transpiled.
### Stack trace
(The stack trace changes with the number of files being transpilled).
RangeError: Maximum call stack size exceeded
at normalizeStringPosix (path.js:78:30)
at Object.normalize (path.js:1193:12)
at Object.join (path.js:1228:18)
at getDest (<project>/node_modules/@babel/cli/lib/babel/dir.js:57:26)
at <project>/node_modules/@babel/cli/lib/babel/dir.js:70:20
at write (<project>/node_modules/@babel/cli/lib/babel/dir.js:30:14)
at handleFile (<project>/node_modules/@babel/cli/lib/babel/dir.js:66:5)
at sequentialHandleFile (<project>/node_modules/@babel/cli/lib/babel/dir.js:89:5)
at <project>/node_modules/@babel/cli/lib/babel/dir.js:94:9
at <project>/node_modules/@babel/cli/lib/babel/dir.js:75:14
### Babel Configuration
{
"presets": [
["@babel/env", {
"targets": {
"node": "6.11.5"
}
}],
"@babel/flow",
"@babel/react"
],
"plugins": [
"@babel/plugin-proposal-object-rest-spread",
"@babel/plugin-proposal-class-properties",
"transform-decorators",
[ "module-resolver", {
"alias": {
"@gitbook": "./modules",
"aphrodite": "aphrodite/no-important"
},
"cwd": "babelrc"
}]
],
"ignore": [
"modules/node_modules/**/*",
"modules/styleguide/icons/*.js"
]
}
|
### Babel/Babylon Configuration (.babelrc, package.json, cli command)
.babelrc
{
"presets": [
["@babel/env", {
"targets": { "node": "6.11.5" }
}]
],
"plugins": ["@babel/plugin-proposal-object-rest-spread"]
}
CLI Command
`babel 'functions/src' --out-dir 'functions/bundle' --copy-files --ignore
'functions/src/node_modules' `
### Expected Behavior
After executing this command, resulting files were correctly copied at
functions/bundle folder using beta.37
### Current Behavior
With beta.38 and beta.39, following error arises after processing last file
RangeError: Maximum call stack size exceeded
at /Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:71:56
at write (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:30:14)
at handleFile (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:66:5)
at sequentialHandleFile (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:89:5)
at /Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:94:9
at /Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:75:14
at write (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:30:14)
at handleFile (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:66:5)
at sequentialHandleFile (/Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:89:5)
at /Users/karlas/myapp/node_modules/@babel/cli/lib/babel/dir.js:94:9
### Possible Solution
Check `26e4911`#diff-f93fbf6dce06b5cbf6cc65f5f7dac101
This is the last commit where this dir.js file was modified. Tags are beta.38
and beta.39 (the affected versions)
### Context
Just transpiling about 20 files to be used on Firebase Functions
### Your Environment
software | version(s)
---|---
Babel | 7.0.0-beta.39
node | 8.9.3
npm | 5.5.1
yarn | 1.3.2
Operating System | macOS High Sierra
| 1 |
I'm just reporting this error I got when using atom.
Error report:
[Enter steps to reproduce below:]
1. I opened a folder in atom with the windows context menu item
2. I right clicked on a a folder in the sidebar and up pops the error.
**Atom Version** : 0.210.0
**System** : Microsoft Windows 8.1 Pro with Media Center
**Thrown From** : Atom Core
### Stack Trace
Uncaught Error: Cannot find module './context-menu'
Error: Cannot find module './context-menu'
at Function.Module._resolveFilename (module.js:328:15)
at Function.Module._load (module.js:270:25)
at Module.require (module.js:357:17)
at require (module.js:376:17)
at BrowserWindow.
(C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\src\browser\atom-
window.js:152:27)
at emitOne (events.js:77:13)
at BrowserWindow.emit (events.js:166:7)
at callFunction
(C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\browser\lib\rpc-
server.js:116:18)
at EventEmitter.
(C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\browser\lib\rpc-
server.js:208:14)
at emitMany (events.js:108:13)
At C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\renderer\api\lib\remote.js:77
Error: Cannot find module './context-menu'
Error: Cannot find module './context-menu'
at Function.Module._resolveFilename (module.js:328:15)
at Function.Module._load (module.js:270:25)
at Module.require (module.js:357:17)
at require (module.js:376:17)
at BrowserWindow.<anonymous> (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\src\browser\atom-window.js:152:27)
at emitOne (events.js:77:13)
at BrowserWindow.emit (events.js:166:7)
at callFunction (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\browser\lib\rpc-server.js:116:18)
at EventEmitter.<anonymous> (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\browser\lib\rpc-server.js:208:14)
at emitMany (events.js:108:13)
at metaToValue (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\renderer\api\lib\remote.js:77:15)
at BrowserWindow.RemoteMemberFunction [as emit] (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\atom.asar\renderer\api\lib\remote.js:111:26)
at ContextMenuManager.module.exports.ContextMenuManager.showForEvent (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\src\context-menu-manager.js:170:31)
at HTMLDocument.<anonymous> (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\src\window-event-handler.js:150:33)
at HTMLDocument.handler (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\src\space-pen-extensions.js:112:34)
at HTMLDocument.jQuery.event.dispatch (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\node_modules\space-pen\vendor\jquery.js:4681:9)
at HTMLDocument.elemData.handle (C:\Users\Philip\AppData\Local\atom\app-0.210.0\resources\app.asar\node_modules\space-pen\vendor\jquery.js:4359:46)
### Commands
### Config
{
"core": {
"themes": [
"one-light-ui",
"one-light-syntax"
],
"disabledPackages": [
"atom-sharp",
"autocomplete-paths",
"linter"
]
},
"editor": {
"invisibles": {}
}
}
### Installed Packages
# User
atom-ternjs, v0.5.30
autoclose-html, v0.18.0
build, v0.38.0
javascript-snippets, v1.0.0
language-fsharp, v0.8.4
language-rust, v0.4.3
linter-rust, v0.1.0
racer, v0.14.1
scheme-syntax, v0.4.0
simple-drag-drop-text, v0.2.3
terminal-status, v1.6.8
# Dev
No dev packages
|
I right-clicked on a folder in the tree view
**Atom Version** : 0.194.0
**System** : Windows 7 Entreprise
**Thrown From** : Atom Core
### Stack Trace
Uncaught Error: Cannot find module './context-menu'
Error: Cannot find module './context-menu'
at Function.Module._resolveFilename (module.js:328:15)
at Function.Module._load (module.js:270:25)
at Module.require (module.js:357:17)
at require (module.js:376:17)
at BrowserWindow.
(C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\src\browser\atom-
window.js:152:27)
at emitOne (events.js:77:13)
at BrowserWindow.emit (events.js:166:7)
at callFunction
(C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\browser\lib\rpc-
server.js:116:18)
at EventEmitter.
(C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\browser\lib\rpc-
server.js:208:14)
at emitMany (events.js:108:13)
At C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\renderer\api\lib\remote.js:77
Error: Cannot find module './context-menu'
Error: Cannot find module './context-menu'
at Function.Module._resolveFilename (module.js:328:15)
at Function.Module._load (module.js:270:25)
at Module.require (module.js:357:17)
at require (module.js:376:17)
at BrowserWindow.<anonymous> (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\src\browser\atom-window.js:152:27)
at emitOne (events.js:77:13)
at BrowserWindow.emit (events.js:166:7)
at callFunction (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\browser\lib\rpc-server.js:116:18)
at EventEmitter.<anonymous> (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\browser\lib\rpc-server.js:208:14)
at emitMany (events.js:108:13)
at metaToValue (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\renderer\api\lib\remote.js:77:15)
at BrowserWindow.RemoteMemberFunction [as emit] (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\atom.asar\renderer\api\lib\remote.js:111:26)
at ContextMenuManager.module.exports.ContextMenuManager.showForEvent (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\src\context-menu-manager.js:170:31)
at HTMLDocument.<anonymous> (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\src\window-event-handler.js:149:33)
at HTMLDocument.handler (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\src\space-pen-extensions.js:112:34)
at HTMLDocument.jQuery.event.dispatch (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\node_modules\space-pen\vendor\jquery.js:4681:9)
at HTMLDocument.elemData.handle (C:\Users\jbrichardet\AppData\Local\atom\app-0.194.0\resources\app.asar\node_modules\space-pen\vendor\jquery.js:4359:46)
### Commands
-4:55.5.0 editor:checkout-head-revision (atom-text-editor.editor.is-focused)
2x -3:41.6.0 window:focus-pane-on-right (atom-text-editor.editor)
-3:17.5.0 application:add-project-folder (ol.tree-view.full-menu.list-tree.has-collapsable-children.focusable-panel)
-2:47.4.0 editor:newline (atom-text-editor.editor.is-focused)
-2:38.2.0 core:cut (atom-text-editor.editor)
-2:36.5.0 core:paste (atom-text-editor.editor.is-focused)
-2:26.6.0 core:save (atom-text-editor.editor.is-focused)
-2:20.6.0 core:move-down (atom-text-editor.editor)
-2:20.4.0 autocomplete-plus:confirm (atom-text-editor.editor)
-2:15.8.0 core:save (atom-text-editor.editor)
-2:08.7.0 core:copy (atom-text-editor.editor.is-focused)
-2:01.2.0 core:paste (atom-text-editor.editor.is-focused)
-1:59.7.0 core:save (atom-text-editor.editor.is-focused)
-1:52.2.0 core:paste (atom-text-editor.editor.is-focused)
-1:51.6.0 core:save (atom-text-editor.editor.is-focused)
-1:30.6.0 core:backspace (atom-text-editor.editor)
### Config
{
"core": {
"ignoredNames": [
"node_modules"
],
"themes": [
"atom-dark-ui",
"seti-syntax"
],
"disabledPackages": [
"Tern"
],
"projectHome": "Y:\\app-tfoumax"
},
"editor": {
"invisibles": {},
"softWrap": true,
"showIndentGuide": true
}
}
### Installed Packages
# User
autocomplete-plus, v2.12.0
autocomplete-snippets, v1.2.0
javascript-snippets, v1.0.0
jshint, v1.3.5
language-ejs, v0.1.0
linter, v0.12.1
pretty-json, v0.3.3
save-session, v0.14.0
Search, v0.4.0
seti-syntax, v0.4.0
# Dev
No dev packages
| 1 |
I have tested this input code in chrome and your compiled code also and
discovered an issue. I have described that issue in the comments in the
following code.
class B extends RegExp {
constructor(source, flags) {
super(source, flags);
}
getName() {
return 'name';
}
}
new B('test', 'i').test('TeSt') // is true as should be
new B('test', 'i').test('TeSt') // throws error but should be "name"
|
Current Babel transform, when it comes to call the parent
function _possibleConstructorReturn(self, call) {
if (!self) {
throw new ReferenceError("this hasn't been initialised - super() hasn't been called");
}
return call && (typeof call === "object" || typeof call === "function") ? call : self;
}
It's a way too poor implementation.
If we take the current basic ES6 compat syntax:
class List extends Array {}
We'll realize babel does a bad job.
var l = new List();
l instanceof List; // false
The reason is simple: Babel replace the returns and exit, without caring about
userland expectations.
This is how above basic extend should desugar:
function List() {
return Object.setPrototypeOf(
Array.apply(this, arguments),
List.prototype
);
}
It's a very ad-hoc case for the initial example that currently fails, but it's
good enough to understand that inheriting the prototype is the least of the
problem.
Indeed, we have 3 ways to do that within a transpiled code:
// losing the initial prototype
// for polyfilled ES5+ or lower
List.prototype = Object.create(
Array.prototype,
{constructor: {
configurable: true,
writable: true,
value: List.prototype
}}
);
// using a cleaner way
// for ES5+ compliant targets
Object.setPrototypeOf(
List.prototype,
Array.prototype
);
// using a cleaner way
// that works well with
// partially polyfilled .setPrototypeOf
List.prototype = Object.setPrototypeOf(
List.prototype,
Array.prototype
);
Solved the inheritance bit, considering Babel also set the prototype of each
constructor,
we need to address cases where a `super` call might " _upgrade_ " the current
context, like it is for `HTMLELement` or exotic native objects.
// native ES6 static example
class List extends Array {
constructor(a, b, c) {
super(a, b);
this.push(c);
}
}
Above case should desugar to something like the follwoing:
function List(a, b, c) {
// the super bit
var self = Object.setPrototypeOf(
Array.call(this, a, b),
List.prototype
);
// the rest with swapped context
self.push(c);
// at the end
return self;
}
which is also ad-hoc example code for the previous example.
Considering a transpiler will get the arguments part easily right, this is how
previous case could be generically transpiled for arguments used in both
constructors:
// to make it as generic as possible
function Child() {
// the super call bit desugar to ...
var
instance = Object.getPrototypeOf(Child)
.apply(this, arguments),
type = instance ? typeof instance : '',
// if Parent overwrote its default return
self = (type === 'object' || type === 'function') ?
// upgrade the instance to reflect Child constructor
Object.setPrototypeOf(instance, Child.prototype) :
// otherwise use the current context as is
this
;
// at this point, the rest of the constructor
// should use `self` instead of `this`
self.push(c);
// and return `self` reference at the end
return self;
}
The last problem is that modern syntax would use Reflect to create any sort of
object, instead of old, ES3 friendly, `.call` or `.apply` way.
Following a past, present, and even future proof approach:
var
reConstruct = typeof Reflect === 'object' ?
Reflect.construct :
function (Parent, args, Child) {
return Parent.apply(this, args);
}
;
function Child() {
// the super call bit
var
instance = reConstruct.call(
this,
Object.getPrototypeOf(Child),
arguments,
Child
),
type = instance ? typeof instance : '',
self = (type === 'object' || type === 'function') ?
Object.setPrototypeOf(instance, Child.prototype) :
this
;
// the rest of the constructor body
self.push(c);
// at the end or instead of empty returns
return self;
}
Above solution would work with userland, exotic, or DOM constructors, and in
both native and transpiled engines.
| 1 |
when I click the `reset`, the input value was empty.
but the v-model `message` still was old value.
Code here:
https://jsfiddle.net/50wL7mdz/9404/
|
It clears the inputs but data is still there
http://jsfiddle.net/5a6z6xmk/
| 1 |
### System info
* Playwright Version:1.32.2
* Operating System: linux
* Browser: [All, Chromium, Firefox, WebKit]
* Other info:
### Source code
docker-compose up --exit-code-from e2e--build e2e
**Steps**
* Run a test in a container in circleCI
**Expected**
Delay exit code after the report generation might solve the issue
**Actual**

In order to show step failure we have to pass --exit-code-from flag, but this
flag will case to terminate the container right away once test is done and
playwright return exit code 1, this will case container to terminate before
report generating
|
### System info
* Playwright Version: [v1.0.0]
* Operating System: [Windows 10]
* Browser: [Chromium]
* Other info:
### Source code
* I provided exact source code that allows reproducing the issue locally.
**Link to the GitHub repository with the repro**
[https://github.com/your_profile/playwright_issue_title]
or
**Config file**
readonly msgContainer: Locator;
this.msgContainer = page
.locator(
".ContextContainer.activeTab #MessagesContainerId
.scil_label.messageHeaderValue.selectable.messageEmail"
)
.first();
await expect(this.msgContainer).toHaveText(global.grabEmail);
**Steps**
* [Run the test]
* [...]
**Expected**
The text of msgContainer should grabbed as as it is.
**Actual**
The value of msgContainer took grabbed with some additional '/'. The real
value is ""Spider Man" spider.man@gmail.com" . But grabbed as ""Spider Man"
spider.man@gmail.com"
| 0 |
class FooError extends Error {
constructor(message:string) {
super(message);
}
}
console.log((new FooError('foo')).message); // <nothing>
console.log((new Error('bar')).message); // bar
Am I missing something obvious or is there a bug with 1.8.9?
|
Test case:
myError.ts
class MyError extends Error {
constructor(message: string) {
super(message);
}
}
let error = new MyError('Error Message');
console.log(error.message);
tsconfig.json
{
"compilerOptions": {
"target": "ES5",
"module": "commonjs"
}
}
compile and run using node:
node myError.js
Observe: nothing is printed to the console. Debugging it shows that the
message passed to super is not recorded. Something seems very strange with the
prototype chain in this case.
I was under the impression from #3516 that this should work now even if
generating ES5 code. If not possible for ES5 then the compiler should produce
an error.
| 1 |
"var" is not more highlighted in blue in javascript file!
|
Windows 10 is half baked one IE and now this editor.
I was surprised with the built in extensions, but I'm feeling terrible after
experiencing this issue.
I was working with this editor, and suddenly my system became very slow and
I'm forced to shut down my system. After the restart my current working file
(see the attachment), got changed in to some other encoding. I lost my 3 hours
of work :(.
* Bala
global.txt
| 0 |
**Migrated issue, originally created by Michael Bayer (@zzzeek)**
from sqlalchemy import Integer, Column, ForeignKey
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class A(Base):
__tablename__ = 'a'
id = Column(Integer, primary_key=True)
class AAbs(A):
__abstract__ = True
class B1(A):
__tablename__ = 'b1'
id = Column(ForeignKey('a.id'), primary_key=True)
class B2(AAbs):
__tablename__ = 'b2'
id = Column(ForeignKey('a.id'), primary_key=True)
assert B1.__mapper__.inherits is A.__mapper__ # passes
assert B2.__mapper__.inherits is A.__mapper__ # fails
|
**Migrated issue, originally created by David Lord (@davidism)**
I have the following models:
* `Device`
* `DeviceSource` has a foreign key and relationship to `Device`
* `LDAPDeviceSource` is a joined-table inheritance of `DeviceSource`
In order to make defining new `DeviceSource` subclasses easier, I created
`DeviceSourceMixin`, which inherits from `DeviceSource` and provides declared
attrs for `__tablename__` and the foreign primary `id`. `DeviceSourceMixin` is
`__abstract__` so that it doesn't create a table of its own.
The issue is that this intermediate `__abstract__` seems to break the
declarative model. The foreign key and relationship in `DeviceSource` do not
get inherited by `LDAPDeviceSource` when subclassing `DeviceSourceMixin`.
This code demonstrates the issue:
import sqlalchemy as sa
from sqlalchemy.ext.declarative import declared_attr, as_declarative
from sqlalchemy.orm import Session, relationship
engine = sa.create_engine('sqlite:///:memory:', echo=True)
session = Session(bind=engine)
@as_declarative(bind=engine)
class Base(object):
@declared_attr
def __tablename__(cls):
return cls.__name__.lower()
id = sa.Column(sa.Integer, primary_key=True)
class Device(Base):
pass
class DeviceSource(Base):
type = sa.Column(sa.String, nullable=False)
device_id = sa.Column(sa.Integer, sa.ForeignKey(Device.id), nullable=False)
device = relationship(Device, backref='sources')
__mapper_args__ = {
'polymorphic_on': type
}
class DeviceSourceMixin(DeviceSource):
__abstract__ = True
@declared_attr
def __tablename__(cls):
return cls.__name__.lower()
@declared_attr
def id(cls):
return sa.Column(sa.Integer, sa.ForeignKey(DeviceSource.id), primary_key=True)
class LDAPDeviceSource(DeviceSourceMixin):
name = sa.Column(sa.String, nullable=False)
__mapper_args__ = {
'polymorphic_identity': 'ldap'
}
Base.metadata.create_all()
d1 = Device()
s1 = LDAPDeviceSource(device=d1, name='s1')
session.add(s1)
session.commit()
It produces the following error:
Traceback (most recent call last):
File "/home/david/Projects/cedar/example2.py", line 62, in <module>
s1 = LDAPDeviceSource(device=d1, name='s1')
File "<string>", line 4, in __init__
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/state.py", line 260, in _initialize_instance
return manager.original_init(*mixed[1:], **kwargs)
File "<string>", line 6, in __init__
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/ext/declarative/base.py", line 526, in _declarative_constructor
setattr(self, k, kwargs[k])
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 226, in __set__
instance_dict(instance), value, None)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 812, in set
value = self.fire_replace_event(state, dict_, value, old, initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 832, in fire_replace_event
state, value, previous, initiator or self._replace_token)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 1148, in emit_backref_from_scalar_set_event
passive=PASSIVE_NO_FETCH)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 980, in append
collection.append_with_event(value, initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/collections.py", line 653, in append_with_event
self._data()._sa_appender(item, _sa_initiator=initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/collections.py", line 1047, in append
item = __set(self, item, _sa_initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/collections.py", line 1019, in __set
item = executor.fire_append_event(item, _sa_initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/collections.py", line 716, in fire_append_event
item, initiator)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 929, in fire_append_event
value = fn(state, value, initiator or self._append_token)
File "/home/david/.virtualenvs/cedar/lib/python3.4/site-packages/sqlalchemy/orm/attributes.py", line 1157, in emit_backref_from_collection_append_event
child_impl = child_state.manager[key].impl
KeyError: 'device'
| 1 |
Sometimes the private modifier on the constructor can be useful, especially if
you want to implement some kind of singletons.
|
I think it is a pretty common pattern to have a static factory method to
create a class and the constructor of this class being otherwise private so
that you cannot instantiate the class unless you use the factory method.
| 1 |
**Describe the bug**
Using SQLAlchemy 1.4.0b2, I'm joining one version of subquery on itself, so I
get duplicate column names. The SQL produced properly disambiguates these
column names, however the keys returned in the result set are not
disambiguated.
(I'm new to SQLAlchemy, so this may not be a bug, but I would think it would
be as I can't distinguish between the two different fields in the results.)
**Expected behavior**
For the labels to be the same as the keys in the result set.
**To Reproduce**
If we have two queries joined on each other, like this:
query1 = <query>.alias('a')
query2 = <query>.alias('b')
joined_query = (
select(query1, query2)
.join(query1, query1.c.join_field == query2.c.join_field)
.set_label_style(LABEL_STYLE_TABLENAME_PLUS_COL)
)
We then get SQL produced like:
SELECT a.foo as a_foo, b.foo as b_foo
FROM (<query>) a
JOIN (<query>) b ON a.join_field = b.join_field
So far so good. The column names are disambiguated in SQL according to the
specific method.
However, when I try to get the results, as follows, that disambiguation
disappears:
result = session.execute(joined_query)
for row in result:
print(row.keys())
I get a result that looks like: `RMKeyView(['foo', 'foo'])` which means that I
can't reference the two columns separately. I would think I could do something
like `row['a_foo']` or `row.a_foo` and `row['b_foo']` or `row.b_foo`, but
these throw an error.
(Note that this is true even using the default label disambiguation style,
which would produce something like `a.foo, b.foo AS foo_1` in the SQL, but
still return both keys as `foo` in the result set.)
**Versions.**
* OS: macOS 11.1
* Python: 3.9.0
* SQLAlchemy: 1.4.0b2
* Database: PostgreSQL
* DBAPI: psycopg2==2.8.6
**Thank you for any and all help!**
I have done extensive Googling about this and can't find a resolution, which
makes me think this may be a bug introduced in the new version.
If this is _not_ a bug, any recommendations about how to resolve this problem
would be appreciated!
|
**Migrated issue, originally created by Leonardo Rossi (@hachreak)**
update: this specifically hits the _stored_in_collection assertion because the
check for `if event_key._listen_fn not in self.listeners` fails for an event
key with wrapping.
I'm using Invenio 2.0 and try to replace old version of SQLAlchemy 0.8.7 with
the last 0.9.7.
The utility to automaticaly create the db works (inveniomanage database
recreate --yes-i-know).
But when I start tests with: python setup.py test
It return me a error:
test_fisrt_blueprint (invenio.testsuite.test_ext_template.TemplateLoaderCase) ... --------------------------------------------------------------------------------
ERROR in wrappers [/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/ext/logging/wrappers.py:310]:
--------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/ext/legacy/__init__.py", line 124, in __call__
response = self.app.full_dispatch_request()
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/flask/app.py", line 1470, in full_dispatch_request
self.try_trigger_before_first_request_functions()
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/flask/app.py", line 1497, in try_trigger_before_first_request_functions
func()
File "/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/modules/messages/views.py", line 264, in invoke_email_alert_register
email_alert_register()
File "/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/modules/messages/models.py", line 202, in email_alert_register
event.listen(MsgMESSAGE, 'after_insert', email_alert)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/api.py", line 63, in listen
_event_key(target, identifier, fn).listen(*args, **kw)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/registry.py", line 187, in listen
self.dispatch_target.dispatch._listen(self, *args, **kw)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/orm/events.py", line 547, in _listen
event_key.base_listen(**kw)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/registry.py", line 226, in base_listen
for_modify(target.dispatch).append(self, propagate)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/attr.py", line 328, in append
event_key.append_to_list(self, self.listeners)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/registry.py", line 237, in append_to_list
_stored_in_collection(self, owner)
File "/home/vagrant/.virtualenvs/invenio2/local/lib/python2.7/site-packages/sqlalchemy/event/registry.py", line 74, in _stored_in_collection
assert dispatch_reg[owner_ref] == listen_ref
AssertionError
In
/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/modules/messages/views.py
(row 264)
# Registration of email_alert invoked from blueprint
# in order to use before_app_first_request.
# Reading config CFG_WEBMESSAGE_EMAIL_ALERT
# required app context.
@blueprint.before_app_first_request
def invoke_email_alert_register():
email_alert_register()
In
/home/vagrant/.virtualenvs/invenio2/src/invenio/invenio/modules/messages/models.py
(row 202)
# Registration of email_alert invoked from blueprint
# in order to use before_app_first_request.
# Reading config CFG_WEBMESSAGE_EMAIL_ALERT
# required app context.
def email_alert_register():
if cfg['CFG_WEBMESSAGE_EMAIL_ALERT']:
from sqlalchemy import event
# Register after insert callback.
event.listen(MsgMESSAGE, 'after_insert', email_alert)
Someone can help me? :)
(Sorry, this is my first issues and I don't know how to set "kind", etc...
T_T)
Installed:
* -e git+https://github.com/mitsuhiko/flask-sqlalchemy@c7eccba63314f3ea77e2c6217d3d3c8b0d2552fd#egg=Flask_SQLAlchemy-2.0
* MySQL-python==1.2.5
* SQLAlchemy==0.9.7
* SQLAlchemy-Utils==0.23.5
| 0 |
# Description of the new feature/enhancement
When pasting a large amount of text or even just having multiple lines show a
OPTIONALLY warning to the user. This is the behaviour in Cmder
# Proposed technical implementation details (optional)
|
# Support win10 1803
# Local compilation
| 0 |
### Version
2.6.10
### Reproduction link
https://github.com/wxkcoder/bugs_demo
### Steps to reproduce
vue version:2.6.10
ios version:10.3.3 or 11.4
use wxchat webview
page A use 'this.$router.push({name:B})' Jump to Page B
Page B import component C
Component C bind a touchstart event
the problem is Component C use slot label Input content,click that content is
dose not trigger C's touchstart event
Direct access to page B can trigger C‘s touchstart event
change ios version to 12.3.1 can trigger C‘s touchstart event
change vue version to 2.5.22 can trigger C‘s touchstart event
pleace try to run this demo https://github.com/wxkcoder/bugs_demo on ios
10.3.3
### What is expected?
no trigger
### What is actually happening?
trigger
|
### Version
2.6.10
### Reproduction link
https://github.com/wxkcoder/bugs_demo
### Steps to reproduce
vue version:2.6.10
ios version:10.3.3 or 11.4
use WeChat webview
page A use 'this.$router.push({name:B})' Jump to Page B
Page B import component C
Component C bind a touchstart event
the problem is Component C use slot label Input content,click that content is
dose not trigger C's touchstart event
Direct access to page B can trigger C‘s touchstart event
change ios version to 12.3.1 can trigger C‘s touchstart event
change vue version to 2.5.22 can trigger C‘s touchstart event
pleace try to run this demo https://github.com/wxkcoder/bugs_demo on ios
10.3.3
### What is expected?
trigger C's touchstart event
### What is actually happening?
no trigger C's touchstart event
| 1 |
hi, all
when i use the scipy.optimize.linprog for a simple LP
how can i get the value of dual variant (in a simple way)?
cuz in my problem, the dual variant has some practical significance
|
I have a huge linprog problem of almost 1k variables and restrictions. I can
calculate the solution with scipy.optimize.linprog(method='simplex') but I
need shadow prices (or opportunity costs / duals) of ~100 inequalities.
I'm able to calculate them by adding 1 to the right side of the inequality and
then solving that problem. Then I get the shadow price substracting the
objective functions values for both solutions: shadow_price_i = f_max_original
- f_max_i. Then repeat 100 times. This method works but it's painfully slow
(1h). Note: I could also pose the dual problem and solve it
It's a shame that scipy does not return duals, so I'm opening a feature
request. I'm new contributing to open projects, but I'm a CS graduate and have
experience with both python and the simplex method, so I may be able to do so
with some guidance.
| 1 |
Set user lockout after X amount of failed logins to protect against bruteforce
attacks.
* Feature toggle in config
* Configurable number of logins in config
* Admin can reset user (Is Active true/false)
* Email recovery link
|
**Is your feature request related to a problem? Please describe.**
In its default state, the Superset login page permits any number of failed
login attempts. This has been flagged as a security issue by our sysadmin
team, and I agree with them. I have only tested users created in the database
and have not tried other authentication methods.
**Describe the solution you'd like**
The login page should permit, say, 3 incorrect login attempts and should ask
the user to try again after a period of time. This time should be customizable
from the config file.
**Describe alternatives you've considered**
I am not sure if this already works for other authentication methods.
**Additional context**
None.
| 1 |
**TypeScript Version:**
1.8.9
**Code**
// A self-contained demonstration of the problem follows...
export interface IHiddenColumns {
hiddenColumns: string;
}
export class HiddenColumnsLoader {
public loadHiddenColumns(): Promise<void> {
return new Promise<void>((resolve, reject) => {
let promises: Promise<IHiddenColumns>[] = [];
// push promises
// .forEach(structureContent => {
// promises.push(this.loadStructureHiddenColumns(structureContent.rowSectionDefinition, structureContent.columnSectionDefinition, provider));
// });
Promise
.all(promises)
.then((results) => {
// results should be typeof IHiddenColumns[] and not Promise<IHiddenColumns[]>. See stack below
this.resolveResults(results);
resolve();
})
});
}
private resolveResults(loadPageResults: IHiddenColumns[]) {
// resolve
}
}
**Expected behavior:**
Compilation succeeded in Typescript 1.8.7.
Failed in 1.8.9
'results' inside Promise.all().then() should be considered typeof
IHiddenColumns[]. Seems to be instead : Promise<IHiddenColumns[]>. See trace
below.
**Actual behavior:**
Compilation failed in 1.8.9 whereas it worked in 1.8.7 (and still works if I
backward to 1.8.7).
Here is the stacktrace :
[11:39:07]
42 this.resolveResults(gridModel, results);
~~~~~~~
.../HiddenColumnsLoader.ts(42,37): error TS2345: Argument of type 'Promise[]'
is not assignable to parameter of type 'IHiddenColumns[]'.
Type 'Promise' is not assignable to type 'IHiddenColumns'.
|
**TypeScript Version:**
1.8.9
**Code**
Promise.all([Promise.resolve('')]); // has type Promise<Promise<string>[]>
**Expected behavior:**
The expression has type `Promise<string[]>` as in 1.8.7
**Actual behavior:**
The expression has type `Promise<Promise<string>[]>`
| 1 |
In my project I have:
"devDependencies": {
"babel-cli": "~6.1.2",
"babel-preset-es2015": "~6.1.2",
…
When I run babel-doctor I see this:
❯ babel-doctor
Babel Doctor
Running sanity checks on your system. This may take a few minutes...
✔ Found config at /Users/tema/Dropbox/Projects/_Repos/sweet2/.babelrc
✔ No duplicate babel packages found
✖ We found some outdated packages:
- babel-runtime - Latest is 6.0.14. Local version is 5.8.29
✔ You're on npm >=3.3.0
Found potential issues on your machine :(
But the latest version of babel-cli depends on babel-runtime ^5.0.0 so I can’t
fix this warning. It’s very confusing.
|
## bug report
### Input Code
This is only an issue when using both the react and typescript presets.
interface I {
<T>(p: T): T
}
Repl Link
### Babel/Babylon Configuration (.babelrc, package.json, cli command)
{
"presets": [
"@babel/react",
"@babel/typescript"
],
}
### Expected Behavior
It should not throw and error.
### Current Behavior
It throws:
{ SyntaxError: ./src/test.tsx: Unexpected token (2:4)
1 | interface I {
> 2 | <T>(p: T): T
| ^
3 | }
4 |
### Possible Solution
Similar issue with generic methods: #6665
Similar fix for generic methods: #7225
Duplicate but closed Issue: #7158
### Context
### Your Environment
software | version(s)
---|---
Babel | 7.0.0-beta.40
preset-react | 7.0.0-beta.40
preset-typescript | 7.0.0-beta.40
node | 8.9.0
npm | 5.5.1
Operating System | OSX
| 0 |
### System info
* Playwright Version: 1.30.0 and newer
* Operating System: Windows 11, Official Playwright Jammy image
* Browser: Chromium
* Other info: Problem happens locally on Wndows 11 and on the CI on a GitLab runner running the offical Playwright Jammy image
### Source code
**Config file**
// playwright.config.ts
import { defineConfig, devices } from '@playwright/test';
export default defineConfig({
projects: [
{
name: 'chromium',
reporter: [
["html"]
],
use: {
headless: false, //Running this headless decreases the speed of the tests by 5-6x
viewport: { width: 1280, height: 720 },
video: "on",
screenshot: "on",
trace: "on",
},
},
},
});
**Steps**
Running a simple test on v1.29.2 creates a snapshot, a trace and a video.
Since version 1.30.0 the exact same code won't create the video anymore.
The latest version 1.33.0 still won't create a video.
**Expected**
Video is created on version 1.30.0 and later
**Actual**
No video since version 1.29.2 (latest version where it's working).
|
### System info
* Playwright Version: [v1.24.0]
* Operating System: [Ubuntu 20, macOS 13.2]
* Browser: [Chromium]
* Other info:
### Source code
* I provided exact source code that allows reproducing the issue locally.
**Link to the GitHub repository with the repro**
[https://github.com/your_profile/playwright_issue_title]
or
**Config file**
// playwright.config.ts
import { defineConfig, devices } from '@playwright/test';
export default defineConfig({
projects: [
{
name: 'chromium',
use: { ...devices['Desktop Chrome'], },
},
});
**Test file (self-contained)**
test('should load login form', async ({ loginPage }) => {
await loginPage.doStuff();
await loginPage.doStuff(['']);
await loginPage.doStuff('');
await loginPage.doStuff();
});
**Fixture**
import { Page, test as base } from '@playwright/test';
const v8toIstanbul = require('v8-to-istanbul');
import {
LoginPage,
} from '../src/pages';
let appPage: Page = null;
export const test = base.extend<{
loginPage: LoginPage;
}>({
page: async ({ browser }, use) => {
if (appPage === null) {
appPage = await browser.newPage();
}
await use(appPage);
},
loginPage: async ({ page }, use) => {
await use(new LoginPage(page));
},
});
test.beforeEach(async ({ page }) => {
await page.coverage.startJSCoverage();
});
test.afterEach(async ({ page }) => {
const coverage = await page.coverage.stopJSCoverage();
console.log(coverage);
for (const entry of coverage) {
const converter = v8toIstanbul('', 0, { source: entry.source });
await converter.load();
converter.applyCoverage(entry.functions);
console.log(JSON.stringify(converter.toIstanbul()));
}
});
**Console error:**
{"":{"path":"","all":false,"statementMap":{"0":{"start":{"line":1,"column":0},"end":{"line":1,"column":0}},"1":{"start":{"line":2,"column":0},"end":{"line":2,"column":46}},"2":{"start":{"line":3,"column":0},"end":{"line":3,"column":23}},"3":{"start":{"line":4,"column":0},"end":{"line":4,"column":61}},"4":{"start":{"line":5,"column":0},"end":{"line":5,"column":21}},"5":{"start":{"line":6,"column":0},"end":{"line":6,"column":22}},"6":{"start":{"line":7,"column":0},"end":{"line":7,"column":10}},"7":{"start":{"line":8,"column":0},"end":{"line":8,"column":68}},"8":{"start":{"line":9,"column":0},"end":{"line":9,"column":60}},"9":{"start":{"line":10,"column":0},"end":{"line":10,"column":58}},"10":{"start":{"line":11,"column":0},"end":{"line":11,"column":40}},"11":{"start":{"line":12,"column":0},"end":{"line":12,"column":50}},"12":{"start":{"line":13,"column":0},"end":{"line":13,"column":25}},"13":{"start":{"line":14,"column":0},"end":{"line":14,"column":58}},"14":{"start":{"line":15,"column":0},"end":{"line":15,"column":10}},"15":{"start":{"line":16,"column":0},"end":{"line":16,"column":0}},"16":{"start":{"line":17,"column":0},"end":{"line":17,"column":46}},"17":{"start":{"line":18,"column":0},"end":{"line":18,"column":51}},"18":{"start":{"line":19,"column":0},"end":{"line":19,"column":60}},"19":{"start":{"line":20,"column":0},"end":{"line":20,"column":51}},"20":{"start":{"line":21,"column":0},"end":{"line":21,"column":15}},"21":{"start":{"line":22,"column":0},"end":{"line":22,"column":24}},"22":{"start":{"line":23,"column":0},"end":{"line":23,"column":9}},"23":{"start":{"line":24,"column":0},"end":{"line":24,"column":42}},"24":{"start":{"line":25,"column":0},"end":{"line":25,"column":61}},"25":{"start":{"line":26,"column":0},"end":{"line":26,"column":43}},"26":{"start":{"line":27,"column":0},"end":{"line":27,"column":50}},"27":{"start":{"line":28,"column":0},"end":{"line":28,"column":15}},"28":{"start":{"line":29,"column":0},"end":{"line":29,"column":75}},"29":{"start":{"line":30,"column":0},"end":{"line":30,"column":9}},"30":{"start":{"line":31,"column":0},"end":{"line":31,"column":0}},"31":{"start":{"line":32,"column":0},"end":{"line":32,"column":8}},"32":{"start":{"line":33,"column":0},"end":{"line":33,"column":0}},"33":{"start":{"line":34,"column":0},"end":{"line":34,"column":61}},"34":{"start":{"line":35,"column":0},"end":{"line":35,"column":72}},"35":{"start":{"line":36,"column":0},"end":{"line":36,"column":59}},"36":{"start":{"line":37,"column":0},"end":{"line":37,"column":64}},"37":{"start":{"line":38,"column":0},"end":{"line":38,"column":51}},"38":{"start":{"line":39,"column":0},"end":{"line":39,"column":44}},"39":{"start":{"line":40,"column":0},"end":{"line":40,"column":48}},"40":{"start":{"line":41,"column":0},"end":{"line":41,"column":48}},"41":{"start":{"line":42,"column":0},"end":{"line":42,"column":17}},"42":{"start":{"line":43,"column":0},"end":{"line":43,"column":50}},"43":{"start":{"line":44,"column":0},"end":{"line":44,"column":81}},"44":{"start":{"line":45,"column":0},"end":{"line":45,"column":112}},"45":{"start":{"line":46,"column":0},"end":{"line":46,"column":87}},"46":{"start":{"line":47,"column":0},"end":{"line":47,"column":84}},"47":{"start":{"line":48,"column":0},"end":{"line":48,"column":52}},"48":{"start":{"line":49,"column":0},"end":{"line":49,"column":40}},"49":{"start":{"line":50,"column":0},"end":{"line":50,"column":40}},"50":{"start":{"line":51,"column":0},"end":{"line":51,"column":17}},"51":{"start":{"line":52,"column":0},"end":{"line":52,"column":9}},"52":{"start":{"line":53,"column":0},"end":{"line":53,"column":31}},"53":{"start":{"line":54,"column":0},"end":{"line":54,"column":52}},"54":{"start":{"line":55,"column":0},"end":{"line":55,"column":40}},"55":{"start":{"line":56,"column":0},"end":{"line":56,"column":60}},"56":{"start":{"line":57,"column":0},"end":{"line":57,"column":17}},"57":{"start":{"line":58,"column":0},"end":{"line":58,"column":9}}},"s":{"0":1,"1":1,"2":1,"3":1,"4":1,"5":1,"6":1,"7":1,"8":1,"9":1,"10":1,"11":1,"12":1,"13":1,"14":1,"15":1,"16":1,"17":0,"18":0,"19":0,"20":0,"21":0,"22":0,"23":1,"24":0,"25":0,"26":0,"27":0,"28":0,"29":0,"30":1,"31":1,"32":1,"33":1,"34":1,"35":1,"36":1,"37":1,"38":1,"39":1,"40":1,"41":1,"42":1,"43":0,"44":0,"45":0,"46":0,"47":0,"48":0,"49":0,"50":0,"51":0,"52":1,"53":0,"54":0,"55":0,"56":0,"57":0},"branchMap":{"0":{"type":"branch","line":2,"loc":{"start":{"line":2,"column":-1},"end":{"line":58,"column":9}},"locations":[{"start":{"line":2,"column":-1},"end":{"line":58,"column":9}}]},"1":{"type":"branch","line":43,"loc":{"start":{"line":43,"column":31},"end":{"line":43,"column":47}},"locations":[{"start":{"line":43,"column":31},"end":{"line":43,"column":47}}]},"2":{"type":"branch","line":43,"loc":{"start":{"line":43,"column":49},"end":{"line":52,"column":9}},"locations":[{"start":{"line":43,"column":49},"end":{"line":52,"column":9}}]},"3":{"type":"branch","line":53,"loc":{"start":{"line":53,"column":30},"end":{"line":58,"column":9}},"locations":[{"start":{"line":53,"column":30},"end":{"line":58,"column":9}}]}},"b":{"0":[1],"1":[0],"2":[0],"3":[0]},"fnMap":{"0":{"name":"createNode","decl":{"start":{"line":17,"column":8},"end":{"line":23,"column":9}},"loc":{"start":{"line":17,"column":8},"end":{"line":23,"column":9}},"line":17},"1":{"name":"appendNodesToDom","decl":{"start":{"line":24,"column":8},"end":{"line":30,"column":9}},"loc":{"start":{"line":24,"column":8},"end":{"line":30,"column":9}},"line":24}},"f":{"0":0,"1":0}}}
Passed: 0 Failed: 1 Pending: 0
1) login/login.test.ts:18:3 › Login scenarios › should load login page ==================
Error: An error occurred while trying to read the map file at /Users/<path>/ag-grid-chunk-bfc390cf.js.map
Error: ENOENT: no such file or directory, open '/Users/<path>/ag-grid-chunk-bfc390cf.js.map'
at readFromFileMap (/Users/<path>/node_modules/convert-source-map/index.js:60:11)
at new Converter (/Users/<path>/node_modules/convert-source-map/index.js:67:32)
at Object.exports.fromMapFileComment (/Users/<path>/node_modules/convert-source-map/index.js:153:10)
at Object.exports.fromMapFileSource (/Users/<path>/node_modules/convert-source-map/index.js:165:22)
at V8ToIstanbul.load (/Users/<path>/node_modules/v8-to-istanbul/lib/v8-to-istanbul.js:52:66)
at /Users/<path>/fixtures/blink.ts:334:21
**Steps**
* [Run the test]
* npx playwright test
**Expected**
I followed the instructions as per https://playwright.dev/docs/api/class-
coverage to get the code coverage.
**Actual**
But I am facing an error complaining that the map is not present at the given
location. This seems to be happening with latest version as well.
| 0 |
Apache Druid 0.21.0 contains around 120 new features, bug fixes, performance
enhancements, documentation improvements, and additional test coverage from 36
contributors. Refer to the complete list of changes and everything tagged to
the milestone for further details.
# # New features
## # Operation
### # Service discovery and leader election based on Kubernetes
The new Kubernetes extension supports service discovery and leader election
based on Kubernetes. This extension works in conjunction with the HTTP-based
server view (`druid.serverview.type=http`) and task management
(`druid.indexer.runner.type=httpRemote`) to allow you to run a Druid cluster
_with zero ZooKeeper dependencies_. This extension is still **experimental**.
See Kubernetes extension for more details.
#10544
#9507
#10537
### # New dynamic coordinator configuration to limit the number of segments
when finding a candidate segment for segment balancing
You can set the `percentOfSegmentsToConsiderPerMove` to limit the number of
segments considered when picking a candidate segment to move. The candidates
are searched up to `maxSegmentsToMove * 2` times. This new configuration
prevents Druid from iterating through all available segments to speed up the
segment balancing process, especially if you have lots of available segments
in your cluster. See Coordinator dynamic configuration for more details.
#10284
### # `status` and `selfDiscovered` endpoints for Indexers
The Indexer now supports `status` and `selfDiscovered` endpoints. See
Processor information APIs for details.
#10679
## # Querying
### # New `grouping` aggregator function
You can use the new `grouping` aggregator SQL function with `GROUPING SETS` or
`CUBE` to indicate which grouping dimensions are included in the current
grouping set. See Aggregation functions for more details.
#10518
### # Improved missing argument handling in expressions and functions
Expression processing now can be vectorized when inputs are missing. For
example a non-existent column. When an argument is missing in an expression,
Druid can now infer the proper type of result based on non-null arguments. For
instance, for `longColumn + nonExistentColumn`, `nonExistentColumn` is treated
as `(long) 0` instead of `(double) 0.0`. Finally, in default null handling
mode, math functions can produce output properly by treating missing arguments
as zeros.
#10499
### # Allow zero period for `TIMESTAMPADD`
`TIMESTAMPADD` function now allows zero period. This functionality is required
for some BI tools such as Tableau.
#10550
## # Ingestion
### # Native parallel ingestion no longer requires explicit intervals
Parallel task no longer requires you to set explicit intervals in
`granularitySpec`. If intervals are missing, the parallel task executes an
extra step for input sampling which collects the intervals to index.
#10592
#10647
### # Old Kafka version support
Druid now supports Apache Kafka older than 0.11. To read from an old version
of Kafka, set the `isolation.level` to `read_uncommitted` in
`consumerProperties`. Only 0.10.2.1 have been tested up until this release.
See Kafka supervisor configurations for details.
#10551
### Multi-phase segment merge for native batch ingestion
A new tuningConfig, `maxColumnsToMerge`, controls how many segments can be
merged at the same time in the task. This configuration can be useful to avoid
high memory pressure during the merge. See tuningConfig for native batch
ingestion for more details.
#10689
### # Native re-ingestion is less memory intensive
Parallel tasks now sort segments by ID before assigning them to subtasks. This
sorting minimizes the number of time chunks for each subtask to handle. As a
result, each subtask is expected to use less memory, especially when a single
Parallel task is issued to re-ingest segments covering a long time period.
#10646
## # Web console
### # Updated and improved web console styles
The new web console styles make better use of the Druid brand colors and
standardize paddings and margins throughout. The icon and background colors
are now derived from the Druid logo.

#10515
### # Partitioning information is available in the web console
The web console now shows datasource partitioning information on the new
`Segment granularity` and `Partitioning` columns.
##### `Segment granularity` column in the `Datasources` tab

##### `Partitioning` column in the `Segments` tab

#10533
### # The column order in the `Schema` table matches the `dimensionsSpec`
The `Schema` table now reflects the dimension ordering in the
`dimensionsSpec`.

#10588
### # Metrics
### # Coordinator duty runtime metrics
The coordinator performs several 'duty' tasks. For example segment balancing,
loading new segments, etc. Now there are two new metrics to help you analyze
how fast the Coordinator is executing these duties.
* `coordinator/time`: the time for an individual duty to execute
* `coordinator/global/time`: the time for the whole duties runnable to execute
#10603
### # Query timeout metric
A new metric provides the number of timed out queries. Previously timed out
queries were treated as interrupted and included in the
`query/interrupted/count` (see Changed HTTP status codes for query errors for
more details).
`query/timeout/count`: the number of timed out queries during the emission
period
#10567
### # Shuffle metrics for batch ingestion
Two new metrics provide shuffle statistics for MiddleManagers and Indexers.
These metrics have the `supervisorTaskId` as their dimension.
* `ingest/shuffle/bytes`: number of bytes shuffled per emission period
* `ingest/shuffle/requests`: number of shuffle requests per emission period
To enable the shuffle metrics, add
`org.apache.druid.indexing.worker.shuffle.ShuffleMonitor` in
`druid.monitoring.monitors`. See Shuffle metrics for more details.
#10359
### # New clock-drift safe metrics monitor scheduler
The default metrics monitor scheduler is implemented based on
`ScheduledThreadPoolExecutor` which is prone to unbounded clock drift. A new
monitor scheduler, `ClockDriftSafeMonitorScheduler`, overcomes this
limitation. To use the new scheduler, set
`druid.monitoring.schedulerClassName` to
`org.apache.druid.java.util.metrics.ClockDriftSafeMonitorScheduler` in the
runtime.properties file.
#10448
#10732
### # Others
### # New extension for a password provider based on AWS RDS token
A new `PasswordProvider` type allows access to AWS RDS DB instances using
temporary AWS tokens. This extension can be useful when an RDS is used as
Druid's metadata store. See AWS RDS extension for more details.
#9518
### # The `sys.servers` table shows leaders
A new long-typed column `is_leader` in the `sys.servers` table indicates
whether or not the server is the leader.
#10680
### # `druid-influxdb-emitter` extension supports the HTTPS protocol
See Influxdb emitter extension for new configurations.
#9938
## # Docker
### # Small docker image
The docker image size is reduced by half by eliminating unnecessary
duplication.
#10506
## # Development
### # Extensible Kafka consumer properties via a new `DynamicConfigProvider`
A new class `DynamicConfigProvider` enables fetching consumer properties at
runtime. For instance, you can use `DynamicConfigProvider` fetch
`bootstrap.servers` from location such as a local environment variable if it
is not static. Currently, only a map-based config provider is supported by
default. See DynamicConfigProvider for how to implement a custom config
provider.
#10309
# # Bug fixes
Druid 0.21.0 contains 30 bug fixes, you can see the complete list here.
### # Post-aggregator computation with subtotals
Before 0.21.0, the query fails with an error when you use post aggregators
with sub-totals. Now this bug is fixed and you can use post aggregators with
subtotals.
#10653
### # Indexers announce themselves as segment servers
In 0.19.0 and 0.20.0, Indexers could not process queries against streaming
data as they did not announce themselves as segment servers. They are fixed to
announce themselves properly in 0.21.0.
#10631
### # Validity check for segment files in historicals
Historicals now perform validity check after they download segment files and
re-download automatically if those files are crashed.
#10650
### # `StorageLocationSelectorStrategy` injection failure is fixed
The injection failure while reading the configurations of
`StorageLocationSelectorStrategy` is fixed.
#10363
# # Upgrading to 0.21.0
Consider the following changes and updates when upgrading from Druid 0.20.0 to
0.21.0. If you're updating from an earlier version than 0.20.0, see the
release notes of the relevant intermediate versions.
### # Improved HTTP status codes for query errors
Before this release, Druid returned the "internal error (500)" for most of the
query errors. Now Druid returns different error codes based on their cause.
The following table lists the errors and their corresponding codes that has
changed:
Exception | Description | Old code | New code
---|---|---|---
SqlParseException and ValidationException from Calcite | Query planning failed
| 500 | **400**
QueryTimeoutException | Query execution didn't finish in timeout | 500 |
**504**
ResourceLimitExceededException | Query asked more resources than configured
threshold | 500 | **400**
InsufficientResourceException | Query failed to schedule because of lack of
merge buffers available at the time when it was submitted | 500 | **429** ,
merged to QueryCapacityExceededException
QueryUnsupportedException | Unsupported functionality | 400 | **501**
There is also a new query metric for query timeout errors. See New query
timeout metric for more details.
#10464
#10746
### # Query interrupted metric
`query/interrupted/count` no longer counts the queries that timed out. These
queries are counted by `query/timeout/count`.
### # `context` dimension in query metrics
`context` is now a default dimension emitted for all query metrics. `context`
is a JSON-formatted string containing the query context for the query that the
emitted metric refers to. The addition of a dimension that was not previously
alters some metrics emitted by Druid. You should plan to handle this new
`context` dimension in your metrics pipeline. Since the dimension is a JSON-
formatted string, a common solution is to parse the dimension and either
flatten it or extract the bits you want and discard the full JSON-formatted
string blob.
#10578
### # Deprecated support for Apache ZooKeeper 3.4
As ZooKeeper 3.4 has been end-of-life for a while, support for ZooKeeper 3.4
is deprecated in 0.21.0 and will be removed in the near future.
#10780
### # Consistent serialization format and column naming convention for the
`sys.segments` table
All columns in the `sys.segments` table are now serialized in the JSON format
to make them consistent with other system tables. Column names now use the
same "snake case" convention.
#10481
# # Known issues
### # Known security vulnerability in the Thrift library
The Thrift extension can be useful for ingesting files of the Thrift format
into Druid. However, there is a known security vulnerability in the version of
the Thrift library that Druid uses. The vulerability can be exploitable by
ingesting maliciously crafted Thrift files when you use Indexers. We recommend
granting the `DATASOURCE WRITE` permission to only trusted users.
For a full list of open issues, please see Bug .
# # Credits
Thanks to everyone who contributed to this release!
@a2l007
@abhishekagarwal87
@asdf2014
@AshishKapoor
@awelsh93
@ayushkul2910
@bananaaggle
@capistrant
@ccaominh
@clintropolis
@cloventt
@FrankChen021
@gianm
@harinirajendran
@himanshug
@jihoonson
@jon-wei
@kroeders
@liran-funaro
@martin-g
@maytasm
@mghosh4
@michaelschiff
@nishantmonu51
@pcarrier
@QingdongZeng3
@sthetland
@suneet-s
@tdt17
@techdocsmith
@valdemar-giosg
@viatcheslavmogilevsky
@viongpanzi
@vogievetsky
@xvrl
@zhangyue19921010
|
@gianm, this is meant as a placeholder, feel free to alter title and content
of this issue
In #6066 we discussed the possibility of specifying the `sortOrder` on
ingestion specs.
Currently, rows in segments are sorted based on `[__time] + dimensions`.
Knowing that Druid's bitmaps use some form of RLE compression, and that LZ4
operates by finding duplication on sliding windows (reminiscent of RLE-
compression), allowing for a custom sort order could allow for significantly
denser segments. Users may want to take the time to specify a sort order
improves perf based on their data and workloads. This may mean sorting based
on cardinality (lower dims first), one-to-many hierarchies across dims
(country->region), or heavily used and/or commonly predicated column.
A key item would be to remove the assumption that `__time` is the first item
in the ordering to allow for more RLE-friendly ordering. As @gianm mentioned
in #6066, many places in the codebase build upon this assumption, and we'd
have to make sure that isn't the case anymore as a prerequisite.
| 0 |
I think it should be computed in terms of `zero_one_loss` to avoid code
duplication.
|
Hi, I would like to report what seems to be a bug in the treatment of the
`copy_X` parameter of the `LassoLarsIC` class. Because it's a simple bug, it's
much easier to see in the code directly than in the execution, so I am not
posting steps to reproduce it.
As you can see here, LassoLarsIC accepts a copy_X parameter.
scikit-learn/sklearn/linear_model/least_angle.py
Line 1487 in 7389dba
| self.copy_X = copy_X
---|---
However, it also takes a copy_X parameter a few lines below, in the definition
of `fit`.
`def fit(self, X, y, copy_X=True):`
Now there are two values (potentially contradicting each other) for copy_X and
each one is used once. Therefore `fit` can have a mixed behaviour. Even worse,
this can be completely invisible to the user, since copy_X has a default value
of True. Let's assume that I'd like it to be False, and have set it to False
in the initialization, `my_lasso = LassoLarsIC(copy_X=False)`. I then call
`my_lasso.fit(X, y)` and my choice will be silently overwritten.
Ideally I think that copy_X should be removed as an argument in `fit`. No
other estimator seems to have a duplication in class parameters and fit
arguments (I've checked more than ten in the linear models module). However,
this would break existing code. Therefore I propose that `fit` takes a default
value of `None` and only overwrites the existing value if the user has
explicitly passed it as an argument to `fit`. I will submit a PR to that
effect.
| 0 |
**Janning Vygen** opened **SPR-3449** and commented
the binding of a list does not work in spring 2.0.4. The error is reproducable
with the code below (including pom maven descriptor).
The same code DOES work with spring 2.0.3 (you just need to change the pom if
you use maven)
// imports removed
public class SimpleControllerTest extends TestCase {
private SimpleController controller;
public void testCorrectModel ( ) throws Exception {
controller = new SimpleController();
controller.setCommandClass(ListForm.class);
MockHttpServletRequest req = new MockHttpServletRequest("POST", "/myurl");
MockHttpServletResponse res = new MockHttpServletResponse();
req.addParameter("oks[0].ok", "true");
ModelAndView mav = controller.handleRequest(req, res);
ListForm form = (ListForm) mav.getModelMap().get("command");
Boolean ok = form.getOks().get(0).getOk();
assertNotNull(ok);
}
}
// imports removed
public class SimpleController extends AbstractFormController
{
protected ModelAndView processFormSubmission ( HttpServletRequest req,
HttpServletResponse resp, Object command, BindException err ) throws Exception
{
ModelAndView mav = new ModelAndView();
mav.addObject("command", command);
return mav;
}
@Override
protected ModelAndView showForm ( HttpServletRequest arg0, HttpServletResponse arg1, BindException arg2 ) throws Exception {
return null;
}
}
// imports removed
public class Ok
{
Boolean ok;
public Boolean getOk () {
return ok;
}
public void setOk ( Boolean ok ) {
this.ok = ok;
}
}
// imports removed
public class ListForm
{
private List<Ok> oks = new ArrayList<Ok>();
public ListForm () {
for( int index = 0; index < 5; index++) {
Ok ok = new Ok();
oks.add( ok );
}
}
public List<Ok> getOks ( ) {
return oks;
}
public void setOks ( List<Ok> oks ) {
this.oks = oks;
}
}
<?xml version="1.0" encoding="UTF-8"?>
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>test</groupId>
<artifactId>test</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring</artifactId>
<version>2.0.4</version>
<!--
<version>2.0.3</version>
\-->
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency> <dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-mock</artifactId>
<version>2.0.4</version>
<!--
<version>2.0.3</version>
\-->
<scope>test</scope>
</dependency>
</dependencies>
</project>
* * *
**Affects:** 2.0.4
**Issue Links:**
* #8043 Error in BeanWrapperImpl.setPropertyValue for nested paths for primitive wrapper types such as integer ( _ **"duplicates"**_ )
|
**Peter Dettman** opened **SPR-6041** and commented
The Caucho hessian lib has now stabilised its v2.0 protocol and rounded up
several regressions, with the release of v4.0.1
(http://caucho.com/download/hessian-4.0.1.jar).
In 4.0.1, the HessianSkeleton class provides a ready-made invoke method that
Spring's HessianExporter can call (instead of duplicating its code). I am
attaching a patch that modifies HessianExporter.java to do so. The debugging
output part of the code is still duplicated of necessity until some suitable
refactoring of HessianSkeleton is done.
As far as backward compatibility is concerned, I believe users should avoid
using Hessian 2.0 protocol from previous versions (that may be an overly
strict interpretation). Hessian 1.0 protocol should work fine with a v4.0.1
server. I understand that there could be problems if people are using
HessianOutput/Hessian2Output or streaming versions directly.
I recommend Caucho's Hessian 4.0.1 be used for Spring 3.0, a fresh start of
sorts.
See also:
http://maillist.caucho.com/pipermail/hessian-interest/2009-June/000750.html
(and surrounding discussion on that list)
http://jira.springframework.org/browse/SPR-5469
http://bugs.caucho.com/view.php?id=3646
* * *
**Affects:** 3.0 M4
**Attachments:**
* HessianExporter_401.patch ( _2.88 kB_ )
**Issue Links:**
* #10142 Support Hessian 3.2.1
2 votes, 4 watchers
| 0 |
### Bug report
**Bug summary**
Attempting to IMPORT matplotlib under Python 3.7.x on a Win10Pro results in
error: "ImportError: DLL load failed: The specific module could not be found".
DLL name not given.
However, first importing PyQt5, and only then importing matplotlib works as it
should.
**Code for reproduction**
* install Python 3.7.x (tested both 3.7.2 and 3.7.3) on Win10Pro
* install matplotlib (version 3.1.0) with pip - installs fine, no issues
* install PyQt5 (version 5.12.2) with pip - installs fine, no issues
* start Python
* under Python, enter command: "import matplotlib"
Python 3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 22:22:05) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import matplotlib
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files\Python37\lib\site-packages\matplotlib\__init__.py", line 200, in <module>
_check_versions()
File "C:\Program Files\Python37\lib\site-packages\matplotlib\__init__.py", line 194, in _check_versions
module = importlib.import_module(modname)
File "C:\Program Files\Python37\lib\importlib\__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
ImportError: DLL load failed: The specified module could not be found.
>>>
>>>
# importing matplotlib works fine when PyQt5 imported first
>>> import PyQt5
>>> import matplotlib
>>> import serial
>>>
>>> ^Z
**Actual outcome**
# see above
**Expected outcome**
using matplotlib should not depend on importing PyQt5 first; but when done,
code works fine
**Matplotlib version**
* Operating system: Win10Pro
* Matplotlib version: 3.1.0
* Matplotlib backend (`print(matplotlib.get_backend())`): TkAgg right after importing; in the full code using qt5agg
* Python version: 3.7.2, 3.7.3, 64 bit
* Jupyter version (if applicable): n.a.
* Other libraries:
Python installed from Win installer
matplotlib and PyQt5 installed with pip
|
### Documentation Link
https://matplotlib.org/devdocs/api/_as_gen/mpl_toolkits.mplot3d.axes3d.Axes3D.html#mpl_toolkits.mplot3d.axes3d.Axes3D.scatter
### Problem
Note that both scatter() and scatter3D() appear, although they are strict
aliases of one another, duplicating much content. (I'm not suggesting to get
rid of the alias, just to dedupe them in the docs.) Quite a few other methods
exhibit the same duplication.
### Suggested improvement
_No response_
### Matplotlib Version
3.4.2.post1559+g3a46327c9c
### Matplotlib documentation version
3.4.2.post1559+g3a46327c9c
| 0 |
## Bug Report
* I would like to work on a fix!
**Current Behavior**
I created my porject with CRA.
I tried to import React-Navigation to my project. After the import, babel
started throwing me an error for "@babel/plugin-proposal-export-default-from"
plugin every time i tried to run my code. So I added it and also updated my
babel configuration, but it still keeps throwing me this error. What am I
missing?
SyntaxError: .../node_modules/react-native-gesture-handler/DrawerLayout.js: Support for the experimental syntax 'exportDefaultFrom' isn't currently enabled (30:8):
28 | const SETTLING = 'Settling';
29 |
> 30 | export type PropType = {
| ^
31 | children: any,
32 | drawerBackgroundColor?: string,
33 | drawerPosition: 'left' | 'right',
Add @babel/plugin-proposal-export-default-from (https://git.io/vb4yH) to the 'plugins' section of your Babel config to enable transformation.
**Input Code**
* REPL or Repo link if applicable:
https://github.com/FrediArro/babel-error
**Expected behavior/code**
No errors
**Babel Configuration (babel.config.js, .babelrc, package.json#babel, cli
command, .eslintrc)**
* Filename: `package.json`
"babel": {
"plugins": [
"@babel/plugin-proposal-export-default-from"
]
}
**Environment**
System:
OS: Linux 5.3 Ubuntu 18.04.4 LTS (Bionic Beaver)
Binaries:
Node: 12.16.1 - ~/.nvm/versions/node/v12.16.1/bin/node
Yarn: 1.21.1 - /usr/bin/yarn
npm: 6.13.4 - ~/.nvm/versions/node/v12.16.1/bin/npm
npmPackages:
@babel/plugin-proposal-export-default-from: ^7.8.3 => 7.8.3
@babel/plugin-proposal-export-namespace-from: ^7.8.3 => 7.8.3
**Possible Solution**
**Additional context/Screenshots**
Add any other context about the problem here. If applicable, add screenshots
to help explain.
|
## Bug Report
* I would like to work on a fix!
**Current behavior**
@babel/eslint-parser gives false positive (shows error) for no-unused-vars
rule with TypeScript types:
error 'Rubik' is defined but never used no-unused-vars
**Input Code**
import Rubik from './Rubik';
export default class Arranger {
private readonly rubik: Rubik;
}
**Expected behavior**
When run eslint, no errors/warnings should arise.
**Babel Configuration (babel.config.js, .babelrc, package.json#babel, cli
command, .eslintrc)**
* Filename: `babel.config.js`
{
"presets": [
"@babel/preset-env",
"@babel/preset-typescript"
],
"plugins": [
"@babel/proposal-class-properties"
]
}
* Filename: `.eslintrc.json`
{
"parser": "@babel/eslint-parser",
"rules": {
"no-unused-vars": ["error"]
}
}
**Environment**
System:
OS: Linux 4.19 Alpine Linux
Binaries:
Node: 12.18.3 - /usr/bin/node
Yarn: 1.22.4 - /usr/bin/yarn
npmPackages:
@babel/core: ^7.10.2 => 7.11.4
@babel/eslint-parser: ^7.11.4 => 7.11.4
@babel/eslint-plugin: ^7.11.3 => 7.11.3
@babel/plugin-proposal-class-properties: ^7.8.3 => 7.10.4
@babel/preset-env: ^7.10.2 => 7.11.0
@babel/preset-typescript: ^7.9.0 => 7.10.4
eslint: 7.2.* => 7.2.0
webpack: ^4.43.0 => 4.44.1
**Note**
This issue already existed before switching to Babel monorepo (in babel/babel-
eslint) so the switch is not the cause.
| 0 |
_Original tickethttp://projects.scipy.org/numpy/ticket/1064 on 2009-03-23 by
trac user changimeno, assigned to @charris._
`interp(x, xp, fp, left=None, right=None)`
the "right"--parameter option does not work properly.
Example:
In [2]: x = arange(4)*2 + 2
In [3]: y=x**2
In [4]: xx = arange(11)
In [5]: x
Out[5]: array([2, 4, 6, 8])
In [6]: y
Out[6]: array([ 4, 16, 36, 64])
In [7]: xx
Out[7]: array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10])
In [14]: numpy.interp(xx,x,y)
Out[14]: array([ 4., 4., 4., 10., 16., 26., 36., 50., 64., 64., 64.]) #---->>>> RIGHT!!!!
In [15]: numpy.interp(xx,x,y,left=0.)
Out[15]: array([ 0., 0., 4., 10., 16., 26., 36., 50., 64., 64., 64.]) #---->>>> RIGHT!!!!
In [16]: numpy.interp(xx,x,y,left=0.,right=0.)
Out[16]: array([ 0., 0., 4., 10., 16., 26., 36., 50., 0., 0., 0.]) #---->>>> WRONG!!!!
numpy.interp set the vales of the interpolated array (yy) for "xx[i] >=
x[len(x)-1]" to the given RIGHT value. So, the last GOOD value (xx[i] ==
x[len(x)-1]) is wrongly set to the RIGHT value instead of the proper value
yy[i] = y[len(x)-1].
I edited the file numpy/lib/src/_compiled_base.c and fixed this bug (see
attachment). Now
In [19]: numpy.interp(xx,x,y,0.,0.)
Out[19]: array([ 0., 0., 4., 10., 16., 26., 36., 50., 64., 0., 0.]) #---->>>> RIGHT!!!!
My only worry is that I modified "binary_search" together with "arr_interp"
and I do not know if this function is called by other routines and whether
this modification can alter the results.
diff
> /* sgg */
> else if (dval > dlist [len-1])
> result = len ;
> /* sgg */
448c454,455
< else if (indx >= lenxp - 1)
---
> /*sgg else if (indx >= lenxp - 1) */
> else if (indx > lenxp - 1)
Cheers,
Chan
|
Cross-compiling scipy and other projects that depend on numpy's distutils is a
huge pain right now, because to do it [in addition to lots of other details
that you have to get right] you have to have both a native and cross-compiled
version of numpy installed. It seems pretty unreasonable that I need a native
version of numpy installed to compile scipy. One might ask, why is this
needed?
Well, scipy's setup.py uses numpy's distutils fork for the fortran support (I
think). Because numpy.distutils is a subpackage of numpy, if you're working
with a cross-compiled version of numpy, it eventually tries to import some .so
that wasn't compiled for your host and everything dies -- thus you have to
have a native numpy installed to allow numpy.distutils to import correctly.
As far as I can tell, numpy's distutils fork is pure python and doesn't
actually use anything else in numpy itself, and so is completely separable
from numpy. If it were its own top-level package that scipy et al could use,
then cross-compiling would be significantly less tricky.
The mechanics of this would work like so:
* contents of numpy.distutils would be moved to 'numpy_distutils' package
* importing numpy.distutils would raise a deprecation warning and redirect to numpy_distutils, probably using import hook magic
* scipy and other packages would now utilize numpy_distutils instead of numpy.distutils at build time
Initially, I considered proposing creating a separate pypi package for
numpy_distutils, but I expect that would be more trouble than it's worth. One
could also propose creating a new PEP 517 build backend, or move to cmake, or
other huge changes, but those also seem more trouble than they're worth.
Thanks for your consideration.
| 0 |
#### Description
When there are duplicated input points to Kmeans resulting to number of unique
points < number of requested clusters, there is no error thrown. Instead,
clustering continues to (seemingly) produce the number of clusters requested,
but some of them are exactly the same, so the cluster labels produced for the
input points do not go all the way to number of requested clusters.
#### Steps/Code to Reproduce
from sklearn.cluster import KMeans
import numpy as np
# some input points here are identical, so that n_total=17, n_unique=9
x2d = np.array([(1086, 348), (1087, 347), (1190, 244), (1190, 244), (1086, 348), (1185, 249), (1193, 241), (1185, 249), (1087, 347), (1188, 247), (1187, 233), (26, 111), (26, 111), (26, 110), (26, 110), (26, 110), (26, 110)])
kmeans = KMeans(n_clusters=10) # n_clusters > n_unique
c_labels = kmeans.fit_predict(x2d)
c_centers = kmeans.cluster_centers_
#### Expected Results
Either an error thrown, or the cluster labels produced should match the unique
clusters only (i.e. no identical cluster centres)
#### Actual Results
>>> c_labels # note there's no entry for cluster 9
array([7, 2, 6, 6, 7, 5, 4, 5, 2, 1, 3, 8, 8, 0, 0, 0, 0], dtype=int32)
>>> c_centers # two of these 10 clusters have identical centers, so only 9 of them are unique
array([[ 26., 110.],
[ 1188., 247.],
[ 1087., 347.],
[ 1187., 233.],
[ 1193., 241.],
[ 1185., 249.],
[ 1190., 244.],
[ 1086., 348.],
[ 26., 111.],
[ 26., 110.]])
#### Versions
Darwin-16.7.0-x86_64-i386-64bit
Python 3.6.1 |Continuum Analytics, Inc.| (default, May 11 2017, 13:04:09)
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)]
NumPy 1.13.1
SciPy 0.19.1
Scikit-Learn 0.18.2
|
### Describe the bug
PCA fit_transform() gives different (and wrong) results with fit() first and
then transform() on the same data, and doing two separately yields the correct
results.
### Steps/Code to Reproduce
from sklearn.decomposition import PCA
test_encoding=np.matrix([[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 0., 0., 0., 1.],
[0., 0., 0., 0., 0., 1.]])
# two steps
pca = PCA(n_components=2)
pca.fit(test_encoding)
print(pca.transform(test_encoding)) # correct results
# one step
pca = PCA(n_components=2)
print(pca.fit_transform(test_encoding)) # wrong results, because the first 3 rows should be the same
### Expected Results
array([[-5.65685425e-01, -8.88178420e-17],
[-5.65685425e-01, -8.88178420e-17],
[-5.65685425e-01, -8.88178420e-17],
[ 8.48528137e-01, 1.33226763e-16],
[ 8.48528137e-01, 1.33226763e-16]])
### Actual Results
array([[-5.65685425e-01, 6.23316960e-17],
[-5.65685425e-01, -5.02181828e-17],
[-5.65685425e-01, -5.02181828e-17],
[ 8.48528137e-01, -1.27015565e-17],
[ 8.48528137e-01, -1.27015565e-17]])
### Versions
import sklearn
sklearn.__version__ # '0.23.2'
| 0 |
I have a problem installing Atom 0.190.0 using AtomSetup.exe - it runs for a
fraction of a second and exits immediately without any error message.
I managed to determine that Atom is using something called Squirrel to build
the installer (is it mentioned somewhere?), so here is its SquirrelSetup.log:
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Failed to load local releases, starting from scratch: System.IO.DirectoryNotFoundException: Could not find a part of the path 'C:\Users\szx\AppData\Local\atom\packages\RELEASES'.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)
at Squirrel.Utility.LoadLocalReleases(String localReleaseFile)
at Squirrel.UpdateManager.CheckForUpdateImpl.<CheckForUpdate>d__28.MoveNext()
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: First run or local directory is corrupt, starting from scratch
ApplyReleasesImpl: Writing files to app directory: C:\Users\szx\AppData\Local\atom\app-0.187.0
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.187.0\atom.exe]
ApplyReleasesImpl: Starting fixPinnedExecutables
ApplyReleasesImpl: fixPinnedExecutables: oldAppDirectories is empty, this is pointless
ApplyReleasesImpl: cleanDeadVersions: for version 0.187.0
ApplyReleasesImpl: cleanDeadVersions: exclude folder app-0.187.0
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: hwhat, local version is greater than remote version
ApplyReleasesImpl: Found partially applied release folder, killing it: C:\Users\szx\AppData\Local\atom\app-0.189.0
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm\node_modules\npm\node_modules\fstream-npm\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm\node_modules\npm\node_modules\fstream-npm: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core\node_modules\output-file-sync\node_modules\mkdirp\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core\node_modules\output-file-sync\node_modules\mkdirp: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core\node_modules\output-file-sync\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core\node_modules\output-file-sync: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules\babel-core: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm\node_modules\npm\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm\node_modules\npm: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\apm: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app\node_modules: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources\app: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0\resources: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
Utility: DeleteDirectory: could not delete - C:\Users\szx\AppData\Local\atom\app-0.189.0: System.IO.IOException: The directory is not empty.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.Directory.DeleteHelper(String fullPath, String userPath, Boolean recursive, Boolean throwOnTopLevelDirectoryNotFound)
at System.IO.Directory.Delete(String fullPath, String userPath, Boolean recursive, Boolean checkHost)
at Squirrel.Utility.<DeleteDirectory>d__3b.MoveNext()
ApplyReleasesImpl: Writing files to app directory: C:\Users\szx\AppData\Local\atom\app-0.189.0
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
ApplyReleasesImpl: Starting fixPinnedExecutables
ApplyReleasesImpl: Processing shortcut 'C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Program Files (x86)\CMake\bin\cmake-gui.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut ''
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Program Files (x86)\foobar2000\foobar2000.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Program Files (x86)\Mozilla Firefox\firefox.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Program Files (x86)\Skype\Phone\Skype.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\devenv.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: Processing shortcut 'C:\Users\szx\AppData\Roaming\uTorrent\uTorrent.exe'
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.187.0', continuing to next directory
ApplyReleasesImpl: Does not match 'C:\Users\szx\AppData\Local\atom\app-0.190.0', continuing to next directory
ApplyReleasesImpl: cleanDeadVersions: for version 0.190.0
ApplyReleasesImpl: cleanDeadVersions: exclude folder app-0.190.0
ApplyReleasesImpl: cleanDeadVersions: exclude folder app-0.190.0
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\szx\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\szx\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\szx\AppData\Local\atom\app-0.190.0\atom.exe]
|
Hi all,
when i try to install Atom on my Windows 8.1 machine i get this error.

## And the log say:
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\Mattia\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\Mattia\AppData\Local\SquirrelTemp
ApplyReleasesImpl: No release to install, running the app
ApplyReleasesImpl: Squirrel Enabled Apps: [C:\Users\Mattia\AppData\Local\atom\app-0.179.0\atom.exe]
Program: Starting Squirrel Updater: --install .
Program: Starting install, writing to C:\Users\Mattia\AppData\Local\SquirrelTemp
CheckForUpdateImpl: Reading RELEASES file from C:\Users\Mattia\AppData\Local\SquirrelTemp
ApplyReleasesImpl: Found partially applied release folder, killing it: C:\Users\Mattia\AppData\Local\atom\app-0.182.0
ApplyReleasesImpl: Writing files to app directory: C:\Users\Mattia\AppData\Local\atom\app-0.182.0
IEnableLogger: Failed to update local releases file: System.IO.IOException: File esistente.
in System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
in System.IO.__Error.WinIOError()
in System.IO.Path.InternalGetTempFileName(Boolean checkHost)
in Squirrel.ReleaseEntry.BuildReleasesFile(String releasePackagesDir)
in Squirrel.UpdateManager.ApplyReleasesImpl.<updateLocalReleasesFile>b__ed()
in System.Threading.Tasks.Task`1.InnerInvoke()
in System.Threading.Tasks.Task.Execute()
--- Fine traccia dello stack da posizione precedente dove è stata generata l'eccezione ---
in System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
in System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
in Squirrel.UpdateManager.ApplyReleasesImpl.<updateLocalReleasesFile>d__ee.MoveNext()
--- Fine traccia dello stack da posizione precedente dove è stata generata l'eccezione ---
in System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
in System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
in Squirrel.Utility.<LogIfThrows>d__4b`1.MoveNext()
After this error, i've found the full Atom app (0.182.0) inside the folder
AppData\Local\atom\app-0.182.0. I've tried to run Atom.exe inside this folder
and the app start regularly and i have linked the .exe file to desktop. The
directory \bin does not exist in `AppData\Local\atom`
| 1 |
Challenge Increment a Number with JavaScript has an issue.
User Agent is: `Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36`.
Please describe how to reproduce this issue, and include links to screenshots
if possible.
Link to page:
https://www.freecodecamp.com/challenges/increment-a-number-with-
javascript#?solution=var%20myVar%20%3D%2087%3B%0A%0A%2F%2F%20Only%20change%20code%20below%20this%20line%0AmyVar%2B%2B%3B%0A%0A

|
Challenge Adjusting the Padding of an Element has an issue.
User Agent is: `Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_3)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.87 Safari/537.36`.
Please describe how to reproduce this issue, and include links to screenshots
if possible.
My code:
<style>
.injected-text {
margin-bottom: -25px;
text-align: center;
}
.box {
border-style: solid;
border-color: black;
border-width: 5px;
text-align: center;
}
.yellow-box {
background-color: yellow;
padding: 10px;
}
.red-box {
background-color: red;
padding: 20px;
}
.green-box {
background-color: green;
padding: 20px;
}
</style>
<h5 class="injected-text">margin</h5>
<div class="box yellow-box">
<h5 class="box red-box">padding</h5>
<h5 class="box green-box">padding</h5>
</div>
Edit by @raisedadead:
#### Camper is unable to advance between challenges due to slow/poor
connectivity.
| 1 |
I'm trying to calculate the NDCG score for binary relevance:
from sklearn import metrics
# test 1
y_true = [[3]]
y_score = [[5]]
metrics.ndcg_score(y_true, y_score)
And getting error
`ValueError: Only ('multilabel-indicator', 'continuous-multioutput',
'multiclass-multioutput') formats are supported. Got binary instead `
Below code snippet is working which has more than 1 item. Please suggest why
single item array is not working.
from sklearn import metrics
# test 1
y_true = [[6 ,5, 4, 3, 2, 1]]
y_score = [[120, -2, -6, -3, 2, 12]]
metrics.ndcg_score(y_true, y_score)
|
See this code example:
>>> t = [[1]]
>>> p = [[0]]
>>> metrics.ndcg_score(t, p)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/cbournhonesque/.pyenv/versions/bento/lib/python3.8/site-packages/sklearn/utils/validation.py", line 63, in inner_f
return f(*args, **kwargs)
File "/Users/cbournhonesque/.pyenv/versions/bento/lib/python3.8/site-packages/sklearn/metrics/_ranking.py", line 1567, in ndcg_score
_check_dcg_target_type(y_true)
File "/Users/cbournhonesque/.pyenv/versions/bento/lib/python3.8/site-packages/sklearn/metrics/_ranking.py", line 1307, in _check_dcg_target_type
raise ValueError(
ValueError: Only ('multilabel-indicator', 'continuous-multioutput', 'multiclass-multioutput') formats are supported. Got binary instead
It works correctly when the number of elements is bigger than 1:
https://stackoverflow.com/questions/64303839/how-to-calculate-ndcg-with-
binary-relevances-using-sklearn
| 1 |
I started having trouble with a site I have in development that uses Joomla
3.1.5 with Bootstrap (meet gavern template) under IE10 on Win7 Ultimate. So I
tried opening the getbootstrap site under IE10 and it also has big issues
(attached). The home page loads the background, a couple text fields and
nothing else, and when I go to getbootstrap.com/components I get this error in
the console:SCRIPT87: Invalid argument. holder.js, line 119 character 4.
If I go to getbootstrap.com/2.3.2, I get this error: SCRIPT5002: Function
expected
holder.js, line 316 character 2.
IE10 goes into compatibility mode with IE7 Standards Document Mode when the
page loads. Some pages display blank white - even though all the code has
loaded if I call up the view source window - and then if I resize the window
it goes black.
I have an NVIDIA GeForce Display, not one of the displays with known issues
for IE10. There is another person on the Joomla forum who has this issue so I
know it is not just my computer. It seems to be ie10 and Windows Ultimate.

|
if the PC has
"helvetica" font,
and
css sais
font-family: "helvetica";
IE 9&10 can not display anything.
so I think it is important,
"helvetica" should be removed from defolt "font-family".
thank you.
| 1 |
After I checkout SciPy and run the tests as-is, I'm seeing >50 test failures.
The steps being followed are:
conda create -n scipy python=3
conda install -y mkl numpy setuptools matplotlib cython pytest
git clone https://github.com/scipy/scipy
cd scipy
python setup.py build_ext --inplace
python runtests.py -v
I have also tried:
apt-get install -yq build-essential ca-certificates gcc gfortran git python3 python3-pip
pip3 install cython matplotlib mkl numpy pytest setuptools
git clone https://github.com/scipy/scipy
cd scipy
python setup.py build_ext --inplace
python runtests.py -v
These steps were followed in the continuumio/anaconda3 Docker image. I have
also had similar luck trying to do this on OSX. For what it's worth, the OSX
binaries from Conda output two failures when I run:
python -c 'import scipy; scipy.test("full")'
What is the right way to build SciPy in a way that passes all tests?
|
### Issue
I installed the latest github version of scikit learn 1.1.0 as of 24.01.2017
latest commit `beb258e` and I get undefined `symbol:
_gfortran_stop_numeric_f08`
It seems that the error comes from an scikit-image, but it is due to scipy
import problem.
The issue is resolved when the release version 1.0.0 is installed
### Reproducing code example:
from skimage import , transform
### Error message:
ImportError: /anaconda/envs/py35/lib/python3.5/site-packages/scipy/special/_ufuncs.cpython-35m-x86_64-linux-gnu.so: undefined symbol: _gfortran_stop_numeric_f08
Traceback (most recent call last):
File "travers.py", line 15, in
from vis.visualization import visualize_saliency, overlay
File "/anaconda/envs/py35/lib/python3.5/site-packages/vis/visualization/
**init**.py", line 4, in
from .activation_maximization import visualize_activation_with_losses
File "/anaconda/envs/py35/lib/python3.5/site-
packages/vis/visualization/activation_maximization.py", line 6, in
from ..losses import ActivationMaximization
File "/anaconda/envs/py35/lib/python3.5/site-packages/vis/losses.py", line 4,
in
from .utils import utils
File "/anaconda/envs/py35/lib/python3.5/site-packages/vis/utils/utils.py",
line 14, in
from skimage import io, transform
File "/anaconda/envs/py35/lib/python3.5/site-packages/skimage/transform/
**init**.py", line 1, in
from .hough_transform import (hough_line, hough_line_peaks,
File "/anaconda/envs/py35/lib/python3.5/site-
packages/skimage/transform/hough_transform.py", line 2, in
from scipy import ndimage
File "/anaconda/envs/py35/lib/python3.5/site-packages/scipy/ndimage/
**init**.py", line 161, in
from .filters import *
File "/anaconda/envs/py35/lib/python3.5/site-
packages/scipy/ndimage/filters.py", line 37, in
from scipy.misc import doccer
File "/anaconda/envs/py35/lib/python3.5/site-packages/scipy/misc/
**init**.py", line 67, in
from scipy.interpolate._pade import pade as _pade
File "/anaconda/envs/py35/lib/python3.5/site-packages/scipy/interpolate/
**init**.py", line 175, in
from .interpolate import *
File "/anaconda/envs/py35/lib/python3.5/site-
packages/scipy/interpolate/interpolate.py", line 21, in
import scipy.special as spec
File "/anaconda/envs/py35/lib/python3.5/site-packages/scipy/special/
**init**.py", line 639, in
from ._ufuncs import *
ImportError: /anaconda/envs/py35/lib/python3.5/site-
packages/scipy/special/_ufuncs.cpython-35m-x86_64-linux-gnu.so: undefined
symbol: _gfortran_stop_numeric_f08
### Scipy/Numpy/Python version information:
<<Output from 'import sys, scipy, numpy; print(scipy.__version__, numpy.__version__, sys.version_info)'>>
1.1.0.dev0+Unknown 1.14.0 sys.version_info(major=3, minor=5, micro=2,
releaselevel='final', serial=0)
scikit-image (0.13.1)
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: xxx
* Operating System version: xxx
* Java version: xxx
# Http Request Solutions
We are going to export dubbo service for front-end request(eg: h5 outside). We
are doubt about the advantage and disadvantage of below two solutions, anyone
have good ideas for that?
Generally, there are two ways to support that.
Solution1: Add proxy(through GenericService) for the http request (http ->
proxy -> dubbo)
Solution2: Support dubbo & http protocol in service side (eg: dubbox/jsonrpc).
## For Solution1:
### Advantage:
1. It's easy for the user, no need to maintain both http & dubbo protocols.
2. No need to maintain two types of configurations (one for dubbo, another for http). Following two conditions may confuse the user. Firstly, xxx config support for dubbo and http, but they are in different key. Secondly, yyy config is supported in dubbo, not for http. User maybe confused on so much of the configs.
### Disadvantage:
1. The latency is high for the proxy mode.
For dubbo protocol: 1k request/100k response/3000QPS, average response is
1-2ms, and able to support 7000-8000QPS in 8Core server.
For the proxy mode: 1K request/100K response/600QPS, the average response is
14ms, only with 600QPS.
2. The proxy CPU is very high.
For dubbo protocol: With 7000-8000QPS, the service CPU is about 80-90%, with
600QPS, the service CPU below 5-10% maybe.
For proxy mode: With 600QPS, service CPU is low, but the proxy's CPU is almost
full (80-90%). That means, it need 4-5times of the server to host the request.
## For Solution2:
### Advantage:
1. The latency is very low, as it's directly connect to the service.
### Disadvantage:
1. User have to maintain both dubbo & http protocol, as described in solution 1 .
We are confused to find a better solution. Am i in a wrong way, or anyone have
good suggestions or solutions for http request?
Thanks for any suggestions.
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.6.2
* Operating System version: win10
* Java version: 1.8
### Step to reproduce this issue
1、the code in **ThriftProtocol**
String serviceKey = serviceKey(channel.getLocalAddress().getPort(),
serviceName, null, null);
DubboExporter exporter = (DubboExporter) exporterMap.get(serviceKey);
2、why set the serviceVersion and serviceGroup null ? I do not understand
com.alibaba.dubbo.remoting.RemotingException: Not found exported service: com.zbss.common.api.config.service.ConfigService$Iface:20080 in [service-config/com.zbss.common.api.config.service.ConfigService$Iface:20080], may be version or group mismatch , channel: consumer: /192.168.3.63:61251 --> provider: /192.168.3.63:20080, message:RpcInvocation [methodName=getVendorConfig, parameterTypes=[class java.lang.String], arguments=[123], attachments={interface=com.zbss.common.api.config.service.ConfigService$Iface}]
at com.alibaba.dubbo.rpc.protocol.thrift.ThriftProtocol$1.reply(ThriftProtocol.java:67)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.handleRequest(HeaderExchangeHandler.java:96)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.received(HeaderExchangeHandler.java:172)
at com.alibaba.dubbo.remoting.transport.DecodeHandler.received(DecodeHandler.java:51)
at com.alibaba.dubbo.remoting.transport.dispatcher.ChannelEventRunnable.run(ChannelEventRunnable.java:80)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
| 0 |
##### System information (version)
* OpenCV => 3.4.1
* Operating System / Platform => windows 10 x64
* Compiler => mingw-w64 8.1.0 and mingw-w64 7.3.0
##### Detailed description
imwrite crash
test with mingw-w64 8.1.0 and mingw-w64 7.3.0
All same settings but OpenCV 3.4.0 is good.
##### Steps to reproduce
c++ code crash when inline matrix operation
#include <opencv2/opencv.hpp>
using namespace cv;
int main(int argc, char *argv[]) {
Mat1b img = Mat1b::zeros(300, 300);
imwrite("img.png", img + 50);
return 0;
}
this way don't crash
#include <opencv2/opencv.hpp>
using namespace cv;
int main(int argc, char *argv[]) {
Mat1b img = Mat1b::zeros(300, 300);
img = img + 50;
imwrite("img.png", img);
return 0;
}
cmake configuration
General configuration for OpenCV 3.4.1 =====================================
Version control: unknown
Platform:
Timestamp: 2018-06-04T06:00:53Z
Host: Windows 10.0.17134 AMD64
CMake: 3.11.2
CMake generator: MinGW Makefiles
CMake build tool: C:/msys64/mingw64/bin/mingw32-make.exe
Configuration: Release
CPU/HW features:
Baseline: SSE SSE2 SSE3
requested: SSE3
Dispatched code generation:
requested: SSE3
C/C++:
Built as dynamic libs?: NO
C++11: YES
C++ Compiler: C:/msys64/mingw64/bin/g++.exe (ver 7.3.0)
C++ flags (Release): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wuninitialized -Winit-self -Wno-narrowing -Wno-delete-non-virtual-dtor -Wno-comment -Wno-implicit-fallthrough -fdiagnostics-show-option -Wno-long-long -fomit-frame-pointer -ffast-math -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fvisibility-inlines-hidden -fopenmp -O3 -DNDEBUG -DNDEBUG
C++ flags (Debug): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wshadow -Wsign-promo -Wuninitialized -Winit-self -Wno-narrowing -Wno-delete-non-virtual-dtor -Wno-comment -Wno-implicit-fallthrough -fdiagnostics-show-option -Wno-long-long -fomit-frame-pointer -ffast-math -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fvisibility-inlines-hidden -fopenmp -g -O0 -DDEBUG -D_DEBUG
C Compiler: C:/msys64/mingw64/bin/gcc.exe
C flags (Release): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wuninitialized -Winit-self -Wno-narrowing -Wno-comment -Wno-implicit-fallthrough -fdiagnostics-show-option -Wno-long-long -fomit-frame-pointer -ffast-math -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fopenmp -O3 -DNDEBUG -DNDEBUG
C flags (Debug): -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wuninitialized -Winit-self -Wno-narrowing -Wno-comment -Wno-implicit-fallthrough -fdiagnostics-show-option -Wno-long-long -fomit-frame-pointer -ffast-math -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fopenmp -g -O0 -DDEBUG -D_DEBUG
Linker flags (Release): -Wl,--gc-sections
Linker flags (Debug): -Wl,--gc-sections
ccache: NO
Precompiled headers: NO
Extra dependencies: avifil32 avicap32 winmm msvfw32 comctl32 gdi32 ole32 setupapi ws2_32 opengl32 glu32
3rdparty dependencies: libprotobuf libjpeg libwebp libpng libtiff libjasper IlmImf zlib
OpenCV modules:
To be built: calib3d core dnn features2d flann highgui imgcodecs imgproc java_bindings_generator ml objdetect photo python_bindings_generator shape stitching superres ts video videoio world
Disabled: js videostab
Disabled by dependency: -
Unavailable: cudaarithm cudabgsegm cudacodec cudafeatures2d cudafilters cudaimgproc cudalegacy cudaobjdetect cudaoptflow cudastereo cudawarping cudev java python2 python3 viz
Applications: tests perf_tests apps
Documentation: NO
Non-free algorithms: NO
Windows RT support: NO
GUI:
Win32 UI: YES
OpenGL support: YES (opengl32 glu32)
VTK support: NO
Media I/O:
ZLib: build (ver 1.2.11)
JPEG: build (ver 90)
WEBP: build (ver encoder: 0x020e)
PNG: build (ver 1.6.34)
TIFF: build (ver 42 - 4.0.9)
JPEG 2000: build (ver 1.900.1)
OpenEXR: build (ver 1.7.1)
Video I/O:
Video for Windows: YES
DC1394: NO
FFMPEG: YES (prebuilt binaries)
avcodec: YES (ver 57.107.100)
avformat: YES (ver 57.83.100)
avutil: YES (ver 55.78.100)
swscale: YES (ver 4.8.100)
avresample: YES (ver 3.7.0)
GStreamer: NO
DirectShow: YES
Parallel framework: OpenMP
Trace: YES (built-in)
Other third-party libraries:
Lapack: NO
Eigen: NO
Custom HAL: NO
Protobuf: build (3.5.1)
NVIDIA CUDA: NO
OpenCL: YES (no extra features)
Include path: D:/software/opencv-3.4.1/3rdparty/include/opencl/1.2
Link libraries: Dynamic load
Python (for build): NO
Java:
ant: NO
JNI: NO
Java wrappers: NO
Java tests: NO
Matlab: NO
Install to: D:/software/cv/install
-----------------------------------------------------------------
|
##### System information (version)
* OpenCV => 3.4.1
* Operating System / Platform => Ubuntu 18.04 64 Bit
* Compiler => gcc 7.3.0
##### Detailed description
imwrite produces a segfault if the provided input array is an expression
involving an operation on a mat, such as 2 * x or x + y. It occurs for me with
both jpg and png output, and with 8U and 32F types. The problem does not occur
on OpenCV 3.0.0, other versions untested.
##### Steps to reproduce
cv::Mat1b m = cv::Mat1b::zeros(10, 10);
cv::imwrite("works.png", cv::Mat(m * 2));
cv::imwrite("fails.png", m * 2);
| 1 |
Calling scipy.stats.hypergeom.pmf with a total of 0 Type I objects (and
therefore, having a maximal k value of 0), returns a NaN value instead of 1.0.
From the documentation :
pmf(k, M, n, N) = choose(n, k) * choose(M - n, N - k) / choose(M, N),
Fixing k = n = 0, we have :
pmf(0, M, 0, N) = choose(0, 0) * choose(M - 0, N - 0) / choose(M, N),
= 1 * choose(M, N) / choose(M, N)
= 1 * 1
= 1
|
_Original tickethttp://projects.scipy.org/scipy/ticket/57 on 2006-04-02 by
@rkern, assigned to @rkern._
The function `tmean` in file source:trunk/Lib/stats/stats.py needs review.
Please look over the StatisticsReview guidelines and add your comments below.
| 0 |
The weekly build with nightly wheels from numpy and pandas
has failed. Check the logs for any updates that need to be
made in matplotlib.
https://github.com/matplotlib/matplotlib/actions/runs/2722394505
|
The weekly build with nightly wheels from numpy and pandas
has failed. Check the logs for any updates that need to be
made in matplotlib.
https://github.com/matplotlib/matplotlib/actions/runs/2600273155
| 1 |
Screenie: http://puu.sh/l7m9w/0f05381eb0.png
I finished the lesson, but I'm unable to progress. This happened with every
lesson in the jQuery section, but all the others finally worked after a dozen
clicks or thereabouts. The script runs correctly, and the hinge animated takes
place (wasn't able to puush fast enough to catch it in action). I'm going to
keep clicking in the hopes that it eventually works, but it is very
discouraging.
Challenge Waypoint: Use jQuery to Modify the Entire Page has an issue.
User Agent is: `Mozilla/5.0 (Windows NT 10.0; WOW64; rv:41.0) Gecko/20100101
Firefox/41.0`.
Please describe how to reproduce this issue, and include links to screenshots
if possible.
My code:
<script>
$(document).ready(function() {
$("#target1").css("color", "red");
$("#target1").prop("disabled", true);
$("#target4").remove();
$("#target2").appendTo("#right-well");
$("#target5").clone().appendTo("#left-well");
$("#target1").parent().css("background-color", "red");
$("#right-well").children().css("color", "green");
$("#left-well").children().css("color", "green");
$(".target:nth-child(2)").addClass("animated bounce");
$(".target:even").addClass("animated shake");
$("body").addClass("animated hinge");
});
</script>
<!-- Only change code above this line. -->
<div class="container-fluid">
<h3 class="text-primary text-center">jQuery Playground</h3>
<div class="row">
<div class="col-xs-6">
<h4>#left-well</h4>
<div class="well" id="left-well">
<button class="btn btn-default target" id="target1">#target1</button>
<button class="btn btn-default target" id="target2">#target2</button>
<button class="btn btn-default target" id="target3">#target3</button>
</div>
</div>
<div class="col-xs-6">
<h4>#right-well</h4>
<div class="well" id="right-well">
<button class="btn btn-default target" id="target4">#target4</button>
<button class="btn btn-default target" id="target5">#target5</button>
<button class="btn btn-default target" id="target6">#target6</button>
</div>
</div>
</div>
</div>
|
Challenge http://www.freecodecamp.com/challenges/waypoint-change-the-css-of-
an-element-using-jquery has an issue. Please describe how to reproduce it, and
include links to screenshots if possible.
Code fails validation ("Your target1 element should have red text." continues
to show a red X even though the condition has been met). Interestingly, if
"Run Code" is pushed several (sometimes >5) times, the code does eventually
pass even though nothing changed.
Screenshot _after_ pressing "Run Code":

| 1 |
When I close an unsaved file, I get a popup saying "foo has changes, do you
want to save them?". The popup has three buttons "Save", "Cancel" and "Don't
Save".
The standard Windows dialog for this allows me to hit n to select Don't Save.
WIBNI Atom would allow me to do the same, at least on Windows?
|
(this is possibly windows only, tested with 0.113.0)
**Steps To Reproduce**
* Open Atom
* Type something into a new editor or change an existing file (doesn't matter which)
* Attempt to close the tab (for example `Ctrl+W`)
* Correctly, a dialogue opens, that will give you a choice how to proceed

**current (faulty) behaviour**
The Keyboard is pretty much useless for this dialogue. You can only Hit
`[Enter]` to save.
**Desired Behaviour**
These interactions all don't work, but would be very much needed, all of these
are very much standard behaviour for any application:
* `[left]` and `[right]` should switch focus between buttos (so we can select a different thing with keyboard only)
* `[esc]` should Cancel
* `[alt]` should bring up shortcut labels (much like it does for the application menu, i.e. _F_ile, which means press `[Alt+F]`)
* `[tab]` should cycle between the buttons, `[shift-tab]` should cycle backwards.
| 1 |
I didn't know but it seems that PHP is replacing dots by underscores in
parameter names in all global variables ($_GET, $_POST, ...). This bug in PHP
is described on stackoverflow and on php.net.
Since the Request class uses those globals in `createFromGlobals` method, all
values in Request::$query, Request::$attributes, ... are "wrong".
Is this a known and accepted behavior ? There is a Gist with a proposed
solution. Do you think it is worth a PR or do I have to wait for @fabpot's one
on RequestFactory (#8957) ?
|
Symfony/Component/BrowserKit/Client->submit uses $form->getPhpValues() which
uses parsestr which automagically converts . to _, this breaks compatibility
with forms intended for non-PHP services.
**Background**
I am trying to write a functional test for connecting to an OpenID2
implementation in ASP.NET, however this requires use of a (auto-submitted by
javascript) form with names like hidden form names like openid.mode.
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.5-SNAPSHOT
* Operating System version: win10
* Java version: 1.8
### Steps to reproduce this issue
A compilation error occurred when cloning the latest code to local, as
follows: cannot resolve com.alibaba.spring: Spring context support:
1.0.4-snapshotPls. provide [GitHub address] to reproduce this issue.
### Expected Result
What do you expected from the above steps?
### Actual Result
What actually happens?
If there is an exception, please attach the exception trace:
Just put your stack trace here!
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.0
* Operating System version: mac
* Java version: 1.8
`org.apache.dubbo.config.AbstractConfig`:
static {
// this is only for compatibility
Runtime.getRuntime().addShutdownHook(DubboShutdownHook.getDubboShutdownHook());
}
The jvm hook is registered by default. When the user uses spring, the default
hook is not removed. It was the earliest cause of this shutdown graceful,
mainly due to this reason. Please note:
Spring does not register hooks by default, but users may register themselves,
spring-boot will also register hooks by default.
`org.apache.dubbo.config.spring.extension.SpringExtensionFactory.ShutdownHookListener`:
private static class ShutdownHookListener implements ApplicationListener {
@Override
public void onApplicationEvent(ApplicationEvent event) {
if (event instanceof ContextClosedEvent) {
// we call it anyway since dubbo shutdown hook make sure its destroyAll() is re-entrant.
// pls. note we should not remove dubbo shutdown hook when spring framework is present, this is because
// its shutdown hook may not be installed.
DubboShutdownHook shutdownHook = DubboShutdownHook.getDubboShutdownHook();
shutdownHook.destroyAll();
}
}
}
| 0 |
I need show the format of date of this way 'dd/mmm/yyyy', thanks
`<Field style={style.field} title="Birthday" name="dob" disabled={sendIn}
component={TextField} validate={Validations.required} type="date"
InputLabelProps={{ shrink: true, }} />`

|
`TextField` with a `label` set to something and `type` set to date but with no
initial value displays the label and `mm/dd/yyyy` on top of each other in the
`TextField`. Once you enter the field, the label moves up as superscript and
the mm/dd/yyyy remains and is readable.
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
When the date picker has no value, if it is going to display the expected date
format in the field, it should treat that as a value and move the label up to
superscript. Alternatively, don't display the date format unless the user
enters the field and there is no value.
## Current Behavior
Currently the label and date format overwrite each other if there is no value
for the TextField.
## Steps to Reproduce (for bugs)
Example on codesandbox.io:
https://codesandbox.io/s/6ljy4wq04k
## Your Environment
Tech | Version
---|---
Material-UI | 1.0.0-beta.9
React | 15.6.1
browser | Chrome 60.03
| 1 |
Rust version:
$ /opt/rust-0.6/bin/rustc -v
/opt/rust-0.6/bin/rustc 0.6
host: x86_64-unknown-linux-gnu
$
Code to reproduce:
struct Test;
pub impl Test {
pub fn new() -> Test {
return Test;
}
}
fn main() {
let test = Test.new();
}
Stack trace:
$ RUST_LOG=rustc=1,::rt::backtrace /opt/rust-0.6/bin/rustc reproduce.ts
error: internal compiler error: calling transform_self_type_for_method on static method
rust: task failed at 'explicit failure', /tmp/rust-0.6/src/libsyntax/diagnostic.rs:99
/opt/rust-0.6/bin/../lib/librustrt.so(_ZN9rust_task13begin_failureEPKcS1_m+0x4b)[0x7f12b4824d1b]
/opt/rust-0.6/bin/../lib/librustrt.so(+0x2c7b9)[0x7f12b48357b9]
/opt/rust-0.6/bin/../lib/librustrt.so(upcall_fail+0x1a0)[0x7f12b4826b00]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x10a4db)[0x7f12b7b404db]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x10a482)[0x7f12b7b40482]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(_ZN3sys12begin_unwind17_61fe198059b9e3fc3_06E+0x71)[0x7f12b7a88551]
/opt/rust-0.6/bin/../lib/libsyntax-84efebcb12c867a2-0.6.so(_ZN10diagnostic14__extensions__9meth_84585fatal15_c79235bb6437b73_06E+0x196)[0x7f12b6fef986]
/opt/rust-0.6/bin/../lib/libsyntax-84efebcb12c867a2-0.6.so(_ZN10diagnostic14__extensions__9meth_84813bug15_c79235bb6437b73_06E+0x6f)[0x7f12b6ff011f]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6driver7session14__extensions__10meth_188433bug17_b5f71376f9f489aa3_06E+0x80)[0x7f12b6350880]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method30transform_self_type_for_method17_51f0b1aaf3e498fc3_06E+0x192)[0x7f12b66dbc52]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method14__extensions__10meth_4855036create_rcvr_ty_and_substs_for_method17_971e65ec50f97c323_06E+0x2a8)[0x7f12b66d99c8]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method14__extensions__10meth_4843725push_candidates_from_impl17_9ac8518896a5ac6c3_06E+0x2d1)[0x7f12b66d7fa1]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x4a120c)[0x7f12b66db20c]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x4a105e)[0x7f12b66db05e]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method14__extensions__10meth_4842338push_inherent_impl_candidates_for_type16_eeee7bdb4b835d03_06E+0x9e)[0x7f12b66d4d2e]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method14__extensions__10meth_4838924push_inherent_candidates16_59939f4aa551e6a3_06E+0x1d3)[0x7f12b66ce403]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method14__extensions__10meth_483439do_lookup17_9ecadfb4b1ffaff13_06E+0x610)[0x7f12b66cc660]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check6method6lookup15_35be5e67e7e9ab3_06E+0x1df)[0x7f12b66ca83f]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check23check_expr_with_unifier17_841417e8b97d6e713_06E+0x2a35)[0x7f12b6726755]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check28check_expr_coercable_to_type17_5acee3342c4cd5a23_06E+0x91)[0x7f12b673baa1]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check22check_decl_initializer17_1547d91555c977933_06E+0xb2)[0x7f12b674eb22]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check16check_decl_local16_ac37e3abcfdd7ff3_06E+0x5ea)[0x7f12b674f20a]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x51709a)[0x7f12b675109a]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check10check_stmt17_e63c562496e2808c3_06E+0x26e)[0x7f12b6750b0e]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x517271)[0x7f12b6751271]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check25check_block_with_expected17_b352eb1d2e7639c03_06E+0x21e)[0x7f12b66ec55e]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check8check_fn17_20cad4a150c275a73_06E+0x1089)[0x7f12b66eada9]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check13check_bare_fn14_4fc3abe889a533_06E+0x17e)[0x7f12b66e9a5e]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check10check_item16_f8c9b58265752963_06E+0x43d)[0x7f12b66e889d]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x4ae2b1)[0x7f12b66e82b1]
/opt/rust-0.6/bin/../lib/libsyntax-84efebcb12c867a2-0.6.so(+0xe5214)[0x7f12b706b214]
/opt/rust-0.6/bin/../lib/libsyntax-84efebcb12c867a2-0.6.so(+0xe4dba)[0x7f12b706adba]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck5check16check_item_types17_b41350aeb9ad50f23_06E+0x430)[0x7f12b66e8080]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x5e4d55)[0x7f12b681ed55]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6middle6typeck11check_crate17_63c941d6da5d83433_06E+0x2d0)[0x7f12b681d970]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6driver6driver12compile_rest17_b2b258b84b35f3533_06E+0x1218)[0x7f12b6ab78e8]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6driver6driver12compile_upto17_12335154b455986e3_06E+0x108)[0x7f12b6abcbf8]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN6driver6driver13compile_input15_bb29edf7a232863_06E+0xca)[0x7f12b6abd03a]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN12run_compiler17_64d52739a36d169c3_06E+0x20aa)[0x7f12b6aec44a]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8bea81)[0x7f12b6af8a81]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8bb2ac)[0x7f12b6af52ac]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0xd2f8e)[0x7f12b7b08f8e]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x152bb4)[0x7f12b7b88bb4]
/opt/rust-0.6/bin/../lib/librustrt.so(_Z18task_start_wrapperP10spawn_args+0x24)[0x7f12b4825564]
rust: task failed at 'explicit failure', /tmp/rust-0.6/src/librustc/rustc.rc:357
/opt/rust-0.6/bin/../lib/librustrt.so(_ZN9rust_task13begin_failureEPKcS1_m+0x4b)[0x7f12b4824d1b]
/opt/rust-0.6/bin/../lib/librustrt.so(+0x2c7b9)[0x7f12b48357b9]
/opt/rust-0.6/bin/../lib/librustrt.so(upcall_fail+0x1a0)[0x7f12b4826b00]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x10a4db)[0x7f12b7b404db]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x10a482)[0x7f12b7b40482]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(_ZN3sys12begin_unwind17_61fe198059b9e3fc3_06E+0x71)[0x7f12b7a88551]
/opt/rust-0.6/bin/../lib/libcore-c3ca5d77d81b46c1-0.6.so(+0x152bb4)[0x7f12b7b88bb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN7monitor17_78935df9ff9e1afa3_06E+0x1577)[0x7f12b6aeea77]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(+0x8c0fb4)[0x7f12b6afafb4]
/opt/rust-0.6/bin/../lib/librustc-c84825241471686d-0.6.so(_ZN4main15_c4de63b748e03d3_06E+0x7e)[0x7f12b6afabce]
/opt/rust-0.6/bin/../lib/librustrt.so(_Z18task_start_wrapperP10spawn_args+0x24)[0x7f12b4825564]
rust: domain main @0xe073e0 root task failed
$
|
This should not compile (AFAIK, but static methods aren't documented), but it
shouldn't ICE either.
## output
error: internal compiler error: calling transform_self_type_for_method on static method
rust: task failed at 'explicit failure', /Users/burg/repos/rust/src/libsyntax/diagnostic.rs:78
rust: task failed at 'explicit failure', /Users/burg/repos/rust/src/rustc/driver/rustc.rs:275
rust: domain main @0x7f8c0a800010 root task failed
## testcase
struct Obj {
member: uint
}
impl Obj {
static pure fn boom() -> bool {
return 1+1 == 2
}
pure fn chirp() {
self.boom();
}
}
fn main() {
let o = Obj { member: 0 };
o.chirp();
1 + 1;
}
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
I uploaded my Next.js/Express app on AWS and it's failing at runtime for
unknown reason.
I have the following AWS logs:
START RequestId: 27f88393-2541-11e8-af21-a362290178d9 Version: $LATEST
2018-03-11 16:30:42.490 (+01:00) 27f88393-2541-11e8-af21-a362290178d9 req from http://staging.loan-advisor.studylink.fr/
END RequestId: 27f88393-2541-11e8-af21-a362290178d9
REPORT RequestId: 27f88393-2541-11e8-af21-a362290178d9 Duration: 600.55 ms Billed Duration: 700 ms Memory Size: 256 MB Max Memory Used: 40 MB
START RequestId: 293c9c47-2541-11e8-9e47-bb956db55617 Version: $LATEST
2018-03-11 16:30:43.008 (+01:00) 293c9c47-2541-11e8-9e47-bb956db55617 req from http://staging.loan-advisor.studylink.fr/_next/fa74e99a-3209-47d0-b35e-096852884958/page/_error.js
END RequestId: 293c9c47-2541-11e8-9e47-bb956db55617
REPORT RequestId: 293c9c47-2541-11e8-9e47-bb956db55617 Duration: 123.53 ms Billed Duration: 200 ms Memory Size: 256 MB Max Memory Used: 40 MB
What I could deduce from those logs is that a request is handled at
http://staging.loan-advisor.studylink.fr/ by Express and then a new request
from http://staging.loan-
advisor.studylink.fr/_next/fa74e99a-3209-47d0-b35e-096852884958/page/_error.js
is handled.
Next loads correctly IMHO, since it's able to load the http://staging.loan-
advisor.studylink.fr/_next/fa74e99a-3209-47d0-b35e-096852884958/page/_error.js
script, which, I guess, displays a 404?
So, something is wrong when Next.js app tries to display something initially,
but I have nothing to help me debug the issue 😕
Locally, everything works correctly, but I'm concerned about the lack of logs
in the AWS environment, which makes it super hard to debug anything.
I did force `quiet: false` when instanciating my Next.js app but it doesn't
have any effect.
* * *
Edit: I made a simple debug page at https://staging.loan-
advisor.studylink.fr/debug which works fine and gets rendered correctly, so my
first assumption is right, an exception happens in the index page which leads
to throwing a 404 somehow. (instead of a 500 :/)
|
# Feature request
## Is your feature request related to a problem? Please describe.
I see a lot of people using a separate component, just for adding an active
class when a route is active. My suggestion is the add functional support for
className in the link component.
To give an example:
import Link from "./link";
// Example using classnames package.
import cx from "classnames";
<Link className={active => cx({ "text-gray-200": active })} href="/">
<a>Home</a>
</Link>;
// Example using pure javascript.
<Link className={active => active && "text-gray-200"} href="/">
<a>Home</a>
</Link>;
## Describe the solution you'd like
My solution is to allow `className` to be both a string and a function; When
then component identifies the `className` as a function, call the method with
the first argument is a boolean if the route is active then true else false.
## Describe alternatives you've considered
An alternative solution is to implement the component yourself, as I have done
so far.
Another way would be creating a component with an `activeClassName` attribute
but this prevents support for class IntelliSense in editors as VSCode.
The reason I suggest this is because as the latest releases show, next.js is
all about zero-config and not too much complexity. So why not implement such
used feature into the existing Link component.
## Additional context
My current solution to this request is using a custom component.
Made a gist to my component and a small example here:
https://gist.github.com/ch99q/b417990540d2ef91870e1af99ac6a516
| 0 |
The Git history for this repository is invalid. You'll run into this if you
enable verification of the history in your configuration (which should really
be the default):
[transfer]
fsckobjects = true
[fetch]
fsckobjects = true
[receive]
fsckobjects = true
% git clone https://github.com/kennethreitz/requests.git
Cloning into 'requests'...
remote: Counting objects: 16580, done.
error: object 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8: badTimezone: invalid author/committer line - bad time zone
fatal: Error in object
If you look at the commit with verification disabled, you can see the problem:
commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8
Author: Shrikant Sharat Kandula <shrikantsharat.k@gmail.com>
Date: Thu Sep 8 02:38:50 2011 +51800
Typo in documentation
The kwarg is named `headers`, not `header`. Also, its a dict, not a set.
I don't have a suggestion on how to approach this. It's quite unfortunate. The
history would probably have to be rewritten, and that would be awful. I do
think it's worth noting though... verification by default may become the
default on various distributions or even upstream.
|
Hi Kenneth,
I'd like to see this feature: allow developers to specify more hooks per
category, and let them be applied in pipe.
I'll be very glad to produce a patch.
| 0 |
##### System information (version)
* OpenCV => Any version:
* Operating System / Platform => Windows 64 Bit and Windows 32 Bit
* Compiler => Visual Studio 2015
##### Detailed description
dist.h compile without NEON or GNUC
It will cause Memory out of bound Error!!
Because size * sizeof(pop_t), larger than original size
opencv/modules/flann/include/opencv2/flann/dist.h
Line 468 in b39cd06
| reinterpret_cast<const unsigned char*> (b), size * sizeof(pop_t));
---|---
##### Steps to reproduce
|
##### System information (version)
* OpenCV => : 4.1.2:
* Operating System / Platform => :Windows 64 Bit:
* Compiler => :Visual Studio 2015:
##### Detailed description
##### Steps to reproduce
I converted from a keras model to tensorflow. But I get this error. Has anyone
already encountered this type of error?
| 0 |
* I tried using the `@types/lodash` package and had problems.
* I tried using the latest stable version of tsc. https://www.npmjs.com/package/typescript
* I have a question that is inappropriate for StackOverflow.
* Authors: @bczengel, @chrootsu, @stepancar
### Ref
lodash/lodash#3475
## Error Details
### Code
import * as _ from "lodash";
const res = _.chain([1, 2, 3]).take(1);
### Error
sh-3.2# tsc
index.ts(3,13): error TS2684: The 'this' context of type 'LoDashExplicitWrapper<number[]>' is not assignable to method's 'this' of type'LoDashExplicitWrapper<ArrayLike<number> | null | undefined>'. Types of property 'push' are incompatible.
Type '<T>(this: _.LoDashExplicitWrapper<ArrayLike<T> | null | undefined>, ...items: T[]) => _.LoDashExp...' is not assignable to typ
e '<T>(this: _.LoDashExplicitWrapper<ArrayLike<T> | null | undefined>, ...items: T[]) => _.LoDashExp...'. Two different types with this
name exist, but they are unrelated.
Type 'LoDashExplicitWrapper<number[]>' is not assignable to type 'LoDashExplicitWrapper<ArrayLike<number> | null | undefined>'.
sh-3.2#
### Environment
{ ts: '1.0.0',
npm: '5.5.1',
ares: '1.10.1-DEV',
cldr: '31.0.1',
http_parser: '2.7.0',
icu: '59.1',
modules: '57',
nghttp2: '1.25.0',
node: '8.9.0',
openssl: '1.0.2l',
tz: '2017b',
unicode: '9.0',
uv: '1.15.0',
v8: '6.1.534.46',
zlib: '1.2.11' }
### Typescript
version : 2.6.1
target: "es6"
module: "commonjs"
types/lodash: 4.14.82
|
_.chain from `@types/lodash` is broken on Typescript 2.6
I have little knowledge of lodash internals, but the project that I'm working
on uses it a lot. Using the new `@ts-ignore` silences the error, but I wanted
to register it here.
### Reproducing:
install typescript ~2.6.0
`tsc -init`
`npm i lodash @types/lodash --save`
index.ts:
import * as _ from 'lodash'
var a = _.chain([1,2,3]).map((x) => { return x + 1 })
console.log(a.value())
run `tsc`
### Result:
index.ts(3,9): error TS2684: The 'this' context of type 'LoDashExplicitWrapper<number[]>' is not assignable to method's 'this' of type 'LoDashExplicitWrapper<ArrayLike<number> | Dictionary<number> | NumericDictionary<number> | null |...'.
Types of property 'push' are incompatible.
Type '<T>(this: _.LoDashExplicitWrapper<ArrayLike<T> | null | undefined>, ...items: T[]) => _.LoDashExp...' is not assignable to type '<T>(this: _.LoDashExplicitWrapper<ArrayLike<T> | null | undefined>, ...items: T[]) => _.LoDashExp...'. Two different types with this name exist, but they are unrelated.
Type 'LoDashExplicitWrapper<number[]>' is not assignable to type 'LoDashExplicitWrapper<ArrayLike<number> | Dictionary<number> | NumericDictionary<number> | null |...'.
index.ts(3,31): error TS7006: Parameter 'x' implicitly has an 'any' type.
### Dependencies:
"@types/lodash": "^4.14.80",
"lodash": "^4.17.4"
@aj-r, @thorn0, @andy-ms any ideas?
| 1 |
**Elasticsearch version** :
5.0
**Plugins installed** : []
x-pack
**JVM version** :
1.8
**OS version** :
ubuntu 14.05
**Description of the problem including expected versus actual behavior** :
enter command
service elasticsearch start
after 10s failed.
/var/log/elasticsearch there is no logs.
so I was test /etc/init.d/elasticsearch, it seems to timeout and immediately
finished.
I have no idea... what can i do anything?
thanks your support.
**Steps to reproduce** :
1.
2.
3.
**Provide logs (if relevant)** :
**Describe the feature** :
|
**Elasticsearch version** : 5.0.0
**Plugins installed** : none
**JVM version** :
openjdk version "1.8.0_91"
OpenJDK Runtime Environment (build 1.8.0_91-8u91-b14-0ubuntu4~14.04-b14)
OpenJDK 64-Bit Server VM (build 25.91-b14, mixed mode)
**OS version** : Ubuntu 14.04.5 LTS
**Description of the problem including expected versus actual behavior** :
Elasticsearch fails to start, doesn't gives any logs.
**Steps to reproduce** : I'm using Ansible script to create my infrastructure.
I had the same issue when I tried to upgrade my ES from 2.3 to ES 5.0. To
determine the root cause, I've tried it in a new server with no installations.
* name: Install apt-transport-https
apt: name=apt-transport-https state=present
* name: Add Java (OpenJDK) Repository
apt_repository: repo='ppa:openjdk-r/ppa' state=present
* name: Install Java (OpenJDK)
apt: pkg=openjdk-8-jdk state=installed update_cache=yes
* name: Add Elasticsearch Key
apt_key:
url: 'https://artifacts.elastic.co/GPG-KEY-elasticsearch'
* name: Add Elasticsearch Repository
apt_repository: repo='deb https://artifacts.elastic.co/packages/5.x/apt stable
main' state=present
* name: Install Elasticsearch
apt: pkg=elasticsearch state=installed update_cache=yes
* name: Init Elasticsearch
command: update-rc.d elasticsearch defaults 95 10
* name: Enable Elasticsearch (Start on boot)
service: name=elasticsearch enabled=yes
**Provide logs (if relevant)** : I don't see any logs, link to the YML file.
https://gist.github.com/ronakjain90/08ae1c225f806af81e383aba24804a60
**Describe the feature** : Since I'm just trying to install the ES with no
extra plugins, it seems weird that it just fails to start providing no clue
why is it is crashing, since there are no logs that I can see. ES 5.0 works
fine in my MacBook though.
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.4.1
* Operating System version: win10
* Java version: 1.8
### Steps to reproduce this issue
1. zookeeper客户端连接服务时采用的时异步的方式,当连接时间所需较长时,导致使用Stat监听节点时报未连接异常
自己尝试的解决方案:https://blog.csdn.net/fate_destiny/article/details/103066932
### Actual Result
Exception in thread "main" java.lang.IllegalStateException: zookeeper not connected
at org.apache.dubbo.remoting.zookeeper.curator.CuratorZookeeperClient.<init>(CuratorZookeeperClient.java:83)
at org.apache.dubbo.remoting.zookeeper.curator.CuratorZookeeperTransporter.createZookeeperClient(CuratorZookeeperTransporter.java:26)
at org.apache.dubbo.remoting.zookeeper.support.AbstractZookeeperTransporter.connect(AbstractZookeeperTransporter.java:68)
at org.apache.dubbo.remoting.zookeeper.ZookeeperTransporter$Adaptive.connect(ZookeeperTransporter$Adaptive.java)
at org.apache.dubbo.configcenter.support.zookeeper.ZookeeperDynamicConfiguration.<init>(ZookeeperDynamicConfiguration.java:62)
at org.apache.dubbo.configcenter.support.zookeeper.ZookeeperDynamicConfigurationFactory.createDynamicConfiguration(ZookeeperDynamicConfigurationFactory.java:37)
at org.apache.dubbo.configcenter.AbstractDynamicConfigurationFactory.getDynamicConfiguration(AbstractDynamicConfigurationFactory.java:33)
at org.apache.dubbo.config.AbstractInterfaceConfig.getDynamicConfiguration(AbstractInterfaceConfig.java:315)
at org.apache.dubbo.config.AbstractInterfaceConfig.prepareEnvironment(AbstractInterfaceConfig.java:290)
at org.apache.dubbo.config.AbstractInterfaceConfig.startConfigCenter(AbstractInterfaceConfig.java:280)
at org.apache.dubbo.config.AbstractInterfaceConfig.lambda$null$7(AbstractInterfaceConfig.java:636)
at java.util.Optional.orElseGet(Optional.java:267)
at org.apache.dubbo.config.AbstractInterfaceConfig.lambda$useRegistryForConfigIfNecessary$8(AbstractInterfaceConfig.java:620)
at java.util.Optional.ifPresent(Optional.java:159)
at org.apache.dubbo.config.AbstractInterfaceConfig.useRegistryForConfigIfNecessary(AbstractInterfaceConfig.java:618)
at org.apache.dubbo.config.AbstractInterfaceConfig.checkRegistry(AbstractInterfaceConfig.java:208)
at org.apache.dubbo.config.ServiceConfig.checkAndUpdateSubConfigs(ServiceConfig.java:303)
at org.apache.dubbo.config.ServiceConfig.export(ServiceConfig.java:370)
at org.apache.dubbo.config.spring.ServiceBean.export(ServiceBean.java:336)
at org.apache.dubbo.config.spring.ServiceBean.onApplicationEvent(ServiceBean.java:114)
at org.apache.dubbo.config.spring.ServiceBean.onApplicationEvent(ServiceBean.java:60)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:172)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:165)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:139)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:393)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:347)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:883)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:546)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:139)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:93)
at Provider.main(Provider.java:5)
Caused by: java.lang.IllegalStateException: zookeeper not connected
at org.apache.dubbo.remoting.zookeeper.curator.CuratorZookeeperClient.<init>(CuratorZookeeperClient.java:80)
... 30 more
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.5
* Operating System version: MAC OS X
* Java version: Hotspot 1.8
### Steps to reproduce this issue
1.define method
void add(Map<Integer, Student> map)
2.config
<dubbo:protocol name="dubbo" port="20880" serialization="fastjson" />
3.call method add.
Pls. provide [GitHub address] to reproduce this issue.
### Expected Result
provider deserialize Map<Integer, Student> ok, map's value is Student type.
### Actual Result
Map's value is JSONObject type.
Because in DecodeableRpcInvocation#decode(Channel channel, InputStream input),
pts = methodDescriptor.getParameterClasses(), not ParameterizedType, in this
case, fastjson unable parse Map's value is Student.
If there is an exception, please attach the exception trace:
Just put your stack trace here!
| 0 |
* Electron version: 0.30+
* Operating system: Windows, Mac & Linux
There are plenty of small & large desktop applications made in Electron, and
while core application logic may be under a few megabytes, the installer
(having its own Electron bundled) itself adds over 60 MB of weight. Once I
have couple of Electron based apps installed, I have multiple Electron
binaries installed dedicated for each single app.
Now, I'm aware that each such app **may be using a different version of
Electron itself (relying on features & bugs of a particular version to get the
job done)**, but that's the point here, to make sure that run-time serves the
platform and apps carry just its own dependencies (NPM modules unrelated of
Electron), thus making resulting installers way less in size. Look at JVM or
Adobe AIR, for example.
Obviously there can be special scenarios where Electron is patched for a
particular app (Brave browser for instance, has process sandboxing enabled, a
feature not natively supported in Electron itself), in such case installer can
carry runtime along, but trivial applications don't need it.
Abricotine is one such example, where it is plain Markdown editor (and source
is measly **2 MB** ), but 64bit Windows installer is **75 MB!**.
This problem of redundant installations is only going to add bloat as adoption
of technologies like Electron & NW.js increases to create Desktop applications
(being dead simple to get apps running instead of going a native route).
Thanks.
|
Currently developers using atom-shell have to ship the whole atom-shell
binaries when distributing their apps, but since now we have asar as atom-
shell's app format, we may add runtime mode to atom-shell like Adobe Air or
Chrome Hosted Apps that developers only need to distribute packaged app in
`asar` format as long as end users have the runtime mode of atom-shell
installed.
The runtime mode can just be another app based on atom-shell (let's call it
`atom-runtime` for now), and has deep platform integrations:
* On Windows, we can provide installer and auto updater for atom-runtime, and register the `.asar` file type to be opened with it.
* On Linux, we provide official software sources of atom-runtime for some popular distributions, and also register atom-runtime as handler of `.asar` files.
* On Mac, we need to provide a tool to help users package their apps into Mac bundles. We can reference how Steam created bundles for downloaded games, or how Parallel created bundles for applications in hosted OS.
We can even provide a tool or service to help generating installers for
developers' apps which can automatically download atom-runtime if it is not
installed on user's machine, just like many .NET applications.
| 1 |
Atom crashes every time when I try to open it, from command line and from
dock. It crashes immediately after opening, I can't even see the window. I
have used atom in my computer few months with out any problem.
When opening atom from terminal it prints following message every time:
LSOpenURLsWithRole() failed for the application /Applications/Atom.app with error -10810.
And OS X opens "Problem Report for Atom" window.
It contains following information
Process: Atom [27076]
Path: /Applications/Atom.app/Contents/MacOS/Atom
Identifier: com.github.atom
Version: ???
Code Type: X86-64 (Native)
Parent Process: launchd [353]
Responsible: Atom [27076]
User ID: 501
Date/Time: 2015-03-27 23:58:27.061 +0200
OS Version: Mac OS X 10.9.5 (13F1066)
Report Version: 11
...
Crashed Thread: 0
Exception Type: EXC_BREAKPOINT (SIGTRAP)
Exception Codes: 0x0000000000000002, 0x0000000000000000
Application Specific Information:
dyld: launch, loading dependent libraries
Dyld Error Message:
Library not loaded: @rpath/libchromiumcontent.dylib
Referenced from: /Applications/Atom.app/Contents/Frameworks/Atom Framework.framework/Versions/A/Atom Framework
Reason: image not found
Binary Images:
0x7fff6e412000 - 0x7fff6e445817 dyld (239.4) <7AD43B9B-5CEA-3C7E-9836-A06909F9CA56> /usr/lib/dyld
0x7fff855f1000 - 0x7fff855f1fff com.apple.Carbon (154 - 157) <45A9A40A-78FF-3EA0-8FAB-A4F81052FA55> /System/Library/Frameworks/Carbon.framework/Versions/A/Carbon
0x7fff8727d000 - 0x7fff87419ff3 com.apple.QuartzCore (1.8 - 332.3) <72003E51-1287-395B-BCBC-331597D45C5E> /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore
0x7fff8ee6d000 - 0x7fff8eebffff libc++.1.dylib (120) <4F68DFC5-2077-39A8-A449-CAC5FDEE7BDE> /usr/lib/libc++.1.dylib
0x7fff90102000 - 0x7fff90103ff7 libSystem.B.dylib (1197.1.1) <E303F2F8-A8CF-3DF3-84B3-F2D0EE41CCF6> /usr/lib/libSystem.B.dylib
|
I am unable to open Atom from the command line on Mac OS X 10.10 (Beta 4)—I
don't remember when it broke or what caused it, but re-downloading/re-
installing the command line tools didn't fix it. I believe this is different
from #1103 because I'm on Atom `0.119.0`.
I investigated which line of `/usr/local/bin/atom` was causing the error and
discovered that
open -a /Applications/Atom.app
works fine, but
open -a /Applications/Atom.app -n
causes the error.
Full error message:
LSOpenURLsWithRole() failed for the application /Applications/Atom.app with error -10810.
| 1 |
**TypeScript Version:**
1.8.7
compiled: `tsc --noImplicitAny error.ts`
(the behavior is exactly the same with or without `--noImplictAny`)
**Code**
// A self-contained demonstration of the problem follows...
type Greeting = "hello" | "goodbye"
const apply = <T extends string,R>(t: T, f: (t:T) => R): R => f(t)
const f1 = (a: Greeting) => console.log('greetings')
const f2 = (a: number) => console.log(3+a)
apply("hello", f1) /* 1. should be good. is good */
apply("helo", f1) /* 2a. should be error. is good */
apply<Greeting, void>("helo", f1) /* 2b. should be error. is error */
apply("hello", f2) /* 3. should be error. is error */
apply(3, f1) /* 4. should be error. is error */
apply(3, f2) /* 5. should be error. is error */
**Expected behavior:**
2-5 should have an error:
2 should have an error because `"helo"` is not a member of `"hello"|"goodbye"`
3 should have an error because `"hello"` is not a `number`
4 should have an error because `3` is not a member of `"hello"|"goodbye"` and
because `3` is not a member of type which `extends string`
5 should have an error because `3` is not a member of a type which `extends
string`, and because `number` does not `extend string`.
**Actual behavior:**
1, 2b, 3-5 are correct
2a passes typechecking even though it should be equivalent to 2b
|
**TypeScript Version:**
1.8.9
**Code**
Based on Using the Compiler API, to generate an AST, use `createSourceFile`
like this:
// A self-contained demonstration of the problem follows...
ts.createSourceFile(<filename>, <file content>, ts.ScriptTarget.ES6, true)
What gets generated appears to be an untyped AST, i.e. the `.type` field only
gets filled if it's declared explicitly in the source code.
What I'm looking for is to generated the typed AST, i.e. having the sources
being loaded and typed-checked.
How can I do that? Thanks.
| 0 |
import 'dart:async';
import 'package:flutter/widgets.dart';
final GlobalKey _kKey = new GlobalKey();
Future main() async {
runApp(new Block(
children: new List<Widget>.generate(
2,
(_) => new Padding(
padding: const EdgeInsets.all(8.0),
child: new Container(
key: _kKey,
width: 100.0,
height: 100.0,
decoration: new BoxDecoration(
backgroundColor: new Color(0xFFFFFF00)))))));
}
|
If you're creating an app that has native and flutter components there is no
way to share images or fonts between flutter and android/ios. This means that
assets needed on both sides will have to be included in the binary twice and
that could be expensive.
| 0 |
Runas blows up because it's using old nan, but upgrading it to latest doesn't
seem to help, yet when I just run `npm install`, it all works >:| It'd be cool
if we could get on official io.js for building so we could just replace all
references of node.js everywhere in the docs
|
`gyp http GET https://atom.io/download/atom-shell/v0.25.1/iojs-v0.25.1.tar.gz`
`gyp http 403 https://atom.io/download/atom-shell/v0.25.1/iojs-v0.25.1.tar.gz`
It looks already released but can't access. Is it temporary?
https://github.com/atom/electron/releases
| 1 |
When trying to drag the mouse in order to select text, nothing happen and the
cursor stays where the mouse was first clicked. It also prevent from drag and
dropping tabs in the same way.
I had the same bug as soon as I updated my Google Chrome to version 41. I had
to add this flag '--touch-devices=123' when opening it as mentionned in this
issue tracker:
https://code.google.com/p/chromium/issues/detail?id=456222#c102
I do not know how I can add this flag for Atom?
Windows 8.1 Host, Ubuntu 14.04 Guest, VirtualBox 4.3.26
|
There is a known issue with Chrome41 engine running on Linux in VirtualBox. It
actually became a show-stopper for Chrome42. **TL;DR** : in VB Chrome41 engine
_always_ runs in touch-screen mode, even on desktops.
Hotfix in Chrome41 is to pass `--touch-devices=123` to the chrome command
line.
Is it possible to pass parameters to Atom’s command line so that chrome engine
will understand them?
| 1 |
* Electron version: 1.4.3
* Operating system: macOS Sierra 10.12
* Description:
I haven't found something in issues so I created it :) Maybe it is a
duplicate. Than it can be closed :D
However. It would be great if we can create separator in Tray.
Like the menu-item type separator.
Current tray implementation (without separator):

(Here I like separators between upload and settings and between settings and
close)
Example of macOS:

|
* Electron Version: 2.0.5
* Operating System (Platform and Version): macOS Mojave beta4
**Expected Behavior**
Call `browserWindow.setProgressBar(0.5)`, the app hard crashes citing an
unrecognized selector. Doesn't happen on High Sierra
[_NSImageViewSimpleImageView setDoubleValue:]: unrecognized selector sent to instance 0x7fa6b7472040
**Actual behavior**
Don't crash, set the progress bar
**To Reproduce**
Clone the repo and press the crash button...
git clone https://github.com/wavebox/electron-quick-start.git -b mojave_set_progressbar_crash
cd electron-quick-start
npm install
npm start
| 0 |
We should add some basic synchronization around multiple processes creating
the node_modules folder at the same time. Basically do whatever `cargo build`
does and have each process wait on the other.
This is necessary because it seems like multiple processes creating junctions
at the same time causes some weird issues with the junctions becoming invalid.
Also, it's duplicate work for the two processes to do this.
Edit: I have the code done for this... just need to write a test.
|
I am using deno_core version 0.112.0 in my project. But when I make a release
build and run it on another computer without cargo installed I get the
following error:
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: The system cannot find the path specified. (os error 3)', C:\Users\user\.cargo\registry\src\github.com-1ecc6299db9ec823\deno_core-0.112.0\runtime.rs:390:38
It appears to be trying to load `00_primordials.js` which doesn't exist on my
other system. Deno ships in a single executable so i figured that it is
possible to do the same with deno_core.
| 0 |
**Do you want to request a _feature_ or report a _bug_?**
Report a bug.
**What is the current behavior?**
Getting the following error while trying to hydrate SSR content.
`Expected server HTML to contain a matching <linearGradient> in <defs>.`
**If the current behavior is a bug, please provide the steps to reproduce and
if possible a minimal demo of the problem viahttps://jsfiddle.net or similar
(template for React 16: https://jsfiddle.net/Luktwrdm/, template for React 15:
https://jsfiddle.net/hmbg7e9w/).**
The following check returns false
react/packages/react-dom/src/client/ReactDOM.js
Lines 453 to 462 in 177cd85
| canHydrateInstance(
---|---
| instance: Instance | TextInstance,
| type: string,
| props: Props,
| ): boolean {
| return (
| instance.nodeType === ELEMENT_NODE &&
| type.toLowerCase() === instance.nodeName.toLowerCase()
| );
| },
as `type` is `linearGradient` and `instance.nodeName.toLowerCase()` is
`lineargradient`
**What is the expected behavior?**
Node types are compared correctly as SVG uses camelCase for tagName
**Which versions of React, and which browser / OS are affected by this issue?
Did this work in previous versions of React?**
React 16
|
**Do you want to request a _feature_ or report a _bug_?**
Bug
**What is the current behavior?**
I've started recieving this warning about my SVGs which is using
feGaussianBlur:
`Warning: Expected server HTML to contain a matching <feGaussianBlur> in
<filter>.`
**What is the expected behavior?**
Should be no warnings because there are no difference.
**Which versions of React, and which browser / OS are affected by this issue?
Did this work in previous versions of React?**
Such behavior started right after updating to `React 16.0.0-rc.3` and never
happened with `React 15.6.1`
| 1 |
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
EC2_GROUP
##### ANSIBLE VERSION
ansible 2.2.0.0
config file =
configured module search path = Default w/o overrides
##### CONFIGURATION
##### OS / ENVIRONMENT
Amazon Linux 2016.09
##### SUMMARY
"msg": "group referenced_group will be automatically created by rule
{'to_port': 65535, 'from_port': 0, 'proto': 'tcp', 'group_name':
'referenced_group'} and no description was provided"
##### STEPS TO REPRODUCE
pip install ansible
attempt to apply group with a reference to another group
---
- name: new_group
hosts: localhost
gather_facts: no
tasks:
- name: new_group
local_action:
module: ec2_group
name: new_group
description: new security group for some service
vpc_id: "{{ vpc }}"
region: "{{ region }}"
profile: "{{ profile }}"
rules:
- proto: tcp
from_port: 0
to_port: 65535
group_name: referenced_group
- proto: tcp
from_port: 22
to_port: 22
cidr_ip: "{{ access_cidr }}"
##### EXPECTED RESULTS
Application of security group(s)
##### ACTUAL RESULTS
fatal: [localhost -> localhost]: FAILED! => {
"changed": false,
"failed": true,
"msg": "group referenced_group will be automatically created by rule {'to_port': 65535, 'from_port': 0, 'proto': 'tcp', 'group_name': 'referenced_group'} and no description was provided"
|
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
ec2_group
##### ANSIBLE VERSION
ansible 2.2.1.0
config file =
configured module search path = Default w/o overrides
install: sudo pip install -U
git+https://github.com/ansible/ansible.git@v2.2.1.0-0.2.rc2 \--upgrade
--ignore-installed six
##### CONFIGURATION
default ansible.cfg
##### OS / ENVIRONMENT
OS X El Capitan 10.11.6
Multi VPC AWS environment. Peers between VPC's w/appropriate routing between
them
##### SUMMARY
ec2_group does not allow group_name to use groups via group name from other
peered VPC's. This worked in v1.9.6 and v2.0.2. It appears to be due to some
additional conditions that were added on 7/27/16. If I revert this change, it
works as expected.
`0419914`
Using group_id with the name of the security group in a different VPC works
but is not idempotent. It will create the rule if it's new but will fail if it
already exists.
##### STEPS TO REPRODUCE
Attempt to use a AWS security group name from another VPC with required
peering present using group_name:
- name: Provision Security Groups
ec2_group:
name: MyGroup
description: MyDesc
region: us-east-1
vpc_id: vpc-xxxxxxxx
rules:
- proto: tcp
from_port: 22
to_port: 22
group_name: <Other_VPC_SG_NAME>
Attempt to use a AWS security group name from another VPC with required
peering present using group_id:
- name: Provision Security Groups
ec2_group:
name: MyGroup
description: MyDesc
region: us-east-1
vpc_id: vpc-xxxxxxxx
rules:
- proto: tcp
from_port: 22
to_port: 22
group_id: <Other_VPC_SG_NAME>
##### EXPECTED RESULTS
Translate group name to group ID (sg-xxxxxxxx) and update rule with the
security group ID of the other VPC's security group.
##### ACTUAL RESULTS
The group in the peered VPC is not found and an attempt to create a new group
in the current VPC is executed. In my case, this fails due to "no description"
Message using group_name with security group name from a different VPC
"msg": "group <PeeredVPCGroupName> will be automatically created by rule {'to_port': 22, 'from_port': 22, 'group_name': '<PeeredVPCGroupName>', 'proto': 'tcp'} and no description was provided"
Message using group_id with security group name from a different VPC if the
rule exists. If the rule does not exist, it is added as expected.
"module_stderr": "Traceback (most recent call last):\n File \"/var/folders/t8/vdrxm90s1ps41pghp7wxn8n80000gp/T/ansible_pJu0pZ/ansible_module_ec2_group.py\", line 479, in <module>\n main()\n File \"/var/folders/t8/vdrxm90s1ps41pghp7wxn8n80000gp/T/ansible_pJu0pZ/ansible_module_ec2_group.py\", line 374, in main\n group.authorize(rule['proto'], rule['from_port'], rule['to_port'], thisip, grantGroup)\n File \"/Library/Python/2.7/site-packages/boto-2.38.0-py2.7.egg/boto/ec2/securitygroup.py\", line 203, in authorize\n dry_run=dry_run)\n File \"/Library/Python/2.7/site-packages/boto-2.38.0-py2.7.egg/boto/ec2/connection.py\", line 3191, in authorize_security_group\n params, verb='POST')\n File \"/Library/Python/2.7/site-packages/boto-2.38.0-py2.7.egg/boto/connection.py\", line 1227, in get_status\n raise self.ResponseError(response.status, response.reason, body)\nboto.exception.EC2ResponseError: EC2ResponseError: 400 Bad Request\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Response><Errors><Error><Code>InvalidPermission.Duplicate</Code><Message>the specified rule \"peer: sg-xxxxxxxx, TCP, from port: 22, to port: 22, ALLOW\" already exists</Message></Error></Errors><RequestID>b9ec5eee-4a86-49d7-90b8-86bfbf2ba21b</RequestID></Response>\n",
"module_stdout": "",
"msg": "MODULE FAILURE"
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
Dialog will scroll its own content when I make a scroll gesture on the phone
## Current Behavior
Once I open a second fullscreen modal (which is opened previously by another
fullscreen modal), the scroll won't work as expected, since it will scroll
contents that are not visible (the Modal behind)
## Steps to Reproduce (for bugs)
https://codesandbox.io/s/pkjk07o8o0
* Open the link on iOS
* Go to the second Dialog
* Scroll to the very end, scroll two time more while at the end
* Once at the end of the modal, scroll to the top
* Scroll will start to feel weird.
Note: it seems that this is happening when you scroll tapping on a input, not
100% sure.
## Context
I'm trying to go thru a list of Cards on a Dialog, click on a specific one so
it will open a edit Modal, trying to add data to the fields on the 2nd modal
will feel weird when I try to scroll.
## Your Environment
Tech | Version
---|---
Material-UI | Next
React | 16.2.0
browser | iOS
etc |
|
Tested dialogs in http://www.material-ui.com/#/components/dialog.
On desktop Chrome, background scrolling is disabled when dialogs are shown.
However, it is not disabled in iOS Safari or Chrome.
| 1 |
#### Code Sample
>>> pd.to_datetime('2017-01-01T15:00:00-05:00')
... Timestamp('2017-01-01 20:00:00') # This is a timezone-naive Timestamp, but was converted to UTC
#### Problem description
Given a timezone-aware string (e.g., in ISO8601 format), `pd.to_datetime`
converts the string to UTC time, but doesn't set the `tzinfo` to UTC.
This makes it difficult to differentiate between string that are truly
timezone-naive strings (e.g., '2017-01-01') and timezone-aware strings (e.g.,
'2017-01-01T15:00:00-05:00').
#### Expected Output
Option 1: Convert the string to UTC (current behavior) and also set the
timezone to UTC:
>>> pd.to_datetime('2017-01-01T15:00:00-05:00')
... Timestamp('2017-01-01 20:00:00+0000', tz='UTC')
Option 2: Don't convert to UTC and retain timezone. This is the behavior of
`dateutil.parser.parse`:
>>> pd.to_datetime(parser.parse('2017-01-01T15:00:00-0500'))
... Timestamp('2017-01-01 15:00:00-0500', tz='tzoffset(None, -18000)')
#### Output of `pd.show_versions()`
INSTALLED VERSIONS \------------------ commit: None python: 3.5.3.final.0
python-bits: 64 OS: Linux OS-release: 4.1.35-pv-ts1 machine: x86_64 processor:
byteorder: little LC_ALL: None LANG: en_US.UTF-8 LOCALE: en_US.UTF-8
pandas: 0.20.3
pytest: 3.1.2
pip: 9.0.1
setuptools: 27.2.0
Cython: 0.25.2
numpy: 1.12.1
scipy: 0.19.1
xarray: None
IPython: 6.1.0
sphinx: 1.6.2
patsy: 0.4.1
dateutil: 2.6.0
pytz: 2017.2
blosc: None
bottleneck: 1.2.1
tables: 3.3.0
numexpr: 2.6.2
feather: None
matplotlib: 2.0.2
openpyxl: 2.4.7
xlrd: 1.0.0
xlwt: None
xlsxwriter: 0.9.6
lxml: None
bs4: 4.6.0
html5lib: 0.999
sqlalchemy: 1.1.11
pymysql: None
psycopg2: None
jinja2: 2.9.6
s3fs: None
pandas_gbq: None
pandas_datareader: None
|
Suppose you have an existing SQL table called `person_age`, where `id` is the
primary key:
age
id
1 18
2 42
and you also have new data in a `DataFrame` called `extra_data`
age
id
2 44
3 95
then it would be useful to have an option on `extra_data.to_sql()` that allows
to pass the DataFrame to SQL with an `INSERT` or `UPDATE` option on the rows,
based on the `primary key`.
In this case, the `id=2` row would get updated to `age=44` and the `id=3` row
would get added
#### Expected Output
age
id
1 18
2 44
3 95
#### (Maybe) helpful code references
* Use `merge` from SQLAlchemy?
* The query: '''INSERT or REPLACE into person_age (id, age) values (?,?,?) ''' in this question
I looked at `pandas` `sql.py` sourcecode to come up with a solution, but I
couldn't follow.
#### Code to replicate the example above
(Apologies for mixing `sqlalchemy` and `sqlite`
import pandas as pd
from sqlalchemy import create_engine
import sqlite3
conn = sqlite3.connect('example.db')
c = conn.cursor()
c.execute('''DROP TABLE IF EXISTS person_age;''')
c.execute('''
CREATE TABLE person_age
(id INTEGER PRIMARY KEY ASC, age INTEGER NOT NULL)
''')
conn.commit()
conn.close()
##### Create original table
engine = create_engine("sqlite:///example.db")
sql_df = pd.DataFrame({'id' : [1, 2], 'age' : [18, 42]})
sql_df.to_sql('person_age', engine, if_exists='append', index=False)
#### Extra data to insert/update
extra_data = pd.DataFrame({'id' : [2, 3], 'age' : [44, 95]})
extra_data.set_index('id', inplace=True)
#### extra_data.to_sql() with row update or insert option
expected_df = pd.DataFrame({'id': [1, 2, 3], 'age': [18, 44, 95]})
expected_df.set_index('id', inplace=True)
| 0 |
## 📝 Provide a description of the new feature
I Would love to see a feature allowing us to set any window to be always on
top of other windows. I thing a good way to enable the feature would just be
add it to the context menu when you right click on the title bar of a window.
## _What is the expected behavior of the proposed feature? What is the
scenario this would be used?_
Enable the feature, the window stays always on top of ALL other windows.
If you'd like to see this feature implemented, add a 👍 reaction to this post.
|
Please make an MSI available as was done in previous versions. It's nice that
the EXE has apparently reached parity with the MSI, but MSI is much preferred
for distribution in an enterprise environment using Configuration Manager.
* * *
If you'd like to see this feature implemented, add a 👍 reaction to this post.
| 0 |
### Preflight Checklist
* I have read the Contributing Guidelines for this project.
* I agree to follow the Code of Conduct that this project adheres to.
* I have searched the issue tracker for a feature request that matches the one I want to file, without success.
### Electron Version
11.2.3
### What operating system are you using?
Windows
### Operating System Version
Windows 10
### What arch are you using?
x64
### Last Known Working Electron version
5.10.0
### Expected Behavior
When use fse.copy() function to copy a file from one location to another, it
should works.
### Actual Behavior
It seems like, on 11.2.3, fse.copy() will stop working sometimes, without any
error or warning. It's like the app doesn't have the permission to perform the
operation. On 5.10.0, the same code works perfectly.
Here is more information.
1.Not everyone will encounter this BUG.
2.For some people, they can temporarily solve this problem by restarting the
app. The bug will show up afterwards.
3.For some people, even restarting the app will not solve this problem.
4.This bug is only found on Windows OS.
### Testcase Gist URL
_No response_
### Additional Information
Our app is a book reading app. We updated the version of electron from 5.10 to
11.2.3 last month.
We have got a lot of complaints about "can not add new books to the library
sometimes." after the update.
Our team member has encountered the same problem once, but we cannot reproduce
it afterwards.
Right now, we have narrowed down the problem to following functions
fse.copy().
It seems like, on 11.2.3, fse.copy() will stop working sometimes, without any
error or warning. It's like the app doesn't have the permission to perform the
operation. On 5.10.0, the same code works perfectly.
|
Our app is a book reading app. We updated the version of electron from 5.10 to
11.2.3 last month.
We have got a lot of complaints about "can not add new books to the library
sometimes." after the update.
Our team member has encountered the same problem once, but we cannot reproduce
it afterward.
Right now, we have narrowed down the problem to following functions
fse.copy().
It seems like, on 11.2.3, fse.copy() will stop working sometimes, without any
error or warning. It's like the app doesn't have the permission to perform the
operation. On 5.x, the same code works perfectly.
Here is more information.
1.Not everyone will encounter this BUG.
2.For some people, they can temporarily solve this problem by restarting the
app.
3.For some people, even restarting the app will not solve this problem.
4.This bug only found on Windows OS.
Does anyone know why this is? what should we do?
Thanks.
| 1 |
### Description
At present, a `set` method in XCom using two steps:
1. delete all XCom with key `(dag_id, execution_date, task_id)`
2. insert new XCom
However, when the xcom is set frequently, some error will occur, like this:
sqlalchemy.exc.OperationalError: (MySQLdb._exceptions.OperationalError) (1205, 'Lock wait timeout exceeded; try restarting transaction')
related code lines:
https://github.com/apache/airflow/blob/main/airflow/models/xcom.py#L84
### Use case/motivation
using `ON DUPLICATE KEY UPDATE` maybe better?
### Related issues
_No response_
### Are you willing to submit a PR?
* Yes I am willing to submit a PR!
### Code of Conduct
* I agree to follow this project's Code of Conduct
|
I have a kind request for all the contributors to the latest provider packages
release.
Could you please help us to test the RC versions of the providers?
Let us know in the comment, whether the issue is addressed.
Those are providers that require testing as there were some substantial
changes introduced:
## Provider amazon: 8.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* take advantage of upcoming major version release to remove deprecated things (#30755): @vandonr-amz
* add a stop operator to emr serverless (#30720): @vandonr-amz
* SqlToS3Operator - Add feature to partition SQL table (#30460): @utkarsharma2
* New AWS sensor — DynamoDBValueSensor (#28338): @mrichman
* Fixed logging issue in `DynamoDBValueSensor` (#30703): @mrichman
* DynamoDBHook - waiter_path() to consider `resource_type` or `client_type` (#30595): @utkarsharma2
* Add ability to override waiter delay in EcsRunTaskOperator (#30586): @vincbeck
* Add support in AWS Batch Operator for multinode jobs (#29522): @vandonr-amz
* AWS logs. Exit fast when 3 consecutive responses are returned from AWS Cloudwatch logs (#30756): @vincbeck
* Remove @poke_mode_only from EmrStepSensor (#30774): @vincbeck
* Organize Amazon provider docs index (#30541): @eladkal
* Remove duplicate param docstring in EksPodOperator (#30634): @jlaneve
## Provider apache.beam: 5.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* Add mechanism to suspend providers (#30422): @potiuk
## Provider apache.kafka: 1.0.0rc1
* Add provider for Apache Kafka (#30175): @dylanbstorey
## Provider cncf.kubernetes: 6.1.0rc1
* Add multiple exit code handling in skip logic for `DockerOperator` and `KubernetesPodOperator` (#30769): @eladkal
* Skip KubernetesPodOperator task when it returns a provided exit code (#29000): @hussein-awala
## Provider databricks: 4.1.0rc1
* Add delete inactive run functionality to databricks provider (#30646): @phanikumv
* Databricks SQL sensor (#30477): @harishkesavarao
## Provider dbt.cloud: 3.1.1rc1
* Merge DbtCloudJobRunAsyncSensor logic to DbtCloudJobRunSensor (#30227): @Lee-W
* Move typing imports behind TYPE_CHECKING in DbtCloudHook (#29989): @josh-fell
## Provider docker: 3.6.0rc1
* Add multiple exit code handling in skip logic for `DockerOperator` and `KubernetesPodOperator` (#30769): @eladkal
* In `DockerOperator`, adding an attribute `tls_verify` to choose whether to validate certificate (#30309) (#30310): @oboki
* Deprecate `skip_exit_code` in `DockerOperator` and `KubernetesPodOperator` (#30733): @eladkal
## Provider google: 10.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* Update Google Campaign Manager360 operators to use API v4 (#30598): @VladaZakharova
* Update DataprocCreateCluster operator to use 'label' parameter properly (#30741): @VladaZakharova
* BigQueryGetDataOperator should use project_id (#30651): @ying-w
* Display Video 360 cleanup v1 API usage (#30577): @lwyszomi
## Provider microsoft.azure: 6.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* Add deferrable mode to `WasbBlobSensor` (#30488): @phanikumv
## Provider mysql: 5.0.0rc1
* Remove mysql-connector-python (#30487): @moiseenkov
## Provider presto: 5.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* Add mechanism to suspend providers (#30422): @potiuk
## Provider snowflake: 4.0.5rc1
* Update documentation for snowflake provider 4.0 breaking change (#30020): @potiuk
## Provider sqlite: 3.3.2rc1
* Use connection URI in SqliteHook (#28721): @feluelle
## Provider trino: 5.0.0rc1
* Remove deprecated "delegate_to" from GCP operators and hooks (#30748): @shahar1
* Add mechanism to suspend providers (#30422): @potiuk
The guidelines on how to test providers can be found in
Verify providers by contributors
All users involved in the PRs:
@feluelle @ying-w @vandonr-amz @eladkal @mrichman @hussein-awala @lwyszomi
@utkarsharma2 @jlaneve @josh-fell @vincbeck @shahar1 @Lee-W @harishkesavarao
@oboki @phanikumv @moiseenkov @potiuk
@VladaZakharova @dylanbstorey
### Committer
* I acknowledge that I am a maintainer/committer of the Apache Airflow project.
| 0 |
# Bug report
**What is the current behavior?**
For the dynamic imports which contains some expression, the
`resolve.extensionAlias` configuration option is not used to resolving the
imports.
**If the current behavior is a bug, please provide the steps to reproduce.**
// Config
const config = {
...
resolve: {
extensions: [ '.ts', '.tsx', '.js', '.jsx' ],
extensionAlias: {
'.js': [ '.js', '.ts', '.tsx' ]
}
}
...
};
// Input:
// /index.ts
const x = 'a';
import(`./lazy-${x}.js`);
// /lazy-a.ts
export {}
// Output:
...
/***/ "./. lazy recursive ^\\.\\/lazy\\-.*\\.js$":
/*!*****************************************************!*\
!*** ././ lazy ^\.\/lazy\-.*\.js$ namespace object ***!
\*****************************************************/
/***/ ((module) => {
function webpackEmptyAsyncContext(req) {
// Here Promise.resolve().then() is used instead of new Promise() to prevent
// uncaught exception popping up in devtools
return Promise.resolve().then(() => {
var e = new Error("Cannot find module '" + req + "'");
e.code = 'MODULE_NOT_FOUND';
throw e;
});
}
webpackEmptyAsyncContext.keys = () => ([]);
webpackEmptyAsyncContext.resolve = webpackEmptyAsyncContext;
webpackEmptyAsyncContext.id = "./. lazy recursive ^\\.\\/lazy\\-.*\\.js$";
module.exports = webpackEmptyAsyncContext;
/***/ })
...
(() => {
/*!******************!*\
!*** ./index.ts ***!
\******************/
const x = 'a';
__webpack_require__("./. lazy recursive ^\\.\\/lazy\\-.*\\.js$")(`./lazy-${x}.js`);
})();
...
When we will change the code to use `.ts` extension in the import like:
...
import(`./lazy-${x}.ts`);
...
Or we remove the the extension from the import expression like
...
import(`./lazy-${x}`);
...
The output code is fine and contains the resolution map like:
// Output:
...
/***/ "./. lazy recursive ^\\.\\/lazy\\-.*\\.ts$":
/*!*****************************************************!*\
!*** ././ lazy ^\.\/lazy\-.*\.ts$ namespace object ***!
\*****************************************************/
/***/ ((module, __unused_webpack_exports, __webpack_require__) => {
var map = {
"./lazy-a.ts": [
"./lazy-a.ts",
"lazy-a_ts"
]
};
...
The full gist: https://gist.github.com/majo44/e774abdc4aea027d11df1ee8582e97ee
**What is the expected behavior?**
In case of es module javascript/typescript code the extensions formally are
obligatory, the typescript resolution logic also supports the '.js'
extensions. Webpack perfectly handling that for the static imports and for the
dynamic imports without expressions by using of provided `extensionAlias`
configuration property. Unfortunately it is not align with the resolution of
dynamic imports with expressions. The expectation is to properly resolve the
dynamic imports with expression by taking the `extensionAlias` in
consideration.
**Other relevant information:**
webpack version: 5.80.0
Node.js version: v18.16.0
Operating System: Windows 11
|
# Bug report
**What is the current behavior?**
* * *
## **The same problem in`require.context` and `import.meta.webpackContext`**,
so need to be fixed too
For example, `highlight.js` has the following in `exports` of `package.json`:
{
"exports": {
"./lib/languages/*": {
"require": "./lib/languages/*.js",
"import": "./es/languages/*.js"
}
}
}
I tried to import a language dynamically using the following code in a Vue 3 /
TS 4 app:
const langName = ref('')
const message = ref('')
const module = await import(
`highlight.js/lib/languages/${langName.value}`
)
message.value = module.default.name
This leads to the following error:
error in ./src/App.vue?vue&type=script&lang=ts
Module not found: Error: Package path ./lib/languages is not exported from package <path_to_proj>/node_modules/highlight.js (see exports field in <path_to_proj>/node_modules/highlight.js/package.json)
**If the current behavior is a bug, please provide the steps to reproduce.**
0. Clone minimum reproduction repo and `npm install`.
1. Run the project with `npm run serve`.
2. See error.
**What is the expected behavior?**
Replacing `${langName.value}` with a literal value works and imports the file
as expected:
const langName = ref('')
const message = ref('')
const module = await import(
`highlight.js/lib/languages/javascript`
)
message.value = module.default.name
Dynamic import should work equally well.
**Workarounds**
The comment highlightjs/highlight.js#3223 (comment) contains possible
workarounds, if that helps to narrow the problem.
**Other relevant information:**
webpack version: 5
Node.js version: 14
Operating System: mac OS
Additional tools: TypeScript 4, Vue 3, Babel 7
| 1 |
The challenge asks the user to make the top margin 20px when the challenge
itself only accepts 40px as the correct answer
|
In the instructions for the margin, you have the pixel setting for "Top" to
20px instead of 40px.
| 1 |
* Electron version:1.4.14
* Operating system: mac v10.12.3
### Expected behavior
setting BrowserWindow options frame false
browser Top frame zone
There is an invisible window
i want work css cursor:pointer
my option code
mainWindow = new BrowserWindow({
show: false,
frame: false,// look
title: "",
useContentSize: false,
width: mainWindowWidth,
minWidth: mainWindowWidth,
height: mainWindowHeight,
minHeight: mainWindowHeight,
maximizable:false,
fullscreenable:false,
icon: iconPath
});
|
* Electron version: 1.4.1
* Operating system: Mac with retina display
Step to reproduce:
1. create a frameless window on a retina device
2. put an element at the very top, say a div, and set the cursor type to pointer
3. observe that the cursor does not always change to a 'pointer' type
I have a video here for demo purpose: https://youtu.be/yFZayteopgg
The video demonstrated the issue on Mac, but it's also reproducible on
Windows, and on Windows it's worse because not only the cursor type does not
change, the click event won't register either. Thus it's impossible to
correctly implement close/maximize/minimize buttons with CSS.
Notice it's only reproducible on hi dpi devices, everything seems to work
correctly on regular displays.
the demo in the video is made by adapting the frameless window example in the
electron api demo app.
| 1 |
Would it be possible to utilize `APP_ENV` variable when running `yarn encore`?
So instead of `yarn encore dev` or `yarn encore production` I would do `yarn
encore`.
More explicit option would be `yarn encore env` that selects dev or production
based on `APP_ENV`.
This would avoid duplication of install/deploy configurations between
production and development.
Rest of Symfony is doing that already.
|
**Symfony version(s) affected** : 4.1.1 (at least)
**Description**
I tried to decorate the serializer, but it's not possible; I get an exception:
> Circular reference detected for service "App\Serializer", path:
> "App\Serializer -> App\Serializer.inner -> App\Serializer".
**How to reproduce**
With a fresh website skeleton, add this commit: lyrixx/test@`f6e27f5`
I tried to decorate the Router, in order to ensure the decoration system works
as expected, and it works well. So I think there is something special with the
serializer, but I don't know what :/
Note: Actually, I don't need this at all, I found that while trying to help
someone
| 0 |
I would like to apply a directive conditionally through something like
[class.draggable]="someBool" with '.draggable' as the selector on the
directive, but this is not supported. What then is the recommended way to
conditionally apply a directive on an existing element? Do I use an attribute
for the selector and pass in the boolean as an input (i.e. the directive is
always "active")? I'd prefer not to use ng-ifs with duplicate elements
everywhere. Is there a cleaner way to attach and detach a behavior?
|
The metadata extractor allows this:
export function FACTORY(parentDispatcher: ViewportRuler) {
return parentDispatcher || new ViewportRuler();
};
But not this:
export const FACTORY = (parentDispatcher: ViewportRuler) => {
return parentDispatcher || new ViewportRuler();
};
The latter results in
Error: Error encountered resolving symbol values statically.
Function calls are not supported.
Consider replacing the function or lambda with a reference to an exported function
(position 59:48 in the original .ts file),
resolving symbol VIEWPORT_RULER_PROVIDER_FACTORY in...
Note from @chuckjaz:
The reason we don't today is because we blindly inline local references.
What we should should only conditionally inline (simplify) exported constants.
| 0 |
## Checklist
* [x ] I have included the output of `celery -A proj report` in the issue.
software -> celery:4.1.0 (latentcall) kombu:4.1.0 py:3.6.5
billiard:3.5.0.3 redis:2.10.6
platform -> system:Windows arch:64bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:redis results:redis://localhost:6379/0
broker_url: 'redis://127.0.0.1:6379//'
result_backend: 'redis://localhost:6379/0'
include: ['base.tasks']
* [ x] I have verified that the issue exists against the `master` branch of Celery.
## Steps to reproduce
`celery -A base.celery worker -P gevent`
## Expected behavior
runs
## Actual behavior
File "c:\programdata\miniconda3\envs\myenv\lib\site-packages\celery\concurrency\gevent.py", line 34, in __init__
from gevent.greenlet import Greenlet, GreenletExit
ImportError: cannot import name 'GreenletExit'
Changing Timer. **init** from this:
class Timer(_timer.Timer):
def __init__(self, *args, **kwargs):
from gevent.greenlet import Greenlet, GreenletExit
...to this:
class Timer(_timer.Timer):
def __init__(self, *args, **kwargs):
from gevent.greenlet import Greenlet
from greenlet import GreenletExit
...resolves the problem. After changing the import then `celery -A base.celery
worker -P gevent` runs like it should.
-------------- celery@DESKTOP-7QUDBA4 v4.1.0 (latentcall)
---- **** -----
--- * *** * -- Windows-10-10.0.17134-SP0 2018-05-29 22:02:10
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: base:0x205ee24df60
- ** ---------- .> transport: redis://127.0.0.1:6379//
- ** ---------- .> results: redis://localhost:6379/0
- *** --- * --- .> concurrency: 4 (gevent)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
I'm running gevent==1.3.2 and greenlet==0.4.13
|
## Checklist
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
## Steps to reproduce
I'm using Python 3.5.2 with celery 4.1.0 and gevent 1.3.0
When launching a worker, it returns an ImportError on GreenletExit
celery worker --app=superset.sql_lab:celery_app --pool=gevent -Ofair
## Expected behavior
The worker should launch.
## Actual behavior
Celery is unable to launch the worker, it return this error:
/home/joao/Superset/lib/python3.5/site-packages/celery/ **init**.py:113:
MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported
may lead to errors, including RecursionError on Python 3.6. Please monkey-
patch earlier. See gevent/gevent#1016
monkey.patch_all()
Loaded your LOCAL configuration at [/home/joao/Superset/superset_config.py]
/home/joaog/Superset/lib/python3.5/site-packages/flask_cache/jinja2ext.py:33:
ExtDeprecationWarning: Importing flask.ext.cache is deprecated, use
flask_cache instead.
from flask.ext.cache import make_template_fragment_key
Traceback (most recent call last):
File "/home/joao/superset/bin/celery", line 11, in
load_entry_point('celery==4.1.0', 'console_scripts', 'celery')()
...
...
...
File "/home/joao/Superset/lib/python3.5/site-
packages/celery/concurrency/gevent.py", line 34, in **init**
from gevent.greenlet import Greenlet, GreenletExit
ImportError: cannot import name 'GreenletExit'
It appears the GreenletExit should be imported like this:
from gevent import GreenletExit
| 1 |
> PHP Fatal error: Can't inherit abstract function
> Symfony\Component\HttpKernel\Log\LoggerInterface::alert() (previously
> declared abstract in Psr\Log\LoggerInterface) in
> /home/olaurendeau/git/lafourchette-
> core/vendor/symfony/symfony/src/Symfony/Bridge/Monolog/Logger.php on line 23
Symfony interface LoggerInterface seems to conflict with Psr interface
LoggerInterface .
Between Symfony class Logger
class Logger extends BaseLogger implements LoggerInterface, DebugLoggerInterface
And freshly updated Monolog class Logger
class Logger implements LoggerInterface
|
As of today, we have a `LoggerInterface` declared in the HttpKernel component.
This can be problematic for components that need a logger but do not depend on
HttpKernel.
There are several options here to fix this "issue":
1/ Let the FIG group decide on an interface and rely on the FIG package that
would declare this interface (keep in mind that we need to have something
stable by the end of December for Symfony 2.2);
2/ Create a new Symfony component with just the `LoggerInterface` file;
3/ Remove the interface altogether and rely on Monolog instead;
4/ Create a Symfony Logger component that support both Monolog and the Zend
Logger;
5/ Do nothing.
see #5911 for the start of the discussion.
I tend to prefer option 3 for the many reasons:
* If we had created a Logger component when we decided to create Monolog (switching from the Zend Logger at that time), we wouldn't have this discussion about this today. So, I like to think of Monolog as being the Symfony Logger component;
* There aren' that many other logger libraries in PHP and this is not the kind of problems that need to have the flexibility to be replaced by something else;
* Monolog is flexible enough to cover all the use cases you will ever need;
* If this is a must, we can also add support for the Zend Logger library within Monolog (@Seldaek agreed with this possibility);
* Thanks to Composer, relying on an external library like Monolog is not a problem anymore.
| 1 |
Function `redirect` returns the wrong type according to `mypy`:
# file "redir.py"
from flask import Response, redirect
def something() -> Response:
return redirect("https://wikipedia.org/")
Then mypy complains:
redir.py:3: error: Incompatible return value type (got "werkzeug.wrappers.response.Response", expected "flask.wrappers.Response")
Found 1 error in 1 file (checked 1 source file)
Function `redirect` should return a `flask.wrappers.Response`.
Environment:
* Python version: 3.8.10
* Flask version: 2.1.2
|
I am not sure if this is a bug in Sphinx, but stuff like the `root_path` and
`static_url_path` attributes/properties from `_PackageBoundObject` end up not
documented in the API page of the docs. `api.rst` just contains this:
.. autoclass:: Flask
:members:
:inherited-members:
| 0 |
When I try to import the package it shows me this message:
Traceback (most recent call last):
File
"/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/ **init**.py", line 44, in
from ._check_build import check_build # noqa
File
"/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_bundle/pydev_import_hook.py",
line 21, in do_import
module = self._system_import(name, *args, **kwargs)
ImportError:
dlopen(/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/_check_build.cpython-37m-darwin.so, 2): Symbol
not found: ____chkstk_darwin
Referenced from:
/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/../.dylibs/libomp.dylib (which was built for
Mac OS X 10.15)
Expected in: /usr/lib/libSystem.B.dylib
in
/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/../.dylibs/libomp.dylib
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "", line 1, in
File
"/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_bundle/pydev_import_hook.py",
line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File
"/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/ **init**.py", line 81, in
from . import __check_build # noqa: F401
File
"/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_bundle/pydev_import_hook.py",
line 21, in do_import
module = self._system_import(name, *args, **kwargs)
File
"/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/ **init**.py", line 46, in
raise_build_error(e)
File
"/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/ **init**.py", line 41, in raise_build_error
%s""" % (e, local_dir, ''.join(dir_content).strip(), msg))
ImportError:
dlopen(/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/_check_build.cpython-37m-darwin.so, 2): Symbol
not found: ____chkstk_darwin
Referenced from:
/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/../.dylibs/libomp.dylib (which was built for
Mac OS X 10.15)
Expected in: /usr/lib/libSystem.B.dylib
in
/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build/../.dylibs/libomp.dylib
* * *
Contents of
/Users/fabio/Desktop/Utilità/Politecnico/Magistrale/pythonProject1/venv/lib/python3.7/site-
packages/sklearn/__check_build:
**init**.py **pycache** _check_build.cpython-37m-darwin.so
setup.py
* * *
It seems that scikit-learn has not been built correctly.
If you have installed scikit-learn from source, please do not forget
to build the package before using it: run `python setup.py install` or
`make` in the source directory.
If you have used an installer, please check that it is suited for your
Python version, your operating system and your platform.
I've already tried to fix it with unistall and installing back, but still I
see this message.
|
#### Describe the bug
Importing `scikit-learn==0.24` fails on macOS 10.13.
I believe this is similar to nodegui/nodegui#391, i.e. the prebuilt `libomp`
binary (from 10.15) isn't compatible with macOS 10.13.
We're also tracking this issue on the project where we encountered it:
spinalcordtoolbox/spinalcordtoolbox#3121.
#### Steps/Code to Reproduce
1. Install `scikit-learn==0.24.0` on macOS 10.13 (for us, it appeared as a successful install)
* Shown in our CI: https://travis-ci.com/github/neuropoly/spinalcordtoolbox/jobs/464176777#L568
2. Import `scikit-learn` (for us, an error was thrown)
* Shown in our CI: https://travis-ci.com/github/neuropoly/spinalcordtoolbox/jobs/464176777#L1074
#### Expected Results
No error is thrown.
#### Actual Results
dlopen(/Users/travis/build/neuropoly/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/sklearn/__check_build/_check_build.cpython-36m-darwin.so, 2): Symbol not found: ____chkstk_darwin
Referenced from: /Users/travis/build/neuropoly/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/sklearn/__check_build/../.dylibs/libomp.dylib (which was built for Mac OS X 10.15)
Expected in: /usr/lib/libSystem.B.dylib
in /Users/travis/build/neuropoly/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/sklearn/__check_build/../.dylibs/libomp.dylib
___________________________________________________________________________
Contents of /Users/travis/build/neuropoly/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/sklearn/__check_build:
__init__.py __pycache__ _check_build.cpython-36m-darwin.so
setup.py
___________________________________________________________________________
It seems that scikit-learn has not been built correctly.
If you have installed scikit-learn from source, please do not forget
to build the package before using it: run `python setup.py install` or
`make` in the source directory.
If you have used an installer, please check that it is suited for your
Python version, your operating system and your platform.
#### Versions
* Python: `python-3.6.12`
* Numpy: `numpy-1.19.4`
* Scipy: `scipy-1.5.4`
* Scikit-learn: `scikit-learn-0.24.0`
| 1 |
**Migrated issue, originally created by Anonymous**
using cast and INTERVAL seems to cause a problem: (note this code is corrupted
by an issue tracker migration some years ago)
import sqlalchemy as sa
import sqlalchemy.dialects.postgresql as sa_pg
s1 = sa.select([week ago', sa_pg.DATE)](sa.cast('1))
print s1
s2 = sa.select([seconds', sa_pg.INTERVAL)](sa.cast('100))
print s2
AttributeError: 'GenericTypeCompiler' object has no attribute 'visit_INTERVAL'
The DATE cast works fine.
The INTERVAL cast fails.
0.6.7 and 0.7b5 (current as of 7606:4d99799ee724070fe0fe7404f655854d223f6e93)
|
**Migrated issue, originally created by Michael Bayer (@zzzeek)**
diff --git a/lib/sqlalchemy/engine/default.py b/lib/sqlalchemy/engine/default.py
index 3e5f339..9f845e7 100644
--- a/lib/sqlalchemy/engine/default.py
+++ b/lib/sqlalchemy/engine/default.py
@@ -474,6 +474,23 @@ class DefaultDialect(interfaces.Dialect):
self.set_isolation_level(dbapi_conn, self.default_isolation_level)
+class StrCompileDialect(DefaultDialect):
+
+ statement_compiler = compiler.StrSQLCompiler
+ ddl_compiler = compiler.DDLCompiler
+ type_compiler = compiler.StrSQLTypeCompiler
+ preparer = compiler.IdentifierPreparer
+
+ supports_sequences = True
+ sequences_optional = True
+ preexecute_autoincrement_sequences = False
+ implicit_returning = False
+
+ supports_native_boolean = True
+
+ supports_simple_order_by_label = True
+
+
class DefaultExecutionContext(interfaces.ExecutionContext):
isinsert = False
isupdate = False
diff --git a/lib/sqlalchemy/sql/compiler.py b/lib/sqlalchemy/sql/compiler.py
index c5f87cc..076ae53 100644
--- a/lib/sqlalchemy/sql/compiler.py
+++ b/lib/sqlalchemy/sql/compiler.py
@@ -2118,6 +2118,30 @@ class SQLCompiler(Compiled):
self.preparer.format_savepoint(savepoint_stmt)
+class StrSQLCompiler(SQLCompiler):
+ """"a compiler subclass with a few non-standard SQL features allowed.
+
+ Used for stringification of SQL statements when a real dialect is not
+ available.
+
+ """
+
+ def visit_getitem_binary(self, binary, operator, **kw):
+ return "%s[%s]" % (
+ self.process(binary.left, **kw),
+ self.process(binary.right, **kw)
+ )
+
+ def returning_clause(self, stmt, returning_cols):
+
+ columns = [
+ self._label_select_column(None, c, True, False, {})
+ for c in elements._select_iterables(returning_cols)
+ ]
+
+ return 'RETURNING ' + ', '.join(columns)
+
+
class DDLCompiler(Compiled):
@util.memoized_property
@@ -2640,6 +2664,17 @@ class GenericTypeCompiler(TypeCompiler):
return type_.get_col_spec(**kw)
+class StrSQLTypeCompiler(GenericTypeCompiler):
+ def __getattr__(self, key):
+ if key.startswith("visit_"):
+ return self._visit_unknown
+ else:
+ raise AttributeError(key)
+
+ def _visit_unknown(self, type_, **kw):
+ return "%s" % type_.__class__.__name__
+
+
class IdentifierPreparer(object):
"""Handle quoting and case-folding of identifiers based on options."""
diff --git a/lib/sqlalchemy/sql/elements.py b/lib/sqlalchemy/sql/elements.py
index de17aab..fe2fecc 100644
--- a/lib/sqlalchemy/sql/elements.py
+++ b/lib/sqlalchemy/sql/elements.py
@@ -429,7 +429,7 @@ class ClauseElement(Visitable):
dialect = self.bind.dialect
bind = self.bind
else:
- dialect = default.DefaultDialect()
+ dialect = default.StrCompileDialect()
return self._compiler(dialect, bind=bind, **kw)
def _compiler(self, dialect, **kw):
| 1 |
ERROR: type should be string, got "\n\nhttps://storage.cloud.google.com/kubernetes-jenkins/pr-logs/pull/27728/node-\npull-build-e2e-test/11333/build-log.txt\n\n \n \n Summarizing 9 Failures:\n \n [Fail] [k8s.io] Downward API [It] should provide container's limits.cpu/memory and requests.cpu/memory as env vars \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2078\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when starting a container that exits [It] it should run with the expected status [Conformance] \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:114\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when starting a container that exits [It] should report termination message if TerminationMessagePath is set [Conformance] \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:157\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should not be able to pull image from invalid registry \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:270\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should not be able to pull non-existing image from gcr.io \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:270\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should be able to pull image from gcr.io \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:265\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should be able to pull image from docker hub \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:265\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should not be able to pull from private registry without secret \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:270\n \n [Fail] [k8s.io] Container Runtime Conformance Test container runtime conformance blackbox test when running a container with a new image [It] should be able to pull from private registry with secret \n /var/lib/jenkins/workspace/node-pull-build-e2e-test/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/test/e2e_node/runtime_conformance_test.go:265\n \n Ran 20 of 23 Specs in 1950.350 seconds\n FAIL! -- 11 Passed | 9 Failed | 0 Pending | 3 Skipped --- FAIL: TestE2eNode (1950.35s)\n FAIL\n \n\n" |
Cluster level logging using Elasticsearch
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/es_cluster_logging.go:46
should check that logs from pods on all nodes are ingested into Elasticsearch [It]
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/es_cluster_logging.go:45
Expected error:
<*errors.errorString | 0xc82073bf50>: {
s: "gave up waiting for pod 'synthlogger-1' to be 'success or failure' after 5m0s",
}
gave up waiting for pod 'synthlogger-1' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/es_cluster_logging.go:296
From #20149 (comment):
e2e failed because kubelet wasn't able to pull the ubuntu image in time
(http://kubekins.dls.corp.google.com:8081/job/kubernetes-pull-build-
test-e2e-gce/26352/):
failed to "StartContainer" for "synth-logger" with ErrImagePull: "API error (500): unable to ping registry endpoint https://gcr.io/v0/\nv2 ping attempt failed with error: Get https://gcr.io/v2/: dial tcp: lookup gcr.io on 10.240.0.1:53: dial udp 10.240.0.1:53: network is unreachable\n v1 ping attempt failed with error: Get https://gcr.io/v1/_ping: dial tcp: lookup gcr.io on 10.240.0.1:53: dial udp 10.240.0.1:53: network is unreachable\n"
| 0 |
The places I'd expect to get warnings are indicated by the comments in this
example:
fn main() {
let a: u8 = 1000;
let b = 1000u8;
let c = 1000 as u8; // NO WARNING !!!
let one: u8 = 1;
let d = (one * 1000) as u32;
let e = (one * -1) as u32; // NO WARNING !!!
let f = (one * -255) as u32; // NO WARNING !!!
let g = (one * -256) as u32;
println!("{}, {}, {}, {}, {}, {}, {}", a, b, c, d, e, f, g);
}
Here's the compiler output:
warning: literal out of range for its type, #[warn(type_overflow)] on by default
let a: u8 = 1000;
^~~~
warning: literal out of range for its type, #[warn(type_overflow)] on by default
let b = 1000u8;
^~~~~~
warning: literal out of range for its type, #[warn(type_overflow)] on by default
let d = (one * 1000) as u32;
^~~~
warning: literal out of range for its type, #[warn(type_overflow)] on by default
let g = (one * -256) as u32;
^~~
And here's the program output:
`232, 232, 232, 232, 255, 1, 0`
|
The compiler should show a warning message when is used a negative number into
a like-uint type
let n: u8 = -1;
println!("{}", n); // => 255
| 1 |
on master: https://travis-ci.org/angular/angular/jobs/66142657#L650
are those expected ?
/cc @kegluneq
|
Right now, we get the following errors during our build. I think this is
because we have almost empty files inside of `web/e2e_test` that look like
this:
E.g. compiler_perf.dart
library benchmarks.e2e_test.compiler_perf;
main() {}
The error in the build is as follows:
[21:01:16] Starting 'build/pubbuild.dart'...
Loading source assets...
Loading angular2, di/module_transformer, observe and smoke/src/default_transformer transformers... (1.5s)
Loading angular transformers... (1.4s)
Building benchmarks_external...
[Warning from InjectorGenerator on benchmarks_external|web/e2e_test/compiler_perf.dart]:
Unable to resolve injectable annotation angular.core.annotation_src.Decorator
[Warning from InjectorGenerator on benchmarks_external|web/e2e_test/compiler_perf.dart]:
Unable to resolve injectable annotation angular.core.annotation_src.Component
[Warning from InjectorGenerator on benchmarks_external|web/e2e_test/compiler_perf.dart]:
Unable to resolve injectable annotation angular.core.annotation_src.Formatter
[Warning from InjectorGenerator on benchmarks_external|web/e2e_test/compiler_perf.dart]:
Unable to resolve injected type name perf_api.Profiler
Build error:
Transform InjectorGenerator on benchmarks_external|web/e2e_test/compiler_perf.dart threw error: Bad state: No element
dart:collection/iterable.dart 329 IterableBase.last
package:di/transformer/injector_generator.dart 368:30 _Processor._editMain
package:di/transformer/injector_generator.dart 72:5 _Processor.process
package:di/transformer/injector_generator.dart 25:52 InjectorGenerator.applyResolver
package:code_transformers/src/resolvers.dart 111:31 Transformer&ResolverTransformer.applyToEntryPoints.<fn>.<fn>
dart:async/future.dart 118 Future.Future.<fn>
dart:async-patch/timer_patch.dart 16 Timer._createTimer.<fn>
dart:isolate-patch/timer_impl.dart 385 _Timer._runTimers
dart:isolate-patch/timer_impl.dart 411 _Timer._handleMessage
dart:isolate-patch/isolate_patch.dart 142 _RawReceivePortImpl._handleMessage
dart:collection IterableBase.last
package:di/transformer/injector_generator.dart 368:30 _Processor._editMain
package:di/transformer/injector_generator.dart 72:5 _Processor.process
package:di/transformer/injector_generator.dart 25:52 InjectorGenerator.applyResolver
package:code_transformers/src/resolvers.dart 111:31 Transformer&ResolverTransformer.applyToEntryPoints.<fn>.<fn>
dart:isolate _RawReceivePortImpl._handleMessage
| 1 |
Currently, developing a flutter project that includes swift can be a bit of a
pain when including various plugins - in particular Firebase messaging is the
one I've had trouble with.
This is more or less the structure of (the iOS part) of my app:
App
↳ swift source
↳ firebase messaging
↳ objc source
↳ various static libraries included in sub-pods etc
↳ Qr code plugin
↳ objc source (originally swift but had issues with that)
↳ Google MobileVision (objc source)
↳ various static libraries included as sub-pods etc
↳ Swift plugin(s) (source, no libraries)
↳ Objc plugin(s) (source, no libraries)
Because of my usage of swift, `use_frameworks!` is set in my Podfile.
What's happening when I build currently is that the various libraries are each
being build as dynamic frameworks i.e. Firebase, qr code pluign, swift
plugin(s), objc plugin(s). Then it appears that the static libraries are being
linked properly when running on the device, but when I do an achive build,
upload to testflight, and run on the device it fails to run. As far as I can
tell, the static libraries are being linked into the main app rather than the
dynamic libraries where they're required (i.e. firebase), although I may be
misdiagnosing that as I'm not an expert by any means.
I do have an example project but the issue is that to actually reproduce the
issue you have to archive it, upload to app store, and use testflight etc to
run it; I sometimes saw issues with release builds after running `flutter
build ios --release` but not always for some reason.
This is the error though, although not all that detailed as it's hard to debug
an app from testflight:
`Termination Description: DYLD, Symbol not found: _OBJC_CLASS_$_FIRApp |
Referenced from:
/private/var/containers/Bundle/Application/A9C35A12-B9A3-4F0C-A508-C0554CA2843A/Runner.app/Frameworks/firebase_messaging.framework/firebase_messaging
| Expected in: flat namespace | in
/private/var/containers/Bundle/Application/A9C35A12-B9A3-4F0C-A508-C0554CA2843A/Runner.app/Frameworks/firebase_messaging.framework/firebase_messaging`
What I'd like to propose is some way of forcing cocoapods to do what is needed
(i.e. link the libraries into the right place). I tried various things
including manually specifying the name of the dependency libraries in the
podspec of the framework including them, which did link them into the dynamic
framework but then resulted in duplicate definitions of classes and all sorts
of other fun issues. I'm sure it's possible but that I just couldn't figure it
out.
However, cocoapods does have a relatively new option introduced in 1.4.0 -
static_framework. To use it, you simply specify `static_framework = true` in
the podspec of the framework which includes a static library, and the
framework will compile statically including the dependent libraries (it should
also theoretically remove the requirement for the `s.pod_target_xcconfig = {
'FRAMEWORK_SEARCH_PATHS .... OTHER_LDFLAGS ...' currently in
firebase_messaging.podspec (and I assume the other plugins as well).
There are a couple of caveats to this - the first being that CocoaPods
currently has a bug that means public_headers which are symlinked from outside
the project (as flutter has recently started doing) get included as `Project
Headers` instead of `Public Headers` at least for when static_framework is
used (it might pop up other places too). I've submitted a PR to cocoapods to
resolve that ~~but still need to write some unit tests for them (yay, I get to
learn more ruby 😒)~~ _(edit: it has now been accepted into master but no idea
about release date)_. Also, FYI - the trunk of cocoapods seems to not work
with flutter (I don't currently have time to track down why not) so the next
release that includes this fix could theoretically fail on that. It will also
require people to update to cocoapods 1.4.0 if they haven't already.
Sorry for the long-windedness, but I think this is something worth considering
especially now that more developers will hopefully be coming on board since
the beta release - and I'd bet that a lot of them will want to use both swift
and the various plugins flutter provides for firebase etc.
|
10-03 11:29:10.612 29886 29912 E flutter : [ERROR:../../lib/tonic/logging/dart_error.cc(16)] Unhandled exception:
10-03 11:29:10.612 29886 29912 E flutter : The null object does not have a method 'inheritFromWidgetOfExactType'.
10-03 11:29:10.612 29886 29912 E flutter :
10-03 11:29:10.612 29886 29912 E flutter : NoSuchMethodError: method not found: 'inheritFromWidgetOfExactType'
10-03 11:29:10.612 29886 29912 E flutter : Receiver: null
10-03 11:29:10.612 29886 29912 E flutter : Arguments: [Type: class 'FormScope']
10-03 11:29:10.612 29886 29912 E flutter : #0 Object._noSuchMethod (dart:core-patch/object_patch.dart)
10-03 11:29:10.612 29886 29912 E flutter : #1 Object.noSuchMethod (dart:core-patch/object_patch.dart)
10-03 11:29:10.612 29886 29912 E flutter : #2 _FormFieldData.onChanged (package:flutter/src/material/input.dart)
10-03 11:29:10.612 29886 29912 E flutter : #3 RawInputLineState._handleTextUpdated (package:flutter/src/widgets/editable.dart)
10-03 11:29:10.612 29886 29912 E flutter : #4 _KeyboardClientImpl.updateEditingState (package:flutter/src/widgets/editable.dart)
10-03 11:29:10.612 29886 29912 E flutter : #5 _KeyboardClientStubControl.handleMessage (package:sky_services/editing/editing.mojom.dart)
10-03 11:29:10.612 29886 29912 E flutter : #6 StubMessageHandler.handleRead (package:mojo/src/stub.dart)
10-03 11:29:10.612 29886 29912 E flutter : #7 MojoEventHandler._handleEvent (package:mojo/src/event_handler.dart)
10-03 11:29:10.612 29886 29912 E flutter : #8 MojoEventHandler._tryHandleEvent (package:mojo/src/event_handler.dart)
10-03 11:29:10.612 29886 29912 E flutter : #9 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart)
| 0 |
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
* template expansion
##### ANSIBLE VERSION
ansible 2.2.1.0
config file =
configured module search path = Default w/o overrides
##### CONFIGURATION
N/A
##### OS / ENVIRONMENT
Ubuntu 16.04. ansible installed via pip.
$ pip list --format=legacy
ansible (2.2.1.0)
cffi (1.9.1)
configobj (5.0.6)
crit (0.0.1)
cryptography (1.7.1)
dynagen (0.9.0)
enum34 (1.1.6)
idna (2.2)
ipaddr (2.1.11)
ipaddress (1.0.18)
Jinja2 (2.9.4)
MarkupSafe (0.23)
paramiko (2.1.1)
pip (9.0.1)
protobuf (3.1.0.post1)
pyasn1 (0.1.9)
pycparser (2.17)
pycrypto (2.6.1)
python-apt (1.1.0b1)
PyYAML (3.12)
setuptools (33.1.1)
six (1.10.0)
wheel (0.29.0)
##### SUMMARY
When one template _imports_ another one, and then _includes_ another template,
it barfs with "KeyError: 'undefined variable: 0'"
##### STEPS TO REPRODUCE
# test.yml
- hosts: localhost
tasks:
- template:
src: foo
dest: /tmp/foo
# templates/foo
{% import 'qux' as qux with context %}
hello world
{{ qux.wibble }}
{% include 'bar' %}
# templates/bar
Goodbye
# templates/qux
{% set wibble = "WIBBLE" %}
##### EXPECTED RESULTS
/tmp/foo to be created with:
hello world
WIBBLE
Goodbye
##### ACTUAL RESULTS
TASK [template] ****************************************************************
task path: /home/ubuntu/bug/test.yml:3
fatal: [localhost]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_args": {
"dest": "/tmp/foo",
"src": "foo"
},
"module_name": "template"
},
"msg": "KeyError: 'undefined variable: 0'"
}
##### ADDITIONAL INFO
If you change `templates/foo` to this, it works fine:
hello world
{% include 'bar' %}
And so does this:
{% import 'qux' as qux with context %}
hello world
{{ qux.wibble }}
So it appears to be something to do with the _combination_ of import and
include which is causing it to barf.
However if you remove the "with context", like this:
{% import 'qux' as qux %}
hello world
{{ qux.wibble }}
then it barfs in a different way:
TASK [template] ****************************************************************
task path: /home/ubuntu/bug/test.yml:3
fatal: [localhost]: FAILED! => {
"changed": false,
"failed": true,
"invocation": {
"module_args": {
"dest": "/tmp/foo",
"src": "foo"
},
"module_name": "template"
},
"msg": "AttributeError: 'NoneType' object has no attribute 'add_locals'"
}
So there is a potentially separate issue, that 'import ... with context'
works, but 'import' by itself does not, in Ansible.
The difference is that in the latter case, jinja2 will cache the import with
the template (since it doesn't depend on any of the dynamic context in that
particular expansion of the template)
Finally: this appears not to be a bug with jinja2 itself. Using jinja2 API
directly runs the test case with no problem:
#!/usr/bin/python
from jinja2 import Environment, FileSystemLoader
env = Environment(
loader=FileSystemLoader('templates'),
)
template = env.get_template('foo')
print template.render()
gives:
$ python test.py
hello world
WIBBLE
Goodbye
|
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
Template
##### ANSIBLE VERSION
ansible 2.3.0 (devel 0aad46c85b) last updated 2017/01/09 16:11:21 (GMT -400)
config file = /home/jadams/.ansible.cfg
configured module search path = Default w/o overrides
##### CONFIGURATION
Tested on vanilla / no extra configuration
##### OS / ENVIRONMENT
N/A but tested from Fedora 25
##### SUMMARY
When Jinja2 dependency is >= 2.9.0, nested templates inside of loops defined
in a parent template do not receive the local scope. For example:
{% for item in foo %}
{% include "other_template.j2" %}
{% endfor %
In the above example, other_template.j2 would not have the item variable
defined.
##### STEPS TO REPRODUCE
# Example Playbook
- hosts: localhost
connection: local
vars:
foo:
- bar
- baz
tasks:
- name: Test Template
template: src=parent.j2 dest=/tmp/template.out
# parent.j2
{% for item in foo %}
{% include "subtemplate.j2" %}
{% endfor %}
#subtemplate.j2
{{ item }}
##### EXPECTED RESULTS
Expected the output of the following file:
bar
baz
##### ACTUAL RESULTS
TASK [Test Template] ************************************************************************************************************************************************************************************************************************
fatal: [localhost]: FAILED! => {"changed": false, "failed": true, "msg": "AnsibleUndefinedVariable: 'item' is undefined"}
I believe this is because of the Jinja2 Context changes mentioned here in
combination with Ansible overriding default Jinja2 functions. I wasn't able to
track down the root cause in my limited testing.
| 1 |
In [1]: import pandas as pd
In [2]: df = pd.DataFrame([[1, 0, 0],[0,0,0]])
# non sparse
In [19]: %time df.loc[0]
CPU times: user 367 µs, sys: 14 µs, total: 381 µs
Wall time: 374 µs
Out[19]:
0 1
1 0
2 0
Name: 0, dtype: int64
In [20]: %time df.loc[[0]]
CPU times: user 808 µs, sys: 57 µs, total: 865 µs
Wall time: 827 µs
Out[20]:
0 1 2
0 1 0 0
In [3]: df = df.to_sparse()
# sparse
In [5]: %time df.loc[0]
CPU times: user 429 µs, sys: 14 µs, total: 443 µs
Wall time: 436 µs
Out[5]:
0 1.0
1 0.0
2 0.0
Name: 0, dtype: float64
BlockIndex
Block locations: array([0], dtype=int32)
Block lengths: array([3], dtype=int32)
In [6]: %time df.loc[[0]]
CPU times: user 4.07 ms, sys: 406 µs, total: 4.48 ms
Wall time: 6.16 ms
Out[6]:
0 1 2
0 1.0 0.0 0.0
It seems in SparseDataFrame, selecting as dataframe, the time increased much
more than it does in DataFrame, I would expect it runs faster(on my 1m x 500
SparseDataFrame it takes seconds for the loc operation)
## INSTALLED VERSIONS
commit: None
python: 3.5.1.final.0
python-bits: 64
OS: Darwin
OS-release: 15.6.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: en_GB.UTF-8
LANG: en_GB.UTF-8
pandas: 0.18.1
nose: None
pip: 8.1.2
setuptools: 24.0.2
Cython: None
numpy: 1.11.0
scipy: 0.17.1
statsmodels: None
xarray: None
IPython: 5.0.0
sphinx: None
patsy: None
dateutil: 2.5.3
pytz: 2016.4
blosc: None
bottleneck: None
tables: None
numexpr: 2.6.1
matplotlib: 1.5.1
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: None
httplib2: None
apiclient: None
sqlalchemy: 1.0.13
pymysql: None
psycopg2: 2.6.1 (dt dec pq3 ext lo64)
jinja2: None
boto: None
pandas_datareader: None
|
#### Code Sample
df = Dataframe(numpy.zeros(10000,10000))
random_fill_df(df, num_elements=20)
df = df.to_sparse(fill_value=0)
timeit.timeit('df.loc[[23, 45, 65, 67],:]', globals=globals(), number=10)
#### Problem description
The reason why row slicing takes so long is because a sparse dataframe a bunch
of sparse series. Column slicing is several order of magnitude faster but row
slicing is very poor. The sparse dataframe doesn't take advantage of the scipy
sparse matrix library which is even faster (both column and row).
#### Expected Output
In case data is stored as a scipy sparse matrix (as well) inside dataframe
object, the slicing operations can be improved, by several orders of
magnitude.
I propose that data be stored as a sparse matrix as well in the dataframe
object.
#### Output of `pd.show_versions()`
pandas: 0.20.3
pytest: None
pip: 9.0.1
setuptools: 36.2.0
Cython: None
numpy: 1.13.1
scipy: 0.19.1
xarray: None
IPython: None
sphinx: None
patsy: None
dateutil: 2.6.1
pytz: 2017.2
blosc: None
bottleneck: 1.2.1
tables: None
numexpr: 2.6.2
feather: None
matplotlib: 2.0.2
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: 4.6.0
html5lib: 0.999999999
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: None
s3fs: None
pandas_gbq: None
pandas_datareader: None
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.1
* Operating System version: windows7
* Java version: 1.8
* Dubbo Spring boot Starter: 2.7.0
### Steps to reproduce this issue
dubbo:
protocol:
port: 20880
registry:
address: N/A
check: false
consumer:
check: false
AppController:
@Reference(check = false)
private UserService userService;
### exception:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'appController': Injection of @org.apache.dubbo.config.annotation.Reference dependencies is failed; nested exception is java.lang.IllegalStateException: No such any registry to reference com.epipe.ucloud.crm.service.sys.UserService on the consumer 192.168.3.177 use dubbo version 2.7.1, please config <dubbo:registry address="..." /> to your spring config.
`check=false` doesn't take effect
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.4.1
* Operating System version: macos, linux
* Java version: 1.8
### 背景
这个issue不是正常的使用场景下产生的啊... 我在做一个 dubbo 的压测工具 , 要在 JVM 不退出的情况下 重载 dubbo 的接口class.
如: GreetingService
public interface GreetingService {
String sayHello(String name);
HelloResponse sayHello2(String name);
}
### 现象
第一次调用时 `HelloResponse` 返回正常,类加载器是: pea.app.compiler.ReloadableClassLoader
(这个是我自定义的类加载器, 每次调用都会是新的)。
之后的调用就会报异常:
[error] p.d.a.DubboAction - failure: ==========> java.lang.ClassCastException: pea.example.ext.dubbo.response.HelloResponse cannot be cast to pea.example.ext.dubbo.response.HelloResponse
at org.apache.dubbo.common.bytecode.proxy1.sayHello2(proxy1.java)
at pea.example.dubbo.GreetingSimulation.$anonfun$scn$1(GreetingSimulation.scala:23)
at pea.dubbo.action.DubboAction.$anonfun$execute$1(DubboAction.scala:43)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:658)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
### 问题
这个问题大概是怎么产生的啊?为啥第一调用没事, 代理类返回的 `HelloResponse`是不是有缓存啊之类的?如果有怎么清掉?
| 0 |
* VSCode Version: 0.10.11
* OS Version: Ubuntu 14.04
Steps to Reproduce:
1. Export a variable in your default shell (`~/.zlogin` for zsh; `~/.bash_login` for bash, for example. You can use `.bashrc` as well).
export FOOBAR="1"
2. Pass different value of the same variable to VSCode.
FOOBAR=2 code .
3. VSCode uses value defined for login shell, not current session:
Help > Toggle Developer Tools. And in Console, `process.env.FOOBAR` gives "1"
when I expect "2".
It looks like VSCode executes default shell as login shell again before
launching the program UI.
Being able to pass different values for existing environment variables greatly
helps writing/testing extensions.
Related discussion: microsoft/vscode-go#220 (comment)
###### strace example:
$ strace -E GOPATH=1 -E FOOBAR=2 -s 1000000 -e verbose=execve -e abbrev=none ./.build/electron/electron .
execve("./.build/electron/electron", ["./.build/electron/electron", "."], [... "GOPATH=1", ...]) # What I expected
...
read(62, "...\nGOPATH=/home/saml/go\n....") # What is this?
...
|
Per microsoft/vscode-go#141, there appears to still be an issue with how env
vars are passed into Code, even after the fix for #560.
Steps to reproduce:
1. have `export FOO=1` in your `.bash_profile`
2. ensure your function to start visual studio code looks like this `code () { VSCODE_CWD="$PWD" open -n -b "com.microsoft.VSCode" --args $* ;}`
3. open a terminal session
4. `export FOO=2` in your terminal session
5. open `code .`
6. open developer tools, and eval `process.env["FOO"]`
Expected: `Foo` should be `2`
Result: `FOO` will be `1`
But the thing that's really odd is that if you skip step (1) (remove this from
you `.bash_profile`), you will see `Foo` set to `2`. So in that case, the
local environment is getting passed in.
It seems like this must be either a problem with how vscode is setting up the
env vars, or how `open` in OS X behaves.
| 1 |
I've had a few conversations on IRC about using `#![no_std]` in Rust
libraries. There are comments around (such as here in html5ever) that using
`#![no_std]` should be used if one wants to write a Rust library that is going
to be used from other languages. It's not really clear to me how accurate that
recommendation still is currently / will be at 1.0 with the changes to std
that have happened over the past months; however, assuming it's still true, I
think there are a couple of problems that don't seem to have easy solutions:
* It's really easy to accidentally pull in std by using another crate that links std (html5ever currently has this problem - servo/html5ever#47).
* According to the unsafe docs, a crate using `#![no_std]` needs to define a few lang_items like `stack_exhausted`. If two independently-developed crates both define the lang items, you get duplicate name errors when trying to use both simultaneously.
|
It would be great to be able to do some thing like this :
`type A = { a: int }; type B = {b: int} herit A;`
So in fact`B = {a: int, b: int};`
It could maybe replace an object model when combined with interfaces.
| 0 |
One of the pages (Bonfire: Seek and Destroy) is completely "frozen" for me. I
mean I can move the mouse, but can't click on anything, can't reset or edit my
code or do anything, and if I close the tab and return later I still see my
old code and the page is frozen.
|
Challenge http://freecodecamp.com/challenges/bonfire-reverse-a-string has an
issue. Please describe how to reproduce it, and include links to screenshots
if possible.
Cannot reset code, page is frozen on me.
| 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.