text1 stringlengths 2 269k | text2 stringlengths 2 242k | label int64 0 1 |
|---|---|---|
**Migrated issue, originally created by Sean Mars (@seanmars)**
I get the "TypeError: not enough arguments for format string" when i execute
by compiled object with mysql.dialect().
But when i execute by complied object without mysql.dialect(), it is fine.
Env:
* MySQL 5.6
* Python 3.5
* PyMySQL==0.7.9
* SQLAlchemy==1.1.6
from sqlalchemy import create_engine
from sqlalchemy import Table, MetaData, text
from sqlalchemy.dialects import mysql
# The score table just two filed id(vchar(20)), value(int)
engine = create_engine('mysql+pymysql://root:root@127.0.0.1/score?charset=utf8mb4')
conn = engine.connect()
meta = MetaData()
table = Table('score', meta, autoload=True, autoload_with=conn)
id = 1
value = 100
table.insert().values(id=id, value=value).compile(bind=conn, dialect=mysql.dialect())
conn.execute(ins)
Error log:
Traceback (most recent call last):
File "/test/db/score.py", line 33, in insert
conn.execute(ins)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 945, in execute
return meth(self, multiparams, params)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/sql/compiler.py", line 227, in _execute_on_connection
return connection._execute_compiled(self, multiparams, params)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1075, in _execute_compiled
compiled, parameters
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1189, in _execute_context
context)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1396, in _handle_dbapi_exception
util.reraise(*exc_info)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/util/compat.py", line 187, in reraise
raise value
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/base.py", line 1182, in _execute_context
context)
File "/test/venv/lib/python3.5/site-packages/sqlalchemy/engine/default.py", line 470, in do_execute
cursor.execute(statement, parameters)
File "/test/venv/lib/python3.5/site-packages/pymysql/cursors.py", line 164, in execute
query = self.mogrify(query, args)
File "/test/venv/lib/python3.5/site-packages/pymysql/cursors.py", line 143, in mogrify
query = query % self._escape_args(args, conn)
TypeError: not enough arguments for format string
|
**Migrated issue, originally created by Michael Bayer (@zzzeek)**
we detect the `delete()` of one object with the same PK as another which was
just `add()`ed, and convert what would be a DELETE and INSERT into an UPDATE.
The main reason for this was that we issue INSERT statements first, then
DELETEs, and there was no obvious way to get the unit of work to issue the
DELETE first for this one particular record.
The expected behavior of `add()` is INSERT. The table may have triggers and
other defaults that only trigger on INSERT and not UPDATE. Fields which are
undefined on the instance when it is added are expected to be their default
values in the DB, which is usually NULL. The current logic takes none of this
into account. I'm not sure what the current behavior is for related
objects/foreign keys.
The change here must be that the whole "convert INSERT/DELETE to an UPDATE"
must be removed. The unitofwork must issue the DELETE for any row switch
object before it issues the INSERTs for that mapper. The most obvious way to
do this would be to flip the order of DELETEs and INSERTs wholesale for the
UOW. Some experimentation is suggesting that this is much harder than it
sounds.
| 0 |
Challenge http://www.freecodecamp.com/challenges/waypoint-add-rounded-corners-
with-a-border-radius has an issue. Please describe how to reproduce it, and
include links to screen shots if

possible.
|

Challenge http://www.freecodecamp.com/challenges/waypoint-add-different-
margins-to-each-side-of-an-element has an issue. Please describe how to
reproduce it, and include links to screen shots if possible.
| 1 |
We are experiencing an issue (since we upgraded from celery 2.5 to 3.0) where
our workers mysteriously stop consuming messages from the Q's they are
configured on.
It happens periodically on each worker node independently, and i cannot get to
the root of what is causing it.
`celery inspect -d appserver-i-82df38f8.prod active` always returns empty.
I did find something slightly interesting, in that, the list of reserved tasks
seems to stay static for a long time. Since these tasks do not have an eta
set, why don't they execute? I cant think of any reason why they are not being
executed?
Does the reserved Q have any relationship to the prefetch Q? i.e. if there are
a lot of reserved tasks which do not execute would that prevent the node from
retrieving more tasks from the server?
When looking at the rabbit MQ management interface I can see this node is
listed as a consumer.
I have collected some info from the node which may shine some light on this,
but am at a loss as to why this could happen. Could you provide some pointers
for things to look into?
Worker machine task inspections
https://gist.github.com/declanshanaghy/5103891
Worker machine report
https://gist.github.com/declanshanaghy/5103838
|
Hi,
I'm encountering a strange issue for which I set up a simple test case.
I'm using djcelery with RabbitMQ (3.0.4) as broker. Here are the first lines
from python manage.py celery report :
software -> celery:3.0.17 (Chiastic Slide) kombu:2.5.9 py:2.7.3
billiard:2.7.3.26 librabbitmq:1.0.1
platform -> system:Linux arch:64bit, ELF imp:CPython
loader -> djcelery.loaders.DjangoLoader
settings -> transport:librabbitmq results:database
This is the code in the tasks.py module :
from celery import task
from celery.task.control import revoke
import memcache
mc = memcache.Client(['127.0.0.1:11211'], debug=0)
@task(name='tasks.homeioTestTask')
def testTask(feedUid):
print "Within test task for feedUid : %s" % feedUid
return
@task(name='tasks.homeioProcessEvent')
def processEvent(event):
print "Got an event from the views"
feedUid = event.get('feedUid')
print "Event occurred for feed %s" % feedUid
# Checking if we have a task id in memcached
mcKey = str('taskHomeIoTest_feed_%s' % feedUid)
taskId = mc.get(mcKey)
if taskId:
print "We have a task in the cache, revoke it"
revoke(taskId)
mc.delete(mcKey)
else:
print "Scheduling a task"
task = testTask.apply_async((feedUid,), countdown=60)
mc.set(mcKey,task.id, time=0)
return
This code is being called from my views.py file as follows :
from tasks import processEvent
def event(request):
event = simplejson.loads(request.raw_post_data)
processEvent.delay(event)
return HttpResponse(status=200)
When I run celery using python manage.py celery worker --loglevel=debug I get
:
https://gist.github.com/franckb/e6e5db51b41b74b3fbce (things go wrong around
line 257)
Looks like the tasks don't get to the workers anymore but there is no
indication that the workers died or that anything is going wrong.
I've tried several things when tracking down the issue, but I'm having a hard
time knowing where to look. Could anyone provide some guidance ?
Thanks
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.4.1
* Operating System version: MacOS
* Java version: JDK1.8
### Steps to reproduce this issue
Quote the issue to continue question about _**Not friendly with the YAML
config**_.
**You can use any dynamic config center as Dubbo configuration provider( Must
be use other format to write your configuration, like YAML, XML, etc... ),
them you verify this issue.**
The one of example of about "Not friendly with the YAML config", you can found
that code in the class `org.apache.dubbo.config.AbstractInterfaceConfig`(
Version 2.7.4.1 ) , line 303 & 304 .
Environment.getInstance().updateExternalConfigurationMap(parseProperties(configContent));
Environment.getInstance().updateAppExternalConfigurationMap(parseProperties(appConfigContent));
You can not expect everyone have to use "Properties" format to write them
configurations, this code can not handling other format like YAML, so that
someone use other format like me have to customize own protocol by SPI for
handle the configuration parsing process.
My suggest is, do not write that dead code like "parseProperties".
Or, also can write a document to teach everyone to handling this situation if
someone use other format like me.
### Expected Result
Pull all config from remote config center ( like Nacos / Apollo ) and **parse
configuration in other formats successfully**.
### Actual Result
Parsing any format configurations as "Properties" format, and failure.
The configuration parsing process is unreasonable, that code only accept
"Properties" format.
|
* [ √] I have searched the issues of this repository and believe that this is not a duplicate.
* [ √] I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.6
* Operating System version: xxx
* Java version: 1.8
### Steps to reproduce this issue
1.this is my mavenconfig ,it's not a springboot project ,but springmvc
,2.7.4.1 and 2.6.X is ok. 2.7.6 in springboot project is ok.
` <dependency> <groupId>org.apache.dubbo</groupId>
<artifactId>dubbo</artifactId> <version>2.7.6</version> <exclusions>
<exclusion> <artifactId>spring</artifactId>
<groupId>org.springframework</groupId> </exclusion> <exclusion>
<artifactId>servlet-api</artifactId> <groupId>javax.servlet</groupId>
</exclusion> <exclusion> <artifactId>log4j</artifactId>
<groupId>log4j</groupId> </exclusion> </exclusions> </dependency> <dependency>
<groupId>org.javassist</groupId> <artifactId>javassist</artifactId>
<version>3.20.0-GA</version> </dependency> <!--dubbo end--> <!--
https://mvnrepository.com/artifact/org.apache.zookeeper/zookeeper -->
<dependency> <groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId> <version>3.4.9</version> </dependency>
<dependency> <groupId>org.apache.curator</groupId> <artifactId>curator-
framework</artifactId> <version>4.2.0</version> </dependency> <dependency>
<groupId>org.apache.curator</groupId> <artifactId>curator-recipes</artifactId>
<version>4.2.0</version> </dependency>`
2. I got "java.lang.StackOverflowError",when i use the method config ,like this :
` <dubbo:reference id="ierpWishProductService"
interface="com.mangoerp.product.api.service.wish.IErpWishProductService"
timeout="30000" retries="0" version="${dubbo.product.version}"> <dubbo:method
name="batchEditPic" async="true" return="false"/> </dubbo:reference>`
exception trace:
at java.net.URLClassLoader.findResource(URLClassLoader.java:569)
at java.lang.ClassLoader.getResource(ClassLoader.java:1089)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1242)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1167)
at org.springframework.orm.jpa.support.PersistenceAnnotationBeanPostProcessor.postProcessBeforeDestruction(PersistenceAnnotationBeanPostProcessor.java:375)
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:239)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:552)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:528)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:831)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:563)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:528)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:831)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:308)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:466)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:459)
at org.springframework.context.support.AbstractApplicationContext.getBeansOfType(AbstractApplicationContext.java:1065)
at org.springframework.beans.factory.BeanFactoryUtils.beansOfTypeIncludingAncestors(BeanFactoryUtils.java:228)
at org.apache.dubbo.config.spring.ReferenceBean.prepareDubboConfigBeans(ReferenceBean.java:86)
at org.apache.dubbo.config.spring.ReferenceBean.afterPropertiesSet(ReferenceBean.java:104)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1613)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1550)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:466)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:459)
at org.springframework.context.support.AbstractApplicationContext.getBeansOfType(AbstractApplicationContext.java:1065)
at org.springframework.beans.factory.BeanFactoryUtils.beansOfTypeIncludingAncestors(BeanFactoryUtils.java:228)
at org.apache.dubbo.config.spring.ReferenceBean.prepareDubboConfigBeans(ReferenceBean.java:86)
at org.apache.dubbo.config.spring.ReferenceBean.afterPropertiesSet(ReferenceBean.java:104)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1613)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1550)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveInnerBean(BeanDefinitionValueResolver.java:276)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:122)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveManagedList(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:157)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1457)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1198)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
at com.alibaba.spring.util.BeanFactoryUtils.getBeans(BeanFactoryUtils.java:80)
at com.alibaba.spring.util.BeanFactoryUtils.getOptionalBean(BeanFactoryUtils.java:59)
at org.apache.dubbo.config.spring.extension.SpringExtensionFactory.getExtension(SpringExtensionFactory.java:69)
at org.apache.dubbo.common.extension.factory.AdaptiveExtensionFactory.getExtension(AdaptiveExtensionFactory.java:47)
at org.apache.dubbo.common.extension.ExtensionLoader.injectExtension(ExtensionLoader.java:657)
at org.apache.dubbo.common.extension.ExtensionLoader.createExtension(ExtensionLoader.java:614)
at org.apache.dubbo.common.extension.ExtensionLoader.getExtension(ExtensionLoader.java:405)
at org.apache.dubbo.rpc.model.ApplicationModel.getConfigManager(ApplicationModel.java:92)
at org.apache.dubbo.config.AbstractConfig.addIntoConfigManager(AbstractConfig.java:579)
at sun.reflect.GeneratedMethodAccessor54.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:349)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:300)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:133)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:407)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getTypeForFactoryBean(AbstractBeanFactory.java:1418)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:788)
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:541)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:387)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:354)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:348)
at org.springframework.context.support.AbstractApplicationContext.getBeanNamesForType(AbstractApplicationContext.java:1053)
at org.springframework.beans.factory.BeanFactoryUtils.beanNamesForTypeIncludingAncestors(BeanFactoryUtils.java:144)
| 0 |
I'm unable to link my GitHub account to FreeCodeCamp
(http://www.freecodecamp.com/fccbd1cb0b2). It gives me the message: Your
GitHub is already associated with another account. You may have accidentally
created a duplicate account. No worries, though. We can fix this real quick.
Please email us with your GitHub username: team@freecodecamp.com.
I sent an email to team@freecodecamp.com last week and have not yet heard
back.
|
Css Flexbox: Using the Flex-direction Property to Make a Row
http://beta.freecodecamp.com/en/challenges/css-flexbox/using-the-
flexdirection-property-to-make-a-row
The challenge passes without input or correct/requested solution.
**because**
`display:flex;`
is equivalent to
display:flex;
flex-direction:row;
| 0 |
It seems like there is currently no option to use the email-only
authentication with the firebase_auth plugin.
|
It would be great to have an implementation for passwordless signon. Its an
option in firebase but not the plugin. Can we add a method and documentation
for this?
| 1 |
If I run VSCode as **root** from bash on Ubuntu/Xubuntu/Lubuntu/Kubuntu it
doesn't respond on startup.
This problem occurred on fresh installs, in VMs, on both of my devices.
However running a quick `chown -R root:root /` fixed the problem (It is a
local VM, so whatever).
[Disclaimer: Know what you do, do not copy paste that into your terminal]
Anyone else experienced the same on a Ubuntu based distro lately? On
Debian/Arch it works just fine.
BTW: I couldn't install the omnisharp plugin for Atom because of the exact
same reason.
|
jay@jay-virtual-machine: ~~/Desktop$ sudo code
bash: cannot set terminal process group (-1): Inappropriate ioctl for device
bash: no job control in this shell
ATTENTION: default value of option force_s3tc_enable overridden by
environment.
[VS Code]: detected unresponsive
[VS Code]: detected unresponsive
jay@jay-virtual-machine:~~/Desktop$ events.js:141
throw er; // Unhandled 'error' event
^
Error: channel closed
at process.target.send (internal/child_process.js:509:16)
at
/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:20:9562
at t
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:11:18479)
at Object.callOnRemote
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:11:18413)
at Object.onUnexpectedPluginHostError
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:4:9764)
at e.unexpectedErrorHandler
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:19:17071)
at e.onUnexpectedError
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:7:17878)
at Object.u as onUnexpectedError
at process.
(/usr/local/vscode/resources/app/out/vs/workbench/node/pluginHostProcess.js:20:9853)
at emitOne (events.js:82:20)
| 1 |
* Electron Version: 1.8.7, 2.0.2 and 3.0.0-beta.4
* Operating System (Platform and Version): Ubuntu Linux 18.04
* Last known working Electron version: ?
**Expected Behavior**
`pointer: cursor` should work regardless of whether BrowserWindow was created
with `frame: false`.
**Actual behavior**
`pointer: cursor` stops working at about 76% height of the window, when using
`frame: false`.
**To Reproduce**
You can use Electron Fiddle to load the repro:
https://gist.github.com/joaomoreno/2bb9879aee864f22704882b5fac825c3
**Additional Information**
From microsoft/vscode#53870
|
* Electron version: 1.6.2
* Operating system: OSX
### Expected behavior
I should be able to use the `--trace-warnings` flag in node by passing it like
this:
app.commandLine.appendSwitch('trace-warnings')
This enables stack traces for unhandled promise rejections. Right now, node
will print "possibly unhandled promise rejection: " but not print a stack
trace, which is incredibly frustrating.
### Actual behavior
Passing the flag to node enables stack traces for unhandled promise warnings.
### How to reproduce
Run this code within `index.js` of the electron app:
new Promise((resolve, reject) => reject(new Error("foo")))
| 0 |
I am not sure whether supporting IE is a priority but I have noticed that the
numerical list in the map page using IE11 shows up as "1".

|
I am using Internet Explorer 11 and all the options in the map and the text
summary of videos (I have a limited cap on my ISP so unable to watch these)
all have the number 1 next to them. Not sure if this is the way things were
set up.
Sorry if has been mentioned before.
| 1 |
Hello,
I'm trying to reproduce the first example here, but the first and last rows
are cut in half:

I'm using seaborn 0.10.0.
|
With `seaborn 0.9.0` and `matplotlib 3.1.1`, the topmost and bottommost row of
boxes in a `seaborn` plot are partially cut off:
import seaborn as sns
import numpy as np
np.random.seed(42)
sns.heatmap(np.random.random((10, 10)))

As another example, consider this demo from the gallery:
import matplotlib.pyplot as plt
import seaborn as sns
sns.set()
# Load the example flights dataset and conver to long-form
flights_long = sns.load_dataset("flights")
flights = flights_long.pivot("month", "year", "passengers")
# Draw a heatmap with the numeric values in each cell
f, ax = plt.subplots(figsize=(9, 6))
sns.heatmap(flights, annot=True, fmt="d", linewidths=.5, ax=ax)

This is a pretty major visual issue. Apologies if this has been reported
before, but I couldn't find a record of such in the GH issues.
| 1 |
## Checklist
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
Running from commit `21baef5`. Report cannot be run since nothing is able to
start.
## Steps to reproduce
Create a minimal worker, but use a backend URI that specifies a failover set
as specified by the documentation
Hostnames do not matter since no connection attempt will be made
from celery import Celery
c = Celery(
__name__,
broker="sentinel://host1:8080;sentinel://host2:8080;sentinel://host3:8080",
backend="sentinel://host1:8080;sentinel://host2:8080;sentinel://host3:8080"
)
@c.task
def add(x, y):
return x + y
Run as normal with `celery -A minimal_celery worker`
## Expected behavior
Worker loads and starts as normal
## Actual behavior
Celery banner formatter attempts to parse the string as a single URI and
cannot extract the port number
[2018-10-16 17:47:25,407: CRITICAL/MainProcess] Unrecoverable error: ValueError("invalid literal for int() with base 10: '8080;sentinel:'",)
Traceback (most recent call last):
File "ENV/celery/worker/worker.py", line 205, in start
self.blueprint.start(self)
File "ENV/celery/bootsteps.py", line 115, in start
self.on_start()
File "ENV/celery/apps/worker.py", line 139, in on_start
self.emit_banner()
File "ENV/celery/apps/worker.py", line 154, in emit_banner
' \n', self.startup_info(artlines=not use_image))),
File "ENV/celery/apps/worker.py", line 217, in startup_info
results=self.app.backend.as_uri(),
File "ENV/celery/backends/base.py", line 135, in as_uri
url = maybe_sanitize_url(self.url or '')
File "ENV/kombu/utils/url.py", line 92, in maybe_sanitize_url
return sanitize_url(url, mask)
File "ENV/kombu/utils/url.py", line 85, in sanitize_url
return as_url(*_parse_url(url), sanitize=True, mask=mask)
File "ENV/kombu/utils/url.py", line 52, in url_to_parts
parts.port,
File "/usr/lib64/python3.6/urllib/parse.py", line 169, in port
port = int(port, 10)
ValueError: invalid literal for int() with base 10: '8080;sentinel:'
|
# Checklist
* I have verified that the issue exists against the `master` branch of Celery.
* This has already been asked to the discussion group first.
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
* I have checked the pull requests list
for existing proposed fixes.
* I have checked the commit log
to find out if the bug was already fixed in the master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
* I have included the contents of `pip freeze` in the issue.
* I have included all the versions of all the external dependencies required
to reproduce this bug.
## Optional Debugging Information
* I have tried reproducing the issue on more than one Python version
and/or implementation.
* I have tried reproducing the issue on more than one message broker and/or
result backend.
* I have tried reproducing the issue on more than one version of the message
broker and/or result backend.
* I have tried reproducing the issue on more than one operating system.
* I have tried reproducing the issue on more than one workers pool.
* I have tried reproducing the issue with autoscaling, retries,
ETA/Countdown & rate limits disabled.
* I have tried reproducing the issue after downgrading
and/or upgrading Celery and its dependencies.
## Related Issues and Possible Duplicates
#### Related Issues
* None
#### Possible Duplicates
* None
## Environment & Settings
**Celery version** :
**`celery report` Output:**
# Steps to Reproduce
## Required Dependencies
* **Minimal Python Version** : N/A or Unknown
* **Minimal Celery Version** : N/A or Unknown
* **Minimal Kombu Version** : N/A or Unknown
* **Minimal Broker Version** : N/A or Unknown
* **Minimal Result Backend Version** : N/A or Unknown
* **Minimal OS and/or Kernel Version** : N/A or Unknown
* **Minimal Broker Client Version** : N/A or Unknown
* **Minimal Result Backend Client Version** : N/A or Unknown
### Python Packages
**`pip freeze` Output:**
### Other Dependencies
N/A
## Minimally Reproducible Test Case
# Expected Behavior
# Actual Behavior
| 0 |
Received the following error while compiling a test application in rust. If
this looks like a geuine error I can provide source, stack traces, etc. if
needed.
/home/uuser/Tools/rust/rust_dev/src/libstd/sync/mutex.rs:177:37: 180:2 error: internal compiler error: debuginfo: Could not find scope info for node NodeExpr(Expr { id: 4974, node: ExprStruct(Path { span: Span { lo: BytePos(4553404), hi: BytePos(4553415), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: StaticMutex#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }, [Field { ident: Spanned { node: lock#0, span: Span { lo: BytePos(1770102), hi: BytePos(1770106), expn_id: ExpnId(4294967295) } }, expr: Expr { id: 4975, node: ExprPath(None, Path { span: Span { lo: BytePos(4553428), hi: BytePos(4553443), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: sys#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }, PathSegment { identifier: MUTEX_INIT#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }), span: Span { lo: BytePos(4553428), hi: BytePos(4553443), expn_id: ExpnId(4294967295) } }, span: Span { lo: BytePos(4553422), hi: BytePos(4553443), expn_id: ExpnId(4294967295) } }, Field { ident: Spanned { node: poison#0, span: Span { lo: BytePos(1770129), hi: BytePos(1770135), expn_id: ExpnId(4294967295) } }, expr: Expr { id: 4976, node: ExprPath(None, Path { span: Span { lo: BytePos(4553457), hi: BytePos(4553474), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: poison#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }, PathSegment { identifier: FLAG_INIT#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }), span: Span { lo: BytePos(4553457), hi: BytePos(4553474), expn_id: ExpnId(4294967295) } }, span: Span { lo: BytePos(4553449), hi: BytePos(4553474), expn_id: ExpnId(4294967295) } }], None), span: Span { lo: BytePos(4553404), hi: BytePos(4553477), expn_id: ExpnId(4294967295) } })
note: the compiler unexpectedly panicked. this is a bug.
note: we would appreciate a bug report: https://github.com/rust-lang/rust/blob/master/CONTRIBUTING.md#bug-reports
note: run with `RUST_BACKTRACE=1` for a backtrace
thread 'rustc' panicked at 'Box<Any>', /home/uuser/Tools/rust/rust_dev/src/libsyntax/diagnostic.rs:129
|
$ RUST_BACKTRACE=1 cargo build --verbose
Fresh gcc v0.2.1
Fresh pkg-config v0.2.1
Compiling openssl-sys v0.4.0 (file:///tmp/rust-openssl/openssl-sys)
Running `rustc src/lib.rs --crate-name openssl-sys --crate-type lib -g -C metadata=dd0e973e71d408a3 -C extra-filename=-dd0e973e71d408a3 --out-dir /tmp/rust-openssl/openssl-sys/target --emit=dep-info,link -L dependency=/tmp/rust-openssl/openssl-sys/target -L dependency=/tmp/rust-openssl/openssl-sys/target/deps -L native=/usr/lib64 -L native=/tmp/rust-openssl/openssl-sys/target/build/openssl-sys-dd0e973e71d408a3/out -l ssl -l crypto -l static=old_openssl_shim`
src/lib.rs:1:1: 1:1 error: internal compiler error: debuginfo: Could not find scope info for node NodeExpr(Expr { id: 5326, node: ExprStruct(Path { span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: StaticMutex#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }, [Field { ident: Spanned { node: lock#0, span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }, expr: Expr { id: 5327, node: ExprPath(Path { span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: sys#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }, PathSegment { identifier: MUTEX_INIT#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }), span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }, span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }, Field { ident: Spanned { node: poison#0, span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }, expr: Expr { id: 5328, node: ExprPath(Path { span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) }, global: false, segments: [PathSegment { identifier: poison#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }, PathSegment { identifier: FLAG_INIT#0, parameters: AngleBracketedParameters(AngleBracketedParameterData { lifetimes: [], types: [], bindings: [] }) }] }), span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }, span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } }], None), span: Span { lo: BytePos(0), hi: BytePos(0), expn_id: ExpnId(4294967295) } })
src/lib.rs:1 #![allow(non_camel_case_types, non_upper_case_globals, non_snake_case)]
^
note: the compiler unexpectedly panicked. this is a bug.
note: we would appreciate a bug report: http://doc.rust-lang.org/complement-bugreport.html
note: run with `RUST_BACKTRACE=1` for a backtrace
thread 'rustc' panicked at 'Box<Any>', /mnt/trash1/tmp/portage/dev-lang/rust-9999/work/rust-9999/src/libsyntax/diagnostic.rs:129
stack backtrace:
1: 0x7fa97995e8e0 - sys::backtrace::write::he6b1641fd079577cfmB
2: 0x7fa979984760 - <unknown>
3: 0x7fa9798d60f0 - rt::unwind::begin_unwind_inner::h634afd378febd66detK
4: 0x7fa976e432b0 - <unknown>
5: 0x7fa976e43240 - diagnostic::SpanHandler::span_bug::h189abf680a7a28f874E
6: 0x7fa97793d500 - session::Session::span_bug::h720ba33852c899b7rSs
7: 0x7fa9787a8a10 - <unknown>
8: 0x7fa9786d9700 - <unknown>
9: 0x7fa978698f50 - <unknown>
10: 0x7fa978698f50 - <unknown>
11: 0x7fa9786999b0 - <unknown>
12: 0x7fa978756fc0 - <unknown>
13: 0x7fa97870d900 - <unknown>
14: 0x7fa97873aef0 - <unknown>
15: 0x7fa9786d9d30 - <unknown>
16: 0x7fa97869a200 - <unknown>
17: 0x7fa9786cea00 - <unknown>
18: 0x7fa9786d50f0 - <unknown>
19: 0x7fa9786dbb30 - <unknown>
20: 0x7fa9786db080 - <unknown>
21: 0x7fa97869a200 - <unknown>
22: 0x7fa9786cea00 - <unknown>
23: 0x7fa9786d50f0 - <unknown>
24: 0x7fa9786dbb30 - <unknown>
25: 0x7fa9786db080 - <unknown>
26: 0x7fa97869a200 - <unknown>
27: 0x7fa9786cea00 - <unknown>
28: 0x7fa9786d3a90 - <unknown>
29: 0x7fa9786dbb30 - <unknown>
30: 0x7fa978698f50 - <unknown>
31: 0x7fa978787d40 - <unknown>
32: 0x7fa978698830 - <unknown>
33: 0x7fa9786999b0 - <unknown>
34: 0x7fa9786dbb30 - <unknown>
35: 0x7fa978698f50 - <unknown>
36: 0x7fa9786999b0 - <unknown>
37: 0x7fa978756fc0 - <unknown>
38: 0x7fa97870d900 - <unknown>
39: 0x7fa9786dbb30 - <unknown>
40: 0x7fa9786db080 - <unknown>
41: 0x7fa97869a200 - <unknown>
42: 0x7fa9786cea00 - <unknown>
43: 0x7fa9786d50f0 - <unknown>
44: 0x7fa9786dbb30 - <unknown>
45: 0x7fa978698f50 - <unknown>
46: 0x7fa9786999b0 - <unknown>
47: 0x7fa9786dbb30 - <unknown>
48: 0x7fa978698f50 - <unknown>
49: 0x7fa9786999b0 - <unknown>
50: 0x7fa978756fc0 - <unknown>
51: 0x7fa9786887a0 - <unknown>
52: 0x7fa978684210 - <unknown>
53: 0x7fa97875dd80 - trans::base::trans_crate::ha8556e563b91ec61fQv
54: 0x7fa979f5f250 - driver::phase_4_translate_to_llvm::hc334bfd7f2b08e6d2Oa
55: 0x7fa979f384b0 - driver::compile_input::h6a09d1f5442479b9Eba
56: 0x7fa97a007770 - run_compiler::h940da8caa9037102Bbc
57: 0x7fa97a005d00 - <unknown>
58: 0x7fa97a004be0 - <unknown>
59: 0x7fa9799e1770 - <unknown>
60: 0x7fa9799e1760 - rust_try
61: 0x7fa97a004e90 - <unknown>
62: 0x7fa979971e00 - <unknown>
63: 0x7fa974614f70 - <unknown>
64: 0x7fa97955d389 - clone
65: 0x0 - <unknown>
Could not compile `openssl-sys`.
Caused by:
Process didn't exit successfully: `rustc src/lib.rs --crate-name openssl-sys --crate-type lib -g -C metadata=dd0e973e71d408a3 -C extra-filename=-dd0e973e71d408a3 --out-dir /tmp/rust-openssl/openssl-sys/target --emit=dep-info,link -L dependency=/tmp/rust-openssl/openssl-sys/target -L dependency=/tmp/rust-openssl/openssl-sys/target/deps -L native=/usr/lib64 -L native=/tmp/rust-openssl/openssl-sys/target/build/openssl-sys-dd0e973e71d408a3/out -l ssl -l crypto -l static=old_openssl_shim` (status=101)
Rust version:
rustc 1.0.0-dev (81bce5290 2015-02-16) (built 2015-02-17)
binary: rustc
commit-hash: 81bce5290ff55b9a2eddd83d31b0778180904d7f
commit-date: 2015-02-16
build-date: 2015-02-17
host: x86_64-unknown-linux-gnu
release: 1.0.0-dev
OS: Gentoo Linux AMD64
| 1 |
**Do you want to request a _feature_ or report a _bug_?**
BUG
**What is the current behavior?**
I have the following ES6 module:
export const namedExport1 = 'foo';
export const namedExport2 = 'bar';
const Test = function Test() {
const factory = {
foo: 'bar',
};
factory.sayHello = function sayHello(name) {
alert(`Hello ${name}`);
};
return factory;
};
export default Test;
If I import it as `import myModule from 'path/Test'` and it is located in the
node_modules directory, it gets imported as:
{
default: Test()
namedExport1: 'foo'
namedExport2: 'bar'
}
However, if I load the module from any other directory, it gets imported as
the default method:
function Test() {
var factory = {
foo: 'bar'
};
factory.sayHello = function sayHello(name) {
alert('Hello ' + name);
};
return factory;
}
**What is the expected behavior?**
Should always import the default export.
**Please mention other relevant information such as the browser version,
Node.js version, Operating System and programming language.**
Webpack: 2.1.0-beta.27
Node: v7
|
**I'm submitting a bug report**
**Webpack version:**
2.1.0-beta.25
**Please tell us about your environment:**
OSX 10.12 (16A323)
**Current behavior:**
When I use webpack with babel and the the `babel-plugin-transform-
es2015-modules-commonjs` transform I can't import a ES6 module anymore which
is not transpiled by babel.
Babel handles the module like an CommonJS module because webpack doesn't set
the `__esModules` flag to `true`.
**Expected/desired behavior:**
I should be able to import the module.
* **If the current behavior is a bug, please provide the steps to reproduce and if possible a minimal demo of the problem along with a gist/jsbin of your webpack configuration.**
https://github.com/k15a/webpack2-babel-bug
* **What is the motivation / use case for changing the behavior?**
styled-components/styled-components#115
* **Language:**
ES6/7
| 1 |
HI, i wanted to look at typescript and saw, that private members are not
really private in console :D
i searched in your FAQ and found an entry to this, but i think this might be a
solution for private members.
**TypeScript Version:**
1.8.9
**Code**
// you all know it:
class Foo {
private x = 0;
increment(): number {
this.x++;
return x;
}
}
**Wished behavior:**
// what about something like this?
var Foo = (function () {
var privates = {};
function Foo() {
privates[ this ] = { x: 0 };
}
Foo.prototype.increment = function () {
privates[ this ].x++;
return privates[ this ].x;
};
return Foo;
})();
**Actual behavior:**
// generates
var Foo = (function () {
var x = 0;
function Foo() {
}
Foo.prototype.increment = function () {
x++;
return x;
};
return Foo;
})();
|
Consider the following code
export class Test {
private x: number;
constructor() {
this.x = 42;
}
f() : number {
return this.x;
}
}
Typescript generates the following JS
var Test = (function () {
function Test() {
this.x = 42;
}
Test.prototype.f = function () {
return this.x;
};
return Test;
})();
exports.Test = Test;
Obviously, one could run something like `new Test().x` in a JS library, and
`private` invariant would fail. I think it's not a kind of behaviour one would
expect from `private`, at least in `export`ed classes. The fix is quite
simple:
var Test = (function () {
var _private = {};
function Test() {
_private.x = 42;
}
Test.prototype.f = function () {
return _private.x;
};
return Test;
})();
exports.Test = Test;
| 1 |
**I'm submitting a ...** (check one with "x")
[x] bug report => search github for a similar issue or PR before submitting
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://github.com/angular/angular/blob/master/CONTRIBUTING.md#question
**Current behavior**
Given a module using the RouterModule:
@NgModule({
declarations: [
AppComponent,
UsersComponent,
UserComponent,
HomeComponent
],
imports: [
BrowserModule,
FormsModule,
HttpModule,
RouterModule.forRoot([
{ path: '', component: HomeComponent},
{ path: 'users', component: UsersComponent}
])
],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
`UsersComponent` uses `UserComponent` and a `RouterLink` in its template:
<app-user></app-user>
<a routerLink="/">Home</a>
whereas `UserComponent` is just plain HTML for this repro.
Testing a simple component using Angular 2.1.0 and Router ~3.0.0
for a component using `routerLink` in its template like `UsersComponent`, we
had to add `RouterTestingModule` in the imports of the testing module:
import { TestBed } from '@angular/core/testing';
import { RouterTestingModule } from '@angular/router/testing';
import { AppModule } from '../app.module';
import { UsersComponent } from './users.component';
describe('Component: Users', () => {
beforeEach(() => {
TestBed.configureTestingModule({
imports: [AppModule, RouterTestingModule],
});
});
it('should create the component', () => {
let fixture = TestBed.createComponent(UsersComponent);
let app = fixture.debugElement.componentInstance;
expect(app).toBeTruthy();
});
});
whereas for a "dumb" component like `UserComponent` we did not:
import { TestBed } from '@angular/core/testing';
import { AppModule } from '../app.module';
import { UserComponent } from './user.component';
describe('Component: User', () => {
beforeEach(() => {
TestBed.configureTestingModule({
imports: [AppModule],
});
});
it('should create the component', () => {
let fixture = TestBed.createComponent(UserComponent);
let app = fixture.debugElement.componentInstance;
expect(app).toBeTruthy();
});
});
**Now, using the router 3.1.0, the above test throws:
`No base href set. Please provide a value for the APP_BASE_HREF token or add a
base element to the document.`**
This can be fixed by adding the `RouterTestingModule` in the imports of the
testing module for the dumb component too, but this is a regression/breaking
change compared to the behavior we used to have.
**Expected behavior**
We should be able to test a component that does not rely on `routerLink`
without having to import the `RouterTestingModule` as we used to.
**Minimal reproduction of the problem with instructions**
See above
**Please tell us about your environment:**
MacOSX
* **Angular version:** 2.0.X
Angular 2.1.0, Router 3.1.0
* **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ]
All
* **Language:** [all | TypeScript X.X | ES6/7 | ES5]
All
* **Node (for AoT issues):** `node --version` =
|
* **I'm submitting a ...**
* bug report
**Current behavior**
Angular may be performing unnecessary duplicate work when starting up the app.
I would expect that the _first_ time that a component is encountered that it's
"compiled" (the template string parsed and the JS code generated) and after
that it's re-used if and when the component is encountered again.
But if you use templateUrl you will see the same template file being requested
over and over which suggests this may not be the case. For example, I have a
price display component which is used in multiple places and the template can
be requested 8+ times or more. Is it being re-parsed and re-built each time?
It seems unnecessary and wasteful.
**Expected/desired behavior**
Any templateUrl should be requested no more than once.
* **What is the expected behavior?**
Still only request the templateUrl once (erm, maybe the issue template doesn't
need this same heading repeated?)
* **What is the motivation / use case for changing the behavior?**
Optimize behavior
* **Angular version:** 2.0.0-beta.X
2.0.0-RC-1
| 0 |
This is not really bugging me, but just would like to know if there is
anything wrong with the code (I am on Ubuntu 18.04 if that helps).
#### Reproducing code example (`test.py`):
import numpy as np
from scipy.interpolate import LSQBivariateSpline
# generate data
x, y = np.meshgrid(
np.linspace(0, 10, 30),
np.linspace(0, 5, 40)
)
x = x.flatten()
y = y.flatten()
z = np.exp(-(x-5)**2/4) + np.exp(-(y-2)**2/3)
# define the knot positions
tx = [1, 2, 4, 5, 6, 8, 9]
ty = [0.2, 0.8, 1.1, 1.8, 2.5, 3, 3.7, 4.2, 4.9]
# get spline fit
s = LSQBivariateSpline(x, y, z, tx, ty, kx=3, ky=3)
# new evaluation
x2, y2 = np.meshgrid(
np.linspace(0, 10, 50),
np.linspace(0, 5, 60)
)
z_new = s(x2, y2, grid=False)
#### Error message:
Run code in terminal as `python -i test.py`, with the following error bumps
out before quitting Python terminal:
python -i tmp.py
>>>
corrupted size vs. prev_size
Aborted (core dumped)
or
python tmp.py
Segmentation fault (core dumped)
This error message shows up randomly, roughly 1 out of 5 trials will trigger
the error.
#### Scipy/Numpy/Python version information:
1.6.0 1.19.4 sys.version_info(major=3, minor=8, micro=0, releaselevel='final', serial=0)
|
_Original tickethttp://projects.scipy.org/scipy/ticket/133 on 2006-04-02 by
@rkern, assigned to @rkern._
The function `f_value` in file source:trunk/Lib/stats/stats.py needs review.
Please look over the StatisticsReview guidelines and add your comments below.
| 0 |
Challenge http://www.freecodecamp.com/challenges/waypoint-learn-how-script-
tags-and-document-ready-work has an issue. Please describe how to reproduce
it, and include links to screenshots if possible.
|
Challenge http://www.freecodecamp.com/challenges/waypoint-bring-your-
javascript-slot-machine-to-life has an issue. Please describe how to reproduce
it, and include links to screenshots if possible.
There is like an endless refresh on this, for me.
| 1 |
`DataFrame.to_string()` has a formatters parameter for formatting columns
individually, but `to_csv()` could benefit from that parameter as well. This
is especially useful, for example, when formatting integers with missing
values.
|
* date formatting #2583, PR #4313, #6797
* string spacing (justification?) #4158
* float format #2502, #2069
* int format, #6502
* timedelta fromat #6783
SO question
something like
df.to_csv(format='%10.4f', sep=' ')
| 1 |
If the output of this product is a matrix, should it not be a column vector
matrix rather than a row vector matrix?
>>> matrix([[1, 2], [3, 4]]).dot(array([5, 6]))
matrix([[17, 39]])
This seems like a bug, although I do not have extensive knowledge of the
behavior of `numpy.matrix` and this has probably already been discussed
extensively.
|
_Original tickethttp://projects.scipy.org/numpy/ticket/2057 on 2012-02-17 by
@pv, assigned to unknown._
http://permalink.gmane.org/gmane.comp.python.scientific.user/31095
Consider this:
import numpy as np
x = np.arange(5)
I = np.asmatrix(np.identity(5))
print np.dot(I, x).shape
# -> (1, 5)
while `(5, 1)` would be what is expected based on usual linear algebra rules.
It might also be a mistake to return a matrix from mixed matrix--array
computations.
| 1 |
using 1.2.4
I created a pod with a small udp echo server running on port 5005
(mendhak/udp-listener) and exposed it as a udp service.
**This only happens when the client and server land on the same node!**
A simple client
nc -4u -w1 $SERVICE_IP 5005 <<< a
nc -4u -w1 $POD_IP 5005 <<< b
Meanwhile, on the server
Listening on UDP port 5005
a
a
b
tcpdump on the node
08:00:38.574388 IP 10.228.1.5.38130 > 10.228.1.3.5005: UDP, length 7
08:00:38.574447 IP 10.228.1.5.38130 > 10.228.1.3.5005: UDP, length 7
08:00:42.022864 IP 10.228.1.5.42386 > 10.228.1.3.5005: UDP, length 7
By the way, my real use case was using logstash with a udp input and my logs
were duplicated sometimes (when the pod that produces logs landed on the same
node as logstash).
another thought, maybe TCP packets get duplicated too but handled?
|
**Is this a request for help?** (If yes, you should use our troubleshooting
guide and community support channels, see
http://kubernetes.io/docs/troubleshooting/.):
**What keywords did you search in Kubernetes issues before filing this one?**
(If you have found any duplicates, you should instead reply there.):
* * *
**Is this a BUG REPORT or FEATURE REQUEST?** (choose one):
**Kubernetes version** (use `kubectl version`):
**Environment** :
* **Cloud provider or hardware configuration** :
* **OS** (e.g. from /etc/os-release):
* **Kernel** (e.g. `uname -a`):
* **Install tools** :
* **Others** :
**What happened** :
**What you expected to happen** :
**How to reproduce it** (as minimally and precisely as possible):
**Anything else do we need to know** :
| 0 |
Ability to collapse on a field. For example, I want the most relevant result
from all different report types. Or similarly, the most recent result of each
report type. Or maybe, I want to de-dup on headline.
So, the sort order would dictate which one from the group is returned. Similar
to what is discussed here:
http://blog.jteam.nl/2009/10/20/result-grouping-field-collapsing-with-solr/
From my understanding, it seems that in order for field collapsing to be
efficient, the result set must be relatively small.
This is also referred to as "Combine" on some other search products.
|
I restarted a node in my cluster but new documents are not indexing. I thought
"yellow' state meant that unassigned replicas are getting initialized but the
cluster is still fully functional. Is this a bug?
+=== debug
$ curl -XGET 'es.internal.company.com:9200/_cluster/health?pretty'
{
"cluster_name" : "es_cluster_a",
"status" : "yellow",
"timed_out" : false,
"number_of_nodes" : 9,
"number_of_data_nodes" : 6,
"active_primary_shards" : 1265,
"active_shards" : 3041,
"relocating_shards" : 0,
"initializing_shards" : 12,
"unassigned_shards" : 725,
"number_of_pending_tasks" : 0
}
curl -XGET 'es.internal.company.com:9200/_cat/indices?pretty'|sort -k 3n
yellow open web-logs-2015.05.11 8 2 577100361 0 697.3gb 346.7gb
yellow open web-logs-2015.05.12 8 1 737742512 0 640.3gb 496.3gb
yellow open web-logs-2015.05.13 8 1 632722103 0 557.6gb 402.5gb
yellow open web-logs-2015.05.14 8 2 240925903 0 309.3gb 148.5gb
yellow open web-logs-2015.05.15 8 2 0 0 920b 920b
| 0 |
Apache Druid 25.0.0 contains over 300 new features, bug fixes, performance
enhancements, documentation improvements, and additional test coverage from 51
contributors.
See the complete set of changes for additional details.
# # Highlights
## # MSQ task engine now production ready
The multi-stage query (MSQ) task engine used for SQL-based ingestion is now
production ready. Use it for any supported workloads. For more information,
see the following pages:
* Ingestion
* SQL-based ingestion
## # Simplified Druid deployments
The new `start-druid` script greatly simplifies deploying any combination of
Druid services on a single-server. It comes pre-packaged with the required
configs and can be used to launch a fully functional Druid cluster simply by
invoking `./start-druid`. For experienced Druids, it also gives complete
control over the runtime properties and JVM arguments to have a cluster that
exactly fits your needs.
The `start-druid` script deprecates the existing profiles such as `start-
micro-quickstart` and `start-nano-quickstart`. These profiles may be removed
in future releases. For more information, see Single server deployment.
## # String dictionary compression (experimental)
Added support for front coded string dictionaries for smaller string columns,
leading to reduced segment sizes with only minor performance penalties for
most Druid queries.
This can be enabled by setting `IndexSpec.stringDictionaryEncoding` to
`{"type":"frontCoded", "bucketSize": 4}` , where `bucketSize` is any power of
2 less than or equal to 128. Setting this property instructs indexing tasks to
write segments using compressed dictionaries of the specified bucket size.
> Any segment written using string dictionary compression is not readable by
> older versions of Druid.
For more information, see Front coding.
#12277
## # Kubernetes-native tasks
Druid can now use Kubernetes to launch and manage tasks, eliminating the need
for middle managers.
To use this feature, enable the druid-kubernetes-overlord-extensions in the
extensions load list for your Overlord process.
#13156
## # Hadoop-3 compatible binary
Druid now comes packaged as a dedicated binary for Hadoop-3 users, which
contains Hadoop-3 compatible jars. If you do not use Hadoop-3 with your Druid
cluster, you may continue using the classic binary.
# # Multi-stage query (MSQ) task engine
## # MSQ enabled for Docker
MSQ task query engine is now enabled for Docker by default.
#13069
## # Query history
Multi-stage queries no longer show up in the Query history dialog. They are
still available in the **Recent query tasks** panel.
## # Limit on CLUSTERED BY columns
When using the MSQ task engine to ingest data, the number of columns that can
be passed in the CLUSTERED BY clause is now limited to 1500.
#13352
## # Support for string dictionary compression
The MSQ task engine supports the front-coding of String dictionaries for
better compression. This can be enabled for INSERT or REPLACE statements by
setting `indexSpec` to a valid json string in the query context.
#13275
## # Sketch merging mode
Workers can now gather key statistics, used to generate partition boundaries,
either sequentially or in parallel. Set `clusterStatisticsMergeMode` to
`PARALLEL`, `SEQUENTIAL` or `AUTO` in the query context to use the
corresponding sketch merging mode. For more information, see Sketch merging
mode.
#13205
## # Performance and operational improvements
* **Error messages** : For disallowed MSQ warnings of certain types, the warning is now surfaced as the error. #13198
* **Secrets** : For tasks containing SQL with sensitive keys, Druid now masks the keys while logging with the help regular expressions. #13231
* **Downsampling accuracy** : MSQ task engine now uses the number of bytes instead of number of keys when downsampling data. #12998
* **Memory usage** : When determining partition boundaries, the heap footprint of internal sketches used by MSQ is now capped at 10% of available memory or 300 MB, whichever is lower. Previously, the cap was strictly 300 MB. #13274
* **Task reports** : Added fields `pendingTasks` and `runningTasks` to the worker report. See Query task status information for related web console changes. #13263
# # Querying
## # Async reads for JDBC
Prevented JDBC timeouts on long queries by returning empty batches when a
batch fetch takes too long. Uses an async model to run the result fetch
concurrently with JDBC requests.
#13196
## # Improved algorithm to check values of an IN filter
To accommodate large value sets arising from large IN filters or from joins
pushed down as IN filters, Druid now uses a sorted merge algorithm for merging
the set and dictionary for larger values.
#13133
## # Enhanced query context security
Added the following configuration properties that refine the query context
security model controlled by `druid.auth.authorizeQueryContextParams`:
* `druid.auth.unsecuredContextKeys`: A JSON list of query context keys that do not require a security check.
* `druid.auth.securedContextKeys`: A JSON list of query context keys that do require a security check.
If both are set, `unsecuredContextKeys` acts as exceptions to
`securedContextKeys`.
#13071
## # HTTP response headers
The HTTP response for a SQL query now correctly sets response headers, same as
a native query.
#13052
# # Metrics
## # New metrics
The following metrics have been newly added. For more details, see the
complete list of Druid metrics.
### # Batched segment allocation
These metrics pertain to batched segment allocation.
Metric | Description | Dimensions
---|---|---
`task/action/batch/runTime` | Milliseconds taken to execute a batch of task
actions. Currently only being emitted for batched `segmentAllocate` actions |
`dataSource`, `taskActionType=segmentAllocate`
`task/action/batch/queueTime` | Milliseconds spent by a batch of task actions
in queue. Currently only being emitted for batched `segmentAllocate` actions |
`dataSource`, `taskActionType=segmentAllocate`
`task/action/batch/size` | Number of task actions in a batch that was executed
during the emission period. Currently only being emitted for batched
`segmentAllocate` actions | `dataSource`, `taskActionType=segmentAllocate`
`task/action/batch/attempts` | Number of execution attempts for a single batch
of task actions. Currently only being emitted for batched `segmentAllocate`
actions | `dataSource`, `taskActionType=segmentAllocate`
`task/action/success/count` | Number of task actions that were executed
successfully during the emission period. Currently only being emitted for
batched `segmentAllocate` actions | `dataSource`, `taskId`, `taskType`,
`taskActionType=segmentAllocate`
`task/action/failed/count` | Number of task actions that failed during the
emission period. Currently only being emitted for batched `segmentAllocate`
actions | `dataSource`, `taskId`, `taskType`, `taskActionType=segmentAllocate`
### # Streaming ingestion
Metric | Description | Dimensions
---|---|---
`ingest/kafka/partitionLag` | Partition-wise lag between the offsets consumed
by the Kafka indexing tasks and latest offsets in Kafka brokers. Minimum
emission period for this metric is a minute. | `dataSource`, `stream`,
`partition`
`ingest/kinesis/partitionLag/time` | Partition-wise lag time in milliseconds
between the current message sequence number consumed by the Kinesis indexing
tasks and latest sequence number in Kinesis. Minimum emission period for this
metric is a minute. | `dataSource`, `stream`, `partition`
`ingest/pause/time` | Milliseconds spent by a task in a paused state without
ingesting. | `dataSource`, `taskId`, `taskType`
`ingest/handoff/time` | Total time taken in milliseconds for handing off a
given set of published segments. | `dataSource`, `taskId`, `taskType`
#13238
#13331
#13313
## # Other improvements
* New dimension `taskActionType` which may take values such as `segmentAllocate`, `segmentTransactionalInsert`, etc. This dimension is reported for `task/action/run/time` and the new batched segment allocation metrics. #13333
* Metric `namespace/cache/heapSizeInBytes` for global cached lookups now accounts for the `String` object overhead of 40 bytes. #13219
* `jvm/gc/cpu` has been fixed to report nanoseconds instead of milliseconds. #13383
# # Nested columns
## # Nested columns performance improvement
Improved `NestedDataColumnSerializer` to no longer explicitly write null
values to the field writers for the missing values of every row. Instead,
passing the row counter is moved to the field writers so that they can
backfill null values in bulk.
#13101
## # Support for more formats
Druid nested columns and the associated JSON transform functions now support
Avro, ORC, and Parquet.
#13325
#13375
## # Refactored a datasource before unnest
When data requires "flattening" during processing, the operator now takes in
an array and then flattens the array into N (N=number of elements in the
array) rows where each row has one of the values from the array.
#13085
# # Ingestion
## # Improved filtering for cloud objects
You can now stop at arbitrary subfolders using glob syntax in the
`ioConfig.inputSource.filter` field for native batch ingestion from cloud
storage, such as S3.
#13027
## # Async task client for streaming ingestion
You can now enable asynchronous communication between the stream supervisor
and indexing tasks by setting `chatAsync` to true in the `tuningConfig`. The
async task client uses its own internal thread pool and thus ignrores the
`chatThreads` property.
#13354
## # Improved handling of JSON data with streaming ingestion
You can now better control how Druid reads JSON data for streaming ingestion
by setting the following fields in the input format specification:
* `assumedNewlineDelimited` to parse lines of JSON independently.
* `useJsonNodeReader` to retain valid JSON events when parsing multi-line JSON events when a parsing exception occurs.
The web console has been updated to include these options.
#13089
## # Ingesting from an idle Kafka stream
When a Kafka stream becomes inactive, the supervisor ingesting from it can be
configured to stop creating new indexing tasks. The supervisor automatically
resumes creation of new indexing tasks once the stream becomes active again.
Set the property `dataSchema.ioConfig.idleConfig.enabled` to true in the
respective supervisor spec or set `druid.supervisor.idleConfig.enabled` on the
overlord to enable this behaviour. Please see the following for details:
* Overlord configuration
* Supervisor spec
#13144
## # Kafka Consumer improvement
You can now configure the Kafka Consumer's custom deserializer after its
instantiation.
#13097
## # Kafka supervisor logging
Kafka supervisor logs are now less noisy. The supervisors now log events at
the DEBUG level instead of INFO.
#13392
## # Fixed Overlord leader election
Fixed a problem where Overlord leader election failed due to lock
reacquisition issues. Druid now fails these tasks and clears all locks so that
the Overlord leader election isn't blocked.
#13172
## # Support for inline protobuf descriptor
Added a new `inline` type `protoBytesDecoder` that allows a user to pass
inline the contents of a Protobuf descriptor file, encoded as a Base64 string.
#13192
## # Duplicate notices
For streaming ingestion, notices that are the same as one already in queue
won't be enqueued. This will help reduce notice queue size.
#13334
## # Sampling from stream input now respects the configured timeout
Fixed a problem where sampling from a stream input, such as Kafka or Kinesis,
failed to respect the configured timeout when the stream had no records
available. You can now set the maximum amount of time in which the entry
iterator will return results.
#13296
## # Streaming tasks resume on Overlord switch
Fixed a problem where streaming ingestion tasks continued to run until their
duration elapsed after the Overlord leader had issued a pause to the tasks.
Now, when the Overlord switch occurs right after it has issued a pause to the
task, the task remains in a paused state even after the Overlord re-election.
#13223
## # Fixed Parquet list conversion
Fixed an issue with Parquet list conversion, where lists of complex objects
could unexpectedly be wrapped in an extra object, appearing as
`[{"element":<actual_list_element>},{"element":<another_one>}...]` instead of
the direct list. This changes the behavior of the parquet reader for lists of
structured objects to be consistent with other parquet logical list
conversions. The data is now fetched directly, more closely matching its
expected structure.
#13294
## # Introduced a tree type to flattenSpec
Introduced a `tree` type to `flattenSpec`. In the event that a simple
hierarchical lookup is required, the `tree` type allows for faster JSON
parsing than `jq` and `path` parsing types.
#12177
# # Operations
## # Compaction
Compaction behavior has changed to improve the amount of time it takes and
disk space it takes:
* When segments need to be fetched, download them one at a time and delete them when Druid is done with them. This still takes time but minimizes the required disk space.
* Don't fetch segments on the main compact task when they aren't needed. If the user provides a full `granularitySpec`, `dimensionsSpec`, and `metricsSpec`, Druid skips fetching segments.
For more information, see the documentation on Compaction and Automatic
compaction.
#13280
## # Idle configs for the Supervisor
You can now set the Supervisor to idle, which is useful in cases where freeing
up slots so that autoscaling can be more effective.
To configure the idle behavior, use the following properties:
Property | Description | Default
---|---|---
`druid.supervisor.idleConfig.enabled` | (Cluster wide) If `true`, supervisor
can become idle if there is no data on input stream/topic for some time. |
false
`druid.supervisor.idleConfig.inactiveAfterMillis` | (Cluster wide) Supervisor
is marked as idle if all existing data has been read from input topic and no
new data has been published for `inactiveAfterMillis` milliseconds. |
`600_000`
`inactiveAfterMillis` | (Individual Supervisor) Supervisor is marked as idle
if all existing data has been read from input topic and no new data has been
published for `inactiveAfterMillis` milliseconds. | no (default == `600_000`)
#13311
## # Improved supervisor termination
Fixed issues with delayed supervisor termination during certain transient
states.
#13072
## # Backoff for HttpPostEmitter
The `HttpPostEmitter` option now has a backoff. This means that there should
be less noise in the logs and lower CPU usage if you use this option for
logging.
#12102
## # DumpSegment tool for nested columns
The DumpSegment tool can now be used on nested columns with the `--dump
nested` option.
For more information, see dump-segment tool.
#13356
## # Segment loading and balancing
### # Batched segment allocation
Segment allocation on the Overlord can take some time to finish, which can
cause ingestion lag while a task waits for segments to be allocated.
Performing segment allocation in batches can help improve performance.
There are two new properties that affect how Druid performs segment
allocation:
Property | Description | Default
---|---|---
`druid.indexer.tasklock.batchSegmentAllocation` | If set to true, Druid
performs segment allocate actions in batches to improve throughput and reduce
the average `task/action/run/time`. See batching `segmentAllocate` actions for
details. | false
`druid.indexer.tasklock.batchAllocationWaitTime` | Number of milliseconds
after Druid adds the first segment allocate action to a batch, until it
executes the batch. Allows the batch to add more requests and improve the
average segment allocation run time. This configuration takes effect only if
`batchSegmentAllocation` is enabled. | 500
In addition to these properties, there are new metrics to track batch segment
allocation. For more information, see New metrics for segment allocation.
For more information, see the following:
* Overlord operations
* Task actions and Batching `segmentAllocate` actions
#13369
#13503
### # Improved cachingCost balancer strategy
The `cachingCost` balancer strategy now behaves more similarly to cost
strategy. When computing the cost of moving a segment to a server, the
following calculations are performed:
* Subtract the self cost of a segment if it is being served by the target server
* Subtract the cost of segments that are marked to be dropped
#13321
### # Faster segment assignment
You can now use a round-robin segment strategy to speed up initial segment
assignments. Set `useRoundRobinSegmentAssigment` to `true` in the Coordinator
dynamic config to enable this feature.
#13367
### # Default to batch sampling for balancing segments
Batch sampling is now the default method for sampling segments during
balancing as it performs significantly better than the alternative when there
is a large number of used segments in the cluster.
As part of this change, the following have been deprecated and will be removed
in future releases:
* coordinator dynamic config `useBatchedSegmentSampler`
* coordinator dynamic config `percentOfSegmentsToConsiderPerMove`
* old non-batch method of sampling segments
### # Remove unused property
The unused coordinator property `druid.coordinator.loadqueuepeon.repeatDelay`
has been removed. Use only `druid.coordinator.loadqueuepeon.http.repeatDelay`
to configure repeat delay for the HTTP-based segment loading queue.
#13391
### # Avoid segment over-replication
Improved the process of checking server inventory to prevent over-replication
of segments during segment balancing.
#13114
## # Provided service specific log4j overrides in containerized deployments
Provided an option to override log4j configs setup at the service level
directories so that it works with Druid-operator based deployments.
#13020
## # Various Docker improvements
* Updated Docker to run with JRE 11 by default.
* Updated Docker to use `gcr.io/distroless/java11-debian11` image as base by default.
* Enabled Docker buildkit cache to speed up building.
* Downloaded `bash-static` to the Docker image so that scripts that require bash can be executed.
* Bumped builder image from `3.8.4-jdk-11-slim` to `3.8.6-jdk-11-slim`.
* Switched busybox from `amd64/busybox:1.30.0-glibc` to `busybox:1.35.0-glibc`.
* Added support to build arm64-based image.
#13059
## # Enabled cleaner JSON for various input sources and formats
Added `JsonInclude` to various properties, to avoid population of default
values in serialized JSON.
#13064
## # Improved direct memory check on startup
Improved direct memory check on startup by providing better support for Java
9+ in `RuntimeInfo`, and clearer log messages where validation fails.
#13207
## # Improved the run time of the MarkAsUnusedOvershadowedSegments duty
Improved the run time of the `MarkAsUnusedOvershadowedSegments` duty by
iterating over all overshadowed segments and marking segments as unused in
batches.
#13287
# # Web console
## # Delete an interval
You can now pick an interval to delete from a dropdown in the kill task
dialog.
#13431
## # Removed the old query view
The old query view is removed. Use the new query view with tabs.
For more information, see Web console.
#13169
## # Filter column values in query results
The web console now allows you to add to existing filters for a selected
column.
#13169
## # Support for Kafka lookups in the web-console
Added support for Kafka-based lookups rendering and input in the web console.
#13098
## # Query task status information
The web console now exposes a textual indication about running and pending
tasks when a query is stuck due to lack of task slots.
#13291
# # Extensions
## # Extension optimization
Optimized the `compareTo` function in `CompressedBigDecimal`.
#13086
## # CompressedBigDecimal cleanup and extension
Removed unnecessary generic type from CompressedBigDecimal, added support for
number input types, added support for reading aggregator input types directly
(uningested data), and fixed scaling bug in buffer aggregator.
#13048
## # Support for Kubernetes discovery
Added `POD_NAME` and `POD_NAMESPACE` env variables to all Kubernetes
Deployments and StatefulSets.
Helm deployment is now compatible with `druid-kubernetes-extension`.
#13262
# # Docs
## # Jupyter Notebook tutorials
We released our first Jupyter Notebook-based tutorial to learn the basics of
the Druid API. Download the notebook and follow along with the tutorial to
learn how to get basic cluster information, ingest data, and query data.
For more information, see Jupyter Notebook tutorials.
#13342
#13345
# # Dependency updates
## # Updated Kafka version
Updated the Apache Kafka core dependency to version 3.3.1.
#13176
## # Docker improvements
Updated dependencies for the Druid image for Docker, including JRE 11. Docker
BuildKit cache is enabled to speed up building.
#13059
# # Upgrading to 25.0.0
Consider the following changes and updates when upgrading from Druid 24.0.x to
25.0.0. If you're updating from an earlier version, see the release notes of
the relevant intermediate versions.
## # Default HTTP-based segment discovery and task management
The default segment discovery method now uses HTTP instead of ZooKeeper.
This update changes the defaults for the following properties:
Property | New default | Previous default
---|---|---
`druid.serverview.type` for segment management | http | batch
`druid.coordinator.loadqueuepeon.type` for segment management | http | curator
`druid.indexer.runner.type` for the Overlord | httpRemote | local
To use ZooKeeper instead of HTTP, change the values for the properties back to
the previous defaults. ZooKeeper-based implementations for these properties
are deprecated and will be removed in a subsequent release.
#13092
## # Finalizing HLL and quantiles sketch aggregates
The aggregation functions for HLL and quantiles sketches returned sketches or
numbers when they are finalized depending on where they were in the native
query plan.
Druid no longer finalizes aggregators in the following two cases:
* aggregators appear in the outer level of a query
* aggregators are used as input to an expression or finalizing-field-access post-aggregator
This change aligns the behavior of HLL and quantiles sketches with theta
sketches.
To restore old behaviour, you can set `sqlFinalizeOuterSketches=true` in the
query context.
#13247
## # Kill tasks mark segments as unused only if specified
When you issue a kill task, Druid marks the underlying segments as unused only
if explicitly specified. For more information, see the API reference
#13104
## # Incompatible changes
### # Upgrade curator to 5.3.0
Apache Curator upgraded to the latest version, 5.3.0. This version drops
support for ZooKeeper 3.4 but Druid has already officially dropped support in
0.22. In 5.3.0, Curator has removed support for Exhibitor so all related
configurations and tests have been removed.
#12939
### # Fixed Parquet list conversion
The behavior of the parquet reader for lists of structured objects has been
changed to be consistent with other parquet logical list conversions. The data
is now fetched directly, more closely matching its expected structure. See
parquet list conversion for more details.
#13294
# # Credits
Thanks to everyone who contributed to this release!
@317brian
@599166320
@a2l007
@abhagraw
@abhishekagarwal87
@adarshsanjeev
@adelcast
@AlexanderSaydakov
@amaechler
@AmatyaAvadhanula
@ApoorvGuptaAi
@arvindanugula
@asdf2014
@churromorales
@clintropolis
@cloventt
@cristian-popa
@cryptoe
@dampcake
@dependabot[bot]
@didip
@ektravel
@eshengit
@findingrish
@FrankChen021
@gianm
@hnakamor
@hosswald
@imply-cheddar
@jasonk000
@jon-wei
@Junge-401
@kfaraz
@LakshSingla
@mcbrewster
@paul-rogers
@petermarshallio
@rash67
@rohangarg
@sachidananda007
@santosh-d3vpl3x
@senthilkv
@somu-imply
@techdocsmith
@tejaswini-imply
@vogievetsky
@vtlim
@wcc526
@writer-jill
@xvrl
@zachjsh
|
Apache Druid 24.0.0 contains over 300 new features, bug fixes, performance
enhancements, documentation improvements, and additional test coverage from 67
contributors. See the complete set of changes for additional details.
# # New Features
## # Multi-stage query task engine
SQL-based ingestion for Apache Druid uses a distributed multi-stage query
architecture, which includes a query engine called the multi-stage query task
engine (MSQ task engine). The MSQ task engine extends Druid's query
capabilities, so you can write queries that reference external data as well as
perform ingestion with SQL INSERT and REPLACE. Essentially, you can perform
SQL-based ingestion instead of using JSON ingestion specs that Druid's native
ingestion uses.
SQL-based ingestion using the multi-stage query task engine is the recommended
solution starting in Druid 24.0.0. Alternative ingestion solutions such as
native batch and Hadoop-based ingestion systems will still be supported. We
recommend you read all known issues and test the feature in a development
environment before rolling out in production. Using the multi-stage query task
engine with `SELECT` statements that do not write to a datasource is
experimental.
The extension for it (druid-multi-stage-query) is loaded by default. If you're
upgrading from an earlier version of Druid, you'll need to add the extension
to druid.extensions.loadlist in your common.runtime.properties file.
For more information, see the overview for the multi-stage query architecture.
#12524
#12386
#12523
#12589
## # Nested columns
Druid now supports directly storing nested data structures in a newly added
`COMPLEX<json>` column type. `COMPLEX<json>` columns store a copy of the
structured data in JSON format as well as specialized internal columns and
indexes for nested literal values—`STRING`, `LONG`, and `DOUBLE` types. An
optimized virtual column allows Druid to read and filter these values at
speeds consistent with standard Druid `LONG`, `DOUBLE`, and `STRING` columns.
Newly added Druid SQL, native JSON functions, and virtual column allow you to
extract, transform, and create `COMPLEX<json>` values in at query time. You
can also use the JSON functions in `INSERT` and `REPLACE` statements in SQL-
based ingestion, or in a `transformSpec` in native ingestion as an alternative
to using a `flattenSpec` object to "flatten" nested data for ingestion.
See SQL JSON functions, native JSON functions, Nested columns, virtual
columns, and the feature summary for more detail.
#12753
#12714
#12753
#12920
## # Updated Java support
Java 11 is fully supported is no longer experimental. Java 17 support is
improved.
#12839
# # Query engine updates
### # Updated column indexes and query processing of filters
Reworked column indexes to be extraordinarily flexible, which will eventually
allow us to model a wide range of index types. Added machinery to build the
filters that use the updated indexes, while also allowing for other column
implementations to implement the built-in index types to provide adapters to
make use indexing in the current set filters that Druid provides.
#12388
### # Time filter operator
You can now use the Druid SQL operator TIME_IN_INTERVAL to filter query
results based on time. Prefer TIME_IN_INTERVAL over the SQL BETWEEN operator
to filter on time. For more information, see Date and time functions.
#12662
### # Null values and the "in" filter
If a `values` array contains `null`, the "in" filter matches null values. This
differs from the SQL IN filter, which does not match null values.
For more information, see Query filters and SQL data types.
#12863
### # Virtual columns in search queries
Previously, a search query could only search on dimensions that existed in the
data source. Search queries now support virtual columns as a parameter in the
query.
#12720
### # Optimize simple MIN / MAX SQL queries on __time
Simple queries like `select max(__time) from ds` now run as a `timeBoundary`
queries to take advantage of the time dimension sorting in a segment. You can
set a feature flag to enable this feature.
#12472
#12491
### # String aggregation results
The first/last string aggregator now only compares based on values.
Previously, the first/last string aggregator’s values were compared based on
the `_time` column first and then on values.
If you have existing queries and want to continue using both the `_time`
column and values, update your queries to use ORDER BY MAX(timeCol).
#12773
### # Reduced allocations due to Jackson serialization
Introduced and implemented new helper functions in `JacksonUtils` to enable
reuse of
`SerializerProvider` objects.
Additionally, disabled backwards compatibility for map-based rows in the
`GroupByQueryToolChest` by default, which eliminates the need to copy the
heavyweight `ObjectMapper`. Introduced a configuration option to allow
administrators to explicitly enable backwards compatibility.
#12468
### # Updated IPAddress Java library
Added a new IPAddress Java library dependency to handle IP addresses. The
library includes IPv6 support. Additionally, migrated IPv4 functions to use
the new library.
#11634
### # Query performance improvements
Optimized SQL operations and functions as follows:
* Vectorized numeric latest aggregators (#12439)
* Optimized `isEmpty()` and `equals()` on RangeSets (#12477)
* Optimized reuse of Yielder objects (#12475)
* Operations on numeric columns with indexes are now faster (#12830)
* Optimized GroupBy by reducing allocations. Reduced allocations by reusing entry and key holders (#12474)
* Added a vectorized version of string last aggregator (#12493)
* Added Direct UTF-8 access for IN filters (#12517)
* Enabled virtual columns to cache their outputs in case Druid calls them multiple times on the same underlying row (#12577)
* Druid now rewrites a join as a filter when possible in IN joins (#12225)
* Added automatic sizing for GroupBy dictionaries (#12763)
* Druid now distributes JDBC connections more evenly amongst brokers (#12817)
## # Streaming ingestion
### # Kafka consumers
Previously, consumers that were registered and used for ingestion persisted
until Kafka deleted them. They were only used to make sure that an entire
topic was consumed. There are no longer consumer groups that linger.
#12842
### # Kinesis ingestion
You can now perform Kinesis ingestion even if there are empty shards.
Previously, all shards had to have at least one record.
#12792
## # Batch ingestion
### # Batch ingestion from S3
You can now ingest data from endpoints that are different from your default S3
endpoint and signing region.
For more information, see S3 config.
#11798
## # Improvements to ingestion in general
This release includes the following improvements for ingestion in general.
### # Increased robustness for task management
Added `setNumProcessorsPerTask` to prevent various automatically-sized thread
pools from becoming unreasonably large. It isn't ideal for each task to size
its pools as if it is the only process on the entire machine. On large
machines, this solves a common cause of `OutOfMemoryError` due to "unable to
create native thread".
#12592
### # Avatica JDBC driver
The JDBC driver now follows the JDBC standard and uses two kinds of
statements, Statement and PreparedStatement.
#12709
### # Eight hour granularity
Druid now accepts the `EIGHT_HOUR` granularity. You can segment incoming data
to `EIGHT_HOUR` buckets as well as group query results by eight hour
granularity.
#12717
## # Ingestion general
### # Updated Avro extension
The previous Avro extension leaked objects from the parser. If these objects
leaked into your ingestion, you had objects being stored as a string column
with the value as the .toString(). This string column will remain after you
upgrade but will return `Map.toString()` instead of `GenericRecord.toString`.
If you relied on the previous behavior, you can use the Avro extension from an
earlier release.
#12828
### # Sampler API
The sampler API has additional limits: `maxBytesInMemory` and
`maxClientResponseBytes`. These options augment the existing options `numRows`
and `timeoutMs`. `maxBytesInMemory` can be used to control the memory usage on
the Overlord while sampling. `maxClientResponseBytes` can be used by clients
to specify the maximum size of response they would prefer to handle.
#12947
## # SQL
### # Column order
The `DruidSchema` and `SegmentMetadataQuery` properties now preserve column
order instead of ordering columns alphabetically. This means that query order
better matches ingestion order.
#12754
### # Converting JOINs to filter
You can improve performance by pushing JOINs partially or fully to the base
table as a filter at runtime by setting the `enableRewriteJoinToFilter`
context parameter to `true` for a query.
Druid now pushes down join filters in case the query computing join references
any columns from the right side.
#12749
#12868
### # Add is_active to sys.segments
Added `is_active` as shorthand for `(is_published = 1 AND is_overshadowed = 0)
OR is_realtime = 1)`. This represents "all the segments that should be
queryable, whether or not they actually are right now".
#11550
### # `useNativeQueryExplain` now defaults to true
The `useNativeQueryExplain` property now defaults to `true`. This means that
EXPLAIN PLAN FOR returns the explain plan as a JSON representation of
equivalent native query(s) by default. For more information, see Broker
Generated Query Configuration Supplementation.
#12936
### # Running queries with inline data using druid query engine
Some queries that do not refer to any table, such as `select 1`, are now
always translated to a native Druid query with `InlineDataSource` before
execution. If translation is not possible, for queries such as `SELECT (1,
2)`, then an error occurs. In earlier versions, this query would still run.
#12897
## # Coordinator/Overlord
### # You can configure the Coordinator to kill segments in the future
You can now set `druid.coordinator.kill.durationToRetain` to a negative period
to configure the Druid cluster to kill segments whose `interval_end` is a date
in the future. For example, PT-24H would allow segments to be killed if their
interval_end date was 24 hours or less into the future at the time that the
kill task is generated by the system.
A cluster operator can also disregard the
`druid.coordinator.kill.durationToRetain` entirely by setting a new
configuration, `druid.coordinator.kill.ignoreDurationToRetain=true`. This
ignores `interval_end` date when looking for segments to kill, and can instead
kill any segment marked unused. This new configuration is turned off by
default, and a cluster operator should fully understand and accept the risks
before enabling it.
### # Improved Overlord stability
Reduced contention between the management thread and the reception of status
updates from the cluster. This improves the stability of Overlord and all
tasks in a cluster when there are large (1000+) task counts.
#12099
### # Improved Coordinator segment logging
Updated Coordinator load rule logging to include current replication levels.
Added missing segment ID and tier information from some of the log messages.
#12511
### # Optimized overlord GET tasks memory usage
Addressed the significant memory overhead caused by the web-console indirectly
calling the Overlord’s GET tasks API. This could cause unresponsiveness or
Overlord failure when the ingestion tab was opened multiple times.
#12404
### # Reduced time to create intervals
In order to optimize segment cost computation time by reducing time taken for
interval creation, store segment interval instead of creating it each time
from primitives and reduce memory overhead of storing intervals by interning
them. The set of intervals for segments is low in cardinality.
#12670
## # Brokers/Overlord
Brokers now have a default of 25MB maximum queued per query. Previously, there
was no default limit. Depending on your use case, you may need to increase the
value, especially if you have large result sets or large amounts of
intermediate data. To adjust the maximum memory available, use the
`druid.broker.http.maxQueuedBytes` property.
For more information, see Configuration reference.
## # Web console
> Prepare to have your Web Console experience elevated! - @vogievetsky
### # New query view (WorkbenchView) with tabs and long running query support

You can use the new query view to execute multi-stage, task based, queries
with the /druid/v2/sql/task and /druid/indexer/v1/task/* APIs as well as
native and sql-native queries just like the old Query view. A key point of the
sql-msq-task based queries is that they may run for a long time. This inspired
/ necessitated many UX changes including, but not limited to the following:
#### # Tabs
You can now have many queries stored and running at the same time,
significantly improving the query view UX.

You can open several tabs, duplicate them, and copy them as text to paste into
any console and reopen there.
#### # Progress reports (counter reports)
Queries run with the multi-stage query task engine have detailed progress
reports shown in the summary progress bar and the in detail execution table
that provides summaries of the counters for every step.

#### # Error and warning reports
Queries run with the multi-stage query task engine present user friendly
warnings and errors should anything go wrong.
The new query view has components to visualize these with their full detail
including a stack-trace.

#### # Recent query tasks panel
Queries run with the multi-stage query task engine are tasks. This makes it
possible to show queries that are executing currently and that have executed
in the recent past.

For any query in the Recent query tasks panel you can view the execution
details for it and you can also attach it as a new tab and continue iterating
on the query. It is also possible to download the "query detail archive", a
JSON file containing all the important details for a given query to use for
troubleshooting.
#### # Connect external data flow
Connect external data flow lets you use the sampler to sample your source data
to, determine its schema and generate a fully formed SQL query that you can
edit to fit your use case before you launch your ingestion job. This point-
and-click flow will save you much typing.

#### # Preview button
The Preview button appears when you type in an INSERT or REPLACE SQL query.
Click the button to remove the INSERT or REPLACE clause and execute your query
as an "inline" query with a limi). This gives you a sense of the shape of your
data after Druid applies all your transformations from your SQL query.
#### # Results table
The query results table has been improved in style and function. It now shows
you type icons for the column types and supports the ability to manipulate
nested columns with ease.
#### # Helper queries
The Web Console now has some UI affordances for notebook and CTE users. You
can reference helper queries, collapsable elements that hold a query, from the
main query just like they were defined with a WITH statement. When you are
composing a complicated query, it is helpful to break it down into multiple
queries to preview the parts individually.
#### # Additional Web Console tools
More tools are available from the ... menu:
* Explain query - show the query plan for sql-native and multi-stage query task engine queries.
* Convert ingestion spec to SQL - Helps you migrate your native batch and Hadoop based specs to the SQL-based format.
* Open query detail archive - lets you open a query detail archive downloaded earlier.
* Load demo queries - lets you load a set of pre-made queries to play around with multi-stage query task engine functionality.
### # New SQL-based data loader
The data loader exists as a GUI wizard to help users craft a JSON ingestion
spec using point and click and quick previews. The SQL data loader is the SQL-
based ingestion analog of that.
Like the native based data loader, the SQL-based data loader stores all the
state in the SQL query itself. You can opt to manipulate the query directly at
any stage. See (#12919) for more information about how the data loader differs
from the **Connect external data** workflow.
### # Other changes and improvements
* The query view has so much new functionality that it has moved to the far left as the first view available in the header.
* You can now click on a datasource or segment to see a preview of the data within.
* The task table now explicitly shows if a task has been canceled in a different color than a failed task.
* The user experience when you view a JSON payload in the Druid console has been improved. There’s now syntax highlighting and a search.
* The Druid console can now use the column order returned by a scan query to determine the column order for reindexing data.
* The way errors are displayed in the Druid console has been improved. Errors no longer appear as a single long line.
See (#12919) for more details and other improvements
## # Metrics
### # Sysmonitor stats for Peons
Sysmonitor stats, like memory or swap, are no longer reported since Peons
always run on the same host as MiddleManagerse. This means that duplicate
stats will no longer be reported.
#12802
### # Prometheus
You can now include the host and service as labels for Prometheus by setting
the following properties to true:
* `druid.emitter.prometheus.addHostAsLabel`
* `druid.emitter.prometheus.addServiceAsLabel`
*
#12769
### # Rows per segment
(Experimental) You can now see the average number of rows in a segment and the
distribution of segments in predefined buckets with the following metrics:
`segment/rowCount/avg` and `segment/rowCount/range/count`.
Enable the metrics with the following property:
`org.apache.druid.server.metrics.SegmentStatsMonitor`
#12730
### # New `sqlQuery/planningTimeMs` metric
There’s a new `sqlQuery/planningTimeMs` metric for SQL queries that computes
the time it takes to build a native query from a SQL query.
#12923
### # StatsD metrics reporter
The StatsD metrics reporter extension now includes the following metrics:
* coordinator/time
* coordinator/global/time
* tier/required/capacity
* tier/total/capacity
* tier/replication/factor
* tier/historical/count
* compact/task/count
* compactTask/maxSlot/count
* compactTask/availableSlot/count
* segment/waitCompact/bytes
* segment/waitCompact/count
* interval/waitCompact/count
* segment/skipCompact/bytes
* segment/skipCompact/count
* interval/skipCompact/count
* segment/compacted/bytes
* segment/compacted/count
* interval/compacted/count
#12762
### # New worker level task metrics
Added a new monitor, `WorkerTaskCountStatsMonitor`, that allows each middle
manage worker to report metrics for successful / failed tasks, and task slot
usage.
#12446
### # Improvements to the JvmMonitor
The JvmMonitor can now handle more generation and collector scenarios. The
monitor is more robust and works properly for ZGC on both Java 11 and 15.
#12469
### # Garbage collection
Garbage collection metrics now use MXBeans.
#12481
### # Metric for task duration in the pending queue
Introduced the metric `task/pending/time` to measure how long a task stays in
the pending queue.
#12492
### # Emit metrics object for Scan, Timeseries, and GroupBy queries during
cursor creation
Adds vectorized metric for scan, timeseries and groupby queries.
#12484
### # Emit state of replace and append for native batch tasks
Druid now emits metrics so you can monitor and assess the use of different
types of batch ingestion, in particular replace and tombstone creation.
#12488
#12840
### # KafkaEmitter emits `queryType`
The KafkaEmitter now properly emits the `queryType` property for native
queries.
#12915
## # Security
You can now hide properties that are sensitive in the API response from
`/status/properties`, such as S3 access keys. Use the
`druid.server.hiddenProperties` property in `common.runtime.properties` to
specify the properties (case insensitive) you want to hide.
#12950
## # Other changes
* You can now configure the retention period for request logs stored on disk with the `druid.request.logging.durationToRetain` property. Set the retention period to be longer than `P1D` (#12559)
* You can now specify liveness and readiness probe delays for the historical StatefulSet in your values.yaml file. The default is 60 seconds (#12805)
* Improved exception message for native binary operators (#12335)
* Improved error messages when URI points to a file that doesn't exist (#12490)
* Improved build performance of modules (#12486)
* Improved lookups made using the druid-kafka-extraction-namespace extension to handle records that have been deleted from a kafka topic (#12819)
* Updated core Apache Kafka dependencies to 3.2.0 (#12538)
* Updated ORC to 1.7.5 (#12667)
* Updated Jetty to 9.4.41.v20210516 (#12629)
* Added `Zstandard` compression library to `CompressionStrategy` (#12408)
* Updated the default gzip buffer size to 8 KB to for improved performance (#12579)
* Updated the default `inputSegmentSizeBytes` in Compaction configuration to 100,000,000,000,000 (~100TB)
# # Bug fixes
Druid 24.0 contains over 68 bug fixes. You can find the complete list here
# # Upgrading to 24.0
## # Permissions for multi-stage query engine
To read external data using the multi-stage query task engine, you must have
READ permissions for the EXTERNAL resource type. Users without the correct
permission encounter a 403 error when trying to run SQL queries that include
EXTERN.
The way you assign the permission depends on your authorizer. For example,
with [basic security]((/docs/development/extensions-core/druid-basic-
security.md) in Druid, add the `EXTERNAL READ` permission by sending a `POST`
request to the roles API.
The example adds permissions for users with the `admin` role using a basic
authorizer named `MyBasicMetadataAuthorizer`. The following permissions are
granted:
* DATASOURCE READ
* DATASOURCE WRITE
* CONFIG READ
* CONFIG WRITE
* STATE READ
* STATE WRITE
* EXTERNAL READ
curl --location --request POST 'http://localhost:8081/druid-ext/basic-security/authorization/db/MyBasicMetadataAuthorizer/roles/admin/permissions' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"resource": {
"name": ".*",
"type": "DATASOURCE"
},
"action": "READ"
},
{
"resource": {
"name": ".*",
"type": "DATASOURCE"
},
"action": "WRITE"
},
{
"resource": {
"name": ".*",
"type": "CONFIG"
},
"action": "READ"
},
{
"resource": {
"name": ".*",
"type": "CONFIG"
},
"action": "WRITE"
},
{
"resource": {
"name": ".*",
"type": "STATE"
},
"action": "READ"
},
{
"resource": {
"name": ".*",
"type": "STATE"
},
"action": "WRITE"
},
{
"resource": {
"name": "EXTERNAL",
"type": "EXTERNAL"
},
"action": "READ"
}
]'
## # Behavior for unused segments
Druid automatically retains any segments marked as unused. Previously, Druid
permanently deleted unused segments from metadata store and deep storage after
their duration to retain passed. This behavior was reverted from `0.23.0`.
#12693
## # Default for `druid.processing.fifo`
The default for `druid.processing.fifo` is now true. This means that tasks of
equal priority are treated in a FIFO manner. For most use cases, this change
can improve performance on heavily loaded clusters.
#12571
## # Update to JDBC statement closure
In previous releases, Druid automatically closed the JDBC Statement when the
ResultSet was closed. Druid closed the ResultSet on EOF. Druid closed the
statement on any exception. This behavior is, however, non-standard.
In this release, Druid's JDBC driver follows the JDBC standards more closely:
The ResultSet closes automatically on EOF, but does not close the Statement or
PreparedStatement. Your code must close these statements, perhaps by using a
try-with-resources block.
The PreparedStatement can now be used multiple times with different
parameters. (Previously this was not true since closing the ResultSet closed
the PreparedStatement.)
If any call to a Statement or PreparedStatement raises an error, the client
code must still explicitly close the statement. According to the JDBC
standards, statements are not closed automatically on errors. This allows you
to obtain information about a failed statement before closing it.
If you have code that depended on the old behavior, you may have to change
your code to add the required close statement.
#12709
## # Known issues
## # Credits
@2bethere
@317brian
@a2l007
@abhagraw
@abhishekagarwal87
@abhishekrb19
@adarshsanjeev
@aggarwalakshay
@AmatyaAvadhanula
@BartMiki
@capistrant
@chenrui333
@churromorales
@clintropolis
@cloventt
@CodingParsley
@cryptoe
@dampcake
@dependabot[bot]
@dherg
@didip
@dongjoon-hyun
@ektravel
@EsoragotoSpirit
@exherb
@FrankChen021
@gianm
@hellmarbecker
@hwball
@iandr413
@imply-cheddar
@jarnoux
@jasonk000
@jihoonson
@jon-wei
@kfaraz
@LakshSingla
@liujianhuanzz
@liuxiaohui1221
@lmsurpre
@loquisgon
@machine424
@maytasm
@MC-JY
@Mihaylov93
@nishantmonu51
@paul-rogers
@petermarshallio
@pjfanning
@rockc2020
@rohangarg
@somu-imply
@suneet-s
@superivaj
@techdocsmith
@tejaswini-imply
@TSFenwick
@vimil-saju
@vogievetsky
@vtlim
@williamhyun
@wiquan
@writer-jill
@xvrl
@yuanlihan
@zachjsh
@zemin-piao
| 0 |
# Checklist
* I have verified that the issue exists against the `master` branch of Celery.
* This has already been asked to the discussion group first.
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
* I have checked the pull requests list
for existing proposed fixes.
* I have checked the commit log
to find out if the bug was already fixed in the master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
* I have included the contents of `pip freeze` in the issue.
* I have included all the versions of all the external dependencies required
to reproduce this bug
#### Related Issues
#4116
#4292
# Environment & Settings
4.3.0 (rhubarb):
**Celery version** :
**`celery report` Output:**
software -> celery:4.3.0 (rhubarb) kombu:4.5.0 py:3.8.2
billiard:3.6.3.0 py-amqp:2.5.2
platform -> system:Linux arch:64bit, ELF
kernel version:4.15.0-88-generic imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:amqp results:amqp://guest:**@localhost//
ABSOLUTE_URL_OVERRIDES: {
}
ACCOUNT_ADAPTER: 'lancium_box.users.adapters.AccountAdapter'
ACCOUNT_ALLOW_REGISTRATION: True
ACCOUNT_AUTHENTICATION_METHOD: 'username'
ACCOUNT_EMAIL_REQUIRED: True
ACCOUNT_EMAIL_VERIFICATION: 'mandatory'
ACCOUNT_FORMS: {
'login': 'lancium_box.users.forms.CustomLoginForm'}
ADMINS: [('Vitor de Miranda Henrique',
'vitor.henrique@lancium.com')]
ADMIN_URL: 'admin/'
ALLOWED_HOSTS: []
APPEND_SLASH: True
APPS_DIR: Path:/home/skywalker/lancium_box/lancium_box
AUTHENTICATION_BACKENDS:
['lancium_box.users.auth_backends.CustomAuthenticationBackend',
'django.contrib.auth.backends.ModelBackend',
'allauth.account.auth_backends.AuthenticationBackend']
AUTH_PASSWORD_VALIDATORS: ' **'
AUTH_USER_MODEL: 'users.User'
CACHES: {
'default': { 'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': 'redis://127.0.0.1:6379/1',
'OPTIONS': { 'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'IGNORE_EXCEPTIONS': True}}}
CACHE_MIDDLEWARE_ALIAS: 'default'
CACHE_MIDDLEWARE_KEY_PREFIX: '**'
CACHE_MIDDLEWARE_SECONDS: 600
CELERY_ACCEPT_CONTENT: ['json']
CELERY_BROKER_URL: 'amqp://guest: **@localhost:5672//'
CELERY_IMPORTS: ['lancium_box.monitoring.tasks']
CELERY_RESULT_BACKEND: 'amqp://guest:**@localhost//'
CELERY_RESULT_SERIALIZER: 'json'
CELERY_TASK_SERIALIZER: 'json'
CELERY_TASK_SOFT_TIME_LIMIT: 60
CELERY_TASK_TIME_LIMIT: 300
CELERY_TIMEZONE: 'UTC'
COMPRESSORS: {
'css': 'compressor.css.CssCompressor', 'js': 'compressor.js.JsCompressor'}
COMPRESS_CACHEABLE_PRECOMPILERS:
()
COMPRESS_CACHE_BACKEND: 'default'
COMPRESS_CACHE_KEY_FUNCTION: ' **'
COMPRESS_CLEAN_CSS_ARGUMENTS: ''
COMPRESS_CLEAN_CSS_BINARY: 'cleancss'
COMPRESS_CLOSURE_COMPILER_ARGUMENTS: ''
COMPRESS_CLOSURE_COMPILER_BINARY: 'java -jar compiler.jar'
COMPRESS_CSS_HASHING_METHOD: 'mtime'
COMPRESS_DATA_URI_MAX_SIZE: 1024
COMPRESS_DEBUG_TOGGLE: None
COMPRESS_ENABLED: True
COMPRESS_FILTERS: {
'css': ['compressor.filters.css_default.CssAbsoluteFilter'],
'js': ['compressor.filters.jsmin.JSMinFilter']}
COMPRESS_JINJA2_GET_ENVIRONMENT: <function
CompressorConf.JINJA2_GET_ENVIRONMENT at 0x7f3a8e76fee0>
COMPRESS_MINT_DELAY: 30
COMPRESS_MTIME_DELAY: 10
COMPRESS_OFFLINE: True
COMPRESS_OFFLINE_CONTEXT: {
'STATIC_URL': '/static/'}
COMPRESS_OFFLINE_MANIFEST: 'manifest.json'
COMPRESS_OFFLINE_TIMEOUT: 31536000
COMPRESS_OUTPUT_DIR: 'CACHE'
COMPRESS_PARSER: 'compressor.parser.AutoSelectParser'
COMPRESS_PRECOMPILERS:
()
COMPRESS_REBUILD_TIMEOUT: 2592000
COMPRESS_ROOT: '/home/skywalker/lancium_box/staticfiles'
COMPRESS_STORAGE: 'compressor.storage.GzipCompressorFileStorage'
COMPRESS_TEMPLATE_FILTER_CONTEXT: {
'STATIC_URL': '/static/'}
COMPRESS_URL: '/static/'
COMPRESS_URL_PLACEHOLDER: '/compressor_url_placeholder/'
COMPRESS_VERBOSE: False
COMPRESS_YUGLIFY_BINARY: 'yuglify'
COMPRESS_YUGLIFY_CSS_ARGUMENTS: '--terminal'
COMPRESS_YUGLIFY_JS_ARGUMENTS: '--terminal'
COMPRESS_YUI_BINARY: 'java -jar yuicompressor.jar'
COMPRESS_YUI_CSS_ARGUMENTS: ''
COMPRESS_YUI_JS_ARGUMENTS: ''
CRISPY_TEMPLATE_PACK: 'bootstrap4'
CSRF_COOKIE_AGE: 31449600
CSRF_COOKIE_DOMAIN: None
CSRF_COOKIE_HTTPONLY: True
CSRF_COOKIE_NAME: 'csrftoken'
CSRF_COOKIE_PATH: '/'
CSRF_COOKIE_SAMESITE: 'Lax'
CSRF_COOKIE_SECURE: False
CSRF_FAILURE_VIEW: 'django.views.csrf.csrf_failure'
CSRF_HEADER_NAME: 'HTTP_X_CSRFTOKEN'
CSRF_TRUSTED_ORIGINS: []
CSRF_USE_SESSIONS: False
DATABASES: {
'default': { 'ATOMIC_REQUESTS': True,
'AUTOCOMMIT': True,
'CONN_MAX_AGE': None,
'ENGINE': 'django.db.backends.postgresql',
'HOST': '10.2.40.205',
'NAME': 'lanciumdb',
'OPTIONS': {},
'PASSWORD': '**',
'PORT': 5432,
'TEST': { 'CHARSET': None,
'COLLATION': None,
'MIRROR': None,
'NAME': None},
'TIME_ZONE': None,
'USER': 'lancium_user'}}
DATABASE_ROUTERS: ' **'
DATACENTER_ROOM: None
DATA_UPLOAD_MAX_MEMORY_SIZE: 2621440
DATA_UPLOAD_MAX_NUMBER_FIELDS: 1000
DATETIME_FORMAT: 'N j, Y, P'
DATETIME_INPUT_FORMATS: ['%Y-%m-%d %H:%M:%S',
'%Y-%m-%d %H:%M:%S.%f',
'%Y-%m-%d %H:%M',
'%Y-%m-%d',
'%m/%d/%Y %H:%M:%S',
'%m/%d/%Y %H:%M:%S.%f',
'%m/%d/%Y %H:%M',
'%m/%d/%Y',
'%m/%d/%y %H:%M:%S',
'%m/%d/%y %H:%M:%S.%f',
'%m/%d/%y %H:%M',
'%m/%d/%y']
DATE_FORMAT: 'N j, Y'
DATE_INPUT_FORMATS: ['%Y-%m-%d',
'%m/%d/%Y',
'%m/%d/%y',
'%b %d %Y',
'%b %d, %Y',
'%d %b %Y',
'%d %b, %Y',
'%B %d %Y',
'%B %d, %Y',
'%d %B %Y',
'%d %B, %Y']
DEBUG: False
DEBUG_PROPAGATE_EXCEPTIONS: False
DECIMAL_SEPARATOR: '.'
DEFAULT_CHARSET: 'utf-8'
DEFAULT_CONTENT_TYPE: 'text/html'
DEFAULT_EXCEPTION_REPORTER_FILTER:
'django.views.debug.SafeExceptionReporterFilter'
DEFAULT_FILE_STORAGE: 'django.core.files.storage.FileSystemStorage'
DEFAULT_FROM_EMAIL: 'Smart Response noreply@lancium.com'
DEFAULT_INDEX_TABLESPACE: ''
DEFAULT_TABLESPACE: ''
DISABLE_DASHBOARD: False
DISALLOWED_USER_AGENTS: []
DJANGO_APPS: ['django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.humanize',
'django.contrib.admin']
EMAIL_BACKEND: 'django.core.mail.backends.smtp.EmailBackend'
EMAIL_HOST: 'smtp.gmail.com'
EMAIL_PORT: 587
EMAIL_SSL_CERTFILE: None
EMAIL_SSL_KEYFILE: '**'
EMAIL_SUBJECT_PREFIX: '[Smart Response]'
EMAIL_TIMEOUT: 10
EMAIL_USE_LOCALTIME: False
EMAIL_USE_SSL: False
EMAIL_USE_TLS: True
FILE_CHARSET: 'utf-8'
FILE_UPLOAD_DIRECTORY_PERMISSIONS: None
FILE_UPLOAD_HANDLERS:
['django.core.files.uploadhandler.MemoryFileUploadHandler',
'django.core.files.uploadhandler.TemporaryFileUploadHandler']
FILE_UPLOAD_MAX_MEMORY_SIZE: 2621440
FILE_UPLOAD_PERMISSIONS: None
FILE_UPLOAD_TEMP_DIR: None
FIRST_DAY_OF_WEEK: 0
FIXTURE_DIRS:
('/home/skywalker/lancium_box/lancium_box/fixtures',)
FORCE_SCRIPT_NAME: None
FORMAT_MODULE_PATH: None
FORM_RENDERER: 'django.forms.renderers.DjangoTemplates'
IGNORABLE_404_URLS: []
INSTALLED_APPS: ['django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.humanize',
'django.contrib.admin',
'crispy_forms',
'allauth',
'allauth.account',
'allauth.socialaccount',
'rest_framework',
'rest_framework.authtoken',
'django_celery_beat',
'sekizai',
'lancium_box.users.apps.UsersConfig',
'lancium_box.datacenter.apps.DatacenterConfig',
'lancium_box.monitoring.apps.MonitoringConfig',
'lancium_box.cryptocurrency.apps.CryptocurrencyConfig',
'lancium_box.dashboard.apps.DashboardConfig',
'lancium_box.remotebox.apps.RemoteboxConfig',
'lancium_box.api.apps.ApiConfig',
'lancium_box.event.apps.EventConfig',
'lancium_box.control.apps.ControlConfig',
'lancium_box.demand_response.apps.DemandResponseConfig',
'lancium_box.inventory.apps.InventoryConfig',
'compressor']
INTERNAL_IPS: []
LANGUAGES: [('af', 'Afrikaans'),
('ar', 'Arabic'),
('ast', 'Asturian'),
('az', 'Azerbaijani'),
('bg', 'Bulgarian'),
('be', 'Belarusian'),
('bn', 'Bengali'),
('br', 'Breton'),
('bs', 'Bosnian'),
('ca', 'Catalan'),
('cs', 'Czech'),
('cy', 'Welsh'),
('da', 'Danish'),
('de', 'German'),
('dsb', 'Lower Sorbian'),
('el', 'Greek'),
('en', 'English'),
('en-au', 'Australian English'),
('en-gb', 'British English'),
('eo', 'Esperanto'),
('es', 'Spanish'),
('es-ar', 'Argentinian Spanish'),
('es-co', 'Colombian Spanish'),
('es-mx', 'Mexican Spanish'),
('es-ni', 'Nicaraguan Spanish'),
('es-ve', 'Venezuelan Spanish'),
('et', 'Estonian'),
('eu', 'Basque'),
('fa', 'Persian'),
('fi', 'Finnish'),
('fr', 'French'),
('fy', 'Frisian'),
('ga', 'Irish'),
('gd', 'Scottish Gaelic'),
('gl', 'Galician'),
('he', 'Hebrew'),
('hi', 'Hindi'),
('hr', 'Croatian'),
('hsb', 'Upper Sorbian'),
('hu', 'Hungarian'),
('hy', 'Armenian'),
('ia', 'Interlingua'),
('id', 'Indonesian'),
('io', 'Ido'),
('is', 'Icelandic'),
('it', 'Italian'),
('ja', 'Japanese'),
('ka', 'Georgian'),
('kab', 'Kabyle'),
('kk', 'Kazakh'),
('km', 'Khmer'),
('kn', 'Kannada'),
('ko', 'Korean'),
('lb', 'Luxembourgish'),
('lt', 'Lithuanian'),
('lv', 'Latvian'),
('mk', 'Macedonian'),
('ml', 'Malayalam'),
('mn', 'Mongolian'),
('mr', 'Marathi'),
('my', 'Burmese'),
('nb', 'Norwegian Bokmål'),
('ne', 'Nepali'),
('nl', 'Dutch'),
('nn', 'Norwegian Nynorsk'),
('os', 'Ossetic'),
('pa', 'Punjabi'),
('pl', 'Polish'),
('pt', 'Portuguese'),
('pt-br', 'Brazilian Portuguese'),
('ro', 'Romanian'),
('ru', 'Russian'),
('sk', 'Slovak'),
('sl', 'Slovenian'),
('sq', 'Albanian'),
('sr', 'Serbian'),
('sr-latn', 'Serbian Latin'),
('sv', 'Swedish'),
('sw', 'Swahili'),
('ta', 'Tamil'),
('te', 'Telugu'),
('th', 'Thai'),
('tr', 'Turkish'),
('tt', 'Tatar'),
('udm', 'Udmurt'),
('uk', 'Ukrainian'),
('ur', 'Urdu'),
('vi', 'Vietnamese'),
('zh-hans', 'Simplified Chinese'),
('zh-hant', 'Traditional Chinese')]
LANGUAGES_BIDI: ['he', 'ar', 'fa', 'ur']
LANGUAGE_CODE: 'en-us'
LANGUAGE_COOKIE_AGE: None
LANGUAGE_COOKIE_DOMAIN: None
LANGUAGE_COOKIE_NAME: 'django_language'
LANGUAGE_COOKIE_PATH: '/'
LOCALE_PATHS: [Path:/home/skywalker/lancium_box/locale]
LOCAL_APPS: ['lancium_box.users.apps.UsersConfig',
'lancium_box.datacenter.apps.DatacenterConfig',
'lancium_box.monitoring.apps.MonitoringConfig',
'lancium_box.cryptocurrency.apps.CryptocurrencyConfig',
'lancium_box.dashboard.apps.DashboardConfig',
'lancium_box.remotebox.apps.RemoteboxConfig',
'lancium_box.api.apps.ApiConfig',
'lancium_box.event.apps.EventConfig',
'lancium_box.control.apps.ControlConfig',
'lancium_box.demand_response.apps.DemandResponseConfig',
'lancium_box.inventory.apps.InventoryConfig']
LOGGING: {
'disable_existing_loggers': False,
'filters': { 'require_debug_false': { '()':
'django.utils.log.RequireDebugFalse'}},
'formatters': { 'verbose': { 'format': '%(levelname)s %(asctime)s '
'%(module)s %(process)d '
'%(thread)d %(message)s'}},
'handlers': { 'console': { 'class': 'logging.StreamHandler',
'formatter': 'verbose',
'level': 'DEBUG'},
'mail_admins': { 'class': 'django.utils.log.AdminEmailHandler',
'filters': ['require_debug_false'],
'level': 'ERROR'}},
'loggers': { 'django.request': { 'handlers': ['mail_admins'],
'level': 'ERROR',
'propagate': True},
'django.security.DisallowedHost': { 'handlers': [ 'console',
'mail_admins'],
'level': 'ERROR',
'propagate': True}},
'root': {'handlers': ['console'], 'level': 'INFO'},
'version': 1}
LOGGING_CONFIG: 'logging.config.dictConfig'
LOGIN_REDIRECT_URL: 'users:redirect'
LOGIN_URL: 'account_login'
LOGOUT_REDIRECT_URL: None
MANAGERS: [('Vitor de Miranda Henrique',
'vitor.henrique@lancium.com')]
MEDIA_ROOT: '/home/skywalker/lancium_box/lancium_box/media'
MEDIA_URL: '/media/'
MESSAGE_STORAGE: 'django.contrib.messages.storage.fallback.FallbackStorage'
MIDDLEWARE: ['whitenoise.middleware.WhiteNoiseMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.locale.LocaleMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware']
MIGRATION_MODULES: {
'sites': 'lancium_box.contrib.sites.migrations'}
MONTH_DAY_FORMAT: 'F j'
NUMBER_GROUPING: 0
PASSWORD_HASHERS: ' **'
PASSWORD_RESET_TIMEOUT_DAYS: '**'
PREPEND_WWW: False
READ_DOT_ENV_FILE: False
REST_FRAMEWORK: {
'DEFAULT_AUTHENTICATION_CLASSES': [
'rest_framework.authentication.TokenAuthentication']}
ROOT_DIR: Path:/home/skywalker/lancium_box
ROOT_URLCONF: 'config.urls'
SECRET_KEY: '********'
SECURE_BROWSER_XSS_FILTER: True
SECURE_CONTENT_TYPE_NOSNIFF: False
SECURE_HSTS_INCLUDE_SUBDOMAINS: False
SECURE_HSTS_PRELOAD: False
SECURE_HSTS_SECONDS: 0
SECURE_PROXY_SSL_HEADER: None
SECURE_REDIRECT_EXEMPT: []
SECURE_SSL_HOST: None
SECURE_SSL_REDIRECT: False
SERVER_EMAIL: 'Smart Response noreply@lancium.com'
SESSION_CACHE_ALIAS: 'default'
SESSION_COOKIE_AGE: 1209600
SESSION_COOKIE_DOMAIN: None
SESSION_COOKIE_HTTPONLY: True
SESSION_COOKIE_NAME: 'sessionid'
SESSION_COOKIE_PATH: '/'
SESSION_COOKIE_SAMESITE: 'Lax'
SESSION_COOKIE_SECURE: False
SESSION_ENGINE: 'django.contrib.sessions.backends.db'
SESSION_EXPIRE_AT_BROWSER_CLOSE: False
SESSION_FILE_PATH: None
SESSION_SAVE_EVERY_REQUEST: False
SESSION_SERIALIZER: 'django.contrib.sessions.serializers.JSONSerializer'
SETTINGS_MODULE: 'config.settings.production'
SHORT_DATETIME_FORMAT: 'm/d/Y P'
SHORT_DATE_FORMAT: 'm/d/Y'
SIGNING_BACKEND: 'django.core.signing.TimestampSigner'
SILENCED_SYSTEM_CHECKS: []
SITE_ID: 1
SOCIALACCOUNT_ADAPTER: 'lancium_box.users.adapters.SocialAccountAdapter'
STATICFILES_DIRS: ['/home/skywalker/lancium_box/lancium_box/static']
STATICFILES_FINDERS: ['django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'compressor.finders.CompressorFinder']
STATICFILES_STORAGE: 'whitenoise.storage.CompressedManifestStaticFilesStorage'
STATIC_ROOT: '/home/skywalker/lancium_box/staticfiles'
STATIC_URL: '/static/'
TEMPLATES: [{'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': ['/home/skywalker/lancium_box/lancium_box/templates'],
'OPTIONS': {'context_processors': ['django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.template.context_processors.i18n',
'django.template.context_processors.media',
'django.template.context_processors.static',
'django.template.context_processors.tz',
'django.contrib.messages.context_processors.messages',
'lancium_box.utils.context_processors.settings_context',
'sekizai.context_processors.sekizai'],
'loaders': [('django.template.loaders.cached.Loader',
['django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader'])]}}]
TEST_NON_SERIALIZED_APPS: []
TEST_RUNNER: 'django.test.runner.DiscoverRunner'
THIRD_PARTY_APPS: ['crispy_forms',
'allauth',
'allauth.account',
'allauth.socialaccount',
'rest_framework',
'rest_framework.authtoken',
'django_celery_beat',
'sekizai']
THOUSAND_SEPARATOR: ','
TIME_FORMAT: 'P'
TIME_INPUT_FORMATS: ['%H:%M:%S', '%H:%M:%S.%f', '%H:%M']
TIME_ZONE: 'UTC'
USE_I18N: True
USE_L10N: True
USE_THOUSAND_SEPARATOR: False
USE_TZ: True
USE_X_FORWARDED_HOST: False
USE_X_FORWARDED_PORT: False
WSGI_APPLICATION: 'config.wsgi.application'
X_FRAME_OPTIONS: 'DENY'
YEAR_MONTH_FORMAT: 'F Y'
is_overridden: <bound method Settings.is_overridden of <Settings
"config.settings.production">>
# Steps to Reproduce
Hello everyone,
On my project I have 5 workers on running that periodically check or insert
data on database (postgress) every 5 seconds . My problem is that celery is
opening too many connections to the database. I can see at least 20
connections for the same tasks hitting the database.
Is there way to improve this with Django? I tried setting up `CONN_MAX_AGE =
None` on django, to have the database persist the same connection but that
didn't help.
I also saw a pull request and some comments on reuse the database on previous
mentioned issues but they were reverted because they caused regression
problems.
### Python Packages
**`pip freeze` Output:**
amqp==2.5.2
argon2-cffi==19.2.0
bcrypt==3.1.7
beautifulsoup4==4.7.1
billiard==3.6.3.0
blockchain==1.4.4
boto3==1.12.47
botocore==1.15.47
celery==4.3.0
certifi==2020.4.5.1
cffi==1.14.0
chardet==3.0.4
coreapi==2.3.3
coreschema==0.0.4
cryptography==2.9.2
defusedxml==0.6.0
Django==2.2.8
django-allauth==0.40.0
django-anymail==7.0.0
django-appconf==1.0.4
django-celery-beat==1.5.0
django-classy-tags==0.8.0
django-compressor==2.3
django-crispy-forms==1.8.1
django-environ==0.4.5
django-model-utils==3.2.0
django-pandas==0.6.0
django-redis==4.10.0
django-sekizai==1.0.0
django-storages==1.9.1
django-timezone-field==4.0
djangorestframework==3.10.3
docutils==0.15.2
enum-compat==0.0.3
et-xmlfile==1.0.1
future==0.18.2
gevent==20.4.0
greenlet==0.4.15
gunicorn==20.0.4
idna==2.8
itypes==1.2.0
jdcal==1.4.1
Jinja2==2.11.2
jmespath==0.9.5
kombu==4.5.0
MarkupSafe==1.1.1
numpy==1.16.2
oauthlib==3.1.0
openpyxl==2.6.3
pandas==0.24.2
parallel-ssh==1.9.1
paramiko==2.7.1
Pillow==6.2.1
psycopg2-binary==2.8.4
pycparser==2.20
pyModbusTCP==0.1.8
PyNaCl==1.3.0
pysftp==0.2.9
python-crontab==2.4.2
python-dateutil==2.8.0
python-slugify==4.0.0
python3-openid==3.1.0
pytz==2019.3
rcssmin==1.0.6
redis==3.3.11
requests==2.21.0
requests-oauthlib==1.3.0
rjsmin==1.1.0
s3transfer==0.3.3
six==1.14.0
soupsieve==2.0
sqlparse==0.3.1
ssh2-python==0.18.0.post1
text-unidecode==1.3
uritemplate==3.0.1
urllib3==1.24.3
vine==1.3.0
whitenoise==4.1.4
|
# Checklist
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
## Optional Debugging Information
* I have tried reproducing the issue on more than one Python version
and/or implementation.
* I have tried reproducing the issue on more than one version of the message
broker and/or result backend.
* I have tried reproducing the issue after downgrading
and/or upgrading Celery and its dependencies.
## Related Issues and Possible Duplicates
#### Related Issues
* #5597
## Environment & Settings
**Celery version** : 4.3.0 (rhubarb)
**`celery report` Output:**
software -> celery:4.3.0 (rhubarb) kombu:4.4.0 py:3.6.7
billiard:3.6.0.0 py-amqp:2.4.1
platform -> system:Linux arch:64bit
kernel version:4.18.0-25-generic imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:amqp results:disabled
task_queues: [<unbound Queue test -> <unbound Exchange test(direct)> -> test>]
# Steps to Reproduce
## Required Dependencies
* **Minimal Python Version** : 3.6.7
* **Minimal Celery Version** : 4.3.0
* **Minimal Kombu Version** : 4.4.0
## Minimally Reproducible Test Case
import celery
from kombu import Exchange
from kombu import Queue
class Config:
task_queues = [
Queue('test', Exchange('test'), routing_key='test', queue_arguments={'x-max-priority': 3})
] # yapf: disable
app = celery.Celery('test')
app.config_from_object(Config)
@app.task(bind=True)
def task(self):
print(self.request.delivery_info['priority'])
self.retry(countdown=1)
if __name__ == '__main__':
task.s().apply_async(priority=1, queue='test')
# Expected Behavior
Expect task to have a priority 1 after self.retry()
Expected output:
[tasks]
. test.task
[2019-07-24 22:31:50,810: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2019-07-24 22:31:50,823: INFO/MainProcess] mingle: searching for neighbors
[2019-07-24 22:31:51,855: INFO/MainProcess] mingle: all alone
[2019-07-24 22:31:51,869: INFO/MainProcess] celery@ArtSobes-Home-Ubuntu-18 ready.
[2019-07-24 22:31:51,870: INFO/MainProcess] Received task: test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2]
[2019-07-24 22:31:51,973: WARNING/ForkPoolWorker-16] 1
[2019-07-24 22:31:51,995: INFO/MainProcess] Received task: test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] ETA:[2019-07-24 19:31:52.974357+00:00]
[2019-07-24 22:31:51,995: INFO/ForkPoolWorker-16] Task test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] retry: Retry in 1s
[2019-07-24 22:31:54,828: WARNING/ForkPoolWorker-2] 1
[2019-07-24 22:31:54,850: INFO/MainProcess] Received task: test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] ETA:[2019-07-24 19:31:55.829945+00:00]
[2019-07-24 22:31:54,850: INFO/ForkPoolWorker-2] Task test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] retry: Retry in 1s
[2019-07-24 22:31:56,831: WARNING/ForkPoolWorker-4] 1
[2019-07-24 22:31:56,853: INFO/MainProcess] Received task: test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] ETA:[2019-07-24 19:31:57.832523+00:00]
[2019-07-24 22:31:56,853: INFO/ForkPoolWorker-4] Task test.task[df27b1cc-6a9e-4fdb-aa46-dd02d15e4df2] retry: Retry in 1s
[2019-07-24 22:31:58,833: WARNING/ForkPoolWorker-6] 1
# Actual Behavior
On the first call task priority is 1, but after retry it is None
[tasks]
. test.task
[2019-07-24 22:30:51,901: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2019-07-24 22:30:51,913: INFO/MainProcess] mingle: searching for neighbors
[2019-07-24 22:30:52,940: INFO/MainProcess] mingle: all alone
[2019-07-24 22:30:52,956: INFO/MainProcess] celery@ArtSobes-Home-Ubuntu-18 ready.
[2019-07-24 22:30:52,957: INFO/MainProcess] Received task: test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e]
[2019-07-24 22:30:53,060: WARNING/ForkPoolWorker-16] 1
[2019-07-24 22:30:53,083: INFO/MainProcess] Received task: test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] ETA:[2019-07-24 19:30:54.062046+00:00]
[2019-07-24 22:30:53,083: INFO/ForkPoolWorker-16] Task test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] retry: Retry in 1s
[2019-07-24 22:30:54,958: WARNING/ForkPoolWorker-2] None
[2019-07-24 22:30:54,979: INFO/MainProcess] Received task: test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] ETA:[2019-07-24 19:30:55.959481+00:00]
[2019-07-24 22:30:54,979: INFO/ForkPoolWorker-2] Task test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] retry: Retry in 1s
[2019-07-24 22:30:56,960: WARNING/ForkPoolWorker-4] None
[2019-07-24 22:30:56,982: INFO/MainProcess] Received task: test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] ETA:[2019-07-24 19:30:57.962134+00:00]
[2019-07-24 22:30:56,983: INFO/ForkPoolWorker-4] Task test.task[bc93cbd9-9a20-41cb-aaef-55c71245038e] retry: Retry in 1s
[2019-07-24 22:30:58,963: WARNING/ForkPoolWorker-6] None
| 0 |
There should be a way to add typings for specific version of ts(e.g. `2.1`).
The tests should run on ts 2.1 and packages should require typescript 2.1 in
their package.json.
So what is the best way to do it?
|
It makes sense to not want to break existing users with a "^" dependency on
something like `@types/react`, but it hinders the very users and organizations
that help the TS ecosystem. The introduction of `Partial<>` and co in
particular is a big game changer, and many users would benefit from updates to
react and lodash in particular.
The current policy has been stated as waiting 30 days before allowing new
features into DefinitelyTyped to avoid breaking people. While I disagree with
the policy, I think it we as a community can do better to support those on the
bleeding edge.
Considering the situation, I'd like to propose a few options.
**Option A** , do nothing. There are work arounds if you want to start moving
fast, such as hacking `paths` and `typeRoots` in your tsconfig. I have
documented one such solution here: https://medium.com/@ericlanderson/using-
custom-typescript-definitions-with-ts-2-x-3121db84015d#.tvqw3rmth
While this works, its a hassle.
**Option B** : Someone (maybe me) forks DefinitelyTyped and allows changes for
the latest and greatest right away. I could publish this as something like
"types-edge" in npm and provide documentation for how to configure your
tsconfig to allow for it to work.
While this works, it half forks the community, and creates a burden on myself
to keep up to date with DefinitelyTyped. It also hinders discoverability. I
think this hurts the ecosystem, but its better than everyone manually forking
type definitions.
**Option C** : DefinitelyTyped and Microsoft support the development of these
edge versions on a branch within DefinitelyTyped (lets call it "edge" or even
use "master"). Microsoft could then register the org @types-edge and publish
the edge branch. The DefinitelyTyped and Microsoft people could then as a
group keep merging types-2.0 (which should really be called "stable" in this
world) into edge, while allowing the community to grow and move quickly. If
desired, there could also be a "nightly" branch and a corresponding @types-
nightly (I would have personally liked this as the second `Partial<>` and
`keyof` got merged, I started using it immediately). This would also mean the
types are already being updated and ready to go when the next TS release
happens.
There could also be a release of typescript 2.1.1 pretty quickly that adds
"./node_modules/@types-edge" to the default list of for the `paths` and the
`typeRoots` in tsconfig.
I think this is the best short-term solution.
**Option D** : Add macros to TS. (Yeah this is dirty). But the macros could
allow things like `#if typescriptVersion >= 2.1.0` and allow additional
definitions to be placed into the type files. This would still require the 30
day grace period before it could be introduced (maybe in TS 2.2?) but would
future proof this problem and allow just a single npm org for publishing
types. Its also not backwards compatible, but it would be the easiest to
maintain in DefinitelyTyped.
**Option E** : Add additional meta-data support to package.json to declare
types with a minimum version. This could look something like:
{
"...": "...",
"types2": {
">=2.1.0": "index-2.1.0.d.ts"
}
}
Newer versions of typescript can look at this instead of `types` and choose
the highest version that maps.
This would be a PITA to support, IMHO. You could do it with branches in DT,
but now we have to support forward merging changes to types-2.0 to types-2.1
and then types-2.2, etc. Or you have to keep the different versions in the
same branch and remember to change them all when adding new functionality.
Another way to do this, which I have no idea if it would work given future
changes to TS are unknown, is write scripts that convert the latest typings
into older versions. For example, pretty much every typing that would want
`Partial<T>` could just be mutated at the time the publisher runs to replace
it with `T`. `Pick` is probably `any` in the old world. `ReadonlyArray`
support wouldn't have been broken cause it could just be `Array`. This would
require custom scripts for each new release but it might not be awful.
There are probably other ways to do it, but these are the ones that came to me
today. I would really love to hear the opinions of a bunch of people: @andy-
ms, @ahejlsberg, @pspeter3 @vsiao @johnnyreilly, @jkillian, and more.
Thanks
| 1 |
Does babel make code that will conditionally run native generators instead of
always using the performance cost of regenerator (when using `transform-
regenerator`)?
f.e., something similar to (examples copied pasted from regenerator site):
if (supportsGenerators()) {
function *range(max, step) {
var count = 0;
step = step || 1;
for (var i = 0; i < max; i += step) {
count++;
yield i;
}
return count;
}
var gen = range(20, 3), info;
while (!(info = gen.next()).done) {
console.log(info.value);
}
console.log("steps taken: " + info.value);
}
else {
var marked0$0 = [range].map(regeneratorRuntime.mark);
function range(max, step) {
var count, i;
return regeneratorRuntime.wrap(function range$(context$1$0) {
while (1) switch (context$1$0.prev = context$1$0.next) {
case 0:
count = 0;
step = step || 1;
i = 0;
case 3:
if (!(i < max)) {
context$1$0.next = 10;
break;
}
count++;
context$1$0.next = 7;
return i;
case 7:
i += step;
context$1$0.next = 3;
break;
case 10:
return context$1$0.abrupt("return", count);
case 11:
case "end":
return context$1$0.stop();
}
}, marked0$0[0], this);
}
var gen = range(20, 3), info;
while (!(info = gen.next()).done) {
console.log(info.value);
}
console.log("steps taken: " + info.value);
}
|
> Issue originally made by @Cellule
### Bug information
* **Babel version:** 6.8.0
* **Node version:** 5.5.0
* **npm version:** 3.7.2
### Options
{
"plugins": ["transform-es2015-modules-umd"]
}
### Input code
require("module").test;
### Description
Compiling `require("module").test;` with plugin `transform-es2015-modules-umd`
throws the following error:
TypeError: test.js: Property object of MemberExpression expected node to be of a type ["Expression"] but instead got null
at Object.validate (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-types\lib\definitions\index.js:115:13)
at Object.validate (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-types\lib\index.js:552:9)
at NodePath._replaceWith (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\replacement.js:214:7)
at NodePath._remove (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\removal.js:60:10)
at NodePath.remove (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\removal.js:31:8)
at PluginPass.CallExpression (E:\Projects\chakra-runner\node_modules\babel-plugin-transform-es2015-modules-amd\lib\index.js:42:12)
at NodePath._call (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\context.js:78:18)
at NodePath.call (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\context.js:49:17)
at NodePath.visit (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\path\context.js:108:12)
at TraversalContext.visitQueue (E:\Projects\chakra-runner\node_modules\babel-cli\node_modules\babel-traverse\lib\context.js:174:16)
| 0 |
From @slash3g on 2016-04-10T16:37:22Z
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
lxc_container
##### ANSIBLE VERSION
ansible 2.0.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
(bug still present in devel branch)
##### CONFIGURATION
Default Ansible configuration.
##### OS / ENVIRONMENT
N/A
##### SUMMARY
lxc_container handles container configuration via the container_configuration
parameter. Unfortunately, the logic behind the configuration update is broken,
as it duplicates the key and never replaces old values. Morever, when the
configuration file contains duplicated keys, the script can fail to make any
changes.
##### STEPS TO REPRODUCE
Exec the following playbook
---
- hosts: myhost
tasks:
- name: fist config update
lxc_container:
name: mycontainer
container_config:
- "lxc.start.auto = 0"
- name: second config update
lxc_container:
name: mycontainer
container_config:
- "lxc.start.auto = 1"
##### EXPECTED RESULTS
$ grep start mycontainer/config
lxc.start.auto = 1
##### ACTUAL RESULTS
$ grep start mycontainer/config
lxc.start.auto = 0
lxc.start.auto = 1
Copied from original issue: ansible/ansible-modules-extras#1998
|
##### ISSUE TYPE
* Bug Report
##### COMPONENT NAME
Conditionals
##### ANSIBLE VERSION
ansible 2.3.1.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
python version = 2.7.13 (default, Jan 19 2017, 14:48:08) [GCC 6.3.0 20170118]
##### CONFIGURATION
##### OS / ENVIRONMENT
Have discovered this with Ubuntu 17.04 managing Centos 7.3.
Reproduced for this report with Ubuntu 17.04 targeting itself.
##### SUMMARY
is defined works as expected when the variable it is used against is not
defined.
However, when the variable exists, but depends on an undefined variable, it
fails, instead of returning false.
##### STEPS TO REPRODUCE
Will upload a minimal test case role.
Run:
ansible-playbook -i inventory playbook.yml
##### EXPECTED RESULTS
The role def has 2 conditional steps. Since the variables are (directly or
indirectly) not defined, they should both be skipped.
##### ACTUAL RESULTS
The 2nd task makes the playbook fail.
TASK [def : Problematic step.] **********************************************************************************************************************************************************************
fatal: [thismachine]: FAILED! => {"failed": true, "msg": "The conditional check 'dependent is defined' failed. The error was: error while evaluating conditional (dependent is defined): I use {{ variable }}.: 'variable' is undefined\n\nThe error appears to have been in '/home/xalkina/tmp/defined/roles/def/tasks/main.yml': line 9, column 3, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n- name: Problematic step.\n ^ here\n"}
to retry, use: --limit @/home/xalkina/tmp/defined/playbook.retry
| 0 |
* [ x ] I tried using the `@types/jquery` package and had problems.
* [ x ] I tried using the latest stable version of tsc. https://www.npmjs.com/package/typescript (3.1.6)
* [ x ] I have a question that is inappropriate for StackOverflow. (Please ask any appropriate questions there).
* [ x ] Mention the authors (see `Definitions by:` in `index.d.ts`) so they can respond.
* Authors: @leonard-thieu @borisyankov @Steve-Fenton @Diullei @tasoili @jasonswearingen @seanski @Guuz @ksummerlin @basarat @nwolverson @derekcicerone @AndrewGaspar @seikichi @benjaminjackman @JoshStrobl @johnnyreilly @DickvdBrink @King2500 @terrymun
I'm getting multiple errors when building a project that depends on
@types/jquery@3.3.22, using TypeScript 3.1.6:
node_modules/@types/jquery/JQuery.d.ts:6356:141 - error TS2344: Type 'TReturn' does not satisfy the constraint 'Node'.
6356 map<TReturn>(callback: (this: TElement, index: number, domElement: TElement) => JQuery.TypeOrArray<TReturn> | null | undefined): JQuery<TReturn>;
~~~~~~~
node_modules/@types/jquery/JQueryStatic.d.ts:172:27 - error TS2344: Type 'T' does not satisfy the constraint 'Node'.
172 <T>(selection: JQuery<T>): JQuery<T>;
~
node_modules/@types/jquery/JQueryStatic.d.ts:172:39 - error TS2344: Type 'T' does not satisfy the constraint 'Node'.
172 <T>(selection: JQuery<T>): JQuery<T>;
~
node_modules/@types/jquery/JQueryStatic.d.ts:192:93 - error TS2344: Type 'TElement' does not satisfy the constraint 'Node'.
192 <TElement = HTMLElement>(callback: ((this: Document, $: JQueryStatic) => void)): JQuery<TElement>;
~~~~~~~
node_modules/@types/jquery/JQueryStatic.d.ts:199:55 - error TS2344: Type 'T' does not satisfy the constraint 'Node'.
Type 'PlainObject<any>' is not assignable to type 'Node'.
Property 'baseURI' is missing in type 'PlainObject<any>'.
199 <T extends JQuery.PlainObject>(object: T): JQuery<T>;
~
node_modules/@types/jquery/JQueryStatic.d.ts:206:40 - error TS2344: Type 'TElement' does not satisfy the constraint 'Node'.
206 <TElement = HTMLElement>(): JQuery<TElement>;
~~~~~~~~
| ERROR: type should be string, got "\n\nhttps://github.com/borisyankov/DefinitelyTyped/blob/master/pickadate/pickadate.d.ts \nhttps://github.com/borisyankov/DefinitelyTyped/blob/master/jquery.pickadate/jquery.pickadate.d.ts\n\n" | 0 |
Currently, GLTFParser and constants in GLTFLoader are scoped and cannot be
used in user-land.
However, sometimes it is needed.
Usecases:
* glTF-ish file may need special parsing using modified(extended) GLTFParser.
* glTF models may be supplied in JSON or JSONP format from a WebApp API rather than files and need GLTFParser directly.
Also, `WEBGL_CONSTANTS` and other constants would be useful if available under
`THREE` or somewhere
What I would like to have is:
* `THREE.GLTFParser`
and maybe:
* `gltfLoaderInstance.gltfParser`
* `THREE.WEBGL_CONSTANTS.FLOAT = 5126` and so on
|
##### Description of the problem
If you have multiple threejs objects on a page that each use LoadingManager
only one of the objects will fire the LoadManager onLoad call back.
The problem is in the FileLoader in its load call. In the broken case all we
do in the FileLoader load is:
if ( loading[ url ] !== undefined ) {
loading[ url ].push( {
onLoad: onLoad,
onProgress: onProgress,
onError: onError
} );
return;
}
The above code causes a return without ever telling the different manager
itemStart. Not exactly sure how you want to fix but the above needs to
identify if different load requests come from different managers and if it is
the first time for a particular manager trigger its itemStart then when the
file is done we also need to trigger all the different managers itemEnd.
##### Three.js version
* Dev
* r103
* ...
##### Browser
* All of them
* Chrome
* Firefox
* Internet Explorer
##### OS
* All of them
* Windows
* macOS
* Linux
* Android
* iOS
##### Hardware Requirements (graphics card, VR Device, ...)
## example html to reproduce problem
<html>
<head>
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/103/three.js"></script>
<style>
#test1 {
width: 200px;
height: 200px;
}
#test2 {
width: 200px;
height: 200px;
}
</style>
</head>
<body>
<div id="test1"></div>
<div id="test2"></div>
</body>
<script>
class word {
constructor(target) {
this.scene = new THREE.Scene();
this.camera = new THREE.PerspectiveCamera( 75,
target.offsetWidth/target.offsetHeight, 0.1, 1000 );
this.camera.position.z = 100;
this.renderer = new THREE.WebGLRenderer({antialias:true});
this.renderer.setSize( target.offsetWidth, target.offsetHeight );
target.appendChild( this.renderer.domElement );
this.scene.background = new THREE.Color(0xFFFFFF);
var light = new THREE.AmbientLight(0xFFFFFF);
this.scene.add(light);
var manager = new THREE.LoadingManager();
manager.onLoad = this.draw.bind(this);
new THREE.FontLoader(manager).load("https://threejs.org/examples/fonts/optimer_bold.typeface.json",this.savefont.bind(this));
}
savefont(font) {
this.font = font;
}
draw() {
var geometry = new THREE.TextGeometry("Test",{
font: this.font,
size: 50,
height: 1,
curveSegments: 4
});
geometry.center();
var material = new THREE.MeshLambertMaterial( { color: 0x0000EE } );
var word = new THREE.Mesh( geometry, material );
this.scene.add( word );
this.render();
}
render() {
requestAnimationFrame( this.render.bind(this) );
this.renderer.render(this.scene,this.camera);
}
}
new word(document.getElementById("test1"));
new word(document.getElementById("test2"));
</script>
</html>
| 0 |
**Dmitry V. Zemnitskiy** opened **SPR-3849** and commented
Copied from http://forum.springframework.org/showthread.php?p=140845
Also see http://forum.springframework.org/showthread.php?t=31531
I just checked with latest available spring (2.0.6), hibernate (3.2.5),
hibernate entity manager and MS SQL database (both MSDE and MS SQL Server
2000), jtds driver.
As I see it is still not solved, that is, with declarative transactions and
exception translation, having business manager class annotated with
`@Repository` and business method annotated with `@Transactional` (no nested
transactions), example configuration below:
Code:
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalEntityManagerFactoryBean">
<property name="persistenceUnitName" value="${jpa.persistence.unit}" />
<property name="jpaVendorAdapter" ref="hibernateAdapter" />
</bean>
<bean name="hibernateAdapter"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
<bean id="transactionManager"
class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory"
ref="entityManagerFactory" />
</bean>
<bean
class="org.springframework.orm.jpa.support.PersistenceAnnotationBeanPostProcessor" />
<bean
class="org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor" />
<bean id="entityManager"
class="org.springframework.orm.jpa.support.SharedEntityManagerBean">
<property name="entityManagerFactory"
ref="entityManagerFactory" />
</bean>
<tx:annotation-driven transaction-manager="transactionManager" />
I get UnexpectedRollbackException caused by integrity violation exception,
because actual database operation and constraints verification are seems
deferred to commit() in MS SQL (much like in Oracle I think).
Here's exception stack trace:
Code:
org.springframework.transaction.UnexpectedRollbackException: JPA transaction
unexpectedly rolled back (maybe marked rollback-only after a failed
operation); nested exception is javax.persistence.RollbackException: Error
while commiting the transaction
Caused by:
javax.persistence.RollbackException: Error while commiting the transaction
at org.hibernate.ejb.TransactionImpl.commit(TransactionImpl.java:71)
at
org.springframework.orm.jpa.JpaTransactionManager.doCommit(JpaTransactionManager.java:433)
at
org.springframework.transaction.support.AbstractPlatformTransactionManager.processCommit(AbstractPlatformTransactionManager.java:662)
at
org.springframework.transaction.support.AbstractPlatformTransactionManager.commit(AbstractPlatformTransactionManager.java:632)
at
org.springframework.transaction.interceptor.TransactionAspectSupport.commitTransactionAfterReturning(TransactionAspectSupport.java:314)
at
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:117)
at
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:166)
at
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at $Proxy34.handleChangeMsisdn(Unknown Source)
at
com.vyke.mobile.server.io.ConnectionHandlerImpl.packetReceived(ConnectionHandlerImpl.java:254)
... removed application code calls ...
at java.lang.Thread.run(Unknown Source)
Caused by: org.hibernate.exception.ConstraintViolationException: could not
update: [com.vyke.mobile.server.domain.Client#4]
at
org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:71)
at
org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:43)
at
org.hibernate.persister.entity.AbstractEntityPersister.update(AbstractEntityPersister.java:2425)
at
org.hibernate.persister.entity.AbstractEntityPersister.updateOrInsert(AbstractEntityPersister.java:2307)
at
org.hibernate.persister.entity.AbstractEntityPersister.update(AbstractEntityPersister.java:2607)
at org.hibernate.action.EntityUpdateAction.execute(EntityUpdateAction.java:92)
at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:250)
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:234)
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:142)
at
org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:298)
at
org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:27)
at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1000)
at org.hibernate.impl.SessionImpl.managedFlush(SessionImpl.java:338)
at org.hibernate.transaction.JDBCTransaction.commit(JDBCTransaction.java:106)
at org.hibernate.ejb.TransactionImpl.commit(TransactionImpl.java:54)
... 29 more
Caused by: java.sql.SQLException: Violation of UNIQUE KEY constraint
'UQ__Client__72C60C4A'. Cannot insert duplicate key in object 'Client'.
at
net.sourceforge.jtds.jdbc.SQLDiagnostic.addDiagnostic(SQLDiagnostic.java:365)
at net.sourceforge.jtds.jdbc.TdsCore.tdsErrorToken(TdsCore.java:2781)
at net.sourceforge.jtds.jdbc.TdsCore.nextToken(TdsCore.java:2224)
at net.sourceforge.jtds.jdbc.TdsCore.getMoreResults(TdsCore.java:628)
at
net.sourceforge.jtds.jdbc.JtdsStatement.processResults(JtdsStatement.java:525)
at net.sourceforge.jtds.jdbc.JtdsStatement.executeSQL(JtdsStatement.java:487)
at
net.sourceforge.jtds.jdbc.JtdsPreparedStatement.executeUpdate(JtdsPreparedStatement.java:421)
at
org.hibernate.jdbc.NonBatchingBatcher.addToBatch(NonBatchingBatcher.java:23)
at
org.hibernate.persister.entity.AbstractEntityPersister.update(AbstractEntityPersister.java:2403)
... 41 more
I found very similar problem description here :
http://forum.springframework.org/showthread.php?t=31531
it is dated by last year..
I was able to solve the problem only by verification during business method
invocation if conflicting record already exists in database using extra select
statement.
IMO it is very rough solution and quite bad problem in Spring, as the
situation is quite common and it voids all efforts made to improve persistence
exception
handling in Spring at all.
Regards,
Dima
* * *
**Affects:** 2.0.5, 2.0.6, 2.1 M3
|
**Dmitriy Kopylenko** opened **SPR-163** and commented
There is a notion of Errors <<interface>>, so during validation there might be
some property values rejected. We treat Errors as critical e.g. in Controllers
after validation we check Errors collection and if not empty we **DO NOT**
proceed (invoke the middle tier). There are some use cases however that state
- some validations should be treated as **WARNINGS** e.g. if the rule is not
satisfied, just display a warning message to the user, but still proceed with
the workflow.
I'm not sure if that's been solved already, but in any case, would be nice to
create a Warnings <<interface>> to distinguish between those to concepts
(Errors -> do not proceed, Warnings -> still proceed)
* Dmitriy.
* * *
**Affects:** 1.1 RC1
**Issue Links:**
* #6123 Expand Errors object to be a Messages object ( _ **"is duplicated by"**_ )
22 votes, 15 watchers
| 0 |
When using CircularProgressIndicator as a loading screen, Google Play console
pre-launch report complains "This item may not have a label readable by screen
readers.".
It's not clear from the documentation how to force Semantics for
CircularProgressIndicator or to have some hidden label to pass such
accessibility tests.
## Steps to Reproduce
return new Scaffold(
body: new Center(
child: new CircularProgressIndicator(),
),
);
## Result

[✓] Flutter (Channel beta, v0.9.4, on Linux, locale en_US.UTF-8)
• Flutter version 0.9.4 at /home/flutter/flutter_v0.9.4-beta
• Framework revision f37c235c32 (4 weeks ago), 2018-09-25 17:45:40 -0400
• Engine revision 74625aed32
• Dart version 2.1.0-dev.5.0.flutter-a2eb050044
[✓] Android toolchain - develop for Android devices (Android SDK 27.0.3)
• Android SDK at /home/android/sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-27, build-tools 27.0.3
• ANDROID_HOME = /home/android/sdk
• Java binary at: /home/android/src/android-studio/jre/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b01)
• All Android licenses accepted.
[✓] Android Studio (version 3.1)
• Android Studio at /home/android/src/android-studio
• Flutter plugin version 24.2.1
• Dart plugin version 173.4700
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b01)
[✓] Connected devices (1 available)
• Android SDK built for x86 • emulator-5554 • android-x86 • Android 8.1.0 (API 27) (emulator)
• No issues found!
|
Render invalid in Thai

| 0 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.6.2
* Operating System version: docker
* Java version: 18
### Steps to reproduce this issue
服务端开启了validate=true ,
依赖也加入了,但是`org.hibernate.validator.engine.ConfigurationImpl`
都没有这个类,服务调用都时候弄了这个类。
已经查阅过的链接
https://blog.csdn.net/hengyunabc/article/details/71513509
1. validate=true
Failed to invoke remote method: stopNotifyJob, provider: dubbo://10.4.3.136:30001/com.raycloud.notify.api.service.NotifyJobRequest?application=demo-docker&application.version=1.0.0ValidationAutoConfiguration&check=false&dubbo=2.6.2&interface=com.xxxxx.xxxx.xxx.service.xxxxx&methods=xxxx,xxx,xxx&pid=56®ister.ip=10.0.0.72&revision=2.0.1&side=consumer&timeout=20000×tamp=1574320167228&version=2.0.1-vpc, cause: com.alibaba.dubbo.rpc.RpcException: Could not initialize class org.hibernate.validator.engine.ConfigurationImpl
com.alibaba.dubbo.rpc.RpcException: Could not initialize class org.hibernate.validator.engine.ConfigurationImpl
at com.alibaba.dubbo.validation.filter.ValidationFilter.invoke(ValidationFilter.java:54)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ExceptionFilter.invoke(ExceptionFilter.java:64)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.monitor.support.MonitorFilter.invoke(MonitorFilter.java:75)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.TimeoutFilter.invoke(TimeoutFilter.java:42)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.filter.TraceFilter.invoke(TraceFilter.java:78)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.raycloud.dubbo.rpc.filter.DubboConcurrentMonitor.invoke(DubboConcurrentMonitor.java:58)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.raycloud.eagle.trace.dubbo.DubboInvokeMonitorFilter.invoke(DubboInvokeMonitorFilter.java:78)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ContextFilter.invoke(ContextFilter.java:60)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.GenericFilter.invoke(GenericFilter.java:112)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ClassLoaderFilter.invoke(ClassLoaderFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.EchoFilter.invoke(EchoFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol$1.reply(DubboProtocol.java:108)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.handleRequest(HeaderExchangeHandler.java:84)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.received(HeaderExchangeHandler.java:170)
at com.alibaba.dubbo.remoting.transport.DecodeHandler.received(DecodeHandler.java:52)
at com.alibaba.dubbo.remoting.transport.dispather.ChannelEventRunnable.run(ChannelEventRunnable.java:82)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.hibernate.validator.engine.ConfigurationImpl
at org.hibernate.validator.HibernateValidator.createGenericConfiguration(HibernateValidator.java:41)
at javax.validation.Validation$GenericBootstrapImpl.configure(Validation.java:269)
at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:111)
at com.alibaba.dubbo.validation.support.jvalidation.JValidator.<init>(JValidator.java:86)
at com.alibaba.dubbo.validation.support.jvalidation.JValidation.createValidator(JValidation.java:31)
at com.alibaba.dubbo.validation.support.AbstractValidation.getValidator(AbstractValidation.java:38)
at com.alibaba.dubbo.validation.Validation$Adpative.getValidator(Validation$Adpative.java)
at com.alibaba.dubbo.validation.filter.ValidationFilter.invoke(ValidationFilter.java:47)
... 29 more
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.3
* Operating System version: macOS 10.15.1、任意Linux发行版
* Java version: 1.8
参考#3784,Version.parseInt未再判断字符串的合法性,也未做异常处理
该bug还未修复。version从文件名获取,当文件名是dubbo-client.jar,解析得到version = "client"
时,isSupportResponseAttachment(Version.java:102) 中的代码块
if (StringUtils.isEmpty(version)) {
return false;
}
就失效了。
后续Version.parseInt未再判断字符串的合法性,也未做异常处理。
也就是会报错如下:
11/28 15:03:45.504 WARN org.apache.dubbo.remoting.exchange.codec.ExchangeCodec
[NettyServerWorker-6-7] [DUBBO] Fail to encode response: Response [id=0,
version=client, status=20, event=false, error=null, result=AppResponse
[value=cn.pengh.core.rpc.RpcResponse@49995603, exception=null]], send
bad_response info instead, cause: For input string: "", dubbo version: 2.7.3,
current host: 192.168.6.46
java.lang.NumberFormatException: For input string: ""
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
~[?:1.8.0_201]
at java.lang.Integer.parseInt(Integer.java:592) ~[?:1.8.0_201]
at java.lang.Integer.parseInt(Integer.java:615) ~[?:1.8.0_201]
at org.apache.dubbo.common.Version.parseInt(Version.java:133)
~[dubbo-2.7.3.jar:2.7.3]
at org.apache.dubbo.common.Version.getIntVersion(Version.java:118)
~[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.common.Version.isSupportResponseAttachment(Version.java:102)
~[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.rpc.protocol.dubbo.DubboCodec.encodeResponseData(DubboCodec.java:195)
~[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.remoting.exchange.codec.ExchangeCodec.encodeResponse(ExchangeCodec.java:283)
[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.remoting.exchange.codec.ExchangeCodec.encode(ExchangeCodec.java:71)
[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.rpc.protocol.dubbo.DubboCountCodec.encode(DubboCountCodec.java:40)
[dubbo-2.7.3.jar:2.7.3]
at
org.apache.dubbo.remoting.transport.netty4.NettyCodecAdapter$InternalEncoder.encode(NettyCodecAdapter.java:70)
[dubbo-2.7.3.jar:2.7.3]
at
io.netty.handler.codec.MessageToByteEncoder.write(MessageToByteEncoder.java:107)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:716)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:708)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:791)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:701)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at io.netty.handler.timeout.IdleStateHandler.write(IdleStateHandler.java:303)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:716)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:708)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:791)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:701)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at io.netty.channel.ChannelDuplexHandler.write(ChannelDuplexHandler.java:115)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
org.apache.dubbo.remoting.transport.netty4.NettyServerHandler.write(NettyServerHandler.java:103)
[dubbo-2.7.3.jar:2.7.3]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite0(AbstractChannelHandlerContext.java:716)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:708)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext.access$1700(AbstractChannelHandlerContext.java:56)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:1102)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:1149)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1073)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518) [netty-
all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1050)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
[netty-all-4.1.43.Final.jar:4.1.43.Final]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_201]
| 0 |
**Bug summary**
Saving matplotlib polar plots on Windows 7 and Windows 10, with MiKTex
2.9.7.000, fails for some formats (e.g., pdf, svg) and works for others (e.g.,
png, jpg)
**Code for reproduction**
import matplotlib.pyplot as plt
import matplotlib as mpl
mpl.rcParams['text.usetex'] = True
plt.figure()
ax = plt.subplot(111, polar=True)
plt.savefig('foo.png') # Works
plt.savefig('foo.jpg') # Works
plt.savefig('foo.tif') # Works
# plt.savefig('foo.pdf') # Fails
# plt.savefig('foo.svg') # Fails
plt.show()
**Actual outcome**
"Error saving .... " window pops up with error code b'tcss1000' for pdf, and
b'tcss3583' for svg. (The numbers change)
**Matplotlib version**
* Operating system: Windows 7 & 10
* Matplotlib version: 2.2.2 or 3.0.3
* Matplotlib backend (`print(matplotlib.get_backend())`): Qt5Agg
* Python version: 3.6
* Jupyter version (if applicable): N/A
* Other libraries: N/A
Installed via Anaconda3
|
Saving vector graphic figures with certain characters in LaTeX will fail. Here
I tried to use the `\textmu` character. Could it be related to #8068?
**Code for reproduction**
Here I'm saving to PostScript. It gives an error that I think is the most
informative and points to some problems with ghostscript. Outputs for SVG and
PDF below.
import matplotlib
matplotlib.rcParams['text.usetex'] = True
import matplotlib.pyplot as plt
fig = plt.figure()
ax = fig.add_subplot(111)
ax.set_ylabel(r'\textmu')
plt.savefig('fig.ps')
**Output - PS**
Traceback (most recent call last):
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_ps.py", line 1520, in gs_distill
report = subprocess.check_output(command, stderr=subprocess.STDOUT)
File "/usr/lib/python3.6/subprocess.py", line 336, in check_output
**kwargs).stdout
File "/usr/lib/python3.6/subprocess.py", line 418, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['gs', '-dBATCH', '-dNOPAUSE', '-r6000', '-sDEVICE=ps2write', '-sPAPERSIZE=letter', '-sOutputFile=/tmp/tmpt8w23igm.ps', '/tmp/tmpt8w23igm']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "Untitled.py", line 8, in <module>
plt.savefig('fig.ps')
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/pyplot.py", line 701, in savefig
res = fig.savefig(*args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/figure.py", line 1834, in savefig
self.canvas.print_figure(fname, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backend_bases.py", line 2267, in print_figure
**kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_ps.py", line 910, in print_ps
return self._print_ps(outfile, 'ps', *args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_ps.py", line 937, in _print_ps
**kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_ps.py", line 1359, in _print_figure_tex
rotated=psfrag_rotated)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_ps.py", line 1525, in gs_distill
'\n\n' % exc.output.decode("utf-8")))
RuntimeError: ghostscript was not able to process your image.
Here is the full report generated by ghostscript:
GPL Ghostscript 9.22 (2017-10-04)
Copyright (C) 2017 Artifex Software, Inc. All rights reserved.
This software comes with NO WARRANTY: see the file PUBLIC for details.
Error: /undefined in --get--
Operand stack:
--nostringval-- --dict:9/18(ro)(L)-- 112 --dict:13/13(L)-- --dict:13/13(L)-- base
Execution stack:
%interp_exit .runexec2 --nostringval-- --nostringval-- --nostringval-- 2 %stopped_push --nostringval-- --nostringval-- --nostringval-- false 1 %stopped_push 2015 1 3 %oparray_pop 2014 1 3 %oparray_pop 1998 1 3 %oparray_pop 1884 1 3 %oparray_pop --nostringval-- %errorexec_pop .runexec2 --nostringval-- --nostringval-- --nostringval-- 2 %stopped_push --nostringval-- --nostringval-- --nostringval-- %finish_stringwidth --nostringval-- --nostringval-- 14 8 0 --nostringval-- (pdf_text_enum_t) %op_show_continue --nostringval--
Dictionary stack:
--dict:986/1684(ro)(G)-- --dict:1/20(G)-- --dict:82/200(L)-- --dict:5/6(ro)(L)-- --dict:180/300(L)-- --dict:44/200(L)-- --dict:8/17(L)-- --dict:51/90(L)--
Current allocation mode is local
Last OS error: No such file or directory
Current file position is 58919
GPL Ghostscript 9.22: Unrecoverable error, exit code 1
**Output - SVG, PDF**
Traceback (most recent call last):
File "Untitled.py", line 8, in <module>
plt.savefig('fig.svg')
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/pyplot.py", line 701, in savefig
res = fig.savefig(*args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/figure.py", line 1834, in savefig
self.canvas.print_figure(fname, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backend_bases.py", line 2267, in print_figure
**kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 1193, in print_svg
return self._print_svg(filename, svgwriter, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 1248, in _print_svg
self.figure.draw(renderer)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/artist.py", line 55, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/figure.py", line 1299, in draw
renderer, self, artists, self.suppressComposite)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/image.py", line 138, in _draw_list_compositing_images
a.draw(renderer)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/artist.py", line 55, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/axes/_base.py", line 2437, in draw
mimage._draw_list_compositing_images(renderer, self, artists)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/image.py", line 138, in _draw_list_compositing_images
a.draw(renderer)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/artist.py", line 55, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/axis.py", line 1147, in draw
self.label.draw(renderer)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/artist.py", line 55, in draw_wrapper
return draw(artist, renderer, *args, **kwargs)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/text.py", line 762, in draw
mtext=mtext)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 1150, in draw_tex
self._draw_text_as_path(gc, x, y, s, prop, angle, ismath="TeX")
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/backends/backend_svg.py", line 950, in _draw_text_as_path
return_new_glyphs_only=True)
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/textpath.py", line 335, in get_glyphs_tex
font_bunch = self.tex_font_map[dvifont.texname]
File "/home/mirek/.local/lib/python3.6/site-packages/matplotlib/dviread.py", line 850, in __getitem__
result = self._font[texname]
KeyError: b'tcss3583'
For PDF it's nearly identical; the last line reads:
KeyError: b'tcss1000'
In case of PS and SVG, no file is produced. In case of PDF, a corrupted file
results. Saving to raster formats and inline previews in jupyter notebook work
fine.
**Matplotlib version**
* Operating system: Debian 10
* Matplotlib version: 2.1.1
* Python version: 3.6.4
Matplotlib from pip. Both Matplotlib and LaTeX are installed in userspace
| 1 |
### Bug report
**Bug summary**
Artifacts appear when plotting periodic data with discontinuities, i.e.
something like x mod 1.0
The bug appears at when render resolution is relatively small comparing to the
real resolution of the data (downsampling problem?).
**Code for reproduction**
import numpy as np
import matplotlib.pyplot as plt
xx, yy = np.meshgrid(np.linspace(-5, 5, 201), np.linspace(-5, 5, 201))
psi = xx % 1.0
plt.figure()
plt.imshow(psi, cmap="twilight")
plt.savefig("small.png", dpi=100)
plt.savefig("large.png", dpi=200)
**Outcome**


(large.png, small.png)
**Matplotlib version**
The issue is repeatable locally and in CoLab:
* Operating system: Linux
* Matplotlib version: 3.3.3, 3.2.2
* Matplotlib backend (): TkAgg, ipykernel.pylab.backend_inline
* Python version: 3.8.5, 3.7
|
After updating matplotlib version from 1.3.1 to 2.0.2, when I want to use
plot_trisurf to generate a TIN by 3d-points, I got an incomprehensible result.
I just want to know the difference of plot_trisurf between matplotlib 2.0.2
and matplotlib 1.3.1. And how i can get the similar result?
the details about this issue are referred to
https://stackoverflow.com/questions/44244297/plot-trisurf-of-matplotlib-2-0-2
| 0 |
Something like `gofmt` is really useful.
|
It would be really nice to have the pretty printer be good enough to use it as
a code formatter, ala gofmt. It still has some issues (I've been working on
it), but it is reasonably close to being usable in that way.
But, the commandline invocation is kind of unwieldy (and feels redundant). It
seems like --pretty should just pretty print, and other stuff should be
handled secondarily, ie with:
\--pretty-expanded
\--pretty-typed
\--pretty-expanded-identified
\--pretty-identified
Alternatively, perhaps it would make more sense to actually split it into a
separate binary, called rustfmt (or, in longer-keyword land, rustformat).
Thoughts?
| 1 |
**Is this a request for help?** (If yes, you should use our troubleshooting
guide and community support channels, see
http://kubernetes.io/docs/troubleshooting/.):
NO, it is not a request of help, this is a BUG REPORT
**What keywords did you search in Kubernetes issues before filing this one?**
(If you have found any duplicates, you should instead reply there.):
HPA not down-scaling deployment
HPA not able to get load info
* * *
**Is this a BUG REPORT or FEATURE REQUEST?** (choose one):
BUG REPORT
**Kubernetes version** (use `kubectl version`):
Tested to repeat on 1.3.5 and on 1.3.6
**Environment** :
* **Cloud provider or hardware configuration** : GCE/GKE managed
* **OS** (e.g. from /etc/os-release): PRETTY_NAME="Debian GNU/Linux 7 (wheezy)"
* **Kernel** (e.g. `uname -a`): Linux gke-k8s-test1-default-pool-72fe9503-8tvm 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt25-2 (2016-04-08) x86_64 GNU/Linux
* **Install tools** : gcloud
* **Others** : N/A
**What happened** :
We have development cluster with one of node pools of --min-nodes=2 and --max-
nodes=3.
Normally cluster operates on 2 nodes in this pool.
Deployment under question specifically has a nodeSelector set to allocate
nodes only to this pool.
Deployment under question has an HPA configured with maximum pod count of 20.
We observed a spike in the load, where HPA decided to allocate all 20 pods.
Due to CPU resource constraint, it triggered an additional node allocation:
everything was as-expected at this time.
After additional node allocation, it was STILL NOT ENOUGH CPU to start all
pods, but now limit on number of nodes was reached and pods were stuck in
"Pending" state: as-expected.
What went wrong: after load was removed, HPA did not want to scale back to 1-2
pods we see normally running.
We found that HPA can not get CPU consumption info for some pods:
4m 4m 2 {horizontal-pod-autoscaler } Warning FailedGetMetrics failed to get CPU consumption and request: metrics obtained for 20/22 of pods
4m 4m 2 {horizontal-pod-autoscaler } Warning FailedComputeReplicas failed to get CPU utilization: failed to get CPU consumption and request: metrics obtained for 20/22 of pods
It was suspicious it reports more pods than we see with "kubectl get pod -l
...", so we found the missing ones to be "stuck" and waiting to be reaped:
kubectl get pod -a -l run=cm-worker
NAME READY STATUS RESTARTS AGE
cm-worker-2850697223-1wtmo 1/1 Running 0 11m
cm-worker-2850697223-9evqk 1/1 Running 0 11m
cm-worker-2850697223-dfhiu 0/1 OutOfCPU 0 5h
cm-worker-2850697223-f6h9o 1/1 Running 0 11m
[...]
cm-worker-2850697223-wogui 1/1 Running 0 11m
cm-worker-2850697223-zdxjg 0/1 OutOfCPU 0 4h
At this point, we upgraded master from 1.3.5 to 1.3.6, but it did not resolve
the situation.
Upon manual kill on these pods, cluster and HPA started to operate as
appropriate: HPA scaled down and cluster scaled down node it did not need.
**What you expected to happen** :
1. It was expected that Kubernetes would properly discover and reap failed pods.
2. It was expected that HPA would not consider dead pods for query of current load.
**How to reproduce it** (as minimally and precisely as possible):
We are not sure we can reproduce, so we investigated to the root cause right
in place.
**Anything else do we need to know** :
N/A
|
**Environment** :
* **Kubernetes version** : 1.3.5
* **Cloud provider or hardware configuration** : Google Container Engine
* **Kernel** : Linux 3.16.0-4-amd64 #1 SMP Debian 3.16.7-ckt25-2 (2016-04-08) x86_64 GNU/Linux
**What happened** :
We're using a HorizontalPodAutoscaler in our cluster to scale pods based on
CPU load. While this is working for some deployments, one deployment seems to
be in a broken state where the HPA tries to get cpu consumption from heapster
for pods that do not exist.
The HPA was working for said deployment but started to misbehave after a
scaling event.
Result from `kubectl get pods`
…
gosu-main-api-v1-3735438065-5n8t7 2/2 Running 1 2h
gosu-main-api-v1-3735438065-j027w 2/2 Running 1 2h
gosu-main-api-v1-3735438065-rydr6 2/2 Running 1 2h
gosu-main-api-v1-3735438065-vf59h 2/2 Running 1 2h
gosu-main-api-v1-3735438065-w4hzm 2/2 Running 1 2h
…
Result from `kubectl get hpa`
NAME REFERENCE TARGET CURRENT MINPODS MAXPODS AGE
…
gosu-main-api-v1 Deployment/gosu-main-api-v1 75% <waiting> 3 20 28m
…
Result from `kubectl describe hpa gosu-main-api-v1`
Name: gosu-main-api-v1
Namespace: default
Labels: <none>
Annotations: <none>
CreationTimestamp: Sat, 27 Aug 2016 17:17:31 -0700
Reference: Deployment/gosu-main-api-v1
Target CPU utilization: 75%
Current CPU utilization: <unset>
Min replicas: 3
Max replicas: 20
Events:
FirstSeen LastSeen Count From SubobjectPath Type Reason Message
--------- -------- ----- ---- ------------- -------- ------ -------
18m 16m 4 {horizontal-pod-autoscaler } Warning FailedGetMetrics failed to get CPU consumption and request: metrics obtained for 0/12 of pods
18m 16m 4 {horizontal-pod-autoscaler } Warning FailedComputeReplicas failed to get CPU utilization: failed to get CPU consumption and request: metrics obtained for 0/12 of pods
30m 30s 58 {horizontal-pod-autoscaler } Warning FailedGetMetrics failed to get CPU consumption and request: metrics obtained for 5/12 of pods
30m 30s 58 {horizontal-pod-autoscaler } Warning FailedComputeReplicas failed to get CPU utilization: failed to get CPU consumption and request: metrics obtained for 5/12 of pods
Excerpt from heapster logs
I0828 00:47:34.525902 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-wkcbo
I0828 00:47:34.526057 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-wkcbo
I0828 00:47:34.526107 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-vylz6
I0828 00:47:34.526165 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-vylz6
I0828 00:47:34.526249 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-ipr93
I0828 00:47:34.526304 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-ipr93
I0828 00:47:34.526387 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-qscb9
I0828 00:47:34.526444 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-qscb9
I0828 00:47:34.526493 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-x19pc
I0828 00:47:34.526548 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-x19pc
I0828 00:47:34.526591 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-3b1q3
I0828 00:47:34.526671 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-3b1q3
I0828 00:47:34.526754 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-02y6n
I0828 00:47:34.526822 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-02y6n
I0828 00:48:04.584791 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-x19pc
I0828 00:48:04.584816 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-x19pc
I0828 00:48:04.584837 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-ipr93
I0828 00:48:04.584866 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-ipr93
I0828 00:48:04.584893 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-02y6n
I0828 00:48:04.584961 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-02y6n
I0828 00:48:04.584986 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-wkcbo
I0828 00:48:04.585005 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-wkcbo
I0828 00:48:04.585048 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-1499485845-vylz6
I0828 00:48:04.585066 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-1499485845-vylz6
I0828 00:48:04.585097 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-3b1q3
I0828 00:48:04.585186 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-3b1q3
I0828 00:48:04.585201 1 handlers.go:242] No metrics for container gosu-main-api in pod default/gosu-main-api-v1-786519656-qscb9
I0828 00:48:04.585206 1 handlers.go:178] No metrics for pod default/gosu-main-api-v1-786519656-qscb9
**What you expected to happen** :
The deployment is currently set to 5 replicas, so I would expect for the HPA
to get the resource consumption for the 5 pods that are scheduled.
**How to reproduce it** :
Cannot reliably bring other deployments into this broken state. Had the same
issue with another deployment, but it started to work again without outer
influence (maybe after we updated the deployment)
| 1 |
**Daniel Fernández** opened **SPR-8928** and commented
org.springframework.context.support.PropertySourcesPlaceholderConfigurer is
overriding the "postProcessBeanFactory" method defined in
org.springframework.beans.factory.config.PropertyResourceConfigurer in order
to register the required chain of PropertySource objects that will be use for
resolving properties. This makes sense and is OK.
But the problem is that the PropertySource object for the local properties
(those coming from the "location" attribute and also from the "properties"
property) is being created like this:
PropertySource<?> localPropertySource =
new PropertiesPropertySource(LOCAL_PROPERTIES_PROPERTY_SOURCE_NAME,
this.mergeProperties());
...whereas in the original "postProcessBeanFactory" method in
PropertyResourceConfigurer these merged properties are post-processed by
executing the "convertProperties" method:
Properties mergedProps = mergeProperties();
// Convert the merged properties, if necessary.
convertProperties(mergedProps);
// Let the subclass process the properties.
processProperties(beanFactory, mergedProps);
This means that the new PropertySourcesPlaceholderConfigurer class never calls
"convertProperties", and so disables one of the extension mechanisms of the
old pre-3.1 PropertyPlaceholderConfigurer class: overriding the
"convertProperty(...)" and "convertPropertyValue(...)" methods.
I am jasypt's author http://www.jasypt.org and I am creating a
Spring3.1-compatible EncryptedPropertySourcesPlaceholderConfigurer, but this
bug doesn't allow me to transparently apply property decryption at the
"convertProperty" method :-(
* * *
**Affects:** 3.1 GA
**Issue Links:**
* #18574 convertPropertyValue (for reading encrypted values) not working ( _ **"is duplicated by"**_ )
* #13603 Allow the use of custom PropertySource annotations in `@Configuration` classes
* SEC-3123 Encrypted property value support
* #15294 Add encryption support for PropertyPlaceholderConfigurer
* #17236 Backport encrypted property functionality from spring-cloud-config environment work
8 votes, 18 watchers
|
**Marten Deinum** opened **SPR-4238** and commented
The prop tag doesn´t support a value attribute. It would be nice if it did so
we can use the short hand notation. There was some discussion about this in
this thread.
http://forum.springframework.org/showthread.php?p=155337#post155337
* * *
**Issue Links:**
* #8917 less verbose xml binding of java.lang.Properties ( _ **"duplicates"**_ )
| 0 |
Currently, when exporting markdown to html, all html special characters are
escaped:
using Markdown
a = Markdown.parse("This is *important* text with <i>html</i> in it");
# parsed as
Markdown.Paragraph(Any["This is ", Markdown.Italic(Any["important"]), " text with <i>html</i> in it"])
# then exporting to html
Markdown.html(a)
# output below
"<p>This is <em>important</em> text with <i>html</i> in it</p>\n"
and there is no way of getting such an output:
<p>This is <em>important</em> text with <i>html</i> in it</p>
So, this issue is a feature request.
It could even be the default behavior for all markdown blocks except code
block.
(similar to #34042)
|
I noticed `Markdown.parse` doesn't parse HTML tags properly while writing docs
with `Documenter.jl` (xref: JuliaDocs/Documenter.jl#176).
Most Markdown parsers support this feature, so I think `Base.Markdown` should
do as well.
For example, two consecutive hyphens are recognized as an em dash as follows:
julia> Markdown.parse("<!-- comment -->")
<!– comment –>
CC: @MichaelHatherly
* * *
julia> versioninfo()
Julia Version 0.5.0-rc1+0
Commit cede539* (2016-08-04 08:48 UTC)
Platform Info:
System: Darwin (x86_64-apple-darwin14.5.0)
CPU: Intel(R) Core(TM) i5-4288U CPU @ 2.60GHz
WORD_SIZE: 64
BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Haswell)
LAPACK: libopenblas64_
LIBM: libopenlibm
LLVM: libLLVM-3.7.1 (ORCJIT, haswell)
| 1 |
Describe what you were doing when the bug occurred:
1.
2.
3.
* * *
## Please do not remove the text below this line
DevTools version: 4.2.1-3816ae7c3
Call stack: at chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:40:157108
at Map.forEach ()
at commitIndex (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:40:157054)
at e.getRankedChartData (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:40:157577)
at vl (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:40:314907)
at gi (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:32:59907)
at jl (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:32:107381)
at Lc (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:32:92715)
at Pc (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:32:92640)
at wc (chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/main.js:32:89544)
Component stack: in vl
in div
in div
in div
in wo
in Unknown
in n
in Unknown
in div
in div
in Li
in $e
in dn
in Ca
in Pc
|
I get three warnings when React DevTools is installed in Chrome. Other people
have reported this but I have not seen a good solution. (I dont use an ad
blocker)
DevTools failed to load SourceMap: Could not load content for chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/injectGlobalHook.js.map:
HTTP error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
DevTools failed to load SourceMap: Could not load content for chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/react_devtools_backend.js.map:
HTTP error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
DevTools failed to load SourceMap: Could not load content for chrome-
extension://fmkadmapgofadopljbjfkapdkoienihi/build/contentScript.js.map: HTTP
error: status code 404, net::ERR_UNKNOWN_URL_SCHEME
Google Chrome Version 86.0.4240.111 (Official Build) (64-bit)
React version:
"react": "^16.13.1",
"react-dom": "^16.13.1",
"react-router-dom": "^5.2.0",
"react-scripts": "3.4.3"
Also using Apollo Client
## The current behavior
The above warnings are generated on every render for every page.
## The expected behavior
No warnings
## Files
\----index.js file ----
import React from "react";
import { render } from "react-dom";
import { BrowserRouter } from "react-router-dom";
import {
ApolloClient,
InMemoryCache,
createHttpLink,
ApolloProvider,
} from "@apollo/client";
// Top level view. Does the setup of the Apollo Provider and renders the App
import App from "./App";
const link = createHttpLink({
uri: "http://localhost:4000",
headers: { authorization: localStorage.getItem("token") },
});
const cache = new InMemoryCache();
const client = new ApolloClient({
link: link,
cache: cache,
});
render(` <ApolloProvider client={client}> <BrowserRouter> <div> <App /> </div>
</BrowserRouter> </ApolloProvider>,`
document.getElementById("root")
);
| 0 |
Source:
https://discord.com/channels/684898665143206084/684898665151594506/1022408024725405698
The following code throws an error in Deno versions 1.17 and upwards:
const { instance, module } = await WebAssembly.instantiateStreaming(
fetch("https://wpt.live/wasm/incrementer.wasm"),
);
const increment = instance.exports.increment as (input: number) => number;
console.log(increment(41));
deno run -A wasm.ts
Deno 1.16:
42
Deno 1.17:
error: Uncaught (in promise) Error: request or response body error: error reading a body from connection: unexpected end of file
at async Object.pull (deno:ext/fetch/26_fetch.js:104:24)
Deno 1.25:
error: Uncaught (in promise) Error: request or response body error: error reading a body from connection: unexpected end of file
at async Object.pull (deno:ext/fetch/26_fetch.js:119:24)
|
If I install a program
deno install catj https://deno.land/std/examples/catjson.ts --allow-read
I want to be able to use flags like `--reload` or `-L=info` with the newly
installed program:
catj --reload
^-- does not work as intended.
| 0 |
Calling `.popover('hide')` on an element does nothing in IE8, and works fine
in other browsers. Here's a demo: http://jsfiddle.net/QaHkV/1/
This also didn't work when I tried it locally with all the shims/hacks
suggested for IE8.
|
Using toggle popover hidden and ok. When using hide command, popover is
hidden, but button inside popover still trigering mouse over.
http://jsfiddle.net/conx/n8FYQ/4/
| 1 |
* I tried using the `@types/superagent@3.5.5` package and had problems.
* I tried using the latest stable version of tsc. https://www.npmjs.com/package/typescript
* I have a question that is inappropriate for StackOverflow. (Please ask any appropriate questions there).
* Mention the authors (see `Definitions by:` in `index.d.ts`) so they can respond.
* Authors: @NicoZelaya @mxl @paplorinc
I'm using `superagent` in the browser environment (not Node). Importing
`superagent` loads type definitions `@types/superagent` which unconditionally
imports `@types/node` which injects global type definitions for Node
environment. After that, the TypeScript compiler tells me that `setTimeout`
returns `NodeJS.Timer` while I expect it to return `number` because there
should be no `NodeJS` in the browser.
I'm not using the latest `tsc` because I'm building a React+TypeScript app via
https://github.com/zhenwenc/create-react-app https://www.npmjs.com/package/zc-
react-scripts –
`typescript@2.4.2` is bundled with the latest version `zc-react-scripts@1.1.1`
at the time of writing.
|
If you know how to fix the issue, make a pull request instead.
* I tried using the `@types/xxxx` package and had problems.
* I tried using the latest stable version of tsc. https://www.npmjs.com/package/typescript
* I have a question that is inappropriate for StackOverflow. (Please ask any appropriate questions there).
* Mention the authors (see `Definitions by:` in `index.d.ts`) so they can respond.
* Authors: @GlenCFL
Since different emitters can have different 'emissions', having a global
interface, especially one defining 'catch-all' string index is not helpful _at
all_. Adding `Emissions` as a generic parameter to `Emitter` itself would work
_much_ better. Example:
/**
* Utility class to be used when implementing event-based APIs that allows
* for handlers registered via ::on to be invoked with calls to ::emit.
*/
export class Emitter<Emissions = { [key: string]: any }> implements DisposableLike {
/** Construct an emitter. */
constructor();
/** Clear out any existing subscribers. */
clear(): void;
/** Unsubscribe all handlers. */
dispose(): boolean;
// Event Subscription
/** Registers a handler to be invoked whenever the given event is emitted. */
on<T extends keyof Emissions>(eventName: T, handler: (value?: Emissions[T]) => void):
Disposable;
/**
* Register the given handler function to be invoked the next time an event
* with the given name is emitted via ::emit.
*/
once<T extends keyof Emissions>(eventName: T, handler: (value?: Emissions[T]) => void):
Disposable;
/**
* Register the given handler function to be invoked before all other
* handlers existing at the time of subscription whenever events by the
* given name are emitted via ::emit.
*/
preempt<T extends keyof Emissions>(eventName: T, handler: (value?: Emissions[T]) => void):
Disposable;
// Event Emission
/** Invoke the handlers registered via ::on for the given event name. */
emit<T extends keyof Emissions>(eventName: T, value?: Emissions[T]): void;
}
| 0 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.3
* Operating System version: MacOs
* Java version: 1.8
### Steps to reproduce this issue
This is not the issue, but the update of referenceCount log.
Originally, i don't know why we have to check the referenceCount before close
the connection.
final class ReferenceCountExchangeClient implements ExchangeClient {
@Override
public void close(int timeout) {
if (referenceCount.decrementAndGet() <= 0) {
if (timeout == 0) {
client.close();
} else {
client.close(timeout);
}
replaceWithLazyClient();
}
}
}
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.2
* Operating System version: Mac
* Java version: 8
### Steps to reproduce this issue
#### 设置属性值
* Provider
> dubbo.provider.filter=metrics
> dubbo.provider.metrics.protocol=dubbo
> dubbo.provider.metrics.port=20880
* Consumer
> dubbo.consumer.filter=metrics
> dubbo.consumer.metrics.protocol=dubbo
> dubbo.consumer.metrics.port=20880
dubbo.provider会映射到`ProviderConfig`里的metrics,dubbo.consumer会映射到`ConsumerConfig`里的metrics。但是在`AbstractConfig`里解析时方法类型只能是`Primitive
`,所以导致里面的值无法放入url 参数列表里。导致dubbo-admin里无法获取到protocol和port
Pls. provide [GitHub address] to reproduce this issue.
### Expected Result
>
> 为什么不跟xml`<dubbo:metrics`一样提供`dubbo.metrics`,然后在`DubboConfigConfiguration`映射进去
@EnableDubboConfigBindings({
@EnableDubboConfigBinding(prefix = "dubbo.application", type = ApplicationConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.module", type = ModuleConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.registry", type = RegistryConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.protocol", type = ProtocolConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.monitor", type = MonitorConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.provider", type = ProviderConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.consumer", type = ConsumerConfig.class),
@EnableDubboConfigBinding(prefix = "dubbo.config-center", type = ConfigCenterBean.class),
@EnableDubboConfigBinding(prefix = "dubbo.metadata-report", type = MetadataReportConfig.class)
})
| 0 |
1. Open directory in Atom. Directory is under git.
2. Open file1.coffee in editor tab.
3. In Linux console checkout another git branch where file1.coffee is absent.
4. Now in atom you should see error message
**Atom Version** : 0.194.0
**System** : linux 3.19.0-12-generic
**Thrown From** : Atom Core
### Stack Trace
Uncaught Error: ENOENT: no such file or directory, open
'/my/file/path/file1.coffee'
At events.js:141
Error: ENOENT: no such file or directory, open '/my/file/path/file1.coffee'
at Error (native)
### Config
{
"core": {
"excludeVcsIgnoredPaths": false,
"disabledPackages": [
"select-scope",
"language-ruby-on-rails",
"git-diff-details"
]
},
"editor": {
"fontSize": 21,
"showInvisibles": true,
"showIndentGuide": true,
"preferredLineLength": 120,
"tabLength": 4,
"invisibles": {}
}
}
### Installed Packages
# User
atom-grails, v0.1.0
autocomplete-plus, v2.12.0
editorconfig, v1.0.0
highlight-selected, v0.9.2
language-hjson, v0.2.0
linter, v0.12.1
linter-coffeelint, v0.2.1
linter-htmlhint, v0.0.13
linter-jshint, v0.1.2
word-jumper, v0.2.0
# Dev
No dev packages
|
Some repro steps that I gathered from the existing issues:
1. Have a file open in branch `something` that doesn't exist in branch `atom`
2. `git checkout atom`
3. ENOENT 💥
OR
4. Have a file open in Atom that will be affected by `git rebase`
5. `git rebase`
6. ENOENT 👎
OR
Just simply rename a file outside of Atom that's currently open inside of
Atom.
| 1 |
body {
background-color: #000000;
}
</style> I am using the correct code but I am not able to get past this way
point. I see others had this issue but I have tried what resolved other users
issues with this, with no result. How can I get past this?
|
Challenge Waypoint: Use Hex Code for Specific Colors has an issue.
User Agent is: `Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36
(KHTML, like Gecko) Chrome/46.0.2486.0 Safari/537.36 Edge/13.10586`.
Please describe how to reproduce this issue, and include links to screenshots
if possible.
My answer is correct but wrongly flagged as erroneous.
My code:
<style>
body {
background-color: #000000;
}
</style>
| 1 |
I tried running flutter for the first time from android studio and i got a
gradle error. I have connected my phone to my computer and it recognized it
alright. I tried to run it again and it gave me the same error message.
like this:
Launching lib\main.dart on SZ622YHRZW in debug mode...
Initializing gradle...
Finished with error: Exit code 1 from: C:\Users\Andi\jet_flutter_app\android\gradlew.bat -v:
Downloading https://services.gradle.org/distributions/gradle-3.3-all.zip
.......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Unzipping C:\Users\Andi\.gradle\wrapper\dists\gradle-3.3-all\55gk2rcmfc6p2dg9u9ohc3hw9\gradle-3.3-all.zip to C:\Users\Andi\.gradle\wrapper\dists\gradle-3.3-all\55gk2rcmfc6p2dg9u9ohc3hw9
Exception in thread "main" java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:219)
at java.util.zip.ZipFile.<init>(ZipFile.java:149)
at java.util.zip.ZipFile.<init>(ZipFile.java:163)
at org.gradle.wrapper.Install.unzip(Install.java:214)
at org.gradle.wrapper.Install.access$600(Install.java:27)
at org.gradle.wrapper.Install$1.call(Install.java:74)
at org.gradle.wrapper.Install$1.call(Install.java:48)
at org.gradle.wrapper.ExclusiveFileAccessManager.access(ExclusiveFileAccessManager.java:65)
at org.gradle.wrapper.Install.createDist(Install.java:48)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:128)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
Please check my command line feedback on the error from the attached txt file.
Flutter doctor runs alright.
|
I don't see any in https://flutter.io/ui-performance/, but maybe they're
somewhere else?
I'm attempting to reply to: #7053 (comment) and request a trace.
| 0 |
**Glide Version** : 4.13.2
**Integration libraries** : None
**Device/Android Version** : Emulator (Pixel 2 - API 30 - Android 11) and
Huawei P30 lite
**Issue details / Repro steps / Use case background** :
I'm trying to load images from the web, delivered by an API. Those images
sometimes contain EXIF data with rotation information and Glide should
automatically rotate those as requested. But for example in the case of the
attached image (see below) the rotation isn't correctly applied.
* If a URL (as string) or a file is used to request the image via Glide, it doesn't get rotated. 🛑
* If I use the path of the file the rotation values are honored. ✅
**Glide load line /`GlideModule` (if any) / list Adapter code (if any)**:
I just created a default Android example app, added three image views and
loaded the same image in the three mentioned ways.
In onViewCreated() in FirstFragment
val url = "https://url-hosting-some-images.com/your-image.HEIC"
val result = runBlocking(Dispatchers.IO) {
val file = Glide.with(this@FirstFragment).downloadOnly().load(url).submit()
file.get()
}
firstImage(url)
secondImage(result)
thirdImage(result.absolutePath)
The actual methods in FirstFragment
// rotation is missing
private fun firstImage(url: String) {
Glide
.with(this)
.load(url)
.into(binding.imageFirst);
}
// rotation is missing
private fun secondImage(file: File) {
Glide
.with(this)
.load(file)
.into(binding.imageSecond);
}
// rotation is correctly displayed
private fun thirdImage(path: String) {
Glide
.with(this)
.load(path)
.into(binding.imageThird);
}
**Layout XML** :
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".FirstFragment">
<TextView
android:id="@+id/textview_first"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/hello_first_fragment"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<ImageView
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@id/textview_first"
tools:src="@drawable/ic_launcher_foreground"
android:id="@+id/image_first"
android:contentDescription="first"
android:layout_width="0dp"
android:layout_height="150dp" />
<ImageView
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@id/image_first"
tools:src="@drawable/ic_launcher_foreground"
android:id="@+id/image_second"
android:contentDescription="second"
android:layout_width="0dp"
android:layout_height="150dp" />
<ImageView
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@id/image_second"
tools:src="@drawable/ic_launcher_foreground"
android:id="@+id/image_third"
android:contentDescription="third"
android:layout_width="0dp"
android:layout_height="150dp" />
<Button
android:id="@+id/button_first"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/next"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@id/image_third" />
</androidx.constraintlayout.widget.ConstraintLayout>
**Stack trace / LogCat** : No errors
**Screenshot** :

**Example image** :
As GitHub doesn't support HEIC attachments I had to zip it.
example_image.zip
|
Invalid image: ExifInterface got an unsupported image format
file(ExifInterface supports JPEG and some RAW image formats only) or a
corrupted JPEG file to ExifInterface
W/ExifInterface: Invalid image: ExifInterface got an unsupported image format
file(ExifInterface supports JPEG and some RAW image formats only) or a
corrupted JPEG file to ExifInterface.
java.io.IOException: Invalid byte order: 4646
at
androidx.exifinterface.media.ExifInterface.readByteOrder(ExifInterface.java:6581)
at
androidx.exifinterface.media.ExifInterface.parseTiffHeaders(ExifInterface.java:6588)
at
androidx.exifinterface.media.ExifInterface.getRawAttributes(ExifInterface.java:5617)
at
androidx.exifinterface.media.ExifInterface.loadAttributes(ExifInterface.java:4583)
at androidx.exifinterface.media.ExifInterface.(ExifInterface.java:4032)
at androidx.exifinterface.media.ExifInterface.(ExifInterface.java:3983)
at
com.bumptech.glide.load.resource.bitmap.ExifInterfaceImageHeaderParser.getOrientation(ExifInterfaceImageHeaderParser.java:40)
at
com.bumptech.glide.load.resource.bitmap.ExifInterfaceImageHeaderParser.getOrientation(ExifInterfaceImageHeaderParser.java:53)
at
com.bumptech.glide.load.ImageHeaderParserUtils$4.getOrientation(ImageHeaderParserUtils.java:147)
at
com.bumptech.glide.load.ImageHeaderParserUtils.getOrientationInternal(ImageHeaderParserUtils.java:222)
at
com.bumptech.glide.load.ImageHeaderParserUtils.getOrientation(ImageHeaderParserUtils.java:142)
at
com.bumptech.glide.load.resource.bitmap.ImageReader$ByteBufferReader.getImageOrientation(ImageReader.java:166)
at
com.bumptech.glide.load.resource.bitmap.Downsampler.decodeFromWrappedStreams(Downsampler.java:330)
at
com.bumptech.glide.load.resource.bitmap.Downsampler.decode(Downsampler.java:285)
at
com.bumptech.glide.load.resource.bitmap.Downsampler.decode(Downsampler.java:187)
at
com.bumptech.glide.load.resource.bitmap.ByteBufferBitmapDecoder.decode(ByteBufferBitmapDecoder.java:28)
at
com.bumptech.glide.load.resource.bitmap.ByteBufferBitmapDecoder.decode(ByteBufferBitmapDecoder.java:12)
at
com.bumptech.glide.load.engine.DecodePath.decodeResourceWithList(DecodePath.java:92)
at
com.bumptech.glide.load.engine.DecodePath.decodeResource(DecodePath.java:70)
at com.bumptech.glide.load.engine.DecodePath.decode(DecodePath.java:59)
at
com.bumptech.glide.load.engine.LoadPath.loadWithExceptionList(LoadPath.java:76)
at com.bumptech.glide.load.engine.LoadPath.load(LoadPath.java:57)
at com.bumptech.glide.load.engine.DecodeJob.runLoadPath(DecodeJob.java:535)
at
com.bumptech.glide.load.engine.DecodeJob.decodeFromFetcher(DecodeJob.java:499)
at com.bumptech.glide.load.engine.DecodeJob.decodeFromData(DecodeJob.java:485)
at
com.bumptech.glide.load.engine.DecodeJob.decodeFromRetrievedData(DecodeJob.java:430)
at
com.bumptech.glide.load.engine.DecodeJob.onDataFetcherReady(DecodeJob.java:394)
at
com.bumptech.glide.load.engine.DataCacheGenerator.onDataReady(DataCacheGenerator.java:100)
at
com.bumptech.glide.load.model.ByteBufferFileLoader$ByteBufferFetcher.loadData(ByteBufferFileLoader.java:62)
at
com.bumptech.glide.load.engine.DataCacheGenerator.startNext(DataCacheGenerator.java:77)
at com.bumptech.glide.load.engine.DecodeJob.runGenerators(DecodeJob.java:311)
at com.bumptech.glide.load.engine.DecodeJob.runWrapped(DecodeJob.java:277)
at com.bumptech.glide.load.engine.DecodeJob.run(DecodeJob.java:235)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1162)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:636)
at
com.bumptech.glide.load.engine.executor.GlideExecutor$DefaultThreadFactory$1.run(GlideExecutor.java:413)
at java.lang.Thread.run(Thread.java:764)
at
com.bumptech.glide.load.engine.executor.GlideExecutor$DefaultPriorityThreadFactory$1.run(GlideExecutor.java:372)
Actually,It is not a bug from Android ExifInterface , but it is a bug of
Glide, and happened in Webp image.
1, Why the exception happened ?
Before the exception happened, I debug the code and find that the image mime
type check result is "unknown".
I found that , the ExifInterfaceImageHeaderParser get wrong header data which
like this " [70, 70, -86, 8, 0, 0, 87, 69, 66, 80, 86, ......." .
Webp file starts with "RIFFxxxxWebpxxxx", must start with "RIFF" four
characters , but the start characters begin with "FF", and "RI" is lost.
2, Why "FF" and not "RIFF"
First, I check the cache file, and the header start with "RIFF" , so it is not
the problem of image file.
Then, check the code of getOrientation function , We know that there are two
image header parsers do the work:
DefaultImageHeaderParser
ExifInterfaceImageHeaderParser
Two parsers share one image file input stream object.
The first parser DefaultImageHeaderParser call imageReader obj to get
getUInt16, and check image number;
Because it is webp file, so it is not DefaultImageHeaderParser's work, so
return unknown and the next parser continue the work.
While ExifInterfaceImageHeaderParser is working, but it does not do reset for
the shared stream obj.
Because, the first parser read one UInt16, 2 bytes, so the second parser read
header buffer from position = 2, not 0.
So, The begin 2 characters lost.
Based on webp header def:
**RIFF Header - 32 bits representing the ASCII characters ‘R’ ‘I’ ‘F’ ‘F’
File Size - 32 bits (uint32) representing the size of the file in bytes
starting at offset 8. The maximum value of this field is 2^32 minus 10 bytes
and thus the size of the whole file is at most 4GiB minus 2 bytes.
‘WEBP’ - 32 bits representing the ASCII characters ‘W’ ‘E’ ‘B’ ‘P’**
We know that, webp file type check, the first four characters are the key
point. So the mime type check failed.
Then, exif info check run into unknown file type, and parser header based on
JPEG, so throw the exception like :
"W/ExifInterface: Invalid image: ExifInterface got an unsupported image format
file(ExifInterface supports JPEG and some RAW image formats only) or a
corrupted JPEG file to ExifInterface. "
Glide provide wrong stream byte data is the root cause. It is not the problem
of ExifInterface.
Of course, It is based on if you refer ExifInterface version = 1.2.0.
Because ExifInterface support webp header parse from 1.2.0, maybe, I found
webp parsing code in 1.2.0 and not check earlier version.
So the suggestion is that adding reset code for input stream before
ExifInterfaceImageHeaderParser do any parsing action in new version.
I did this check with glide 4.13.1 and ExifInterface 1.2.0.
_Originally posted by@aaronKueng in #4451 (comment)_
| 1 |
# Environment
Windows build number: 10.0.18362.86
Windows Terminal version (if applicable): 0.1.1361.0
Any other software?
# Steps to reproduce
Open a PowerShell tab in Windows Terminal. Attempt to use these PSReadLine key
bindings (use the Get-PSReadLineKeyHandler command to see a list of all key
bindings):
* Ctrl+Backspace (BackwardKillWord)
* Shift+Ctrl+Enter (InsertLineBelow)
* Ctrl+Space (MenuComplete)
* Shift+Tab (TabCompletePrevious)
* Alt+0 (DigitArgument) (and other digits)
* PageDown (ScrollDisplayDown)
* Ctrl+PageDown (ScrollDisplayDownLine)
* PageUp (ScrollDisplayUp)
* Ctrl+PageUp (ScrollDisplayUpLine)
* Ctrl+Alt+? (ShowKeyBindings)
* Alt+? (WhatIsKey)
# Expected behavior
Key executes the specified PSReadLine function.
# Actual behavior
Nothing happens (or the current tab changes in the case of Alt+digit)
|
* Your Windows build number: (Type `ver` at a Windows Command Prompt)
Microsoft Windows [Version 10.0.18362.86]
* What you're doing and what's happening: (Copy & paste specific commands and their output, or include screen shots)
Trying to add an empty line in a multi-line input by pressing `Shift` \+
`Enter` with PSReadline in PowerShell
* What's wrong / what should be happening instead:
`Shift+Enter` is treated as `Enter`, and it tries to run my incomplete command
line
| 1 |
For reasons, I have a file called BUILD in my ios/ directory, which conflicts
with Xcode's default build output directory ('build/'), since macOS has case-
insensitive file system by default.
I can get around that by setting SYMROOT in project.pbxproj, but the flutter
tools still insist on using 'build/'. It would be nice if the build directory
was configurable with a command-line parameter, and the xcode_backend.sh
script passed $SYMROOT in there.
## Steps to Reproduce
1. Create an empty (non-directory) file named 'build' in ios/
2. flutter run // <\-- doesn't work, fails building iOS project and flutter
3. Change SYMROOT in the Xcode project file
4. flutter run // <\-- still doesn't work, but only flutter fails now
## Flutter Doctor
[✓] Flutter (on Mac OS, channel master)
• Flutter at /Users/jakobr/src/flutter
• Framework revision `5f44b2d` (8 hours ago), 2016-08-22 17:14:30
• Engine revision `0949efe`
• Tools Dart version 1.19.0-dev.5.0
[x] Android toolchain - develop for Android devices
x Android Studio / Android SDK not found. Download from
https://developer.android.com/sdk/
(or visit https://flutter.io/setup/#android-setup for detailed instructions).
[✓] iOS toolchain - develop for iOS devices (Xcode 7.3.1)
• XCode at /Applications/Xcode.app/Contents/Developer
• Xcode 7.3.1, Build version 7D1014
[✓] Atom - a lightweight development environment for Flutter
• flutter plugin version 0.2.4
• dartlang plugin version 0.6.35
## Logs and Crash Reports
Flutter crash report; please file at
https://github.com/flutter/flutter/issues.
## command
flutter --suppress-analytics build flx --target=lib/main.dart --output-
file=example/ios/Flutter/app.flx
## exception
FileSystemException: Creation failed, path = 'build' (OS Error: File exists,
errno = 17)
dart:io _Directory.createSync
package:file/src/sync/local/local_directory.dart 13 _LocalDirectory.create
package:flutter_tools/src/base/file_system.dart 65 ensureDirectoryExists
package:flutter_tools/src/flx.dart 86 build.<async>
===== asynchronous gap ===========================
dart:async _Completer.completeError
package:flutter_tools/src/flx.dart 114 build.<async>
===== asynchronous gap ===========================
dart:async Future.Future.microtask
package:flutter_tools/src/flx.dart build
package:flutter_tools/src/commands/build_flx.dart 44 BuildFlxCommand.runInProject.<async>
dart:async _SyncCompleter.complete
package:flutter_tools/src/commands/build.dart 56 BuildSubCommand.runInProject.<async>
===== asynchronous gap ===========================
dart:async _asyncThenWrapperHelper
package:flutter_tools/src/commands/build_flx.dart BuildFlxCommand.runInProject
package:flutter_tools/src/runner/flutter_command.dart 186 FlutterCommand._run.<async>
dart:async _SyncCompleter.complete
package:flutter_tools/src/dart/pub.dart 60 pubGet.<async>
dart:async _SyncCompleter.complete
package:flutter_tools/src/base/process.dart 77 runCommandAndStreamOutput.<async>
===== asynchronous gap ===========================
dart:async _asyncThenWrapperHelper
package:flutter_tools/src/runner/flutter_command.dart FlutterCommand._run
package:flutter_tools/src/runner/flutter_command.dart 116 FlutterCommand.run
package:args/command_runner.dart 177 CommandRunner.runCommand.<fn>
dart:async Future.Future.sync
package:args/command_runner.dart 130 CommandRunner.runCommand
package:flutter_tools/src/runner/flutter_command_runner.dart 177 FlutterCommandRunner.runCommand.<async>
===== asynchronous gap ===========================
dart:async Future.Future.microtask
package:flutter_tools/src/runner/flutter_command_runner.dart FlutterCommandRunner.runCommand
package:args/command_runner.dart 104 CommandRunner.run.<fn>
dart:async Future.Future.sync
package:args/command_runner.dart 104 CommandRunner.run
package:flutter_tools/src/runner/flutter_command_runner.dart 123 FlutterCommandRunner.run
package:flutter_tools/executable.dart 88 main.<async>.<fn>.<async>
===== asynchronous gap ===========================
package:stack_trace Chain.capture
package:flutter_tools/executable.dart 82 main.<async>
## flutter doctor
|
These two files have duplicated code:
packages/flutter/test/rendering/rendering_tester.dart
packages/flutter_test/lib/src/widget_tester.dart
We should consolidate to simplify maintenance.
| 0 |
trait Bar<T> { fn dummy(&self); }
trait Car<T> { fn dummy(&self); }
trait Foo {
type A;
type B: Bar<Self::A>;
type C: Car<Self::A>;
fn get_b(&self) -> &Self::B;
}
fn test_bar<A, B: Bar<A>>(_: &B) {}
fn test<A, F: Foo<A=A>>(f: &F) {
test_bar(f.get_b());
}
Gives me:
<anon>:15:16: 15:23 error: the trait `Bar<A>` is not implemented for the type `<F as Foo>::B` [E0277]
<anon>:15 test_bar(f.get_b());
^~~~~~~
<anon>:15:16: 15:23 error: the trait `Car<A>` is not implemented for the type `<F as Foo>::C` [E0277]
<anon>:15 test_bar(f.get_b());
^~~~~~~
error: aborting due to 2 previous errors
playpen: application terminated with error code 101
There is a workaround:
fn test<A, B: Bar<A>, C: Car<A>, F: Foo<A=A, B=B, C=C>>(f: &F) {
test_bar(f.get_b());
}
But it's ugly and should not be necessary.
We hit this problem a lot with gfx::Device, and I'd like to see cleaner use of
it without explicit `CommandBuffer` bounds that we use as a workaround.
|
I've been running into a linker error `undefined reference to
'panicking::panic::h98aad983367da335F7E'` while trying to build with
`#![no_std]`. While trying to debug, I ended up with some code that looked
like this:
https://gist.github.com/anonymous/88c01ee6ee33e4bad656
When you try to compile, it fails saying that "panic" is a duplicate
definition. This makes sense because it's defined in `libcore`, but the error
explanation suggests that this _is_ possible, and the correct way to to do it
is to use `#![no_std]`, which I'm already doing.
Seems like either a problem with the library or with the error explanation?
| 0 |
I was looking at the ThreeJS materials. There are four materials that are
nearly the same thing. Basic, Lambert, Phong and NormalMap.
NormalMap seems to have all the features of Lambert and Phong and even has the
ability to turn off things like Specular (to create a Lambert material.)
I think that Basic is a material where color is set into emissive and it isn't
affected by lights.
Thus maybe four different materials could be combined into one.
I am wondering if it is worth keeping all of these materials around on the
shader side of things.
I think that with proper use of DEFINES (as it is already done) we can have
for the most part a single vertex / fragment shader pair for all standard
materials. The variations can be completely controlled via the DEFINES rather
than constructing new shaders via including snippets. I think that with proper
use of DEFINES in the more complex shader you can still get the speed you
would get from a simple Lambert shaders.
One full shader is a lot easier to maintain than three (or four) very similar
shaders. (I also struggle trying to follow the code because of the way the
snippets are combined -- I wonder if there is an alternative design possible
here that is easier to grok.)
Even if we combined things, we could still keep the existing material classes
around for creating specialized materials for those that do not want to deal
with all the options explicitly. Or one can create a factory class for
creating common sub-types of StandardMaterials. But underneath these would all
refer to the same shader, they would just provide simplified parameters to
users.
BTW I am open to other design suggestions regarding the shaders. I do not have
that much experience with dynamic shader generation designs.
(I push this in part because the more we use a single shader for everything
the more can group together objects into batches... Now there are some
additional complications, specifically one needs to be able to control the
variations without defines to do full batches, but that can come later as an
option I think.)
|
**Note:** I don't want to bury the lede for this request, so I moved a brief
summary to the top of the issue. Please refer to the second and third headings
for a more detailed background and rationale!
##### Proposal
When loading models with `GLTFLoader`, it would be nice if we could retain
associations between glTF scene graph elements and the Three.js objects that
correspond to them. I briefly discussed this problem with @donmccurdy and he
suggested that one of the following approaches might be acceptable:
1. Decorate Three.js objects via `userData` e.g., `material.userData.gltfMeta.index = 3`
2. Expose a lookup table somewhere e.g., `parser.getDefinition( material: THREE.Material ): GLTF.Material`
What do others think about this problem? Would you be willing to consider a
change that makes either of the suggested enhancements?
##### Background
Three.js' `GLTFLoader` currently produces an artifact that contains a Three.js
scene graph as well as a `GLTFParser` instance that holds the original,
deserialized glTF.
The `GLTFParser` holds associations between the elements in the original glTF
and the Three.js scene graph. These associations are expressed both by
`GLTFParser.prototype.getDependencies` and the `cache` object on the
`GLTFParser` instance.
In some cases, the relationship between the original glTF element and the
associated Three.js objects is obscured and cannot be trivially re-
established. For example, materials associated with a skinned mesh are cloned,
and the resulting clones are cached against a key that cannot be reconstructed
easily:
three.js/examples/jsm/loaders/GLTFLoader.js
Lines 2118 to 2127 in 648c4bb
| cachedMaterial = material.clone();
---|---
|
| if ( useSkinning ) cachedMaterial.skinning = true;
| if ( useVertexTangents ) cachedMaterial.vertexTangents = true;
| if ( useVertexColors ) cachedMaterial.vertexColors = true;
| if ( useFlatShading ) cachedMaterial.flatShading = true;
| if ( useMorphTargets ) cachedMaterial.morphTargets = true;
| if ( useMorphNormals ) cachedMaterial.morphNormals = true;
|
| this.cache.add( cacheKey, cachedMaterial );
##### Description of the problem
Consider a model editor that allows a user to:
1. Load a glTF with any shape
2. Interactively change the value of any field of any element of the glTF
3. See the result of the change rendered in real time
4. Export a new glTF
In an ideal world, the editor should faithfully re-export the model with its
original hierarchy and shape. The only meaningful change in the exported file
should be the one made interactively by the user.
In the pursuit of this goal, it is very useful (and probably required) that an
association between the source glTF scene graph and the Three.js scene graph
be made, so that for any change we make to a glTF element, we can reflect that
change in the rendered Three.js scene in real-time without completely re-
loading the glTF.
Some of the information required to correctly correlate the glTF scene graph
with the Three.js scene graph is lost or obscured, and this makes the above
described editor scenario very difficult (perhaps impossible) to achieve.
##### Three.js version
* Dev
* r115
* ...
##### Browser
* All of them
* Chrome
* Firefox
* Internet Explorer
##### OS
* All of them
* Windows
* macOS
* Linux
* Android
* iOS
##### Hardware Requirements (graphics card, VR Device, ...)
| 0 |
Hello,
I experiencing similar issue as #7049 .
I raised this issue segmentio/nightmare#894 as well. But I think it has
nothing to do with nightmare, since after installing 32 bit version of
electron it seems that this issue has disappeared. It's not an emergency for
me or something, since I have no problem using 32bit version, just though you
guys might want to know about it.
segmentio/nightmare#894 has more description, don't want to duplicate it. If
you need more information or something, let me know. Also I might mention that
I didn't had this issue on debian 8, it's just after I installed linux mint
18, so it might be something with linux kernel maybe, since mint uses newest
version?
Cheers
|
Recently, our company are using electron to build a application on the
windows.
But during the testing, we found that every time the application was launched,
it will launch not 1 windows process but 2 or 3. After the application was
closed, only 1 of those processes existed. Left another 1 or 2 zombie
processes still "running".
Is this normal ?
The application was build with https://github.com/maxogden/electron-packager
Tested electron versions: 0.26.1, 0.27.0, 0.27.0
Tested platfrom: windows 7 64bit, windows 8 32bit
| 0 |
When project does import properties from separate MSBuild file via `<Import
Project="build.props" />`, The MSBuild is using defined properties, but on
file save of TS file it is being ignored.
It worked with with VS13/VS15RC and TS 1.5 beta, but after release it is
broken.
I can provide ZIP file with example solution, but I cannot add it to github
issue.
|
As reported by @licinioamendes in #2152 (comment):
> Would be nice allow VS macros for build commands.
>
> The result would be something like `$(ProjectDir)outputDirectoryPath` on
> textbox.
Also note that this is working in VS2013 but not in VS2015.
| 1 |
**Chris Beams** opened **SPR-4975** and commented
Today the `@MVC` programming model requires manual instantiation and
invocation of Validator classes:
public String onSubmit(`@ModelAttribute` Customer customer, BindingResult
result) {
new CustomerValidator().validate(customer, result);
if(result.hasErrors()) {
return "editAccount";
}
return "...";
}
It would be preferable to have validation be an integrated part of the `@MVC`
lifecycle.
For instance, Validator instances could be annotated as `@Component` and be
subject to component scanning. At any rate, whether declared via bean
definitions or component scanned, registered Validator instances should be
iterated through, and any returning true from Validator#supports() should be
used to validate candidate objects.
Additionally, I believe the interface-based approach is preferable here, vs a
`@Validator` approach, because the strong contract of supports()->validate()
is a useful and expressive one. It would be sufficient to simply allow
Validator instances to be component scanned. We just need to add iterating
through and invoking registered validators to the lifecycle.
* * *
**Issue Links:**
* #4803 Support for declarative validation (Hibernate Validator, anticipating JSR 303) ( _ **"duplicates"**_ )
|
**Rob Winch** opened **SPR-7128** and commented
Create a HandlerExceptionResolver that handles BindExceptions so that
boilerplate code can be removed from controllers using `@Valid`. See the
Spring forum reference for details.
PS: I would be more than happy to contribute this
* * *
**Reference URL:** http://forum.springsource.org/showthread.php?t=87408
**Attachments:**
* SPR-7128.patch ( _28.86 kB_ )
**Issue Links:**
* #12196 Allow standard form "retry on error" logic to be handled by framework code instead of duplicated across `@Controller` code
| 0 |
First thought was extending `MarkdownWidget` and clipping the received
children, but if I'm not mistaking longer text sections without an explicit
line break will be packed into a single widget.
Any thoughts?
|
## Steps to Reproduce
Set FontWeight.bold to Text's style. Emoji are not displayed.
void main() {
runApp(new MaterialApp(
home: new Scaffold(
appBar: new AppBar(
title: const Text('Emoji'),
),
body: new Container(
padding: const EdgeInsets.all(20.0),
child: new Column(
children: <Widget>[
new Text('normal 😀 💪 emoji'),
new Text('bold 😀 💪 emoji',
style: const TextStyle(fontWeight: FontWeight.bold)),
],
),
),
),
));
}
## Flutter Doctor
[✓] Flutter (on Mac OS X 10.12.6 16G29, locale ja-JP, channel master)
• Flutter at /Applications/flutter
• Framework revision 79fbf8bb03 (4 hours ago), 2017-10-18 20:55:00 -0700
• Engine revision 7c4142808c
• Tools Dart version 1.25.0-dev.11.0
[✓] Android toolchain - develop for Android devices (Android SDK 26.0.1)
• Android SDK at /Users/najeira/Library/Android/sdk
• Platform android-26, build-tools 26.0.1
• ANDROID_HOME = /Users/najeira/Library/Android/sdk
• Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] iOS toolchain - develop for iOS devices (Xcode 9.0)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 9.0, Build version 9A235
• ios-deploy 1.9.2
• CocoaPods version 1.2.1
[✓] Android Studio (version 2.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] IntelliJ IDEA Ultimate Edition (version 2017.2.1)
• Flutter plugin version 16.0
• Dart plugin version 172.3544.34
| 0 |
It would be extremely handy if the fancy zones showed their dimensions for the
OCD amongst us. This could be implemented by vertical & horizontal arrows
within each zone, showing the size.
| ERROR: type should be string, got "\n\nhttps://www.reddit.com/r/Windows10/comments/ez4xpz/lets_fix_this_monitor_positioning_issue_after/\n\nReddit's user vanarebane had a great idea here. When the mouse crosses a\nmonitor boundary, we extrapolate where the mouse should be mapped to on the\nnext monitor as if they had a 1:1 side by side map.\n\nimage is from the Reddit link \n\n\nAs someone with exactly the setup the image shows, this would be very nice to\nhave.\n\n" | 0 |
**TypeScript Version:**
1.8.2
**Code**
function whatever (idA, idB) {
[idA, idB] = [idB, idA];
return true;
}
**Expected behavior:**
function whatever(idA, idB) {
var _a;
_a = [idB, idA], idA = _a[0], idB = _a[1];
return true;
}
**Actual behavior:**
function whatever(idA, idB) {
_a = [idB, idA], idA = _a[0], idB = _a[1];
return true;
var _a;
}
Thanks to variable hoisting, there is no functional problem here. However, I
consider the compiled code unintuitive.
In addition, using Istanbul for code coverage highlights the `var _a;`
declaration as not covered.
|
_From@admmasters on April 5, 2016 13:34_
* VSCode Version: Version 0.10.14-insider (0.10.14-insider)
* OS Version: OSX 10.11.4
Steps to Reproduce:
1. Cmd+Click on a Component in JSX (also JS).
2. VSCode jumps to the import and not the file itself.
Expected:
VSCode should just jump to the file referenced in the import as its smart
enough to know the specific import.
_Copied from original issue:microsoft/vscode#4964_
| 0 |
If you scale the errors fed into curve_fit, the error returned does not
change. This is incorrect behavior. The error (variance and covariance) for a
final fit parameter should increase given an increased uncertainty in the
actual measured data.
I created a gaussian curve fitting routine. I then show one version
functioning as expected, and another version with error increased by a factor
of 1e10. The increase in 1e10 should be reflected in the variance/covariance
for all fitted parameters. It remains constant:
#!/usr/bin/env python
import numpy as np
import matplotlib.pyplot as plt
import collections as col
import scipy as sp
import scipy.optimize as spopt
plt.close('all')
def _gaussvar(x,amp,mu,variance,bg=0):
return amp*np.exp(-(x-mu)**2/(2*variance))+bg
def iter(factor):
# Generate random numbers with gaussian
# distribution, mu=0, sigma = 3
x=np.random.randn(10000)*3
# Bin the counts
bins = 50
h,edge = np.histogram(x,bins=bins)
# Find the midpoints of the bins
mids = edge + (edge[1]-edge[0])/2
mids = mids[:-1]
# Expected error in counts is sqrt(counts)
sigma = np.sqrt(h)*1
# Error of zero counts isn't zero, but less than one.
# Use 0.5 as a guess.
sigma[sigma==0] = 0.5
# Factor scales error, which should cause
# the pcov matrix to change
sigma = sigma*factor
# Fit the histogram to a gaussian
# popt,pcov,red_chisq = mt.gaussfit(mids,h,sigma_y=sigma,plot=True,variance_bool=True)
# Find initial guesses
y=h
x=mids
amp = max(y)
mu = sum(x*y)/sum(y)
variance = sum(x**2 * y)/sum(y)
bg = 0
p0 = np.array((amp,mu,variance,bg))
# Do actual curve fit
func = _gaussvar
popt,pcov = spopt.curve_fit(func,x,y,sigma=sigma,p0=p0)
output = col.namedtuple('iterout',['popt','pcov'])
out = output(popt,pcov)
return out
num_samples = 1000
variances_regular = np.ones(num_samples)
variances_large = np.ones(num_samples)
means_regular = np.ones(num_samples)
means_large = np.ones(num_samples)
for i in np.linspace(1,num_samples,num_samples)-1:
out = iter(1)
means_regular[i] = out.popt[2]
variances_regular[i] = out.pcov[2,2]
out = iter(1e10)
means_large[i] = out.popt[2]
variances_large[i] = out.pcov[2,2]
plt.hist(variances_regular,bins=20)
plt.hist(variances_large,bins=20)
plt.show()
See also:

https://github.com/joelfrederico/GaussFitDemo
|
In the context of #14360 & #15040, we cleaned up a lot of our module API and
deprecated those that are supposed to be private, for removal that's scheduled
for SciPy 2.0
This was primarily (only?) done for the first-level modules though, not
further down the tree. The added test infrastructure contains:
# The PRIVATE_BUT_PRESENT_MODULES list contains modules that look public (lack
# of underscores) but should not be used. For many of those modules the
# current status is fine. For others it may make sense to work on making them
# private, to clean up our public API and avoid confusion.
# These private modules support will be removed in SciPy v2.0.0
PRIVATE_BUT_PRESENT_MODULES = [
...
IMO, without tracking what needs to be done (deprecate and make private, or
keep public) and then doing it, we won't be in a position to actually execute
this for SciPy 2.0, hence I'm opening this issue. Ideally we decide per module
what needs to be done, and then do it, so we can actually remove what we'd
like to remove when the time comes.
* `scipy.constants.codata`
* `scipy.constants.constants`
* `scipy.fftpack.basic`
* `scipy.fftpack.convolve`
* `scipy.fftpack.helper`
* `scipy.fftpack.pseudo_diffs`
* `scipy.fftpack.realtransforms`
* `scipy.integrate.odepack`
* `scipy.integrate.quadpack`
* `scipy.integrate.dop`
* `scipy.integrate.lsoda`
* `scipy.integrate.vode`
* `scipy.interpolate.dfitpack`
* `scipy.interpolate.fitpack`
* `scipy.interpolate.fitpack2`
* `scipy.interpolate.interpnd`
* `scipy.interpolate.interpolate`
* `scipy.interpolate.ndgriddata`
* `scipy.interpolate.polyint`
* `scipy.interpolate.rbf`
* `scipy.io.arff.arffread`
* `scipy.io.harwell_boeing`
* `scipy.io.idl`
* `scipy.io.mmio`
* `scipy.io.netcdf`
* `scipy.io.matlab.byteordercodes`
* `scipy.io.matlab.mio`
* `scipy.io.matlab.mio4`
* `scipy.io.matlab.mio5`
* `scipy.io.matlab.mio5_params`
* `scipy.io.matlab.mio5_utils`
* `scipy.io.matlab.mio_utils`
* `scipy.io.matlab.miobase`
* `scipy.io.matlab.streams`
* `scipy.linalg.basic`
* `scipy.linalg.decomp`
* `scipy.linalg.decomp_cholesky`
* `scipy.linalg.decomp_lu`
* `scipy.linalg.decomp_qr`
* `scipy.linalg.decomp_schur`
* `scipy.linalg.decomp_svd`
* `scipy.linalg.flinalg`
* `scipy.linalg.matfuncs`
* `scipy.linalg.misc`
* `scipy.linalg.special_matrices`
* `scipy.misc.common`
* `scipy.misc.doccer`
* `scipy.ndimage.filters`
* `scipy.ndimage.fourier`
* `scipy.ndimage.interpolation`
* `scipy.ndimage.measurements`
* `scipy.ndimage.morphology`
* `scipy.odr.models`
* `scipy.odr.odrpack`
* `scipy.optimize.cobyla`
* `scipy.optimize.cython_optimize`
* `scipy.optimize.lbfgsb`
* `scipy.optimize.linesearch`
* `scipy.optimize.minpack`
* `scipy.optimize.minpack2`
* `scipy.optimize.moduleTNC`
* `scipy.optimize.nonlin`
* `scipy.optimize.optimize`
* `scipy.optimize.slsqp`
* `scipy.optimize.tnc`
* `scipy.optimize.zeros`
* `scipy.signal.bsplines`
* `scipy.signal.filter_design`
* `scipy.signal.fir_filter_design`
* `scipy.signal.lti_conversion`
* `scipy.signal.ltisys`
* `scipy.signal.signaltools`
* `scipy.signal.spectral`
* `scipy.signal.spline`
* `scipy.signal.waveforms`
* `scipy.signal.wavelets`
* `scipy.signal.windows.windows`
* `scipy.sparse.base`
* `scipy.sparse.bsr`
* `scipy.sparse.compressed`
* `scipy.sparse.construct`
* `scipy.sparse.coo`
* `scipy.sparse.csc`
* `scipy.sparse.csr`
* `scipy.sparse.data`
* `scipy.sparse.dia`
* `scipy.sparse.dok`
* `scipy.sparse.extract`
* `scipy.sparse.lil`
* `scipy.sparse.linalg.dsolve`
* `scipy.sparse.linalg.eigen`
* `scipy.sparse.linalg.interface`
* `scipy.sparse.linalg.isolve`
* `scipy.sparse.linalg.matfuncs`
* `scipy.sparse.sparsetools`
* `scipy.sparse.spfuncs`
* `scipy.sparse.sputils`
* `scipy.spatial.ckdtree`
* `scipy.spatial.kdtree`
* `scipy.spatial.qhull`
* `scipy.spatial.transform.rotation`
* `scipy.special.add_newdocs`
* `scipy.special.basic`
* `scipy.special.cython_special`
* `scipy.special.orthogonal`
* `scipy.special.sf_error`
* `scipy.special.specfun`
* `scipy.special.spfun_stats`
* `scipy.stats.biasedurn`
* `scipy.stats.kde`
* `scipy.stats.morestats`
* `scipy.stats.mstats_basic`
* `scipy.stats.mstats_extras`
* `scipy.stats.mvn`
* `scipy.stats.statlib`
* `scipy.stats.stats`
| 0 |
**FoX** opened **SPR-4111** and commented
All ApplicationListeners will be instantiated when the ApplicationContext
registers its listeners.
The consequence of this operation for our application is that nearly all beans
are initialized at startup time eventhough lazy-init is set to true on all
contexts.
This is especially annoying in our integration tests, because the
initialization takes a lot of time.
Isn't there a possibility to register these listeners as a proxy, allowing
lazy initialization of these listeners until an application event occurs?
Or, maybe it would also be a valid solution to not put the listeners
themselves in the map, but the bean names (not that nice, as it would break
observer/observable pattern)
It seems that the getBeansOfType() method in the DefaultListableBeanFactory
doesn't take the allowEagerInit boolean into account when retrieving the
beans:
public Map getBeansOfType(Class type, boolean includePrototypes, boolean
allowEagerInit) throws BeansException {
String[] beanNames = getBeanNamesForType(type, includePrototypes,
allowEagerInit);
Map result = CollectionFactory.createLinkedMapIfPossible(beanNames.length);
for (int i = 0; i < beanNames.length; i++) {
String beanName = beanNames[i];
try {
// -----------------------> getBean(beanName) will always instantiate the
bean, eager init or not?
result.put(beanName, getBean(beanName));
}
...
}
...
}
* * *
**Affects:** 2.0.7
**Issue Links:**
* #8733 ApplicationListener beans eagerly instantiated even when marked as lazy-init ( _ **"duplicates"**_ )
|
**Patras Vlad Sebastian** opened **SPR-7984** and commented
Allow mapping the same method to multiple URL's that differ in respect to what
path variables they use.
For example we do not want to transfer the id of the logged in user in the URL
for security reasons and simplicity, but there are pages that can either show
data for the logged in user or an other user, like a profile page.
I would like to be able to map a method like this:
@RequestMapping(value = {"/profile", "/profile/{userId}"})
public String showProfile(@PathVariable("userId") Integer userId) {
if (userId == null) {
userId = <get id of logged in user from session>;
}
//do stuff
}
The first path of the request mapping to does not have a "userId" path
variable, and it would be nice if userId (the method parameter) would be null
in this case, rather than throwing IllegalStateException.
* * *
**Affects:** 3.0.5
**Issue Links:**
* #13049 A `@PathVariable` provided in the method parameter, but not in the `@RequestMapping`, will throw an IllegalStateException. ( _ **"is duplicated by"**_ )
1 votes, 4 watchers
| 0 |
On Sf2.7 the same schemaLocation is used as before:
http://symfony.com/schema/dic/services/services-1.0.xsd
But there is an old schema, the new one is only in the DependencyInjection
Component.
I think it would be a good idea to make a new version services-1.1.xsd and
publish it under symfony.com
|
http://symfony.com/schema/dic/services/services-1.0.xsd
is not in sync with
https://github.com/symfony/symfony/blob/master/src/Symfony/Component/DependencyInjection/Loader/schema/dic/services/services-1.0.xsd
(expression as valid value for argument_type missing)
| 1 |
### Preflight Checklist
* I have read the Contributing Guidelines for this project.
* I agree to follow the Code of Conduct that this project adheres to.
* I have searched the issue tracker for a feature request that matches the one I want to file, without success.
### Electron Version
16.0.0-beta.4
### What operating system are you using?
Ubuntu
### Operating System Version
20.04
### What arch are you using?
x64
### Last Known Working Electron version
15.3.0
### Expected Behavior
On Linux, `app.getPath('crashDumps')` should return something like:
~/.config/{{appName}}/Crash Reports/
### Actual Behavior
However, with v16 beta it returns:
~/.config/Electron/Crash Reports/
### Testcase Gist URL
_No response_
|
### Preflight Checklist
* I have read the Contributing Guidelines for this project.
* I agree to follow the Code of Conduct that this project adheres to.
* I have searched the issue tracker for a feature request that matches the one I want to file, without success.
### Electron Version
16.0.0-beta.3
### What operating system are you using?
Other Linux
### Operating System Version
Linux pop-os 5.13.0-7614-generic #14~1631647151~21.04~930e87c-Ubuntu
### What arch are you using?
x64
### Last Known Working Electron version
15.3.0
### Expected Behavior
From the documentation for `app.getPath("userData")`
(https://www.electronjs.org/docs/latest/api/app#appgetpathname):
> The directory for storing your app's configuration files, which by default
> it is the appData directory appended with your app's name.
So `app.getPath("userData")` in the linked gist should be `~/.config/test-
electron`.
### Actual Behavior
`app.getName()` returns the correct name from package.json but
`app.getPath("userData")` returns `~/.config/Electron`.
### Testcase Gist URL
https://gist.github.com/weedz/ad42b854df4e5bc411d18e48ae1c91df
### Additional Information
Found this issue which might be relevant #30112.
Same result in the following devcontainer
https://gist.github.com/weedz/c962d69078391115635772b279264c02.
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
Mac os
jdk1.8
Nacos1.1.4 集群
dubbo 2.7.6
spring boot 2.0.8
### Steps to reproduce this issue
alibaba/nacos#2928
### Expected Result
What do you expected from the above steps?
### Actual Result
What actually happens?
If there is an exception, please attach the exception trace:
Just put your stack trace here!
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.0
* Operating System version: xxx
* Java version: xxx
### Steps to reproduce this issue
Start dubbo-demo consumer. By default the configuration it is using is
zookeeper where user by default will not use zookeeper and go with multicast
as provider is running on multicast.
1. xxx
2. xxx
3. xxx
Pls. provide [GitHub address] to reproduce this issue.
### Expected Result
What do you expected from the above steps?
### Actual Result
What actually happens?
If there is an exception, please attach the exception trace:
Just put your stack trace here!
| 0 |
is a duplicate of #7540. Sorry.
|
Hi guys,
Due to the `model_selection` refactor, there are a lot of dead links out
there. Would it be an option to redirect the old pages such as
http://scikit-
learn.org/stable/modules/generated/sklearn.grid_search.GridSearchCV.html
to
http://scikit-
learn.org/stable/modules/generated/sklearn.model_selection.GridSearchCV.html
I know you're using an automated doc generator, so I can imagine this to be a
bit a ball ache. But as an end user, I'm sure that this would be much
appreciated.
Thanks!
-Kris
| 1 |
# Checklist
* I have verified that the issue exists against the `master` branch of Celery.
* This has already been asked to the discussion group first.
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
* I have checked the pull requests list
for existing proposed fixes.
* I have checked the commit log
to find out if the bug was already fixed in the master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
* I have included the contents of `pip freeze` in the issue.
* I have included all the versions of all the external dependencies required
to reproduce this bug.
## Optional Debugging Information
* I have tried reproducing the issue on more than one Python version
and/or implementation.
* I have tried reproducing the issue on more than one message broker and/or
result backend.
* I have tried reproducing the issue on more than one version of the message
broker and/or result backend.
* I have tried reproducing the issue on more than one operating system.
* I have tried reproducing the issue on more than one workers pool.
* I have tried reproducing the issue with autoscaling, retries,
ETA/Countdown & rate limits disabled.
* I have tried reproducing the issue after downgrading
and/or upgrading Celery and its dependencies.
## Related Issues and Possible Duplicates
#### Related Issues
* #5627
Similar issues as was solved for this PR for args exists with kwargs.
#### Possible Duplicates
* None
## Environment & Settings
**Celery version** : 4.4.0
**`celery report` Output:**
# Steps to Reproduce
Just run celery beat.
## Required Dependencies
* **Minimal Python Version** : N/A or Unknown
* **Minimal Celery Version** : N/A or Unknown
* **Minimal Kombu Version** : N/A or Unknown
* **Minimal Broker Version** : N/A or Unknown
* **Minimal Result Backend Version** : N/A or Unknown
* **Minimal OS and/or Kernel Version** : N/A or Unknown
* **Minimal Broker Client Version** : N/A or Unknown
* **Minimal Result Backend Client Version** : N/A or Unknown
### Python Packages
**`pip freeze` Output:**
amqp==2.5.2
apipkg==1.5
apiritif==0.9.0
appdirs==1.4.3
Appium-Python-Client==0.48
appnope==0.1.0
argcomplete==1.10.0
arrow==0.10.0
asn1crypto==1.2.0
astunparse==1.6.3
atomicwrites==1.3.0
attrs==19.3.0
autopep8==1.4.4
Babel==2.7.0
backports.csv==1.0.7
bandit==1.6.2
bcrypt==3.1.7
beautifulsoup4==4.8.0
billiard==3.6.1.0
bleach==3.1.0
boto==2.49.0
bs4==0.0.1
bzt==1.14.0
cached-property==1.5.1
cachetools==3.1.1
cairocffi==1.1.0
CairoSVG==2.4.2
celery==4.4.0
certifi==2019.11.28
cffi==1.13.2
chardet==3.0.4
Click==7.0
click-completion==0.5.2
colorama==0.4.3
colorlog==4.0.2
ConfigArgParse==0.15.1
contextlib2==0.5.5
coreapi==2.3.3
coreschema==0.0.4
coverage==5.0.1
crayons==0.3.0
cryptography==2.8
cssselect==1.1.0
cssselect2==0.2.2
cssutils==1.0.2
ddt==1.2.1
decorator==4.4.1
defusedxml==0.6.0
diff-match-patch==20181111
Django==1.11.27
django-activity-stream==0.8.0
django-appconf==1.0.3
django-autofixture==0.12.1
django-celery-beat==1.5.0
django-celery-results==1.1.2
django-compressor==2.3
django-constance==2.0.0
django-countries==5.5
django-crispy-forms==1.8.1
django-debug-toolbar==2.1
django-debug-toolbar-request-history==0.1.0
django-environ==0.4.5
django-extensions==2.2.5
django-extra-fields==1.1.0
django-extra-views==0.10.0
django-fernet-fields==0.6
django-filter==2.2.0
django-fsm==2.7.0
django-guardian==1.5.1
django-haystack==2.8.1
django-import-export==1.2.0
django-jarc==1.0.15
django-jsonfield-compat==0.4.4
django-libsass==0.7
django-media-fixtures==0.0.3
django-memcache-status==2.1
django-memoize==2.2.1
django-modeltranslation==0.12.1
django-oscar==1.5.4
django-oscar-api==1.0.10.post1
django-phonenumber-field==1.3.0
django-picklefield==1.0.0
django-private-storage==2.2.1
django-ratelimit==2.0.0
django-recaptcha2==1.4.1
django-rest-hooks==1.5.0
django-rest-swagger==2.1.2
django-reversion==2.0.8
django-ses==0.8.13
django-silk==3.0.4
django-simple-history==2.8.0
django-tables2==1.16.0
django-timezone-field==3.1
django-transaction-hooks==0.2
django-treebeard==4.3
django-webtest==1.9.7
django-widget-tweaks==1.4.5
djangorestframework==3.9.4
djangorestframework-jwt==1.11.0
djangorestframework-simplejwt==4.3.0
djangorestframework-xml==1.4.0
dnspython==1.15.0
docx2txt==0.8
dparse==0.4.1
EasyProcess==0.2.8
EbookLib==0.17.1
email-validator==1.0.2
entrypoints==0.3
ephem==3.7.6.0
et-xmlfile==1.0.1
execnet==1.7.1
extract-msg==0.23.1
factory-boy==2.12.0
fake-factory==0.7.2
Faker==3.0.0
fintech==4.3.5
flake8==3.7.9
flake8-bugbear==19.8.0
flake8-isort==2.8.0
flake8-polyfill==1.0.2
Flask==1.1.1
flower==0.9.3
fpdf==1.7.2
freezegun==0.3.12
future==0.18.2
fuzzyset==0.0.19
gevent==1.5a2
geventhttpclient-wheels==1.3.1.dev2
gitdb==0.6.4
gitdb2==2.0.6
GitPython==3.0.5
google-api-python-client==1.7.11
google-auth==1.10.0
google-auth-httplib2==0.0.3
google-auth-oauthlib==0.2.0
gprof2dot==2019.11.30
graypy==2.1.0
greenlet==0.4.15
hdrpy==0.3.3
html2text==2019.9.26
html5lib==1.0.1
httplib2==0.14.0
idna==2.8
IMAPClient==2.1.0
importlib-metadata==1.3.0
inflection==0.3.1
ipaddress==1.0.23
ipcalc==1.99.0
ipython==5.3.0
ipython-genutils==0.1.0
isodate==0.6.0
isort==4.3.21
itsdangerous==1.1.0
itypes==1.1.0
jdcal==1.4.1
Jinja2==2.10.3
jsonfield==2.0.2
jsonpath-rw==1.4.0
kombu==4.6.7
libsass==0.19.4
line-profiler==2.0
locustio==0.13.5
lxml==4.4.2
M2Crypto==0.35.2
manhole==1.6.0
MarkupPy==1.14
MarkupSafe==1.1.1
mccabe==0.6.1
mock==2.0.0
mollie-api-python==1.4.5
more-itertools==8.0.2
msgpack==0.6.2
msgpack-python==0.5.6
mysqlclient==1.3.10
newrelic==5.4.0.132
nose==1.3.7
numpy==1.18.0
oauth2client==4.1.3
oauthlib==3.1.0
odfpy==1.4.0
olefile==0.46
openapi-codec==1.3.2
openpyxl==2.6.4
packaging==19.2
paramiko==2.7.1
pathlib2==2.3.5
pbr==5.4.4
pdf417gen==0.7.0
pdfminer.six==20181108
pdfrw==0.4
pep8==1.7.1
pexpect==4.2.1
phonenumbers==8.11.1
pickleshare==0.7.4
Pillow==6.2.1
pip-licenses==1.17.0
pip-review==1.0
pipdeptree==0.12.1
pkg-resources==0.0.0
pkgconfig==1.5.1
pluggy==0.13.1
ply==3.11
premailer==3.6.1
progressbar33==2.4
prompt-toolkit==1.0.13
psutil==5.6.7
psycopg2-binary==2.8.4
PTable==0.9.2
ptyprocess==0.5.1
purl==1.5
py==1.8.0
pyasn1==0.4.8
pyasn1-modules==0.2.7
pyBarcode==0.8b1
pycodestyle==2.5.0
pycountry==19.8.18
pycparser==2.19
pycryptodome==3.9.4
pycycle==0.0.8
pyflakes==2.1.1
Pygments==2.5.2
PyJWT==1.7.1
Pympler==0.8
PyNaCl==1.3.0
pyparsing==2.4.5
PyPDF2==1.26.0
Pyphen==0.9.5
pysftp==0.2.9
pytest==5.3.2
pytest-cache==1.0
pytest-cov==2.8.1
pytest-django==3.7.0
pytest-factoryboy==2.0.3
pytest-forked==1.1.3
pytest-html==1.19.0
pytest-metadata==1.7.0
pytest-mock==1.13.0
pytest-pep8==1.0.6
pytest-sftpserver==1.3.0
pytest-xdist==1.30.0
python-barcode==0.10.0
python-crontab==2.4.0
python-dateutil==2.8.1
python-gnupg==0.4.5
python-Levenshtein==0.12.0
python-memcached==1.59
python-pptx==0.6.18
python3-openid==3.1.0
python3-saml==1.9.0
pytz==2019.3
PyVirtualDisplay==0.2.5
PyYAML==5.2
pyzmq==18.1.1
qrcode==6.1
raven==6.10.0
rcssmin==1.0.6
realbrowserlocusts==0.4
requests==2.22.0
requests-mock==1.7.0
requests-oauthlib==1.3.0
requests-toolbelt==0.9.1
rest-condition==1.0.3
restnavigator==1.0.1
retrying==1.3.3
rjsmin==1.1.0
rsa==4.0
safety==1.8.5
selenium==3.141.0
sepa-generator==0.1.4
shellingham==1.3.1
simplegeneric==0.8.1
simplejson==3.17.0
six==1.12.0
smmap==0.9.0
smmap2==2.0.5
snippetinjector==1.0.0
social-auth-app-django==3.1.0
social-auth-core==3.2.0
sorl-thumbnail==12.5.0
sortedcontainers==2.1.0
soupsieve==1.9.5
SpeechRecognition==3.8.1
sqlparse==0.3.0
stevedore==1.31.0
suds-jurko==0.6
tablib==0.14.0
tabulate==0.8.6
terminaltables==3.1.0
testfixtures==6.10.3
text-unidecode==1.3
textract==1.6.3
texttable==1.6.2
tinycss==0.4
tinycss2==1.0.2
toml==0.9.2
tornado==5.1.1
tqdm==4.40.2
traitlets==4.3.2
typing==3.6.4
tzlocal==1.5.1
unicodecsv==0.14.1
Unidecode==0.4.21
uritemplate==3.0.1
urllib3==1.25.7
urwid==2.0.1
uWSGI==2.0.18
uwsgitop==0.10
vine==1.3.0
vulture==1.2
waitress==1.4.0
wcwidth==0.1.7
WeasyPrint==50
webencodings==0.5.1
WebOb==1.8.5
WebTest==2.0.33
Werkzeug==0.16.0
xlrd==1.2.0
XlsxWriter==1.2.7
xlwt==1.3.0
xmlsec==1.3.3
zeep==3.4.0
zipp==0.6.0
### Other Dependencies
N/A
## Minimally Reproducible Test Case
Not sure how to reproduce at this moment apart from just running celery beat.
# Expected Behavior
Celery beat runs without errors.
# Actual Behavior
Celery beat log shows following errors:
Traceback (most recent call last):
File "/home/ah/virtualenv/lib/python3.5/site-packages/celery/beat.py", line 273, in apply_entry
result = self.apply_async(entry, producer=producer, advance=False)
File "/home/ah/virtualenv/lib/python3.5/site-packages/celery/beat.py", line 401, in apply_async
entry, exc=exc)), sys.exc_info()[2])
File "/home/ah/virtualenv/lib/python3.5/site-packages/vine/five.py", line 194, in reraise
raise value.with_traceback(tb)
File "/home/ah/virtualenv/lib/python3.5/site-packages/celery/beat.py", line 388, in apply_async
entry_kwargs = {k: v() if isinstance(v, BeatLazyFunc) else v for k, v in entry.kwargs.items()}
celery.beat.SchedulingError: Couldn't apply scheduled task sync_products_for_source_40023: 'NoneType' object has no attribute 'items'
|
# Checklist
* I have verified that the issue exists against the `master` branch of Celery.
* This has already been asked to the discussion group first.
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
* I have checked the pull requests list
for existing proposed fixes.
* I have checked the commit log
to find out if the bug was already fixed in the master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
* I have included the contents of `pip freeze` in the issue.
* I have included all the versions of all the external dependencies required
to reproduce this bug.
## Optional Debugging Information
* I have tried reproducing the issue on more than one Python version
and/or implementation.
* I have tried reproducing the issue on more than one message broker and/or
result backend.
* I have tried reproducing the issue on more than one version of the message
broker and/or result backend.
* I have tried reproducing the issue on more than one operating system.
* I have tried reproducing the issue on more than one workers pool.
* I have tried reproducing the issue with autoscaling, retries,
ETA/Countdown & rate limits disabled.
* I have tried reproducing the issue after downgrading
and/or upgrading Celery and its dependencies.
## Related Issues and Possible Duplicates
#### Related Issues
* None
#### Possible Duplicates
* None
## Environment & Settings
**Celery version** : 4.4.0
**`celery report` Output:**
# Steps to Reproduce
## Required Dependencies
* **Minimal Python Version** : 3.7.4 or 2.7
* **Minimal Celery Version** : 4.4.0
* **Minimal Kombu Version** : N/A or Unknown
* **Minimal Broker Version** : N/A or Unknown
* **Minimal Result Backend Version** : N/A or Unknown
* **Minimal OS and/or Kernel Version** : N/A or Unknown
* **Minimal Broker Client Version** : N/A or Unknown
* **Minimal Result Backend Client Version** : N/A or Unknown
### Python Packages
**`pip freeze` Output:**
alembic==1.3.2
aliyun-python-sdk-core==2.13.13
aliyun-python-sdk-core-v3==2.13.11
aliyun-python-sdk-kms==2.9.0
amqp==2.5.2
billiard==3.6.1.0
celery==4.4.0
certifi==2019.11.28
chardet==3.0.4
Click==7.0
configparser==4.0.2
crcmod==1.7
Flask==1.1.1
Flask-Caching==1.8.0
Flask-Migrate==2.5.2
Flask-Script==2.0.6
Flask-SQLAlchemy==2.4.1
gevent==1.4.0
gitlab==1.0.2
greenlet==0.4.15
gunicorn==20.0.4
idna==2.8
importlib-metadata==1.4.0
itsdangerous==1.1.0
Jinja2==2.10.3
jmespath==0.9.4
kombu==4.6.7
lxml==4.4.2
Mako==1.1.0
MarkupSafe==1.1.1
more-itertools==8.1.0
oss2==2.9.1
pycryptodome==3.9.4
pymongo==3.10.0
PyMySQL==0.9.3
python-dateutil==2.8.1
python-editor==1.0.4
pytz==2019.3
redis==3.3.11
requests==2.22.0
six==1.13.0
SQLAlchemy==1.3.12
urllib3==1.25.7
vine==1.3.0
Werkzeug==0.16.0
zipp==2.0.0
### Other Dependencies
N/A
## Minimally Reproducible Test Case
celery -A tasks worker --loglevel=info --pool=solo
from tasks import celery_app
task_id = "some_task_id"
logger.info("stoping task: %s" % task_id)
celery_app.control.revoke(task_id, terminate=True, signal='SIGKILL')
# Expected Behavior
task should be terminated
# Actual Behavior
task can't be terminate
when using "celery -A tasks worker --loglevel=info ~~\--pool=solo~~ " to start
celery, There is no problem
| 0 |
* VSCode Version:1.1.0-alpha
* OS Version:Windows 10
Steps to Reproduce:
1. Run code-alpha --help
Shown as "list-extensions", should be "uninstall-extension"
--list-extensions List the installed extensions.
--install-extension <extension>
Installs an extension.
--list-extensions <extension>
Uninstalls an extension.
See `b455db9`#diff-e255a28340d4f20061d11494e99bd60b
|
Using mainly the default generated files from `generator-code`, a significant
number of devDependencies (mostly recursive) are pulled in - mainly because of
Mocha. For development this is fine (I love Mocha). But why blindly package
them all with `vsce package`? Perhaps create an output directory of everything
`vsce ls` would find but _node_modules_ , then run `npm install --production`
and package that directory. Would certainly reduce the size of VSCode
extensions. Mine - with only the addition of 1 small module (with no
dependencies itself) is over 5MB.
| 0 |
* [ x] I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
I should be able to run a nextjs app on multiple server replicas without
errors occurring.
## Current Behavior
Trying to run next 4.1.4 in a kubernetes cluster and about half of the
requests to the server result in "An unexpected error has occurred." If the
request is made client-side, the only error in the console is `500 - Internal
Server Error. undefined`. There is no server log with more error details.
This does not appear to be happening with a 3.x app run in a similar context.
## Steps to Reproduce (for bugs)
1. Create next.js 4 app
2. Run in kubernetes cluster
3. Load page repeatedly
## Context
I've already implemented the build-stats hack here #2978 (comment) to have
consistent build ids across the replicas. That resolved that set of errors but
now stuck with undebuggable errors.
ETA: This appears to still be build id related. The error that's thrown has
`buildIdMismatched: true` even through both replicas have identical
`.next/build-stats.json` files.
## Your Environment
Tech | Version
---|---
next | 4.1.4
node | 8
OS | CoreOS
browser | Chrome/Firefox
etc |
|
* [x ] I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
I am using Docker on Digital Ocean for a Next.js based project, the main page
is pretty simple and should just show nothing more than a Facebook login
button.
## Current Behavior
If i try to access it via Browser, the first time it works well and the Login
button shows up - if i refresh the page, i get the following error on my
screen:
`Error: Unknown system error -116: Unknown system error -116, stat
'/usr/src/app/pages/overview.js'`
## Steps to Reproduce (for bugs)
The index page is very simple:
import React from 'react';
import PropTypes from 'prop-types';
import {Button} from 'antd';
import {bindActionCreators} from 'redux';
import {initStore} from '../store';
import withRedux from 'next-redux-wrapper';
import * as userActionCreators from '../actions/user';
class Index extends React.Component {
constructor(props) {
super(props);
this.state = {
};
}
componentDidMount() {
const {checkLogin} = this.props.userActions;
//checkLogin(true); //disabled for now, but it makes no difference
}
onLoginClick = () => {
const {doLogin} = this.props.userActions;
doLogin();
};
render() {
const {isRequestingLoginStatus} = this.props.user;
return (
<div>
<Button className="loginBtn" onClick={this.onLoginClick} type="primary"
loading={isRequestingLoginStatus}>Login</Button>
<style jsx>{`
.loginBtn {
}
`}</style>
</div>
);
}
}
Index.propTypes = {
userActions: PropTypes.object,
user: PropTypes.object
};
const mapStateToProps = (state) => {
return {
user: state.user
};
};
function mapDispatchToProps(dispatch) {
return {
userActions: bindActionCreators(userActionCreators, dispatch)
};
}
export default withRedux(initStore, mapStateToProps, mapDispatchToProps)(Index);
I installed Sentry and it showed the following additional error information,
the "touch" package is used by Next.js:
Error: Unknown system error -116: Unknown system error -116, open '/usr/src/app/pages/index.js'
File "fs.js", line 667, in Object.fs.openSync
File "/usr/src/app/node_modules/touch/index.js", line 176, in TouchSync.open
this.onopen(null, fs.openSync(this.path, this.oflags))
File "/usr/src/app/node_modules/touch/index.js", line 82, in new Touch
this.open()
File "/usr/src/app/node_modules/touch/index.js", line 173, in new TouchSync
class TouchSync extends Touch {
File "/usr/src/app/node_modules/touch/index.js", line 20, in Function.module.exports.sync.module.exports.touchSync
(new TouchSync(validOpts(options, f, null)), undefined)
...
(4 additional frame(s) were not displayed)
Screenshot of the Error, including Dev Tools:

My Dockerfile for Next.js:
FROM node:latest
MAINTAINER Andreas Teufel <a.teufel@limesoda.com>
WORKDIR /usr/src/app
COPY package*.json ./
COPY yarn.lock ./
RUN yarn install
COPY . .
RUN yarn global add pm2
EXPOSE 3000
CMD ["pm2-runtime", "./docker/node/process.yml"]
## Your Environment
Tech | Version
---|---
next | 5.0.0
node | 8.x
OS | Ubuntu 17.10 on Digital Ocean
browser | latest Firefox/Chrome
docker | 17.12.0-ce-win47
| 0 |
I have a live website with the problem here: www.studiotaroccomanta.com
If the main menu is collapsed and closed, when I click the button to open it
everything works fine for the first level entries.
But when I visualize the submenu "ATTIVITA'" on mobile, leave it open, and
then try to click one of the other FIRST level menu (like "CONTATTI"), the
submenu simply close, staying on the current page, aka without redirecting me
to the correct page.
Any ideas?
|
In mobile devices, particularly android and safari when textbox is focused the
screen moves upwards and textbox goes outside the screen.
| 0 |
* Electron Version: 2.0.0
* Operating System (Platform and Version): Windows 10
* Last known working Electron version: -
**Expected Behavior**
Non-dragable region if div with `position: fixed;` and `-webkit-app-region:
no-drag;` is above div with `-webkit-app-region: drag;`
**Actual behavior**
region becomes dragable if it's position is fixed and div below has `-webkit-
app-region: drag;`
**To Reproduce**
Make frameless window, make non-fixed div which have dragging enabled, put
fixed div above and disable dragging. Still dragable.
|
* Electron version: 1.8.1
* Operating system: Windows 7 x64
### Expected behavior
mouse events should work normally on fixed/absolute-positioned elements.
### Actual behavior
-webkit-app-region: drag overrides mouse events on elements positioned above and z-index doesn't have any effect
### How to reproduce
HTML:
<nav></nav>
<header></header>
CSS:
nav{
position: absolute;
z-index: 100;
right: 0;
top: 0;
width: 200px;
height: 40px;
background: blue;
-webkit-app-region: no-drag;
}
nav:hover{
background: green;
}
header{
position: relative;
z-index: 5;
width: 100%;
height: 40px;
background: red;
-webkit-app-region: drag;
}
Moving the mouse over the blue nav doesn't turn it green because the header
drag is stealing all mouse events...
| 1 |
There are 7 tests in tests/linalg_tests.py that fail on Mac with scipy vesion
1.4.1 installed.
The tests are:
'tests/linalg_test.py::NumpyLinalgTest::testEigvals_shape=complex64[50,50]'
'tests/linalg_test.py::NumpyLinalgTest::testPinv_shape=complex64[7,10000]'
'tests/linalg_test.py::ScipyLinalgTest::testLuFactor_n=complex64[200,200]'
'tests/linalg_test.py::ScipyLinalgTest::testExpm_n=complex64[50,50]'
'tests/linalg_test.py::NumpyLinalgTest::testInv_shape=float32[200,200]'
'tests/linalg_test.py::NumpyLinalgTest::testPinv_shape=float32[7,10000]'
'tests/linalg_test.py::ScipyLinalgTest::testExpm_n=float32[50,50]'
The failure is
worker 'gw2' crashed while running 'tests/linalg_test.py::NumpyLinalgTest::testEigvals_shape=complex64[50,50]'
A longer stack trace is:
Fatal Python error: Bus error
Thread 0x000070000693a000 (most recent call first):
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 400 in read
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 432 in from_io
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 967 in _thread_receiver
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 220 in run
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 285 in _perform_spawn
Thread 0x0000000106b4d5c0 (most recent call first):
File "/Users/necula/Source/jax/jax/interpreters/xla.py", line 731 in _value
File "/Users/necula/Source/jax/jax/interpreters/xla.py", line 826 in __array__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/numpy/core/_asarray.py", line 85 in asarray
File "/Users/necula/Source/jax/jax/test_util.py", line 682 in assertAllClose
File "/Users/necula/Source/jax/jax/test_util.py", line 727 in _CompileAndCheck
File "/Users/necula/Source/jax/tests/linalg_test.py", line 994 in testExpm
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/absl/testing/parameterized.py", line 263 in bound_param_test
File "/Users/necula/homebrew/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py", line 628 in run
File "/Users/necula/homebrew/Cellar/python/3.7.4/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py", line 676 in __call__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/unittest.py", line 207 in runtest
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 117 in pytest_runtest_call
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 81 in <lambda>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 87 in _hookexec
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/hooks.py", line 289 in __call__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 192 in <lambda>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 220 in from_call
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 192 in call_runtest_hook
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 167 in call_and_report
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 87 in runtestprotocol
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/runner.py", line 72 in pytest_runtest_protocol
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 81 in <lambda>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 87 in _hookexec
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/hooks.py", line 289 in __call__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/xdist/remote.py", line 85 in run_one_test
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/xdist/remote.py", line 71 in pytest_runtestloop
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 81 in <lambda>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 87 in _hookexec
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/hooks.py", line 289 in __call__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/main.py", line 235 in _main
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/main.py", line 191 in wrap_session
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/_pytest/main.py", line 228 in pytest_cmdline_main
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/callers.py", line 187 in _multicall
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 81 in <lambda>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/manager.py", line 87 in _hookexec
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/pluggy/hooks.py", line 289 in __call__
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/xdist/remote.py", line 250 in <module>
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 1084 in executetask
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 220 in run
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 285 in _perform_spawn
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 267 in integrate_as_primary_thread
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 1060 in serve
File "/Users/necula/.virtualenvs/jax/lib/python3.7/site-packages/execnet/gateway_base.py", line 1554 in serve
File "<string>", line 8 in <module>
File "<string>", line 1 in <module>
[gw0] node down: Not properly terminated
f
replacing crashed worker gw0
I will disable the tests for now.
|
I'm not exactly sure _why_ this happens, being unfamiliar with the internal
architecture, but on MacOS with Python 3.6.8, the following code segfaults if
scipy 1.2.1 is installed (the version that comes by default when you `pip
install jax jaxlib`):
import jax.random as random
import jax.scipy.linalg as linalg
key = random.PRNGKey(42)
# For some reason, matrices smaller than (50, 50) or so do not trigger segfaults
X = random.normal(key, (500, 500))
A = X @ X.T # Drawn from standard Wishart distribution
linalg.cholesky(A)
print("Success!")
Output:
$ python -W ignore test.py
zsh: bus error python -W ignore test.py
If I roll back to Scipy 1.1.0, everything works:
$ python -W ignore test.py
Success!
This is a great project by the way--thanks for working on it!
Edit: after further digging, I found the following in the the Scipy 1.2
release notes:
> scipy.linalg.lapack now exposes the LAPACK routines using the Rectangular
> Full Packed storage (RFP) for upper triangular, lower triangular, symmetric,
> or Hermitian matrices; the upper trapezoidal fat matrix RZ decomposition
> routines are now available as well.
Perhaps this has something to do with it?
Even more edits: yet more digging has revealed scipy/scipy#9751, which hints
that this might be caused by a specific (old) version of XCode. I will report
back once XCode is upgraded.
| 1 |
I'm developing these features for a flutter app: loading an image from gallery
or from camera and allow rotate it.
Those operations are expensive so I'm computing them in another thread.
The code is working fine in debug mode, but when I try with release or
profiler mode the app crashes and stop working only when I try to rotate the
image.
I have more operations working in another thread and there is no problem with
those, the problem is only with this operation.
Run `flutter doctor -v`
[✓] Flutter (Channel beta, v0.9.4, on Linux, locale es_ES.UTF-8)
• Flutter version 0.9.4 at /home/nadia/flutter
• Framework revision f37c235c32 (hace 3 semanas), 2018-09-25 17:45:40 -0400
• Engine revision 74625aed32
• Dart version 2.1.0-dev.5.0.flutter-a2eb050044
[✓] Android toolchain - develop for Android devices (Android SDK 27.0.3)
• Android SDK at /home/nadia/Android/Sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-27, build-tools 27.0.3
• Java binary at: /home/nadia/android-studio/jre/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1136-b06)
• All Android licenses accepted.
[✓] Android Studio (version 3.2)
• Android Studio at /home/nadia/android-studio
• Flutter plugin version 29.1.1
• Dart plugin version 181.5656
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1136-b06)
[✓] Connected devices (1 available)
• NX591J • d2ab515a • android-arm64 • Android 7.1.2 (API 25)
This is my `pubspec.yaml`
dependencies:
flutter:
sdk: flutter
cupertino_icons: ^0.1.2
image_picker: ^0.4.10
path_provider: ^0.4.1
path: any
image_picker_saver: ^0.1.0
uuid: ^1.0.3
simple_permissions: ^0.1.8
image: ^2.0.4
And here is my code. I have the code divided into three files. First my
`main.dart`
import 'dart:io';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:image_picker/image_picker.dart' as imPick;
import 'package:flutter/services.dart';
import 'package:image/image.dart' as im;
import 'package:path_provider/path_provider.dart';
import 'package:simple_permissions/simple_permissions.dart';
import 'imageUtils.dart';
import 'toCompute.dart';
void main() => runApp(new MyApp());
class MyApp extends StatelessWidget {
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return new MaterialApp(
/*theme: new ThemeData(
primarySwatch: Colors.white,
),
*/
showPerformanceOverlay: true,
debugShowCheckedModeBanner: false,
home: CameraUpload(),
);
}
}
class _AkkaMark extends State<CameraUpload> {
File _imageFile;
String _path;
List<Widget> _buttons;
im.Image _finalImage;
Widget _widgetForBody;
Widget _imageForBody;
Widget _loading;
double _widthS;
double _heightS;
String _temporalDir = "/Pictures/AKKAmarkTemporal";
final _key = new GlobalKey<ScaffoldState>();
List<String> _savedImages = new List();
void _setWidgets(){
_imageForBody =
new Stack(
fit: StackFit.expand,
children: <Widget>[
_setImage(),
new Container(
alignment: Alignment.bottomCenter,
child: new Stack(
alignment: Alignment.bottomCenter,
overflow: Overflow.visible,
children: _buttons,
),
)
] ,
);
_loading =
new Container(
child: new Stack(
fit: StackFit.expand,
children: <Widget>[
_imageFile == null?
new Center(child: new Text('Pick image or upload from files', textAlign: TextAlign.center, )):
new Image.file(_imageFile, fit: BoxFit.contain,),
new Container(
alignment: Alignment.bottomCenter,
child: new Stack(
alignment: Alignment.bottomCenter,
overflow: Overflow.visible,
children: _buttons,
),
),
new Container(
alignment: AlignmentDirectional.center,
decoration: new BoxDecoration(
color: Colors.white70,
),
child: new Container(
decoration: new BoxDecoration(
color: Colors.blue[200],
borderRadius: new BorderRadius.circular(10.0)
),
width: 300.0,
height: 200.0,
alignment: AlignmentDirectional.center,
child: new Column(
crossAxisAlignment: CrossAxisAlignment.center,
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
new Center(
child: new SizedBox(
height: 50.0,
width: 50.0,
child: new CircularProgressIndicator(
value: null,
strokeWidth: 7.0,
),
),
),
new Container(
margin: const EdgeInsets.only(top: 25.0),
child: new Center(
child: new Text(
"Loading.. wait...",
style: new TextStyle(
color: Colors.white
),
),
),
),
],
),
),
),
],
),
);
}
@override
void initState() {
super.initState();
initData();
_widgetForBody = _imageForBody;
}
void initData() async {
await SimplePermissions.requestPermission(Permission.WriteExternalStorage);
getExternalStorageDirectory().then((Directory dir){
compute(deleteTemporal, dir.path + _temporalDir);
});
_setWidgets();
getImageCamera();
}
void _setStateImage(){
setState(() {
_setWidgets();
_widgetForBody = _imageForBody;
});
}
void _setStateLoading(){
setState(() {
_widgetForBody = _loading;
});
}
/* Update the main screen:
* - If there is no image selected, show text
* - If there is an image: preview this and show new buttons*/
Widget _setImage() {
if (_imageFile == null) {
_setInitialButtons();
return new Center(child: new Text('Pick image or upload from files', textAlign: TextAlign.center, ));
}
else {
_showExtraButtons();
return new Image.file(_imageFile, fit: BoxFit.contain,);
}
}
//Open the camera and save the image in variable
getImageCamera() async {
File image = await imPick.ImagePicker.pickImage(source: imPick.ImageSource.camera, maxWidth: 2048.0);
_loadNewImage(image);
}
//Open the gallery and save the image choose in variable
getImageFile() async {
File image = await imPick.ImagePicker.pickImage(source: imPick.ImageSource.gallery, maxWidth: 2048.0);
_loadNewImage(image);
}
//NEW
_loadNewImage(File image){
_setStateLoading();
compute(deleteImages, _savedImages);
_savedImages.clear();
_imageFile = image == null && _imageFile != null? _imageFile: image;
if(_imageFile != null ){
_savedImages.add(_imageFile.path);
compute(decodeImage, _imageFile).then((im.Image imageDecode){
_finalImage = imageDecode;
_setStateImage();
});
}
else{
_setStateImage();
}
}
/* Init the variable with the default buttons */
_setInitialButtons() {
//Initially only two buttons, centered in the screen
double iniPosW = _widthS/2;
double iniPosH = _heightS/1.2;
_buttons = [
createButton(getImageCamera, 'Pick image', new Icon(Icons.add_a_photo), new EdgeInsets.fromLTRB(iniPosW, iniPosH, 0.0, 0.0) ),
createButton(getImageFile, 'Upload from files', new Icon(Icons.file_upload), new EdgeInsets.fromLTRB(iniPosW - (_widthS * 0.18), iniPosH, 0.0, 0.0) ),
];
}
//Append the button for select the position
_showExtraButtons() {
double iniPosW = _widthS * 0.05;
double iniPosH = _heightS/1.12;
_buttons = [
//Pick image from camera
createButton(getImageCamera, 'Pick image', new Icon(Icons.add_a_photo), new EdgeInsets.fromLTRB( iniPosW, iniPosH, 0.0, 0.0) ),
//Upload image from files
createButton(getImageFile, 'Upload from files', new Icon(Icons.file_upload),
new EdgeInsets.fromLTRB( iniPosW + ( _widthS * 0.11), iniPosH - ( _heightS *0.16), 0.0, 0.0 ) ),
//To rotate image
createButton( () { _computeRotate(); },
'Rotate right', new Icon(Icons.rotate_right),
new EdgeInsets.fromLTRB( iniPosW + ( _widthS * 0.62), iniPosH - ( _heightS *0.16), 0.0, 0.0 ) ),
//To save image
createButton( () {
new ImageUtils().saveImage(_imageFile).then((String path){
_path = path;
compute(deleteImages, _savedImages);
_savedImages.clear();
_savedImages.add(_path);
});
},
'Save image',
new Icon(Icons.save), new EdgeInsets.fromLTRB(iniPosW + ( _widthS * 0.75), iniPosH, 0.0, 0.0 )),
];
//Back
if( _savedImages.length > 1)
_buttons.add(createButton(_back, 'Back', new Icon(Icons.reply), new EdgeInsets.fromLTRB( iniPosW + ( _widthS * 0.78), 0.0, 0.0, _heightS * 0.84) ) );
}
void _back(){
_setStateLoading();
int lenght = _savedImages.length;
File image = new File( _savedImages[ lenght - 2] );
print(image);
List<String> last = new List();
last.add(_savedImages[lenght - 1]);
compute(deleteImages, last);
_savedImages.removeLast();
_imageFile = image;
compute(decodeImage, _imageFile).then((im.Image imageDecode){
_finalImage = imageDecode;
_setStateImage();
});
}
void _computeRotate(){
_setStateLoading();
new ImageUtils().saveImageTemporal(_imageFile, _temporalDir).then((File file){
_path = file.path;
_savedImages.add(_path);
List<Object> arg = new List();
arg.add(_finalImage);
arg.add(_path);
print("Rotating...");
compute(rotateImage, arg).then((List<Object> objects){
_finalImage = objects[0];
_imageFile = objects[1];
_setStateImage();
});
});
}
Positioned createButton(Function onPressed, tooltip, Widget icon, EdgeInsets padding){
return new Positioned(
top: padding.top,
bottom: padding.bottom,
left: padding.left,
//right: padding.right,
child: new CircleAvatar(
radius: 26.0,
child:
FloatingActionButton(
elevation: 6.0,
onPressed: onPressed,
tooltip: tooltip,
child: icon,
),
),
);
}
@override
Widget build(BuildContext context) {
_heightS = MediaQuery.of(context).size.height;
_widthS = MediaQuery.of(context).size.width;
return Scaffold(
key: _key,
body: _widgetForBody,
backgroundColor: Color.fromRGBO(239, 239, 240, 0.9),
);
}
}
class CameraUpload extends StatefulWidget {
@override
_AkkaMark createState() => new _AkkaMark();
}
And the other two files, one that has all the code that I will use isolated
`toCompute.dart`
import 'dart:io';
import 'package:image/image.dart' as im;
List<Object> rotateImage(List<Object> arg) {
im.Image finalImage = arg[0];
String path = arg[1];
finalImage = im.copyRotate(finalImage, 90);
List<int> png = im.encodeJpg(finalImage);
File imageFileReturn = new File(path)..writeAsBytesSync(png);
List<Object> returns = new List();
returns.add(finalImage);
returns.add(imageFileReturn);
return returns;
}
im.Image decodeImage(File file){
return im.decodeImage(file.readAsBytesSync());
}
bool deleteImages(List<String> imagesPath){
print("Deleting...");
for( int i = 1; i < imagesPath.length; i++ ){
new File(imagesPath[i]).delete();
}
return true;
}
bool deleteTemporal(String temporalDir){
new Directory( temporalDir ).exists().then((bool exists){
if ( exists )
new Directory( temporalDir ).deleteSync(recursive: true);
});
return true;
}
And other one that I am using to save the images, `imageUtils.dart`
import 'dart:async';
import 'dart:io';
import 'package:path_provider/path_provider.dart';
import 'package:image_picker_saver/image_picker_saver.dart';
import 'package:uuid/uuid.dart';
class ImageUtils {
Future<String> saveImage(File image) async {
var a = await ImagePickerSaver.saveFile(fileData: image.readAsBytesSync());
return a;
}
Future<File> saveImageTemporal(File image, String temporalDir) async {
temporalDir = await _createTemporalPath(temporalDir);
var file = File('$temporalDir/'+generateUuidJpg().toString())..writeAsBytesSync(image.readAsBytesSync());
File('$temporalDir/'+'.nomedia')..writeAsStringSync('');
print('saving... '+file.path);
return file;
}
Future<String> _createTemporalPath(String temporalDir) async {
Directory dirFinal;
var sdCard = await getExternalStorageDirectory();
dirFinal = await new Directory(sdCard.path + temporalDir).create(recursive: true);
return dirFinal.path;
}
String generateUuidJpg(){
Uuid _uuid = new Uuid();
return _uuid.v1().toString()+".jpg";
}
}
Flutter throws an error which is the same that I can see in my logcat
Here is the flutter error
`Fatal signal 11 (SIGSEGV), code 1, fault addr 0x7f7bdffb90 in tid 21655
(m.akka.akkamark)`
And here is my logcat
Fatal signal 11 (SIGSEGV), code 1, fault addr 0x7f7e9feca0 in tid 23568 (1.ui)
10-15 13:54:54.493 361 361 W : debuggerd: handling request: pid=22783 uid=10204 gid=10204 tid=23568
10-15 13:54:54.494 23570 23570 I debuggerd64: type=1400 audit(0.0:10486): avc: denied { read } for name="isolate_snapshot_instr" dev="mmcblk0p48" ino=1532120 scontext=u:r:debuggerd:s0 tcontext=u:object_r:app_data_file:s0:c512,c768 tclass=file permissive=1
10-15 13:54:54.494 23570 23570 I debuggerd64: type=1400 audit(0.0:10487): avc: denied { open } for path="/data/data/com.akka.akkamark/app_flutter/isolate_snapshot_instr" dev="mmcblk0p48" ino=1532120 scontext=u:r:debuggerd:s0 tcontext=u:object_r:app_data_file:s0:c512,c768 tclass=file permissive=1
10-15 13:54:54.494 23570 23570 I debuggerd64: type=1400 audit(0.0:10488): avc: denied { getattr } for path="/data/data/com.akka.akkamark/app_flutter/isolate_snapshot_instr" dev="mmcblk0p48" ino=1532120 scontext=u:r:debuggerd:s0 tcontext=u:object_r:app_data_file:s0:c512,c768 tclass=file permissive=1
10-15 13:54:54.566 23570 23570 F DEBUG : LineageOS Version: '14.1-20180810-UNOFFICIAL-nx591j'
10-15 13:54:54.566 23570 23570 F DEBUG : Build fingerprint: 'nubia/NX591J/NX591J:7.1.1/NMF26F/eng.nubia.20170905.150740:user/release-keys'
10-15 13:54:54.566 23570 23570 F DEBUG : Revision: '0'
10-15 13:54:54.566 23570 23570 F DEBUG : ABI: 'arm64'
10-15 13:54:54.566 23570 23570 F DEBUG : pid: 22783, tid: 23568, name: 1.ui >>> com.akka.akkamark <<<
10-15 13:54:54.566 23570 23570 F DEBUG : signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x7f7e9feca0
10-15 13:54:54.566 23570 23570 F DEBUG : x0 0000007fa1042a71 x1 0000000000aaa000 x2 0000000000237800 x3 000000000046f000
10-15 13:54:54.566 23570 23570 F DEBUG : x4 0000007f7e120c91 x5 0000000000237800 x6 000000000000046f x7 00000000000008de
10-15 13:54:54.566 23570 23570 F DEBUG : x8 0000007f7e38d9d1 x9 0000000000000000 x10 0000000000000001 x11 0000000000000180
10-15 13:54:54.566 23570 23570 F DEBUG : x12 0000000000000004 x13 0000000500000000 x14 0000000000000020 x15 0000007f7df7e268
10-15 13:54:54.566 23570 23570 F DEBUG : x16 0000007f7e9fec91 x17 0000007f7e4192c1 x18 0000000000000116 x19 0000007f84288c30
10-15 13:54:54.566 23570 23570 F DEBUG : x20 0000000000000000 x21 0000007f7df7f0c0 x22 0000007f7df7f0d0 x23 0000007f7df7d270
10-15 13:54:54.566 23570 23570 F DEBUG : x24 0000007f7e5b0321 x25 0000007f7df7d1c0 x26 0000007f82ec3200 x27 0000007f7e4192c0
10-15 13:54:54.567 23570 23570 F DEBUG : x28 0000000000000004 x29 0000007f7df7e278 x30 0000007f844531b0
10-15 13:54:54.567 23570 23570 F DEBUG : sp 0000007f7df7d250 pc 0000007f8445326c pstate 0000000080000000
10-15 13:54:54.567 23570 23570 F DEBUG :
10-15 13:54:54.567 23570 23570 F DEBUG : backtrace:
10-15 13:54:54.567 23570 23570 F DEBUG : #00 pc 00000000002ae26c /data/data/com.akka.akkamark/app_flutter/isolate_snapshot_instr
10-15 13:54:54.567 23570 23570 F DEBUG : #01 pc 00000000002ae1ac /data/data/com.akka.akkamark/app_flutter/isolate_snapshot_instr
I've tested with several devices, from android 6.0 to android 8.1 and I have
the same problem. This logcat is with a device with android 7.1.2.
I am not sure if it is a flutter bug or maybe the problem is with `image`
pugglin and also I am not sure if I am doing something wrong, I followed all
the steps that are in flutter web page to generate the release apk. Any clue?
Thanks in advance!
|
I'm developing these features for a flutter app: loading an image from
gallery, resize and save it.
The resizing part is a cpu intensive operation so I followed the approach
suggested here using isolate to gain a better user experience.
When I run the code in debug mode I have no issue, but when I try the same
code in release mode my image is saved in a wrong way:
Here is the code to replicate the problem. This is a new flutter project with
the following files:
In `pubspec.yaml` I add this section:
dependencies:
path_provider: ^0.4.1
image_picker: ^0.4.10
image: ^2.0.4
flutter:
sdk: flutter
All code is in `lib/main.dart`:
import 'dart:async';
import 'dart:io';
import 'dart:isolate';
import 'package:flutter/material.dart';
import 'package:image/image.dart';
import 'package:image_picker/image_picker.dart';
import 'package:path_provider/path_provider.dart';
import 'package:path/path.dart' as p;
void main() => runApp(new MyApp());
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return new MaterialApp(
home: new MyHomePage(),
);
}
}
class MyHomePage extends StatefulWidget {
MyHomePage({Key key}) : super(key: key);
@override
_MyHomePageState createState() => new _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
int _counter = 0;
File _file;
Future<File> _getImage() async {
File image = await ImagePicker.pickImage(source: ImageSource.gallery);
if (image != null) {
return image;
}
return null;
}
static decode(DecodeParam param) async {
var p = await param.file.readAsBytes();
var image = decodeImage(p);
var thumbnail = copyResize(image, 120);
param.sendPort.send(thumbnail);
}
void _displayImage() async {
setState(() {
_file = null;
});
File file = await _getImage();
ReceivePort receivePort = new ReceivePort();
await Isolate.spawn(decode, new DecodeParam(file, receivePort.sendPort));
var image = await receivePort.first;
Directory tempDir = await getTemporaryDirectory();
String tempPath = tempDir.path;
File profilePictureFile =
File(p.join(tempPath, 'thumbnail' + _counter.toString() + '.png'))
..writeAsBytesSync(encodePng(image));
setState(() {
_counter++;
_file = profilePictureFile;
});
}
@override
Widget build(BuildContext context) {
return new Scaffold(
body: new Center(
child: new Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
_file != null
? Container(
height: 200.0,
width: 200.0,
decoration: BoxDecoration(
shape: BoxShape.circle,
image: DecorationImage(
fit: BoxFit.fitWidth, image: FileImage(_file))))
: Container(),
],
),
),
floatingActionButton: new FloatingActionButton(
onPressed: _displayImage,
child: new Icon(Icons.add),
),
);
}
}
class DecodeParam {
final File file;
final SendPort sendPort;
DecodeParam(this.file, this.sendPort);
}
`flutter doctor -v`:
[√] Flutter (Channel master, v0.9.7-pre.61, on Microsoft Windows [Versione 10.0.15063], locale it-IT)
• Flutter version 0.9.7-pre.61 at C:\src\flutter
• Framework revision 2d81adf74c (2 days ago), 2018-10-05 22:29:37 -0700
• Engine revision 572fa5646a
• Dart version 2.1.0-dev.6.0.flutter-c6254163dc
[√] Android toolchain - develop for Android devices (Android SDK 28.0.1)
• Android SDK at d:\Profiles\alarosa\AppData\Local\Android\sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-28, build-tools 28.0.1
• Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02)
• All Android licenses accepted.
[√] Android Studio (version 3.1)
• Android Studio at C:\Program Files\Android\Android Studio
• Flutter plugin version 26.0.1
• Dart plugin version 173.4700
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1024-b02)
[!] IntelliJ IDEA Community Edition (version 2018.1)
• IntelliJ at C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2018.1
X Flutter plugin not installed; this adds Flutter specific functionality.
X Dart plugin not installed; this adds Dart specific functionality.
• For information about installing plugins, see
https://flutter.io/intellij-setup/#installing-the-plugins
[√] VS Code (version 1.27.2)
• VS Code at d:\Profiles\alarosa\AppData\Local\Programs\Microsoft VS Code
• Flutter extension version 2.19.0
[√] VS Code, 64-bit edition (version 1.27.2)
• VS Code at C:\Program Files\Microsoft VS Code
• Flutter extension version 2.19.0
[√] Connected device (1 available)
• HUAWEI VNS L31 • 4TE0216A14001341 • android-arm64 • Android 7.0 (API 24)
! Doctor found issues in 1 category.
Thank you.
| 1 |
**repro:**
1. clean checkout at master (`2c9e051`)
2. `make`
**expected:**
successful build
**actual:**
...
LibCURL ────────── 0.361519 seconds
Downloads ──────── 0.605942 seconds
Pkg ────────────── 3.760104 seconds
LazyArtifacts ──── 0.002615 seconds
Stdlibs total ──── 30.568599 seconds
Sysimage built. Summary:
Total ─────── 51.765127 seconds
Base: ─────── 21.193995 seconds 40.9426%
Stdlibs: ──── 30.568599 seconds 59.0525%
JULIA usr/lib/julia/sys-o.a
/bin/sh: line 1: 62349 Segmentation fault: 11 JULIA_BINDIR=/Users/vilterp/code/j-test/julia/usr/bin WINEPATH="/Users/vilterp/code/j-test/julia/usr/bin;$WINEPATH" /Users/vilterp/code/j-test/julia/usr/bin/julia -O3 -C "native" --output-o /Users/vilterp/code/j-test/julia/usr/lib/julia/sys-o.a.tmp --startup-file=no --warn-overwrite=yes --sysimage /Users/vilterp/code/j-test/julia/usr/lib/julia/sys.ji /Users/vilterp/code/j-test/julia/contrib/generate_precompile.jl 1
*** This error is usually fixed by running `make clean`. If the error persists, try `make cleanall`. ***
make[1]: *** [/Users/vilterp/code/j-test/julia/usr/lib/julia/sys-o.a] Error 1
make: *** [julia-sysimg-release] Error 2
**context**
* macOS: 11.5.2. intel x64
* clang:
clang -v
Apple clang version 13.0.0 (clang-1300.0.29.3)
Target: x86_64-apple-darwin20.6.0
Thread model: posix
InstalledDir: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin
|
I've addressed this feature multiple times, but never really got feedback on
it, so I decided to open an issue for it, as I keep coming back to it as a
very elegant solution to many problems.
I hope you can answer at least, how feasible this is, what needs to be done
for it (so that I can maybe implement it myself, if no one has the time), or
explain me why this is a silly idea.
##### What I want:
# Introducing a new type, in a typealias like fashion.
# first syntax idea
hardtypealias RGB{T <: FloatingPoint} NTuple{3, T}
# second syntax idea
immutable Meter{T <: Real} <: T
# third syntax idea
concrete FixedMatrix{T, M, N} <: NTuple{NTuple{M, T}, N} # in constrast to abstract
Intended semantic: The first type (e.g. RGB) is a either subtype of the second
(e.g NTuple), or has the type Union(RGB, NTuple).
I'm not sure what makes more sense. But it should solve the following problem:
> There is an already existing type and one wants to add functions to it,
> without adding them to the original type.
This behavior is wanted in a myriad of cases and I've seen workarounds for
this situation in many packages (including mine).
### Examples
A few examples, which can be elegantly implemented with this feature, which
assume, that NTuple implements common, simd accelerated, vector operations
#### Adding functions to an existing type:
# under the assumption, that NTuple implements common vector operations
immutable RGB{T} <: NTuple{3, T}
# under the assumption, that getfield can be extended
Base.getfield(c::RGB, ::Field{:r}) = c[1]
Base.getfield(c::RGB, ::Field{:g}) = c[2]
Base.getfield(c::RGB, ::Field{:b}) = c[3]
# which gives you the access to RGB which you'd expect from it.
# could also be implemented with a macro: @accessors RGB :r => 1, :g => 2, :b => 3
# So this works:
a = RGB(0.1,0.2,0.3)
a+a # is implemented by NTuple
a.r # implemented by RGB
#Should not work:
(1,2,3).r
To further dwell on the point, that we can have a diversity of simd
accelerated Vector types, with minimal re-implementation of common
functionality, here is another example:
#### Adding Semantic to generic Types
immutable Point{T, N} <: NTuple{N, T}
immutable RGB{T, N} <: NTuple{N, T}
immutable Red{T <: Real} <: T
immutable Green{T <: Real} <: T
immutable Blue{T <: Real} <: T
red = Red(1f0) # -> this will behave exactly like a normal Float32 value
# making it unnecessary, to re-implement all the operations on it
immutable X{T <: Real} <: T
immutable Y{T <: Real} <: T
immutable Z{T <: Real} <: T
Base.getfield(c::RGB, field::Type{Red}) = Red(c[1])
...
Base.getfield(p::Point, field::Type{X}) = X(p[1])
...
Base.getfield(p::Matrix{RGB}, field::Type{Red}) = ... # -> Matrix{Red}
Base.getfield(p::Matrix{Point}, field::Type{X}) = ... # -> Matrix{X}
...
image = [RGB(1f0, 0f0,0f0) for i=1:512, j=1:512]
points = [Point(1f0, 0f0,0f0) for i=1:512, j=1:512]
redchannel = image.Red
zplane = image.Z
###### Now the big magic, won by more type information
visualize(image) # -> rgb image display
visualize(points) # -> point cloud
visualize(redchannel) # I can actually infer what the user wants to see: a red intensity image
visualize(zplane) # Again, it's relatively clear what is expected here: a heightfield
This would be pretty much the paradise for any visualization library. Also,
visual debugging is made a lot easier, as the debugger doesn't need any magic
to infer how it needs to visualize some random variable.
But this matters for more than that, as it becomes easier to do the correct
thing with any value.
I you want to see something else, for example the pixel of the image in RGB
space, you can simply reinterpret the RGB to Point (which has O(1)), which can
be useful to make out color clusters.
#### Let function bodies only do, what is implied by their function name
Compare these two implementations:
#First implementation of matrix multiplication
function (*)(a::Matrix, b::Matrix)
@assert size(a, 2) == size(b, 1) #This isn't part of the multiplication code
#matrix multiplication code
end
# This is probably a matter of taste, but I like this implementation more:
FixedSizeArray{SZ, T, N} <: Array{T,N}
Base.call{T,N}(::Type{FixedSizeArray}, x::Array{T,N}) = FixedSizeArray{size(x), T,N}(x)
function (*)(a::Matrix, b::Matrix)
return (FixedSizeArray(a) * FixedSizeArray(b))
end
# Like this, in the function body is only, what is really part of the matrix multiplication:
function (*)(a::FixedMatrix{T, N, M}, b::FixedMatrix{T, M, P})
# code for multiplication of matrices with well suited dimensions
end
function (*)(a::FixedMatrix, b::FixedMatrix)
# code for multiplication of matrices with arbitrary dimensions
# which usually throws an error
end
This all is mostly graphic related, which isn't a big surprise considering my
background. But I'm pretty sure, that there are good use cases for other
fields ;)
But I better not iterate more use cases here, as this would turn even more
into a novel.
Hope we can realize this in some way!
Best,
Simon
| 0 |
* I have searched the issues of this repository and believe that this is not a duplicate.
When I use ListItem with rootRef I get a Warning-Message in console in Chrome:
Warning: React does not recognize the `rootRef` prop on a DOM element. If you
intentionally want it to appear in the DOM as a custom attribute, spell it as
lowercase `rootref` instead. If you accidentally passed it from a parent
component, remove it from the DOM element.
(i need rootRef because i want to show a popover on the listItem, and the
listItem should be the anchorElement)
"material-ui": "1.0.0-beta.23"
"react": "16.2.0"
Chrome: Version 62.0.3202.94
|
* I have searched the issues of this repository and believe that this is not a duplicate.
## Expected Behavior
The same as ListItemSecondaryAction
## Current Behavior
No (simple) ability to implement a secondary action
## Context
Some quick-access-like actions from expansion panel's head.
## Your Environment
Tech | Version
---|---
Material-UI | v1.0.0-beta.30
React | 16
| 0 |
### Is there an existing issue for this?
* I have searched the existing issues
### Current Behavior
Since NPM v7:
When installing a new dependency from a public repository (`github:foo/bar`),
this creates a `git+ssh://git@github.com/foo/bar`) in the `package-lock.json`.
This breaks when running `npm ci` in a GitHub Action because it doesn't have
the SSH agent configured. And it shouldn't need to, because the dependency is
public.
### Expected Behavior
Don't put `git+ssh` there but `github:`.
### Steps To Reproduce
`npm i github:foo/bar`
### Environment
* OS: macOS Big Sur
* Node 16
* npm v7.12.1
|
### Current Behavior:
When I use a git repository via an HTTP link NPM "takes liberties" with it,
which breaks my build:
$ npm init -y
Wrote to /Users/eugene.lazutkin/Work/temp/package.json:
{
"name": "temp",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC"
}
$ npm i --save https://github.com/uhop/stream-chain.git
added 1 package, and audited 2 packages in 3s
found 0 vulnerabilities
It produces `package-lock.json`:
{
"name": "temp",
"version": "1.0.0",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"version": "1.0.0",
"license": "ISC",
"dependencies": {
"stream-chain": "github:uhop/stream-chain"
}
},
"node_modules/stream-chain": {
"version": "2.2.4",
"resolved": "git+ssh://git@github.com/uhop/stream-chain.git#459f5a1708c138b6e0abaae4cf103c3488e1e78e",
"license": "BSD-3-Clause"
}
},
"dependencies": {
"stream-chain": {
"version": "git+ssh://git@github.com/uhop/stream-chain.git#459f5a1708c138b6e0abaae4cf103c3488e1e78e",
"from": "stream-chain@github:uhop/stream-chain"
}
}
}
Note that `https://github.com/uhop/stream-chain.git` was replaced with
`github:uhop/stream-chain`, which is probably OK in this case. But other two
links (?) are rewritten from `https://github.com/uhop/stream-chain.git` to
`git+ssh://git@github.com/uhop/stream-chain.git`, which is clearly bad.
The problem is that a build bot we use in similar situations can access
private git repositories using HTTP, but not SSH for security reasons. It
fails on an authentication. Rewriting `https://github.com/uhop/stream-
chain.git` to `git+ssh://git@github.com/uhop/stream-chain.git` is not
acceptable for that reasons.
The fix is relatively minor yet unpleasant: we have to replace `npm ci` with
`npm i`, which takes more time and introduced instabilities with other
dependencies.
### Expected Behavior:
When running `npm ci` it should use the original URL with the HTTP
authentication instead of SSH.
### Steps To Reproduce:
See the description and do the same steps using git repositories (github
only?) as dependencies.
### Environment:
OS: Mac
Node: 15.7.0
NPM: 7.4.3
| 1 |
**Symfony version(s) affected** : 4.1.8
**Description**
Symfony crashes php server without providing any error log.
The server log provides me with the following information:
[OK] Server listening on http://127.0.0.1:8000
// Quit the server with CONTROL-C.
PHP 7.2.12 Development Server started at Wed Nov 28 15:20:14 2018
Listening on http://127.0.0.1:8000
Document root is /Users/christianbieg/Documents/Websites/symfony-test/public
Press Ctrl-C to quit.
[ERROR] The process has been signaled with signal "11".
**How to reproduce**
`composer create-project symfony/website-skeleton symfony-test && cd symfony-
test`
`php bin/console server:run`
and then open the website on http://localhost:8000
|
**Symfony version(s) affected** : 4.1.8
**Description**
`(in built in server)[ERROR] The process has been signaled with signal "11".`
**How to reproduce**
In any new project, if you clone `symfony/website-skeleton` run built-in
server
**Additional context**
I think it has a relationship with latest [Routing] fixes `@Route`
| 1 |
#6497 (checked by bisection) let to pick events reporting incorrect events for
step plots.
MWE:
from pylab import *
plot(rand(100), picker=5, drawstyle="steps-pre") # 5 points tolerance
def on_pick(event):
print(event.ind)
cid = gcf().canvas.mpl_connect('pick_event', on_pick)
show()
On 1.5.1, clicking on the "right" part of the plot reports an index close to
100 (that is, the number of points). On master, the index is instead close to
200 (because each point is "duplicated" in the path).
While the previous behavior could "easily" be restored, it may be a good time
to revisit what the "index" returned by `Line2D.contains` actually means.
Specifically:
* For non-step plots, I think the indices should actually be floats, that indicate where between which two points the projection of the click onto the line falls (for each line for which the projection is close enough), and how close to each extremity of the segment. (This would be only a minor backwards incompatibility if the returned index is used, well, to index the data: recent numpys would emit a warning about indexing with a float.)
* For step plots, it is less clear what the correct solution is. Perhaps keeping the status quo and returning the index of the preceding point(s) would be enough.
|
The weekly build with nightly wheels from numpy and pandas
has failed. Check the logs for any updates that need to be
made in matplotlib.
https://github.com/matplotlib/matplotlib/actions/runs/2175468878
| 0 |
Hi there!
I am using an anaconda 4.3.18 64 bits installation on windows 7. the version
of python is 3.6.1 and numpy version is 1.12.1
I imported an ascii file, and ran these commands:
import numpy as np
# read the ascii file into attribute variable
sorted_items= np.sort(attribute, axis=0)
sorted_ids= np.argsort(attribute,axis=0)
Then, I exported sorted_items and sorted_ids to an ASCII file.
Thereafter, I imported the same file in Octave version 4.2.0 64 bits and ran
this command:
[sorted_items,sorted_ids]= sort(attribute)
Also, I exported the outputs to an ASCII file.
When I compared these outputs, I noticed the following:
* The sorted values in Python match those in Octave
* The sorted index show issues. The index reported by numpy are different to those reported by Octave in many places (see the excell spreadsheet, rows with blue background)
* The sorted index show some good matches. A good index match is when the Octave index is one value higher than the one for Python (because Octave starts counting from 1, instead of 0 for Python)
I attached to this issue two files: the excell spreadsheet, and the text file
containing the data used to report this issue.
Many thanks,
Ivan
Data.zip
|
### Describe the issue:
### Describe your issue.
From the attached CSV file, input_data.csv, I import the, say, `test_list`.
Then, I create a `pd.Dataframe` to store my data in `test_df`.
from scipy import fft
import numpy as np
test_df = pd.DataFrame(index=range(len(test_list)), columns=['test'])
test_df['test'] = test_list
I want to calculate the FFT transform of the signal in `test_df['test']`.
However, I get different results when I pass as input different types of
arguments to the `scipy.fft.fft()`.
Once I pass the signal as a nympy.array I get the plot #1.

Once I pass the signal as a pd.Dataframe I get the plot #2.

Both inputs, np.array and pd.DataFrame contain the same information.
From the problem definition the plot#2 seems to have a better physical
interpretation.
However, it is not clear why I get different results.
May be it is a bug?
The code I use is below:
from scipy import fft
import numpy as np
test_df = pd.DataFrame(index=range(len(test_list)), columns=['test'])
test_df['test'] = test_list
########## PLOT 1: from Numpy ##############
to_np = test_df['test'].to_numpy()
print(to_np)
fft_output = fft.fft(to_np)
power = np.abs(fft_output)
freq = numpy.fft.fftfreq(len(to_np))
mask = freq >= 0
freq = freq[mask]
power = power[mask]
plt.plot(freq, power)
plt.show()
########## PLOT 2: from pd.DataFrame ##############
to_df = test_df['test'].to_frame()
print(to_df)
fft_output = numpy.fft.fft(to_df)
power = np.abs(fft_output)
freq = fft.fftfreq(len(to_df))
mask = freq >= 0
freq = freq[mask]
power = power[mask]
plt.plot(freq, power)
plt.show()
### Reproduce the code example:
from scipy import fft
import numpy as np
test_df = pd.DataFrame(index=range(len(test_list)), columns=['test'])
test_df['test'] = test_list
########## PLOT 1: from Numpy ##############
to_np = test_df['test'].to_numpy()
print(to_np)
fft_output = fft.fft(to_np)
power = np.abs(fft_output)
freq = numpy.fft.fftfreq(len(to_np))
mask = freq >= 0
freq = freq[mask]
power = power[mask]
plt.plot(freq, power)
plt.show()
########## PLOT 2: from pd.DataFrame ##############
to_df = test_df['test'].to_frame()
print(to_df)
fft_output = numpy.fft.fft(to_df)
power = np.abs(fft_output)
freq = fft.fftfreq(len(to_df))
mask = freq >= 0
freq = freq[mask]
power = power[mask]
plt.plot(freq, power)
plt.show()
### Error message:
_No response_
### Runtime information:
1.23.5
3.9.16 | packaged by conda-forge | (main, Feb 1 2023, 21:38:11)
[Clang 14.0.6 ]
### Context for the issue:
_No response_
| 0 |
**I'm submitting a ...** (check one with "x")
[x] bug report => search github for a similar issue or PR before submitting
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://github.com/angular/angular/blob/master/CONTRIBUTING.md#question
**Current behavior**
`TypeError: values.map is not a function at
SelectMultipleControlValueAccessor.writeValue (forms.umd.min.js:13) at
setUpControl (forms.umd.min.js:13) at FormControlDirective.ngOnChanges
(forms.umd.min.js:15) at
DebugAppView._View_ApiAccessEditorComponent0.detectChangesInternal
(ApiAccessEditorComponent.ngfactory.js:472) at
DebugAppView.AppView.detectChanges (core.umd.min.js:38) at
DebugAppView.detectChanges (core.umd.min.js:38) at
DebugAppView.AppView.detectViewChildrenChanges (core.umd.min.js:38) at
DebugAppView._View_ApiAccessComponent6.detectChangesInternal
(ApiAccessComponent.ngfactory.js:754) at DebugAppView.AppView.detectChanges
(core.umd.min.js:38) at DebugAppView.detectChanges (core.umd.min.js:38)`
**Expected behavior**
Select the Multiple option as a array is provided in the model
**Reproduction**
`<select multiple role="listbox"
[formControl]="apiAccessForm.controls['roleName']"
[(ngModel)]="objSelectedRole"> <title="select question" class="form-control">
<option *ngFor="let obj of objRoleList"
[value]="obj.roleName">{{obj.roleName}}</option> <select>`
**Its works, if I remove both ngModel and formControl directives, here
objSelectedRole is data from db**
* **Angular version:** 2.0.1
* **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ]
* **Language:** [all | TypeScript X.X | ES6/7 | ES5]
* **Node (for AoT issues):** `node --version` =
|
**I'm submitting a ...** (check one with "x")
[X] bug report => search github for a similar issue or PR before submitting
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://github.com/angular/angular/blob/master/CONTRIBUTING.md#question
**Current behavior**
When using the basic example of the Dynamic Form, i try to add
[multiple]="true" to the select:
<select [id]="question.key" *ngSwitchCase="'dropdown'" [formControlName]="question.key" [multiple]="true">
<option *ngFor="let opt of question.options" [value]="opt.key">{{opt.value}}</option>
</select>
ANd i get a values.map does not exist error:
TypeError: values.map is not a function
at SelectMultipleControlValueAccessor.writeValue (forms.umd.js:1317)
at setUpControl (forms.umd.js:1470)
at FormGroupDirective.addControl (forms.umd.js:3889)
at FormControlName._setUpControl (forms.umd.js:4318)
at FormControlName.ngOnChanges (forms.umd.js:4264)
at DebugAppView._View_DynamicFormQuestionComponent22.detectChangesInternal
(DynamicFormQuestionComponent2.ngfactory.js:514)
at DebugAppView.AppView.detectChanges (core.umd.js:9566)
at DebugAppView.detectChanges (core.umd.js:9671)
at DebugAppView.AppView.detectContentChildrenChanges (core.umd.js:9584)
at DebugAppView._View_DynamicFormQuestionComponent20.detectChangesInternal
(DynamicFormQuestionComponent2.ngfactory.js:200)
**Expected behavior**
No error should be thrown when using [multiple]="true"
**Reproduction of the problem**
Run through the Dynamic Form documentation and just add [multiple]="true" to
the select
**What is the motivation / use case for changing the behavior?**
Broken
**Please tell us about your environment:**
Surface Book w/ Win10 in visual studio using npm/gulp with SystemJS
* **Angular version:** 2.0.X
2.0.0
* **Browser:** [all | Chrome XX | Firefox XX | IE XX | Safari XX | Mobile Chrome XX | Android X.X Web Browser | iOS XX Safari | iOS XX UIWebView | iOS XX WKWebView ]
Everything tried... Chrome, edge, firefox
* **Language:** [all | TypeScript X.X | ES6/7 | ES5]
Typescript
* **Node (for AoT issues):** `node --version` =
| 1 |
The scope of a block of PHP code that is enclosed inside of a separate
language (ie. within a JS block in a page) is not detected properly if the
enclosing PHP tags are both on the same line. If the tags are not separated
onto different lines then Atom thinks the code is just part of the enclosing
code block.
For example, this bit of code is incorrectly detected as part of the
JavaScript block:
<html>
<head>
<script type="text/javascript">
<?php echo 'test'; ?>
</script>
</head>
</html>

While this code is properly detected as being a PHP block:
<html>
<head>
<script type="text/javascript">
<?php
echo 'test';
?>
</script>
</head>
</html>

* * *
This issue is present on a completely fresh Atom installation, as well as one
with a multitude of plugins.
Atom Version: 1.0.0
Operating Systems:
* Ubuntu 15.04 x64
* Windows 8.1 x64
|
A thread has come up on the forum that points out that `<?php ... ?>` doesn't
get matched as PHP code when on the same line. I did a little bit of poking
around and found that the embedded whitespace pattern has a negative lookahead
`(?![^?]*\\?>)` that, when removed, allows single-line PHP comments to
display. The thing is, I can't figure out what that negative lookahead was put
there to check against. My fork of this package seems to work just fine.
Does anybody know what the negative lookahead was meant to guard against?
| 1 |
## Bug Report
**Current Behavior**
Some const assertions result in errors. Seems like the check introduced in
this PR is too strict? Because I didn't see any errors in VSCode till I
started webpack dev server.
**Input Code**
function foo() {
const a = getObjectAWithLongAssType();
const b = getObjectBWithLongAssType();
return [a, b] as const; // no errors in tsc or vscode
// Previously: return [a,b] as [typeof a, typeof b]
}
**Expected behavior/code**
VSCode outputs the expected readonly tuple type, so I don't expect any errors
in babel.
**Babel Configuration (.babelrc, package.json, cli command)**
"@babel/core": "^7.4.0",
"@babel/plugin-proposal-class-properties": "^7.4.0",
"@babel/polyfill": "^7.4.0",
"@babel/preset-env": "^7.4.2",
"@babel/preset-react": "^7.0.0",
"@babel/preset-typescript": "^7.3.3",
**Possible Solution**
@tanhauhau is there a need for `tsCheckLiteralForConstantContext` to happen in
babel? To me it seems like it's enough to parse `as const` expressions as
assertions and move on, since TypeScript is actually in charge of ensuring the
assertions are valid.
|
> Issue originally made by @s-panferov
### Bug information
* **Babel version:** any
* **Node version:** >= 4
### Input code
import * as crypto from 'crypto';
### Description
`_interopRequireWildcard` iterates through export object properties and this
fires deprecation warning because of deprecated property access.
| 0 |
This program:
pub fn main() {
let s: &str = "foo";
fail!(s);
}
generates this error:
<std-macros>:37:13: 37:36 error: instantiating a type parameter with an incompatible type `&str`, which does not fulfill `Send`
<std-macros>:37 ::std::rt::begin_unwind($msg, file!(), line!())
^~~~~~~~~~~~~~~~~~~~~~~
<std-macros>:32:5: 42:6 note: in expansion of fail!
badfail.rs:3:5: 3:14 note: expansion site
error: aborting due to previous error
It does point you to the right place if you read it close, but I missed it
myself the first few times.
|
Followup to #1970.
One of the two tests in struct-return.rs doesn't work on x86. Works on x64
though.
The double/byte/double one winds up feeding garbage memory to the callee.
Valgrind gets upset. Program crashes.
| 0 |
maintenance/1.9.x:
In [8]: ap = np.array([0, 2, 4, 6, 8, 10])
# Passing an empty 'k' array to partition is a no-op:
In [9]: ap.partition(np.array([], dtype=np.int64))
In [10]: ap
Out[10]: array([ 0, 2, 4, 6, 8, 10])
1.10.1:
In [2]: ap = np.array([0, 2, 4, 6, 8, 10])
In [3]: ap.partition(np.array([], dtype=np.int64))
---------------------------------------------------------------------------
MemoryError Traceback (most recent call last)
<ipython-input-3-c708b3f943a6> in <module>()
----> 1 ap.partition(np.array([], dtype=np.int64))
MemoryError:
Noticed this because it's breaking some tests in patsy.
|
If np.unique is used with `return_index=True` and the input is empty, the
resulting array has dtype `bool` instead of `int`.
In [1]: import numpy as np
In [2]: np.unique([], return_index=True)
Out[2]: (array([], dtype=float64), array([], dtype=bool))
In [3]: np.unique([1,], return_index=True)
Out[3]: (array([1]), array([0]))
| 0 |
The demo uses a simple Quiz application (just the routing part).
Steps to reproduce -
1. Start the quiz
2. Navigate forward
3. Click the back button and notice that the component does not render properly
See Plunker - http://plnkr.co/edit/8LMo0VLCLIuM06itCifn?p=preview
**Current behavior**
* usage of the browser's back button does not cause component to re-render properly
**Expected/desired behavior**
* usage of the browser's back button _does_ cause the component to re-render properly (see Google Chrome behavior for expected behavior)
**Other information**
I also posted a question to Stack Overflow about this for additional context
|
I am using Angular 2.0.0-beta.11 on OS X El Capitan. When Safari's browser
history back function is used, `OnInit` does not get fired. It seems to work
fine on Chrome.
Steps to reproduce:
1. Fire up the official Tour Of Heroes Demo
2. Click on any hero in the dashboard
3. Use the browser back button or click the "back" button in the demo. The dashboard doesn't populate.
Related: #4809
| 1 |
## Issue description
ResNet-50 (resolution of 128, batch size 8) crashes on PyTorch 0.4.1, CUDA 9.2
with fp16. It works with CUDA 9.0 and fp32.
## Code example
https://github.com/ddkang/fai-imagenet/tree/imagenet/imagenet_nv
Starting program: /home/daniel_d_kang/anaconda3/envs/dawnbench/bin/python main.py -a resnet50 --lr 0.40 --epochs 45 --small -j 8 --fp16 -b 8 --loss-scale 512 /mnt/disks/dawnbench/imagenet
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
~~epoch hours top1Accuracy
[New Thread 0x7fffaf723700 (LWP 12826)]
[New Thread 0x7fffaef22700 (LWP 12827)]
0
[New Thread 0x7fff44a3f700 (LWP 13143)]
[New Thread 0x7fff426fe700 (LWP 13155)]
[New Thread 0x7fff4173f700 (LWP 13156)]
[New Thread 0x7fff40f3e700 (LWP 13157)]
[New Thread 0x7fff4073d700 (LWP 13158)]
[New Thread 0x7fff3ff3c700 (LWP 13159)]
[New Thread 0x7fff3f73b700 (LWP 13162)]
[New Thread 0x7fff3e77f700 (LWP 13164)]
[New Thread 0x7fff3df7e700 (LWP 13165)]
zero_grad
tensor(3648., device='cuda:0', dtype=torch.float16, grad_fn=<MulBackward>)
torch.Size([])
torch.cuda.HalfTensor
[New Thread 0x7fff3d77d700 (LWP 13216)]
[New Thread 0x7fff3cf7c700 (LWP 13217)]
Thread 14 "python" received signal SIGFPE, Arithmetic exception.
[Switching to Thread 0x7fff3cf7c700 (LWP 13217)]
0x00007fffc3151984 in cudnn::gemm::conv2d(cudnnContext*, void const*, cudnnTensor4dStruct*, void const*, cudnnFilter4dStruct*, void const*, cudnnConvolutionStruct*, cudnnConvWorkingStruct const*, void*, unsigned long, void const*, cudnnTensor4dStruct*, void*, cudnn::gemm::Conv2dType_t, cudnn::gemm::Conv2dConfig&, bool, void const*, cudnnActivationStruct*, void*) () from /home/daniel_d_kang/anaconda3/envs/dawnbench/lib/python3.6/site-packages/torch/lib/libcaffe2_gpu.so
## System Info
Collecting environment information...
PyTorch version: 0.4.1
Is debug build: No
CUDA used to build PyTorch: 9.2.148
OS: Ubuntu 16.04.5 LTS
GCC version: (Ubuntu 5.4.0-6ubuntu1~16.04.10) 5.4.0 20160609
CMake version: Could not collect
Python version: 3.6
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration: GPU 0: Tesla V100-SXM2-16GB
Nvidia driver version: 396.44
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.7.2.1
/usr/lib/x86_64-linux-gnu/libcudnn_static_v7.a
Versions of relevant libraries:
[pip] numpy (1.14.3)
[pip] numpydoc (0.8.0)
[pip] torch (0.4.1)
[pip] torchvision (0.2.1)
[conda] cuda92 1.0 0 pytorch
[conda] pytorch 0.4.1 py36_cuda9.2.148_cudnn7.1.4_1 [cuda92] pytorch
[conda] torchvision 0.2.1 py36_1 pytorch
|
## Issue description
The following code produces a `Floating point exception (core dumped)` on
Volta series with fp16:
import torch
from torch.nn import Conv2d
conv=torch.nn.Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False).cuda().half()
x=torch.rand(2,256,196,196, requires_grad=True).cuda().half()
y=conv(x)
loss=y.min()
loss.backward()
By changing the convolution parameters by +1/-1 the code works.
I suspect this to be a cudnn bug.
## System Info
Collecting environment information...
PyTorch version: 0.4.0a0+3749c58
Is debug build: No
CUDA used to build PyTorch: 9.2.88
OS: Ubuntu 16.04.4 LTS
GCC version: (Ubuntu 5.4.0-6ubuntu1~16.04.10) 5.4.0 20160609
CMake version: version 3.5.1
Python version: 3.5
Is CUDA available: Yes
CUDA runtime version: 9.2.88
GPU models and configuration:
GPU 0: TITAN V
GPU 1: TITAN V
GPU 2: TITAN V
GPU 3: TITAN V
Nvidia driver version: 396.26
cuDNN version: Probably one of the following:
/usr/local/cuda-9.2/cuda/lib64/libcudnn.so.7.1.4
/usr/local/cuda-9.2/cuda/lib64/libcudnn_static.a
/usr/local/cuda-9.2/targets/x86_64-linux/lib/libcudnn.so.7.1.4
/usr/local/cuda-9.2/targets/x86_64-linux/lib/libcudnn_static.a
Versions of relevant libraries:
[pip3] numpy (1.14.5)
[pip3] torch (0.4.0a0+3749c58)
[pip3] torchvision (0.2.1)
[conda] Could not collect
Reproducible also on V100.
| 1 |
Currently we have "something" that shows up in the overlay onDrag that will
then follow your finger and the original Draggable doesn't move or change. I
want it to look like the original is following my finger.
|
What I want to be able to do:
Lets say I have a vertical list of five draggables. Dragging one from the
middle should make it appear it is being/has been removed from the list during
the drag.
Draggable's API doesn't support this use case atm.
| 1 |
I was running Julia 1.6.1, and typed `gc` into the REPL. This led to the
errors shown below.
I also have the current master branch of Julia installed (which identifies as
`Julia 1.8`).
Note that the first error mentions the file name
`/Users/eschnett/.julia/environments/v1.8/Manifest.toml`, which is the
manifest for Julia 1.8.
I think each version of Julia should either be able to read other versions'
manifests, or should handle such incompatibilities gracefully.
(@v1.6) pkg> gc
┌ Error: Could not parse entry for `deps`
└ @ Pkg.Types ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/manifest.jl:150
┌ Warning: Reading manifest file at /Users/eschnett/.julia/environments/v1.8/Manifest.toml failed with error
│ exception =
│ MethodError: no method matching get(::Pair{String, Any}, ::String, ::Nothing)
│ Closest candidates are:
│ get(::DataStructures.Accumulator, ::Any, ::Any) at /Users/eschnett/.julia/packages/DataStructures/ixwFs/src/accumulator.jl:47
│ get(::Test.GenericDict, ::Any, ::Any) at /Users/eschnett/src/julia-1.6/usr/share/julia/stdlib/v1.6/Test/src/Test.jl:1663
│ get(::DataStructures.RobinDict{K, V}, ::Any, ::Any) where {K, V} at /Users/eschnett/.julia/packages/DataStructures/ixwFs/src/robin_dict.jl:384
│ ...
└ @ Pkg.API ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/API.jl:504
┌ Error: Could not parse entry for `deps`
└ @ Pkg.Types ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/manifest.jl:150
┌ Warning: Reading manifest file at /Users/eschnett/.julia/environments/v1.7/Manifest.toml failed with error
│ exception =
│ MethodError: no method matching get(::Pair{String, Any}, ::String, ::Nothing)
│ Closest candidates are:
│ get(::DataStructures.Accumulator, ::Any, ::Any) at /Users/eschnett/.julia/packages/DataStructures/ixwFs/src/accumulator.jl:47
│ get(::Test.GenericDict, ::Any, ::Any) at /Users/eschnett/src/julia-1.6/usr/share/julia/stdlib/v1.6/Test/src/Test.jl:1663
│ get(::DataStructures.RobinDict{K, V}, ::Any, ::Any) where {K, V} at /Users/eschnett/.julia/packages/DataStructures/ixwFs/src/robin_dict.jl:384
│ ...
└ @ Pkg.API ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/API.jl:504
Active manifest files: 54 found
Active artifact files: 208 found
┌ Error: Could not parse entry for `deps`
└ @ Pkg.Types ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/manifest.jl:150
┌ Error: Could not parse entry for `deps`
└ @ Pkg.Types ~/src/julia-1.6/usr/share/julia/stdlib/v1.6/Pkg/src/manifest.jl:150
Active scratchspaces: 18 found
Deleted 1 artifact installation (4.654 MiB)
|
This looks valid syntax but is not accepted.
julia> is_array(::T) :: Bool where {T<:AbstractArray} = true
ERROR: UndefVarError: T not defined
Stacktrace:
[1] top-level scope at none:0
Using the same syntax, but with `function` keyword is accepted.
julia> function is_array(::T) :: Bool where {T<:AbstractArray}
return true
end
is_array (generic function with 1 method)
Removing the return type is also accepted.
julia> is_array_(::T) where {T<:AbstractArray} = true
is_array_ (generic function with 1 method)
Somewhat related issues: #31378, #31542
| 0 |
Basically, every time `setState` is called and a rerender happens, I get this
error:
Invariant Violation: dangerouslyRenderMarkup(...): Cannot render markup in a worker thread.
Make sure `window` and `document` are available globally before requiring React
when unit testing or use React.renderToString for server rendering.
I've created a very bare repo where you can recreate the issue:
https://github.com/dmatteo/setStateBoom
The component + test suite is so small that I can copy/paste it here:
import React, {Component} from 'react/addons';
import jsdomify from 'jsdomify';
import expect from 'unexpected';
class MyComponent extends Component {
constructor(props) {
super(props);
this.displayName = 'MyComponent';
this.state = {
bananas: 'bananas'
};
}
render() {
return <div>{this.state.bananas}</div>
}
}
describe('setState test', () => {
before(() => {
// this leaks document and window to global
jsdomify.create();
});
it('should render', () => {
let instance = React.addons.TestUtils.renderIntoDocument(<MyComponent></MyComponent>);
expect(instance, 'to be defined');
});
it('should not throw', () => {
let instance = React.addons.TestUtils.renderIntoDocument(<MyComponent></MyComponent>);
let foo = () => {
instance.setState({bananas: 'apples'});
};
expect(foo, 'not to throw');
})
});
Do you have any idea what is happening and what can I do about it? (if
anything)
|
Currently React relies on a global `window` (and `document` and `navigator`).
This works in a browser and it works in other environments if one sets a
global `window`.
However, relying on globals prevents multiple independent React engines from
running together.
Testing, for example, benefits strongly from isolated environments. It would
be nice if one could run separate tests with `jsdom` without manually cleaning
up the `window` every time.
Another use case is for programatic behaviour inspection. It's interesting to
load multiple windows with `jsdom` and compare the behavior of one's library
with different calls. If React is involved, however, this is impossible.
Avoiding relying on globals can be added in a backwards-compatible fashion. If
there is a global `window` and `document`, keep the existing behavior. If
there isn't, instead of exposing an API, expose a factory taking a `window`
and returning an API. For example, see how jQuery did it.
A decision to avoid globals means refactoring existing code base to pass
`window` explicitly everywhere, and maintaining that attitude in the future.
What is your opinion on such a feature?
| 1 |
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.4
* Java version: 1.8
### Scenario
We have an interface provided in two IDC clusters. We call them `BJ` / `HZ`.
The target interface we call it `AService`.
In project `AProject` we want to subscribe AService separately from both BJ
and HZ for different beans' registering.
In XML age we can easily achieving that by
<dubbo:registry id="ARegistry" address="xxx" />
<dubbo:registry id="BRegistry" address="zzz" register="false" />
<dubbo:consumer registry="ARegistry" .... />
<dubbo:consumer interface="AService" id="beanA />
<dubbo:consumer interface="AService" id="beanB" registry="BRegistry />
But in dubbo.properties, it's hard and the correct way is:
# main registry
dubbo.registry.id=registryMain
dubbo.registry.address=xxxx
dubbo.registry.file=logs/dubbo-registry-main.properties
dubbo.registry.protocol=zookeeper
# second registry
dubbo.registries.registryOther.id=registryOther
dubbo.registries.registryOther.address=zzzz
dubbo.registries.registryOther.file=logs/dubbo-registry-other.properties
dubbo.registries.registryOther.protocol=zookeeper
dubbo.registries.registryOther.register=false
dubbo.provider.timeout=30000
dubbo.provider.registryIds=registryMain
dubbo.consumer.id=consumerMain
dubbo.consumer.registryIds=registryMain
dubbo.consumer.timeout=30000
dubbo.consumer.retries=0
dubbo.consumer.default=true ########## important
dubbo.consumers.consumerOther.id=consumerOther
dubbo.consumers.consumerOther.registryIds=registryOther
dubbo.consumers.consumerOther.timeout=30000
dubbo.consumers.consumerOther.retries=0
dubbo.consumers.consumerOther.default=false ####### important
dubbo.monitor.protocol=registry
And when reference:
@Reference
private AService beanA;
@Reference(consumer = "consumerOther", injvm = false, id = "beanB")
private AService beanB;
Following will not achieve it:
# main registry
dubbo.registry.id=registryMain
dubbo.registry.address=xxxx
dubbo.registry.file=logs/dubbo-registry-main.properties
dubbo.registry.protocol=zookeeper
dubbo.registry.default=true #########
# second registry
dubbo.registries.registryOther.id=registryOther
dubbo.registries.registryOther.address=zzzz
dubbo.registries.registryOther.file=logs/dubbo-registry-other.properties
dubbo.registries.registryOther.protocol=zookeeper
dubbo.registries.registryOther.register=false
dubbo.registries.registryOther.default=false
dubbo.provider.timeout=30000
dubbo.provider.registryIds=registryMain
dubbo.consumer.id=consumerMain
dubbo.consumer.registryIds=registryMain
dubbo.consumer.timeout=30000
dubbo.consumer.retries=0
dubbo.monitor.protocol=registry
@Reference
private AService beanA;
@Reference(registry = "registryOther", injvm = false, id = "beanB")
private AService beanB;
Will cause unexpected subscription and wrong configuration for beanB.
And setting registryOther both register=false and subscript=false will ignore
its initializing.
Same to other kinds of such configuration combinations.
Is that an issue?
|
* I have searched the issues of this repository and believe that this is not a duplicate.
* I have checked the FAQ of this repository and believe that this is not a duplicate.
### Environment
* Dubbo version: 2.7.3
* Operating System version: win7
* Java version: jdk8
### Steps to reproduce this issue
1. write a goole proto file contain bytes list or bytes map.
`syntax = "proto2"; package
org.apache.dubbo.metadata.definition.protobuf.model; message PBRequestType {
optional bytes msg = 7; repeated bytes bytesList = 10; map<string, bytes>
bytesMap = 11; }`
2. generate goolgle pb entity and used as service parameter.
3. add dependency
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>dubbo-metadata-definition-protobuf</artifactId>
</dependency>
4. service log exception when generate service definition.
see org.apache.dubbo.metadata.definition.protobuf.ProtobufTypeBuilderTest
仓库地址
5. Some compatibility logic could be added by who want to use the serviceDefinition.
We could use String Value in base64 encoded to transform to bytes property.
Brief changelog
### Expected Result
serviceDefinition should be generated successfully.
### Actual Result
If there is an exception, please attach the exception trace:
java.lang.IllegalArgumentException: Map protobuf property bytesMapof Type java.util.Map<java.lang.String, com.google.protobuf.ByteString> can't be parsed.Map with unString key or ByteString value is not supported.
at org.apache.dubbo.metadata.definition.protobuf.ProtobufTypeBuilder.validateMapType(ProtobufTypeBuilder.java:164)
at org.apache.dubbo.metadata.definition.protobuf.ProtobufTypeBuilder.buildProtobufTypeDefinition(ProtobufTypeBuilder.java:103)
at org.apache.dubbo.metadata.definition.protobuf.ProtobufTypeBuilder.build(ProtobufTypeBuilder.java:72)
at org.apache.dubbo.metadata.definition.TypeDefinitionBuilder.build(TypeDefinitionBuilder.java:52)
at org.apache.dubbo.metadata.definition.TypeDefinitionBuilder.build(TypeDefinitionBuilder.java:81)
at org.apache.dubbo.metadata.definition.ServiceDefinitionBuilder.build(ServiceDefinitionBuilder.java:77)
at org.apache.dubbo.metadata.definition.ServiceDefinitionBuilder.buildFullDefinition(ServiceDefinitionBuilder.java:50)
at org.apache.dubbo.metadata.definition.protobuf.ProtobufTypeBuilderTest.testProtobufBuilder(ProtobufTypeBuilderTest.java:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:628)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:117)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:184)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:180)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:127)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:68)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:135)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
at java.util.ArrayList.forEach(ArrayList.java:1249)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
at java.util.ArrayList.forEach(ArrayList.java:1249)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:38)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$5(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$7(NodeTestTask.java:125)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:135)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:123)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:122)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:80)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:32)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:51)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:229)
at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$6(DefaultLauncher.java:197)
at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:211)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:191)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:128)
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:74)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
| 0 |
# Summary of the new feature/enhancement
By default, the PowerShell tab opens. And even with a single open CMD tab,
PowerShell always opens.
# Proposed technical implementation details
The user uses only CMD and should be able to select the default tab type (CMD
or PowerShell)
|
# Description of the new feature/enhancement
Option to let terminal be always on top of all windows, just like the new
calculator app. This would be useful for people learning how to do stuff, as
it can be on top of the tutorial window while not suffering from text going on
to the next line during resizing.
| 0 |
I just updated from beta7 to beta8. After compiling my js file I am seeing the
following error in my console `Uncaught SyntaxError: Unexpected token )`.
Looking at the line where the error occurred I can see that there is an extra
closing parenthesis.
Below is the generated code. (I've indented it so I could read it better)
var _default = /* harmony import */__WEBPACK_IMPORTED_MODULE_1_redux_actions__["handleActions"].call(
undefined,
_handleActions = {},
_defineProperty(
_handleActions,
/* harmony import */__WEBPACK_IMPORTED_MODULE_0__constants_properties__["a"] + '_GET_SUCCESS',
function undefined(state, _ref) {
var payload = _ref.payload;
return /* harmony import */__WEBPACK_IMPORTED_MODULE_2_lodash___default.a.assign(
{},
state,
{
propertyIDs: /* harmony import */__WEBPACK_IMPORTED_MODULE_2_lodash___default.a.union(state.propertyIDs, payload.result)
}
);
}
),
_defineProperty(
_handleActions,
/* harmony import */__WEBPACK_IMPORTED_MODULE_0__constants_properties__["b"] + '_GET_SUCCESS',
function undefined(state, _ref2) {
var payload = _ref2.payload;
return /* harmony import */__WEBPACK_IMPORTED_MODULE_2_lodash___default.a.assign(
{},
state,
{
propertyIDs: /* harmony import */__WEBPACK_IMPORTED_MODULE_2_lodash___default.a.union(state.propertyIDs, payload.result)
}
);
}
),
_handleActions), // <- extra paren
initialState);
Looking over the commits it looks like maybe it might have been caused by this
commit `75b93a1` I could be wrong though.
|
**Do you want to request a _feature_ or report a _bug_?**
Bug
**What is the current behavior?**
A few large dependencies (slate / react-slate / immutable) shared by two pages
are not put in a shared chunk (a large number of dependencies are the same
including components). See visualization below:

**If the current behavior is a bug, please provide the steps to reproduce.**
splitChunks: {
chunks: 'all',
name: false,
}
**What is the expected behavior?**
In my understanding this should not happen by default, so I'd like to see if
this is incorrect behaviour. If this is the case I'll happily provide more
information to investigate.
**Please mention other relevant information such as the browser version,
Node.js version, webpack version, and Operating System.**
webpack version: 4.5.0.
| 0 |
2020-08-06.txt
Version: 1.0.0
OS Version: Microsoft Windows NT 10.0.18363.0
IntPtr Length: 8
x64: True
Date: 08/06/2020 21:54:23
Exception:
System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'Timer'.
at System.Timers.Timer.set_Enabled(Boolean value)
at System.Timers.Timer.Start()
at PowerLauncher.MainWindow.OnVisibilityChanged(Object sender,
DependencyPropertyChangedEventArgs e)
at System.Windows.UIElement.RaiseDependencyPropertyChanged(EventPrivateKey
key, DependencyPropertyChangedEventArgs args)
at System.Windows.UIElement.OnIsVisibleChanged(DependencyObject d,
DependencyPropertyChangedEventArgs e)
at
System.Windows.DependencyObject.OnPropertyChanged(DependencyPropertyChangedEventArgs
e)
at
System.Windows.FrameworkElement.OnPropertyChanged(DependencyPropertyChangedEventArgs
e)
at
System.Windows.DependencyObject.NotifyPropertyChange(DependencyPropertyChangedEventArgs
args)
at System.Windows.UIElement.UpdateIsVisibleCache()
at System.Windows.PresentationSource.RootChanged(Visual oldRoot, Visual
newRoot)
at System.Windows.Interop.HwndSource.set_RootVisualInternal(Visual value)
at System.Windows.Interop.HwndSource.set_RootVisual(Visual value)
at System.Windows.Window.SetRootVisual()
at System.Windows.Window.SetRootVisualAndUpdateSTC()
at System.Windows.Window.SetupInitialState(Double requestedTop, Double
requestedLeft, Double requestedWidth, Double requestedHeight)
at System.Windows.Window.CreateSourceWindow(Boolean duringShow)
at System.Windows.Window.CreateSourceWindowDuringShow()
at System.Windows.Window.SafeCreateWindowDuringShow()
at System.Windows.Window.ShowHelper(Object booleanBox)
at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate
callback, Object args, Int32 numArgs)
at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source,
Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)
|
Popup tells me to give y'all this.
2020-07-31.txt
Version: 1.0.0
OS Version: Microsoft Windows NT 10.0.19041.0
IntPtr Length: 8
x64: True
Date: 07/31/2020 17:29:59
Exception:
System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'Timer'.
at System.Timers.Timer.set_Enabled(Boolean value)
at System.Timers.Timer.Start()
at PowerLauncher.MainWindow.OnVisibilityChanged(Object sender,
DependencyPropertyChangedEventArgs e)
at System.Windows.UIElement.RaiseDependencyPropertyChanged(EventPrivateKey
key, DependencyPropertyChangedEventArgs args)
at System.Windows.UIElement.OnIsVisibleChanged(DependencyObject d,
DependencyPropertyChangedEventArgs e)
at
System.Windows.DependencyObject.OnPropertyChanged(DependencyPropertyChangedEventArgs
e)
at
System.Windows.FrameworkElement.OnPropertyChanged(DependencyPropertyChangedEventArgs
e)
at
System.Windows.DependencyObject.NotifyPropertyChange(DependencyPropertyChangedEventArgs
args)
at System.Windows.UIElement.UpdateIsVisibleCache()
at System.Windows.PresentationSource.RootChanged(Visual oldRoot, Visual
newRoot)
at System.Windows.Interop.HwndSource.set_RootVisualInternal(Visual value)
at System.Windows.Interop.HwndSource.set_RootVisual(Visual value)
at System.Windows.Window.SetRootVisual()
at System.Windows.Window.SetRootVisualAndUpdateSTC()
at System.Windows.Window.SetupInitialState(Double requestedTop, Double
requestedLeft, Double requestedWidth, Double requestedHeight)
at System.Windows.Window.CreateSourceWindow(Boolean duringShow)
at System.Windows.Window.CreateSourceWindowDuringShow()
at System.Windows.Window.SafeCreateWindowDuringShow()
at System.Windows.Window.ShowHelper(Object booleanBox)
at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate
callback, Object args, Int32 numArgs)
at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source,
Delegate callback, Object args, Int32 numArgs, Delegate catchHandler)
| 1 |
# Checklist
* I have checked the issues list
for similar or identical feature requests.
* I have checked the pull requests list
for existing proposed implementations of this feature.
* I have checked the commit log
to find out if the if the same feature was already implemented in the
master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Related Issues and Possible Duplicates
#### Related Issues
* None
#### Possible Duplicates
* None
# Brief Summary
The main idea is to allow settings some flag that will specify the number of
tasks the worker should execute before exiting.
The reason to do so is to able to scale celery workers over Kubernetes in a
reasonable matter.
You would create a pod with a worker set to consume 1 message and then
exiting, in Kubernetes you will create a job for each message in the broker
queue that will start that said pod.
The current way to do something like that is kinda ugly since the auto scaler
of Kubernetes might decide to scale down a worker that is currently running
and in long-running tasks that can be problematic
# Design
## Architectural Considerations
None
## Proposed Behavior
## Proposed UI/UX
## Diagrams
N/A
## Alternatives
None
|
# Checklist
* I have verified that the issue exists against the `master` branch of Celery.
* This has already been asked to the discussion group first.
* I have read the relevant section in the
contribution guide
on reporting bugs.
* I have checked the issues list
for similar or identical bug reports.
* I have checked the pull requests list
for existing proposed fixes.
* I have checked the commit log
to find out if the bug was already fixed in the master branch.
* I have included all related issues and possible duplicate issues
in this issue (If there are none, check this box anyway).
## Mandatory Debugging Information
* I have included the output of `celery -A proj report` in the issue.
(if you are not able to do this, then at least specify the Celery
version affected).
* I have verified that the issue exists against the `master` branch of Celery.
* I have included the contents of `pip freeze` in the issue.
* I have included all the versions of all the external dependencies required
to reproduce this bug.
## Optional Debugging Information
* I have tried reproducing the issue on more than one Python version
and/or implementation.
* I have tried reproducing the issue on more than one message broker and/or
result backend.
* I have tried reproducing the issue on more than one version of the message
broker and/or result backend.
* I have tried reproducing the issue on more than one operating system.
* I have tried reproducing the issue on more than one workers pool.
* I have tried reproducing the issue with autoscaling, retries,
ETA/Countdown & rate limits disabled.
* I have tried reproducing the issue after downgrading
and/or upgrading Celery and its dependencies.
## Related Issues and Possible Duplicates
#### Related Issues
* None
#### Possible Duplicates
* None
## Environment & Settings
**Celery version** :
**`celery report` Output:**
# Steps to Reproduce
## Required Dependencies
* **Minimal Python Version** : N/A or Unknown
* **Minimal Celery Version** : N/A or Unknown
* **Minimal Kombu Version** : N/A or Unknown
* **Minimal Broker Version** : N/A or Unknown
* **Minimal Result Backend Version** : N/A or Unknown
* **Minimal OS and/or Kernel Version** : N/A or Unknown
* **Minimal Broker Client Version** : N/A or Unknown
* **Minimal Result Backend Client Version** : N/A or Unknown
### Python Packages
**`pip freeze` Output:**
### Other Dependencies
N/A
## Minimally Reproducible Test Case
# Expected Behavior
# Actual Behavior
Hi,
We have an issue when we create a python exe file for our app. The app run one
celery worker but the exe run infinite number of workers.
It's a blocking point for us
| 0 |
I think, it would be nice to define `isa(T)` to return a curried version,
because it's consistent with similar functions like `in` and often useful in
`any` or `all`. It's a builtin function, so this would probably require some
small changes, but it's already possible to shadow `isa` locally, so I believe
this should be doable.
|
It would be nice to have a single argument `isa` for cleaner filtering code.
filter(x -> !isa(x, AbstractFloat), [1, 2.0])
# could become
filter(!isa(AbstractFloat), [1, 2.0])
This would provide some consistency with other functions, such as `isequal`,
but since `isa` is a builtin function, defining other variants doesn't seem
trivial.
julia> Core.isa(::Type{T}) where {T} = (x) -> isa(x, T)
ERROR: cannot add methods to a builtin function
| 1 |
### Issue summary
When multiple 3d quiver arrows are plotted, using different colours, the arrow
heads and tails have different colours. The two ticks of the arrow heads also
differ in colour.
### Sample code
import numpy as np
import matplotlib.pyplot as plt
%matplotlib notebook
# Three unit vectors
x = [1,0,0]
y = [0,1,0]
z = [0,0,1]
ax = plt.figure().add_subplot(projection='3d')
# The documentation of quiver says that additional kwargs are delegated to LineCollection
# colors parameter of LineCollection, as per documentation, takes a sequence of RGBA tuples.
ax.quiver(0,0,0,x,y,z,colors=[(1,0,0,1),(0,1,0,1),(0,0,1,1)]) # red, green and blue, with alpha=1.0
ax.set_xlabel('x')
ax.set_ylabel('y')
ax.set_zlabel('z')
ax.set_xlim(-2,2)
ax.set_ylim(-2,2)
ax.set_zlim(-2,2)
# ax.quiver(0,0,0,x,y,z,colors=['r','g','b']) shows similar results
### Original output

### Expected output
Each arrow, head to tail, should be of a single colour.
### Versions
Matplotlib: 3.4.1
Python: 3.8.5
Platform: Jupyter Notebook version 6.1.4
|
### Bug report
**Bug summary**
The color of the 3D arrow head does not match that of the arrow body. (In
fact, the two segments of head itself don't even match.)
Not sure if it is related to #11746, so I posted it separately just to make
things clearer.
**Code for reproduction**
import numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
x = np.zeros(10)
y = np.zeros(10)
z = np.arange(10.)
dx = np.zeros(10)
dy = np.arange(10.)
dz = np.zeros(10)
fig = plt.figure()
ax = fig.gca(projection='3d')
arrow_color = plt.cm.Reds(dy/dy.max())
ax.quiver(x, y, z, dx, dy, dz, colors=arrow_color)
ax.set_ylim(0,10)
plt.show()
**Actual outcome**

**Expected outcome**
The entire arrow should have a single color.
**Matplotlib version**
* Operating system: macOS 10.13.6
* Matplotlib version: 2.2.2
* Matplotlib backend (`print(matplotlib.get_backend())`): MacOSX
* Python version: 2.7.15
* Jupyter version (if applicable): 5.6.0
* Other libraries:
matplotlib and Python were installed with Anaconda.
| 1 |
For previous discussion, see: https://groups.google.com/forum/#!topic/julia-
users/FBmU-mxQ0k4
My request is to have functionality to be able to initialize a vector of size
`N` and type `T` using the `Vector(T, N)` command. This would be equivalent to
the current `Array(T, N)`. The reason for this is that it if I have a function
taking a `Vector` argument it feels nice to create the vector I send in to the
function using a `Vector` constructor. Same for matrices.
I tried to implement it myself like this (and in some other ways) but it is
not working:
Base.call{T}(::Vector{T}, size::Int) = Array(T, size)
|
Both Dict and Array are parametric types. Constructing a Dict and an Array of
variable size differs markedly, though:
* `Dict{TypeA,TypeB}()`
* `Array(Type)` while `Array{Type}()` doesn't work
This creates some confusion for people new to the language – also I think its
inconsistent design.
| 1 |
**John Thoms** opened **SPR-7404** and commented
org.springframework.jms.support.converter.JsonMessageConverter would handle
marshalling of JMS payloads similiar to MappingJacksonHttpMessageConverter for
http and
spring-amqp/org.springframework.amqp.support.converter.JsonMessageConverter
* * *
**Reference URL:** http://forum.springsource.org/showthread.php?p=311501
**Issue Links:**
* #11909 Add JsonMessageConvertor to Spring JMS from Spring Extensions project SE-AMQP ( _ **"is duplicated by"**_ )
**Referenced from:** commits `1adf825`, `7ec9292`
1 votes, 1 watchers
|
**Michael Isvy** opened **SPR-6944** and commented
Currently, if I need to write a form that links to my web application, I
usually write it that way:
<c:url value="/client" var="form_url"/>
<form:form action="${form_url}">
...
</form:form>
Generated HTML is as follows:
<form id="command" action="/myWebApp/client" method="post">
Inside form
</form>
It would be easier if the form tag could include the context path out of the
box (for any path starting with "/" for instance). In that way, I would not
need to use the <c:url /> tag anymore.
* * *
**Issue Links:**
* #13326 Form tag should prepend the contextPath and servletPath if not present ( _ **"is duplicated by"**_ )
| 0 |
Tried to get help from mailing list, but nobody responded. See
https://groups.google.com/forum/?fromgroups#!topic/symfony2/Ib7pMtVpV3Q for
text explanations
I wrote simple test controller + test case to check this issue and it's
failing.
See biozshock/symfony-standard@`0bcd739`
Functionality was added at `1099858` but i think tests do not fully cover it.
Or maybe I'm doing something in a wrong way?
|
While trying out the new form PATCH feature, I ran into a bug:
Say for instance you have a form with two children: firstName, lastName,
and you PATCH with this data: `array('firstName' => 'Hank')`
now the "submit" method is never called on the child "lastName". Now there's a
problem, because Form::isValid() only returns `true` if `$this->submitted` is
`true`. However, this is `false` for the "lastName" child. A form with
children, in turn, will only return `true`, if all of it's children are
`true`, which, now, is not the case.
| 1 |
Considering this code:
use Symfony\Component\OptionsResolver\Options;
use Symfony\Component\OptionsResolver\OptionsResolver;
$resolver = new OptionsResolver();
$resolver->setDefaults([
'foo' => 1,
'bar' => null,
]);
$options = $resolver->resolve([
'bar' => function (Options $options) {
return $options['foo'];
}
]);
var_dump($options);
Result **before** 2.6:
array (size=2)
'foo' => int 1
'bar' => int 1
Result **after** 2.6:
array (size=2)
'foo' => int 1
'bar' =>
object(Closure)[2242]
I know Closure resolution is made for default options depending on others but
sometimes it could be useful to reuse a default option.
|
I'll provide a link to my Stack Overflow question if anyone is interested in
more details. To sum it up I'm using postman to send POST requests to
**app.php/myroute/login** which are converted into GET requests. If I post to
just **/myroute/login** everything works fine. I post to the original url but
move the .htaccess file and xdebug shows the method as being post. I don't
think many people will access the application through app.php so I would say
this is very low priority but it gave me a bit of trouble, I was accessing
through app_dev.php and decided to remove _dev which lead me to this. People
may come across this though in the future so I figured it would be good to
submit it.
My Stack Overflow question
| 0 |
After update atom to 0.189 I have several issues with clipboard.
1. If I copy something into clipboard from `atom` editor then I can't paste it in any program except atom. For example into `gedit`.
2. Even worse. If I try to paste it into some text field of any webpage opened in `Google Chrome 41` then Chrome's tab freezes for a long time (several minutes) and then crashes. If I even try to copy some text from webpage opened in `Chrome` while clipboard already contains text copied from `atom` then `Chrome` still freezes (if clipboard contains any value not copied from `atom` then all is ok).
|
I'm using Atom ver. 0.189.0 on Ubuntu 14.04.2 LTS. When I copy a piece of text
in atom and I try to paste it the right-click menu either won't show up or
will show up without giving me option to paste. The paste shortcut doesn't
work either and it creates issues even if I try to copy and paste text from
other programs (after I have coping something from atom). I have disable all
the extra plugins, so it isn't a plugin issue. I didn't had this issue with
the previous version (I am using webupd8 ppa). If I do a right click on a
place where there isn't a text entry (meaning a place where it doesn't support
paste anyway) the right click menu appears fine.
| 1 |
by **eric.atienza@mydoceapower.com** :
What steps will reproduce the problem?
@see http://play.golang.org/p/unO74If5mD
What is the expected output?
in the expression "a.b.c.Check()" a is not nil, but b is.
I expect a run time panic to occur:
"If x is of pointer or interface type and has the value nil, assigning to,
evaluating, or calling x.f causes a run-time panic." (spec extract)
What do you see instead?
the method Check() is called instead.
Which compiler are you using? 6g
Which operating system are you using? Linux 3.5.0-18-generic
Which version are you using? (run 'go version')1.0.3
|
### What version of Go are you using (go version)?
`go version go1.5beta2 darwin/amd64`
### What operating system and processor architecture are you using?
OS X 10.10.4 (64-bit)
### What did you do?
go get -u github.com/shurcooL/go-get-issue-cgo
### What did you expect to see?
No output, successful go get (fetch, build and install).
### What did you see instead?
package github.com/shurcooL/go-get-issue-cgo
imports C: unrecognized import path "C"
## Likely Cause
I haven't verified it, but I am quite certain (it's very likely) this is a
regression caused by the code change in https://go-
review.googlesource.com/#/c/12192/.
After writing up this report, I've realized this issue is a duplicate of
#11738 and can be closed right away (filing just in case some additional
information here is helpful).
| 0 |
Importing numpy raises AttributeError when using latest version, 1.16.0.
### Reproducing code example:
import numpy as np
## Gives error traceback:
* * *
AttributeError Traceback (most recent call last)
in
\----> 1 from sklearn.datasets import make_classification
2 from sklearn.preprocessing import StandardScaler,label_binarize
3 from sklearn.svm import SVC, LinearSVC
4 from sklearn.multiclass import OneVsRestClassifier
5 from sklearn.pipeline import make_pipeline
~/.local/lib/python3.6/site-packages/sklearn/ **init**.py in
62 else:
63 from . import __check_build
\---> 64 from .base import clone
65 from .utils._show_versions import show_versions
66
~/.local/lib/python3.6/site-packages/sklearn/base.py in
8 from collections import defaultdict
9
\---> 10 import numpy as np
11 from scipy import sparse
12 from .externals import six
~/.local/lib/python3.6/site-packages/numpy/ **init**.py in
140 from . import _distributor_init
141
\--> 142 from . import core
143 from .core import *
144 from . import compat
~/.local/lib/python3.6/site-packages/numpy/core/ **init**.py in
57 from . import numerictypes as nt
58 multiarray.set_typeDict(nt.sctypeDict)
\---> 59 from . import numeric
60 from .numeric import *
61 from . import fromnumeric
~/.local/lib/python3.6/site-packages/numpy/core/numeric.py in
3091 from .umath import *
3092 from .numerictypes import *
-> 3093 from . import fromnumeric
3094 from .fromnumeric import *
3095 from . import arrayprint
~/.local/lib/python3.6/site-packages/numpy/core/fromnumeric.py in
15 from . import numerictypes as nt
16 from .numeric import asarray, array, asanyarray, concatenate
\---> 17 from . import _methods
18
19 _dt_ = nt.sctype2char
~/.local/lib/python3.6/site-packages/numpy/core/_methods.py in
156 )
157
\--> 158 _NDARRAY_ARRAY_FUNCTION = mu.ndarray. **array_function**
159
160 def _array_function(self, func, types, args, kwargs):
AttributeError: type object 'numpy.ndarray' has no attribute '
**array_function** '
|
I pip installed the latest numpy and get the below error. The error goes away
if I roll back to v1.15.4.
The most closely related issue I could find is #12028.
### Reproducing code example:
import numpy as np
### Error message:
Traceback (most recent call last):
File "/home/trey/script.py", line 6, in <module>
import numpy as np
File "/home/trey/.local/lib/python3.6/site-packages/numpy/__init__.py", line 142, in <module>
from . import core
File "/home/trey/.local/lib/python3.6/site-packages/numpy/core/__init__.py", line 59, in <module>
from . import numeric
File "/home/trey/.local/lib/python3.6/site-packages/numpy/core/numeric.py", line 3093, in <module>
from . import fromnumeric
File "/home/trey/.local/lib/python3.6/site-packages/numpy/core/fromnumeric.py", line 17, in <module>
from . import _methods
File "/home/trey/.local/lib/python3.6/site-packages/numpy/core/_methods.py", line 158, in <module>
_NDARRAY_ARRAY_FUNCTION = mu.ndarray.__array_function__
AttributeError: type object 'numpy.ndarray' has no attribute '__array_function__'
### Numpy/Python version information:
1.16.0
3.6.7 (default, Oct 22 2018, 11:32:17)
[GCC 8.2.0]
| 1 |
It'd be awesome if the navigation tree did basically the same thing as github,
letting you easily skip through multiple levels of nested trees. (This turns
out to be a super common thing with Java code)
Cheers! ❤️
|
I occasionally have to work on Java projects for work. Java essentially
requires project hierarchies like the following:
\project
\src
\com
\example
\project
\app
source1.java
source2.java
source3.java
\library-foo
source1.java
source2.java
source3.java
Java IDEs have a feature in their tree-view controls to collapse directories
that only have one entry in them underneath `src`. So the above would look
like this:
\project
\src
\com.example.project
\app
source1.java
source2.java
source3.java
\library-foo
source1.java
source2.java
source3.java
I'd like to see Atom tree-view support this feature as well on an opt-in
basis.
| 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.