{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "返回主页\n",
    "天宇之游\n",
    "博客园首页新随笔联系订阅管理随笔 - 140  文章 - 0  评论 - 38\n",
    "python之celery使用详解一\n",
    "前言\n",
    "前段时间需要使用rabbitmq做写缓存，一直使用pika+rabbitmq的组合，pika这个模块虽然可以很直观地操作rabbitmq，但是官方给的例子太简单，对其底层原理了解又不是很深，遇到很多坑，尤其是需要自己写连接池管理和channel池管理。虽然也有用过celery，一直也是celery+redis的组合，涉及很浅；目前打算深研一下celery+redis+rabbitmq的使用。\n",
    "\n",
    "celery + rabbitmq初步\n",
    "我们先不在集成框架如flask或Django中使用celery，而仅仅单独使用。\n",
    "\n",
    "简单介绍\n",
    "Celery 是一个异步任务队列，一个Celery有三个核心组件：\n",
    "\n",
    "Celery 客户端: 用于发布后台作业；当与 Flask 一起工作的时候，客户端与 Flask 应用一起运行。\n",
    "\n",
    "Celery workers: 运行后台作业的进程。Celery 支持本地和远程的 workers，可以在本地服务器上启动一个单独的 worker，也可以在远程服务器上启动worker，需要拷贝代码；\n",
    "\n",
    "消息代理: 客户端通过消息队列和 workers 进行通信，Celery 支持多种方式来实现这些队列。最常用的代理就是 RabbitMQ 和 Redis。\n",
    "\n",
    "安装rabbitmq和redis\n",
    "rabbitmq安装和配置参考：rabbitmq安装和配置\n",
    "\n",
    "redis的安装和配置参考：redis的安装和配置\n",
    "\n",
    "redis-py安装：\n",
    "\n",
    "sudo pip install redis\n",
    "redis-py操作redis参考：python操作redis\n",
    "为了提高性能，官方推荐使用librabbitmq，这是一个连接rabbitmq的C++的库；\n",
    "\n",
    "# 选择broker客户端、序列化和并发\n",
    "sudo pip install celery[librabbitmq,redis,msgpack,gevent]\n",
    "初步使用\n",
    "一般我们使用redis做结果存储，使用rabbitmq做任务队列；\n",
    "\n",
    "第一步：创建并发送一个异步任务\n",
    "# 初始化\n",
    "# tasks.py\n",
    "from celery import Celery\n",
    "app = Celery('tasks', broker='amqp://username:passwd@ip:port/varhost',backend='redis://username:passwd@ip:6390/db')\n",
    "\n",
    "@app.task\n",
    "def add(x, y):\n",
    "    return x + y\n",
    "\n",
    "if __name__ == '__main__':\n",
    "    result = add.delay(30, 42)\n",
    "\n",
    "# broker:任务队列的中间人；\n",
    "# backend:任务执行结果的存储；\n",
    "发生了什么事\n",
    "\n",
    "app.task装饰add函数成一个Task实例，add.delay函数将task实例序列化后，通过librabbitmq库的方法将任务发送到rabbitmq；\n",
    "\n",
    "该过程创建一个名字为celery的exchange交换机，类型为direct（直连交换机）;创建一个名为celery的queue，队列和交换机使用路由键celery绑定；\n",
    "\n",
    "打开rabbitmq管理后台，可以看到有一条消息已经在celery队列中；\n",
    "\n",
    "记住：当有多个装饰器的时候，app.task一定要在最外层；\n",
    "\n",
    "扩展\n",
    "\n",
    "如果使用redis作为任务队列中间人，在redis中存在两个键 celery和_kombu.binding.celery， _kombu.binding.celery表示有一名为 celery 的任务队列（Celery 默认），而键celery为默认队列中的任务列表，使用list类型，可以看看添加进去的任务数据。\n",
    "\n",
    "第二步:开启worker执行任务\n",
    "在项目目录下执行命令：\n",
    "\n",
    "celery -A app.celery_tasks.celery worker -Q queue --loglevel=info\n",
    "\n",
    "# -A参数指定创建的celery对象的位置，该app.celery_tasks.celery指的是app包下面的celery_tasks.py模块的celery实例，注意一定是初始化后的实例，后面加worker表示该实例就是任务执行者；\n",
    "# -Q参数指的是该worker接收指定的队列的任务，这是为了当多个队列有不同的任务时可以独立；如果不设会接收所有的队列的任务；\n",
    "# -l参数指定worker输出的日志级别；\n",
    "任务执行完毕后结果存储在redis中，查看redis中的数据，发现存在一个string类型的键值对：\n",
    "\n",
    "celery-task-meta-064e4262-e1ba-4e87-b4a1-52dd1418188f:data\n",
    "该键值对的失效时间默认为24小时。\n",
    "\n",
    "分析序列化的消息\n",
    "add.delay将Task实例序列化后发送到rabbitmq，那么序列化的过程是怎样的呢？\n",
    "\n",
    "下面是添加到rabbitmq任务队列中的消息数据，使用的是pickle模块对body部分的数据进行序列化：\n",
    "\n",
    "{\"body\": \"gAJ9cQAoWAQAAAB0YXNrcQFYGAAAAHRlc3RfY2VsZXJ5LmFkZF90b2dldGhlcnECWAIAAABpZHEDWCQAAAA2NmQ1YTg2Yi0xZDM5LTRjODgtYmM5OC0yYzE4YjJjOThhMjFxBFgEAAAAYXJnc3EFSwlLKoZxBlgGAAAAa3dhcmdzcQd9cQhYBwAAAHJldHJpZXNxCUsAWAMAAABldGFxCk5YBwAAAGV4cGlyZXNxC05YAwAAAHV0Y3EMiFgJAAAAY2FsbGJhY2tzcQ1OWAgAAABlcnJiYWNrc3EOTlgJAAAAdGltZWxpbWl0cQ9OToZxEFgHAAAAdGFza3NldHERTlgFAAAAY2hvcmRxEk51Lg==\",  \n",
    "# body是序列化后使用base64编码的信息，包括具体的任务参数，其中包括了需要执行的方法、参数和一些任务基本信息\n",
    "\"content-encoding\": \"binary\", # 序列化数据的编码方式\n",
    "\"content-type\": \"application/x-python-serialize\",  # 任务数据的序列化方式，默认使用python内置的序列化模块pickle\n",
    "\"headers\": {}, \n",
    "\"properties\": \n",
    "        {\"reply_to\": \"b7580727-07e5-307b-b1d0-4b731a796652\",       # 结果的唯一id\n",
    "        \"correlation_id\": \"66d5a86b-1d39-4c88-bc98-2c18b2c98a21\",  # 任务的唯一id\n",
    "        \"delivery_mode\": 2, \n",
    "        \"delivery_info\": {\"priority\": 0, \"exchange\": \"celery\", \"routing_key\": \"celery\"},  # 指定交换机名称，路由键，属性\n",
    "        \"body_encoding\": \"base64\", # body的编码方式\n",
    "        \"delivery_tag\": \"bfcfe35d-b65b-4088-bcb5-7a1bb8c9afd9\"}}\n",
    "将序列化消息反序列化\n",
    "import pickle\n",
    "import base64\n",
    "result = base64.b64decode('gAJ9cQAoWAQAAAB0YXNrcQFYGAAAAHRlc3RfY2VsZXJ5LmFkZF90b2dldGhlcnECWAIAAABpZHEDWCQAAAA2NmQ1YTg2Yi0xZDM5LTRjODgtYmM5OC0yYzE4YjJjOThhMjFxBFgEAAAAYXJnc3EFSwlLKoZxBlgGAAAAa3dhcmdzcQd9cQhYBwAAAHJldHJpZXNxCUsAWAMAAABldGFxCk5YBwAAAGV4cGlyZXNxC05YAwAAAHV0Y3EMiFgJAAAAY2FsbGJhY2tzcQ1OWAgAAABlcnJiYWNrc3EOTlgJAAAAdGltZWxpbWl0cQ9OToZxEFgHAAAAdGFza3NldHERTlgFAAAAY2hvcmRxEk51Lg==')\n",
    "print(pickle.loads(result))\n",
    "\n",
    "# 结果\n",
    "{\n",
    "    'task': 'test_celery.add_together',  # 需要执行的任务\n",
    "    'id': '66d5a86b-1d39-4c88-bc98-2c18b2c98a21',  # 任务的唯一id\n",
    "    'args': (9, 42),   # 任务的参数\n",
    "    'kwargs': {},      \n",
    "    'retries': 0, \n",
    "    'eta': None, \n",
    "    'expires': None, # 任务失效时间\n",
    "    'utc': True, \n",
    "    'callbacks': None, # 完成后的回调\n",
    "    'errbacks': None,  # 任务失败后的回调\n",
    "    'timelimit': (None, None), # 超时时间\n",
    "    'taskset': None, \n",
    "    'chord': None\n",
    "}\n",
    "我们可以看到body里面有我们需要执行的函数的一切信息，celery的worker接收到消息后就会反序列化body数据，执行相应的方法。\n",
    "\n",
    "常见的数据序列化方式\n",
    "binary: 二进制序列化方式；python的pickle默认的序列化方法；\n",
    "json:json 支持多种语言, 可用于跨语言方案，但好像不支持自定义的类对象；\n",
    "XML:类似标签语言；\n",
    "msgpack:二进制的类 json 序列化方案, 但比 json 的数据结构更小, 更快；\n",
    "yaml:yaml 表达能力更强, 支持的数据类型较 json 多, 但是 python 客户端的性能不如 json\n",
    "经过比较，为了保持跨语言的兼容性和速度，采用msgpack或json方式；\n",
    "\n",
    "celery配置\n",
    "celery的性能和许多因素有关，比如序列化的方式，连接rabbitmq的方式，多进程、单线程等等，我们可以指定配置；\n",
    "\n",
    "基本配置项\n",
    "CELERY_DEFAULT_QUEUE：默认队列\n",
    "BROKER_URL  : 代理人即rabbitmq的网址\n",
    "CELERY_RESULT_BACKEND：结果存储地址\n",
    "CELERY_TASK_SERIALIZER：任务序列化方式\n",
    "CELERY_RESULT_SERIALIZER：任务执行结果序列化方式\n",
    "CELERY_TASK_RESULT_EXPIRES：任务过期时间\n",
    "CELERY_ACCEPT_CONTENT：指定任务接受的内容序列化类型(序列化)，一个列表；\n",
    "加载配置\n",
    "# main.py\n",
    "from celery import Celery\n",
    "import celeryconfig\n",
    "app = Celery(__name__, include=[\"task\"])\n",
    "# 引入配置文件\n",
    "app.config_from_object(celeryconfig)\n",
    "\n",
    "if __name__ == '__main__':\n",
    "    result = add.delay(30, 42)\n",
    "\n",
    "# task.py\n",
    "from main import app\n",
    "@app.task\n",
    "def add(x, y):\n",
    "    return x + y  \n",
    "\n",
    "# celeryconfig.py\n",
    "BROKER_URL =  'amqp://username:password@localhost:5672/yourvhost'\n",
    "CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'\n",
    "CELERY_TASK_SERIALIZER = 'msgpack'\n",
    "CELERY_RESULT_SERIALIZER = 'msgpack'\n",
    "CELERY_TASK_RESULT_EXPIRES = 60 * 60 * 24   # 任务过期时间\n",
    "CELERY_ACCEPT_CONTENT = [\"msgpack\"]            # 指定任务接受的内容序列化的类型.\n",
    "也可以直接加载配置\n",
    "\n",
    "from celery import Celery\n",
    "import celeryconfig\n",
    "app = Celery(__name__, include=[\"task\"])\n",
    "app.conf.update(\n",
    "        task_serializer='json',\n",
    "        accept_content=['json'],\n",
    "        result_serializer='json',\n",
    "        timezone='Europe/Oslo',\n",
    "        enable_utc=True,\n",
    "    )\n",
    "此外还有两个方法可以加载配置，但开发不会直接调用：\n",
    "\n",
    "app.config_from_envvar() # 从环境变量加载\n",
    "app.config_from_cmdline() # 从命令行加载\n",
    "一份比较常用的配置文件\n",
    "# 注意，celery4版本后，CELERY_BROKER_URL改为BROKER_URL\n",
    "BROKER_URL = 'amqp://username:passwd@host:port/虚拟主机名'\n",
    "# 指定结果的接受地址\n",
    "CELERY_RESULT_BACKEND = 'redis://username:passwd@host:port/db'\n",
    "# 指定任务序列化方式\n",
    "CELERY_TASK_SERIALIZER = 'msgpack' \n",
    "# 指定结果序列化方式\n",
    "CELERY_RESULT_SERIALIZER = 'msgpack'\n",
    "# 任务过期时间,celery任务执行结果的超时时间\n",
    "CELERY_TASK_RESULT_EXPIRES = 60 * 20   \n",
    "# 指定任务接受的序列化类型.\n",
    "CELERY_ACCEPT_CONTENT = [\"msgpack\"]   \n",
    "# 任务发送完成是否需要确认，这一项对性能有一点影响     \n",
    "CELERY_ACKS_LATE = True  \n",
    "# 压缩方案选择，可以是zlib, bzip2，默认是发送没有压缩的数据\n",
    "CELERY_MESSAGE_COMPRESSION = 'zlib' \n",
    "# 规定完成任务的时间\n",
    "CELERYD_TASK_TIME_LIMIT = 5  # 在5s内完成任务，否则执行该任务的worker将被杀死，任务移交给父进程\n",
    "# celery worker的并发数，默认是服务器的内核数目,也是命令行-c参数指定的数目\n",
    "CELERYD_CONCURRENCY = 4 \n",
    "# celery worker 每次去rabbitmq预取任务的数量\n",
    "CELERYD_PREFETCH_MULTIPLIER = 4 \n",
    "# 每个worker执行了多少任务就会死掉，默认是无限的\n",
    "CELERYD_MAX_TASKS_PER_CHILD = 40 \n",
    "# 设置默认的队列名称，如果一个消息不符合其他的队列就会放在默认队列里面，如果什么都不设置的话，数据都会发送到默认的队列中\n",
    "CELERY_DEFAULT_QUEUE = \"default\" \n",
    "# 设置详细的队列\n",
    "CELERY_QUEUES = {\n",
    "    \"default\": { # 这是上面指定的默认队列\n",
    "        \"exchange\": \"default\",\n",
    "        \"exchange_type\": \"direct\",\n",
    "        \"routing_key\": \"default\"\n",
    "    },\n",
    "    \"topicqueue\": { # 这是一个topic队列 凡是topictest开头的routing key都会被放到这个队列\n",
    "        \"routing_key\": \"topic.#\",\n",
    "        \"exchange\": \"topic_exchange\",\n",
    "        \"exchange_type\": \"topic\",\n",
    "    },\n",
    "    \"task_eeg\": { # 设置扇形交换机\n",
    "        \"exchange\": \"tasks\",\n",
    "        \"exchange_type\": \"fanout\",\n",
    "        \"binding_key\": \"tasks\",\n",
    "    },\n",
    "}\n",
    "在celery4.0以后配置参数改成了小写，对于4.0以后的版本替代参数：\n",
    "\n",
    "4.0版本以下参数          4.0版本以上配置参数\n",
    "CELERY_ACCEPT_CONTENT   accept_content\n",
    "CELERY_ENABLE_UTC   enable_utc\n",
    "CELERY_IMPORTS  imports\n",
    "CELERY_INCLUDE  include\n",
    "CELERY_TIMEZONE timezone\n",
    "CELERYBEAT_MAX_LOOP_INTERVAL    beat_max_loop_interval\n",
    "CELERYBEAT_SCHEDULE beat_schedule\n",
    "CELERYBEAT_SCHEDULER    beat_scheduler\n",
    "CELERYBEAT_SCHEDULE_FILENAME    beat_schedule_filename\n",
    "CELERYBEAT_SYNC_EVERY   beat_sync_every\n",
    "BROKER_URL  broker_url\n",
    "BROKER_TRANSPORT    broker_transport\n",
    "BROKER_TRANSPORT_OPTIONS    broker_transport_options\n",
    "BROKER_CONNECTION_TIMEOUT   broker_connection_timeout\n",
    "BROKER_CONNECTION_RETRY broker_connection_retry\n",
    "BROKER_CONNECTION_MAX_RETRIES   broker_connection_max_retries\n",
    "BROKER_FAILOVER_STRATEGY    broker_failover_strategy\n",
    "BROKER_HEARTBEAT    broker_heartbeat\n",
    "BROKER_LOGIN_METHOD broker_login_method\n",
    "BROKER_POOL_LIMIT   broker_pool_limit\n",
    "BROKER_USE_SSL  broker_use_ssl\n",
    "CELERY_CACHE_BACKEND    cache_backend\n",
    "CELERY_CACHE_BACKEND_OPTIONS    cache_backend_options\n",
    "CASSANDRA_COLUMN_FAMILY cassandra_table\n",
    "CASSANDRA_ENTRY_TTL cassandra_entry_ttl\n",
    "CASSANDRA_KEYSPACE  cassandra_keyspace\n",
    "CASSANDRA_PORT  cassandra_port\n",
    "CASSANDRA_READ_CONSISTENCY  cassandra_read_consistency\n",
    "CASSANDRA_SERVERS   cassandra_servers\n",
    "CASSANDRA_WRITE_CONSISTENCY cassandra_write_consistency\n",
    "CASSANDRA_OPTIONS   cassandra_options\n",
    "CELERY_COUCHBASE_BACKEND_SETTINGS   couchbase_backend_settings\n",
    "CELERY_MONGODB_BACKEND_SETTINGS mongodb_backend_settings\n",
    "CELERY_EVENT_QUEUE_EXPIRES  event_queue_expires\n",
    "CELERY_EVENT_QUEUE_TTL  event_queue_ttl\n",
    "CELERY_EVENT_QUEUE_PREFIX   event_queue_prefix\n",
    "CELERY_EVENT_SERIALIZER event_serializer\n",
    "CELERY_REDIS_DB redis_db\n",
    "CELERY_REDIS_HOST   redis_host\n",
    "CELERY_REDIS_MAX_CONNECTIONS    redis_max_connections\n",
    "CELERY_REDIS_PASSWORD   redis_password\n",
    "CELERY_REDIS_PORT   redis_port\n",
    "CELERY_RESULT_BACKEND   result_backend\n",
    "CELERY_MAX_CACHED_RESULTS   result_cache_max\n",
    "CELERY_MESSAGE_COMPRESSION  result_compression\n",
    "CELERY_RESULT_EXCHANGE  result_exchange\n",
    "CELERY_RESULT_EXCHANGE_TYPE result_exchange_type\n",
    "CELERY_TASK_RESULT_EXPIRES  result_expires\n",
    "CELERY_RESULT_PERSISTENT    result_persistent\n",
    "CELERY_RESULT_SERIALIZER    result_serializer\n",
    "CELERY_RESULT_DBURI 请result_backend改用。\n",
    "CELERY_RESULT_ENGINE_OPTIONS    database_engine_options\n",
    "[...]_DB_SHORT_LIVED_SESSIONS   database_short_lived_sessions\n",
    "CELERY_RESULT_DB_TABLE_NAMES    database_db_names\n",
    "CELERY_SECURITY_CERTIFICATE security_certificate\n",
    "CELERY_SECURITY_CERT_STORE  security_cert_store\n",
    "CELERY_SECURITY_KEY security_key\n",
    "CELERY_ACKS_LATE    task_acks_late\n",
    "CELERY_TASK_ALWAYS_EAGER    task_always_eager\n",
    "CELERY_TASK_ANNOTATIONS task_annotations\n",
    "CELERY_TASK_COMPRESSION task_compression\n",
    "CELERY_TASK_CREATE_MISSING_QUEUES   task_create_missing_queues\n",
    "CELERY_TASK_DEFAULT_DELIVERY_MODE   task_default_delivery_mode\n",
    "CELERY_TASK_DEFAULT_EXCHANGE    task_default_exchange\n",
    "CELERY_TASK_DEFAULT_EXCHANGE_TYPE   task_default_exchange_type\n",
    "CELERY_TASK_DEFAULT_QUEUE   task_default_queue\n",
    "CELERY_TASK_DEFAULT_RATE_LIMIT  task_default_rate_limit\n",
    "CELERY_TASK_DEFAULT_ROUTING_KEY task_default_routing_key\n",
    "CELERY_TASK_EAGER_PROPAGATES    task_eager_propagates\n",
    "CELERY_TASK_IGNORE_RESULT   task_ignore_result\n",
    "CELERY_TASK_PUBLISH_RETRY   task_publish_retry\n",
    "CELERY_TASK_PUBLISH_RETRY_POLICY    task_publish_retry_policy\n",
    "CELERY_QUEUES   task_queues\n",
    "CELERY_ROUTES   task_routes\n",
    "CELERY_TASK_SEND_SENT_EVENT task_send_sent_event\n",
    "CELERY_TASK_SERIALIZER  task_serializer\n",
    "CELERYD_TASK_SOFT_TIME_LIMIT    task_soft_time_limit\n",
    "CELERYD_TASK_TIME_LIMIT task_time_limit\n",
    "CELERY_TRACK_STARTED    task_track_started\n",
    "CELERYD_AGENT   worker_agent\n",
    "CELERYD_AUTOSCALER  worker_autoscaler\n",
    "CELERYD_CONCURRENCY worker_concurrency\n",
    "CELERYD_CONSUMER    worker_consumer\n",
    "CELERY_WORKER_DIRECT    worker_direct\n",
    "CELERY_DISABLE_RATE_LIMITS  worker_disable_rate_limits\n",
    "CELERY_ENABLE_REMOTE_CONTROL    worker_enable_remote_control\n",
    "CELERYD_HIJACK_ROOT_LOGGER  worker_hijack_root_logger\n",
    "CELERYD_LOG_COLOR   worker_log_color\n",
    "CELERYD_LOG_FORMAT  worker_log_format\n",
    "CELERYD_WORKER_LOST_WAIT    worker_lost_wait\n",
    "CELERYD_MAX_TASKS_PER_CHILD worker_max_tasks_per_child\n",
    "CELERYD_POOL    worker_pool\n",
    "CELERYD_POOL_PUTLOCKS   worker_pool_putlocks\n",
    "CELERYD_POOL_RESTARTS   worker_pool_restarts\n",
    "CELERYD_PREFETCH_MULTIPLIER worker_prefetch_multiplier\n",
    "CELERYD_REDIRECT_STDOUTS    worker_redirect_stdouts\n",
    "CELERYD_REDIRECT_STDOUTS_LEVEL  worker_redirect_stdouts_level\n",
    "CELERYD_SEND_EVENTS worker_send_task_events\n",
    "CELERYD_STATE_DB    worker_state_db\n",
    "CELERYD_TASK_LOG_FORMAT worker_task_log_format\n",
    "CELERYD_TIMER   worker_timer\n",
    "CELERYD_TIMER_PRECISION worker_timer_precision\n",
    "总结\n",
    "接下来我们分析celery具体的使用方法。celery使用详解二\n",
    "\n",
    "参考\n",
    "http://docs.celeryproject.org/en/latest/userguide/tasks.html#task-options\n",
    "\n",
    "http://docs.jinkan.org/docs/celery/getting-started/first-steps-with-celery.html\n",
    "\n",
    "http://www.pythondoc.com/flask-celery/first.html\n",
    "\n",
    "https://blog.csdn.net/kk123a/article/details/74549117\n",
    "\n",
    "https://blog.csdn.net/preyta/article/details/54288870\n",
    "\n",
    "分类: celery, python\n",
    "标签: celery, python\n",
    "好文要顶 关注我 收藏该文  \n",
    "天宇之游\n",
    "关注 - 2\n",
    "粉丝 - 121\n",
    "+加关注\n",
    "14\n",
    "« 上一篇： git服务器的简单搭建\n",
    "» 下一篇： python内置模块之unittest测试（五）\n",
    "posted @ 2018-04-09 14:40  天宇之游  阅读(34661)  评论(3)  编辑 收藏\n",
    "\n",
    "评论列表\n",
    "  #1楼 2019-03-05 10:19 justz1\n",
    "看了很多关于 celery 的文章 , 这篇写得真的很详细 , 多谢多谢\n",
    "支持(2) 反对(1)\n",
    "  #2楼 2019-06-01 21:01 阿弥陀丸君\n",
    "写得很详细，墙裂支持！\n",
    "支持(1) 反对(1)\n",
    "  #3楼 2019-08-01 23:17 狗蛋第四\n",
    "为什么我老报这个错\n",
    "File \"D:\\Program Files\\Python3.5\\lib\\site-packages\\djcelery\\management\\commands\\celery.py\", line 11, in <module>\n",
    "class Command(CeleryCommand):\n",
    "File \"D:\\Program Files\\Python3.5\\lib\\site-packages\\djcelery\\management\\commands\\celery.py\", line 23, in Command\n",
    "preload_options)\n",
    "TypeError: can only concatenate list (not \"tuple\") to list\n",
    "支持(0) 反对(0)\n",
    "刷新评论刷新页面返回顶部\n",
    "注册用户登录后才能发表评论，请 登录 或 注册， 访问 网站首页。\n",
    "【推荐】超50万行VC++源码: 大型组态工控、电力仿真CAD与GIS源码库\n",
    "【推荐】腾讯云热门云产品限时秒杀，爆款1核2G云服务器99元/年！\n",
    "【推荐】阿里云双11返场来袭，热门产品低至一折等你来抢！\n",
    "【推荐】物理看板和电子看板该如何选择？\n",
    "【活动】京东云服务器_云主机低于1折，低价高性能产品备战双11\n",
    "【活动】ECUG For Future 技术者的年度盛会（杭州，1月4-5日）\n",
    "\n",
    "相关博文：\n",
    "· python之celery使用详解(二)\n",
    "· 分布式任务队列Celery入门与进阶\n",
    "· 异步任务（Celery）详解\n",
    "· 使用celery的backend异步获取结果\n",
    "· celery 快速入门教程 celery 定时器\n",
    "» 更多推荐...\n",
    "精品问答：大数据计算技术 1000 问\n",
    "\n",
    "最新 IT 新闻:\n",
    "· 非常满意！挪威电信表示5G将继续使用华为设备\n",
    "· AT&T宣布已在美国10个城市推出5G网络\n",
    "· 支付公司「Brex」又获 2 亿美元债权融资，成立不到两年便成为独角兽\n",
    "· 数学史上最简单的未解之谜，陶哲轩给出了几十年来最重要的证明！\n",
    "· 华为获得国内首张“5G核心网电信设备进网许可证”\n",
    "» 更多新闻...\n",
    "公告\n",
    "昵称： 天宇之游\n",
    "园龄： 2年8个月\n",
    "粉丝： 121\n",
    "关注： 2\n",
    "+加关注\n",
    "最新随笔\n",
    "1.python之pip安装mysql-python失败\n",
    "2.python之celery使用详解(二)\n",
    "3.docker之安装和管理mongodb\n",
    "4.docker之容器访问和网络连接(三)\n",
    "5.python模块分析之hashlib加密（二）\n",
    "6.python模块分析之time和datetime模块\n",
    "7.flask基础之请求钩子（十二）\n",
    "8.flask基础之session原理详解(十)\n",
    "9.flask基础之Response响应对象（九）\n",
    "10.flask基础之LocalProxy代理对象(八)\n",
    "我的标签\n",
    "python(91)\n",
    "linux(23)\n",
    "flask(22)\n",
    "flask基础(12)\n",
    "flask插件(9)\n",
    "python基础(9)\n",
    "python模块分析(8)\n",
    "rabbitMQ(8)\n",
    "redis(8)\n",
    "python并发编程(6)\n",
    "更多\n",
    "积分与排名\n",
    "积分 - 176840\n",
    "排名 - 2661\n",
    "随笔分类\n",
    "celery(4)\n",
    "Django(1)\n",
    "docker(5)\n",
    "flask(22)\n",
    "git(1)\n",
    "html/CSS(4)\n",
    "HTTP/HTTPS(2)\n",
    "javascript(2)\n",
    "jQuery(3)\n",
    "linux(24)\n",
    "MongoDB(5)\n",
    "mysql(4)\n",
    "numpy(1)\n",
    "python(91)\n",
    "python模块分析(14)\n",
    "rabbitMQ(8)\n",
    "record_bug(3)\n",
    "redis(8)\n",
    "scrapy(1)\n",
    "Tkinter(1)\n",
    "ubuntu(4)\n",
    "服务器配置(10)\n",
    "设计模式(2)\n",
    "数据结构与算法(1)\n",
    "算法(1)\n",
    "网络爬虫spider(1)\n",
    "微信公众号开发(1)\n",
    "转载精华(7)\n",
    "自然语言处理(1)\n",
    "阅读排行榜\n",
    "1. python基础之os.system函数(41253) 2. python之celery使用详解一(34661) 3. ajax和jsonp使用总结(27384) 4. docker之设置开机自启动(二)(20461) 5. windows下python虚拟环境virtualenv安装和使用(19866)\n",
    "Copyright © 2019 天宇之游\n",
    "Powered by .NET Core 3.1.0 on Linux"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.4"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
