Django + djcelery + redis 执行异步任务

2023-02-18 13:37:16 浏览数 (1)

安装 redis

  • Mac 或 centos
  • Mac redis 开机自启
  • Windows

安装 django-redis

pip install django-redis

pip uninstall redis

pip install redis==2.10.6(解决 启动 celery 错误:AttributeError: 'str' object has no attribute 'items',详情)

安装 django-celery

pip install django-celery

配置 settings.py

代码语言:javascript复制
import djcelery


# 添加 djcelery APP
INSTALLED_APPS = [
    # ...
    'djcelery',  # django-celery
]

# django 缓存
CACHES = {
    "default": {
        "BACKEND": "django_redis.cache.RedisCache",
        "LOCATION": "redis://127.0.0.1:6379/1",
        "OPTIONS": {
            "CLIENT_CLASS": "django_redis.client.DefaultClient",
        }
    }
}


# celery 定时任务
djcelery.setup_loader()  # 加载 djcelery
# # 注意,celery4 版本后,CELERY_BROKER_URL 改为 BROKER_URL
BROKER_URL = 'redis://127.0.0.1:6379/0'  # Broker 使用 Redis, 使用0数据库(暂时不是很清楚原理)
# CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'  # 定时任务调度器 python manage.py celery beat
CELERYD_MAX_TASKS_PER_CHILD = 3  # 每个 worker 最多执行3个任务就会被销毁,可防止内存泄露
# CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0'  # celery 结果返回,可用于跟踪结果

# celery 内容等消息的格式设置
CELERY_ACCEPT_CONTENT = ['application/json', ]
CELERY_TASK_SERIALIZER = 'json'
# CELERY_RESULT_SERIALIZER = 'json'

生成 celery 关联的表

python manage.py migrate

项目根目录添加 celery.py

代码语言:javascript复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

"""
@author: yinzhuoqun
@site: http://zhuoqun.info/
@email: yin@zhuoqun.info
@time: 2019/12/14 17:21
"""
from __future__ import absolute_import, unicode_literals
from celery import Celery
from django.conf import settings
import os

# 获取当前文件夹名,即为该Django的项目名
project_name = os.path.split(os.path.abspath('.'))[-1]
project_settings = '%s.settings' % project_name

# 设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', project_settings)

# 实例化 Celery
app = Celery(project_name)

# 使用 django 的 settings 文件配置 celery
app.config_from_object('django.conf:settings')

# Celery 加载所有注册的应用
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

app 目录添加 tasks.py

代码语言:javascript复制
#!/usr/bin/env python3
# -*- coding: utf-8 -*-

"""
@author: yinzhuoqun
@site: http://zhuoqun.info/
@email: yin@zhuoqun.info
@time: 2019/12/15 12:34 AM
"""

import json
import requests
from celery import task

from django.core.mail import send_mail


@task
def task_send_dd_text(url, msg, atMoblies, atAll="flase"):
    """
    发送钉钉提醒
    :param url:
    :param msg:
    :param atMoblies:
    :param atAll:
    :return:
    """
    body = {
        "msgtype": "text",
        "text": {
            "content": msg
        },
        "at": {
            "atMobiles": atMoblies,
            "isAtAll": atAll
        }


    }
    headers = {'content-type': 'application/json',
               'User-Agent': 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:22.0) Gecko/20100101 Firefox/22.0'}
    r = requests.post(url, headers=headers, data=json.dumps(body))
    # print(r.text)


@task
def task_send_mail(*args, **kwargs):
    """
    django 的 发送邮件,支持 html,html_message="html 内容"
    :param args:
    :param kwargs:
    :return:
    """
    send_mail(*args, **kwargs)

views.py 调用

代码语言:javascript复制
# 假如 url 设置成 test

def test(request):
    # 导入
    from .tasks import task_send_dd_text
    # 执行
    task_send_dd_text.delay(settings.DD_NOTICE_URL, "异步任务调用成功", atMoblies=["18612345678"], atAll="false")
    return HttpResponse("test")

启动 celery worker

python manage.py celery worker --loglevel=info

centos7 守护 celery worker

Centos7  使用 Supervisor 守护进程 Celery

访问 异步任务 的视图

http://127.0.0.1/test

worker 日志

代码语言:javascript复制
(joyoo) yinzhuoqundeMacBook-Pro:joyoo yinzhuoqun$ python manage.py celery worker --loglevel=info
raven.contrib.django.client.DjangoClient: 2019-12-15 16:39:56,155 /Users/yinzhuoqun/.pyenv/joyoo/lib/python3.6/site-packages/raven/base.py [line:213] INFO Raven is not configured (logging is disabled). Please see the documentation for more information.
 
 -------------- celery@yinzhuoqundeMacBook-Pro.local v3.1.26.post2 (Cipater)
---- **** ----- 
--- * ***  * -- Darwin-18.6.0-x86_64-i386-64bit
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         default:0x107752e48 (djcelery.loaders.DjangoLoader)
- ** ---------- .> transport:   redis://127.0.0.1:6379/0
- ** ---------- .> results:     
- *** --- * --- .> concurrency: 12 (prefork)
-- ******* ---- 
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery
                

[tasks]
  . blog.tasks.task_send_dd_text
  . blog.tasks.task_send_mail

[2019-12-15 16:39:56,821: INFO/MainProcess] Connected to redis://127.0.0.1:6379/0
[2019-12-15 16:39:56,831: INFO/MainProcess] mingle: searching for neighbors
[2019-12-15 16:39:57,838: INFO/MainProcess] mingle: all alone
[2019-12-15 16:39:57,847: WARNING/MainProcess] /Users/yinzhuoqun/.pyenv/joyoo/lib/python3.6/site-packages/djcelery/loaders.py:133: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warn('Using settings.DEBUG leads to a memory leak, never '
[2019-12-15 16:39:57,847: WARNING/MainProcess] celery@yinzhuoqundeMacBook-Pro.local ready.
[2019-12-15 16:42:22,962: INFO/MainProcess] Received task: blog.tasks.task_send_dd_text[3a888439-29b2-41ce-aeb2-7199590d999d]
[2019-12-15 16:42:22,964: INFO/MainProcess] Received task: blog.tasks.task_send_mail[f721f916-2e96-477c-862c-2f54677e1269]
[2019-12-15 16:42:22,966: INFO/MainProcess] Received task: blog.tasks.task_send_mail[91f75bb6-17a4-4e64-a0c6-2f5f7423db2d]
[2019-12-15 16:42:24,054: INFO/MainProcess] Task blog.tasks.task_send_dd_text[3a888439-29b2-41ce-aeb2-7199590d999d] succeeded in 1.091003522000392s: None
[2019-12-15 16:42:31,866: INFO/MainProcess] Task blog.tasks.task_send_mail[91f75bb6-17a4-4e64-a0c6-2f5f7423db2d] succeeded in 8.898680990001594s: None
[2019-12-15 16:42:32,735: INFO/MainProcess] Task blog.tasks.task_send_mail[f721f916-2e96-477c-862c-2f54677e1269] succeeded in 9.769848276999255s: None

0 人点赞