[Python] Create an asynchronous task execution environment + monitoring environment

what's this

There are times when you want to store tasks in a queue and handle them as a worker. Computational tasks, mass deployment of something, etc. When creating such a mechanism, celery is convenient for Python.

goal

The person who read this can easily realize the asynchronous processing mechanism in Python and even monitor the processing status. What you can do.

Preparation

apt-get install redis-server
pip install celery
pip install flower

Minimal components

main.py

import tasks

print('<first task>')
#Start task here(run task)
worker = tasks.run.delay()
#If it doesn't end, wait until it ends.
while not worker.ready():
    pass
#Give a return value
print worker.result

print('<second task>')
#Start task here(calc task)
worker = tasks.calc.delay(100, 200)
#If it doesn't end, wait until it ends.
while not worker.ready():
    pass
#Give a return value
print worker.result

tasks.py If you put together the tasks you want to process asynchronously into a function and add an @task decorator Ready to hit from celery. The serializer of celery handles the passing of arguments and return values well. Note that instances of your own class cannot be serialized.

import time
from celery.decorators import task

@task
def run():
    time.sleep(10)
    print('Processing Owata')
    return 'I'm done'


@task
def calc(a, b):
    return a+b

celeryconfig.py Configuration file for running celery. Basically, data transfer around workers Since I want to do it with json, "json" is specified as the serializer for task / result delivery. The backend (BROKER) is made to work with redis, but RabbitMQ can also be used. (I'll leave it to you) In the example below, the worker loads tasks.py. Functions to be processed asynchronously Let's specify all the scripts to include. If CELERYD_LOG_LEVEL is set to INFO, the standard output of the task will also be logged (celeryd.log). Written in. In production, it may be better to set it to ERROR.

Since CELERYD_CONCURRENCY = 1, we will handle the queues one by one. It is better to adjust here according to the number of CPUs.

BROKER_URL = 'redis://localhost/0'
CELERYD_CONCURRENCY = 1
CELERY_RESULT_BACKEND = 'redis'
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = "redis"
CELERYD_LOG_FILE = "./celeryd.log"
CELERYD_LOG_LEVEL = "INFO"
CELERY_IMPORTS = ("tasks", )

How to move

Start redis-server

First, let's start redis-server. (Required) Skip those who have already started the service.

$ redis-server

Start celery worker

The worker is now ready to handle the queue

(env) docker@1824542bb286:~/workspace$ celery worker
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
    The 'CELERYD_LOG_LEVEL' setting is scheduled for deprecation in     version 2.4 and removal in version v4.0.     Use the --loglevel argument instead

  alternative='Use the {0.alt} instead'.format(opt))
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
    The 'CELERYD_LOG_FILE' setting is scheduled for deprecation in     version 2.4 and removal in version v4.0.     Use the --logfile argument instead

  alternative='Use the {0.alt} instead'.format(opt))

 -------------- celery@1824542bb286 v3.1.23 (Cipater)
---- **** -----
--- * ***  * -- Linux-3.13.0-24-generic-x86_64-with-Ubuntu-14.04-trusty
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         default:0x7f068383f610 (.default.Loader)
- ** ---------- .> transport:   redis://localhost:6379/0
- ** ---------- .> results:
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.run

Actually put a queue

Asynchronous processing

docker@1824542bb286:~/workspace$ python main.py
<first task>
I'm done
<second task>
300

Task monitoring by flower

flower start

(env2) docker@1824542bb286:~/workspace$ celery flower
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
    The 'CELERYD_LOG_LEVEL' setting is scheduled for deprecation in     version 2.4 and removal in version v4.0.     Use the --loglevel argument instead

  alternative='Use the {0.alt} instead'.format(opt))
/home/docker/.virtualenvs/env2/local/lib/python2.7/site-packages/celery/app/defaults.py:251: CPendingDeprecationWarning:
    The 'CELERYD_LOG_FILE' setting is scheduled for deprecation in     version 2.4 and removal in version v4.0.     Use the --logfile argument instead

  alternative='Use the {0.alt} instead'.format(opt))
[I 160617 13:02:20 command:136] Visit me at http://localhost:5555
[I 160617 13:02:20 command:141] Broker: redis://localhost:6379/0
[I 160617 13:02:20 command:144] Registered tasks:
    ['celery.backend_cleanup',
     'celery.chain',
     'celery.chord',
     'celery.chord_unlock',
     'celery.chunks',
     'celery.group',
     'celery.map',
     'celery.starmap',
     'tasks.run']
[I 160617 13:02:20 mixins:231] Connected to redis://localhost:6379/0

Access the interface

By default, localhost: 555 is the URL of the flower (monitoring interface). It is convenient because you can adjust the number of workers as well as monitoring.

スクリーンショット 2016-06-17 17.00.55.png

Recommended Posts

[Python] Create an asynchronous task execution environment + monitoring environment
Create an OpenCV3 + python3 environment on OSX
Create an environment of 64bit Windows + python 2.7 + MeCab 0.996
Introduction to Python "Re" 1 Building an execution environment
Create a Python environment
Create a Python execution environment on IBM i
Create an environment with virtualenv
python parallel / asynchronous execution memorandum
[Django] Memo to create an environment of Django + MySQL + Vue.js [Python]
Studying Python Part.1 Creating an environment
Create a Python environment on Mac (2017/4)
Create a virtual environment with Python!
Create an Excel file with Python3
Create a python environment on centos
Python Programming Workshop-Super Introductory Python Execution Environment
Create a Python execution environment for Windows with VScode + Remote WSL
Create a C ++ and Python execution environment with WSL2 + Docker + VSCode
[Mac] Create a Python3 execution environment from the fully initialized state
[Python] Building an environment with Anaconda [Mac]
Build an environment for Blender built-in Python
How to create an NVIDIA Docker environment
Note when creating an environment with python
Create a python environment on your Mac
Let's create a virtual environment for Python
Quickly create an excel file with Python #python
Create Python + uWSGI + Nginx environment with Docker
[Python] Quickly create an API with Flask
Create an English word app with python
Parallel task execution using concurrent.futures in Python
[Hyperledger Iroha] Create an account using Python library
Create a Vim + Python test environment in 1 minute
Prepare the execution environment of Python3 with Docker
Create an app that guesses students with python
Create execution environment for each language with boot2docker
Building an environment that uses Python in Eclipse
Building an environment for executing Python scripts (for mac)
[Python pandas] Create an empty DataFrame from an existing DataFrame
Create a virtual environment with conda in Python
Install the python package in an offline environment
Create Nginx + uWSGI + Python (Django) environment with docker
Create an image with characters in python (Japanese)
[Venv] Create a python virtual environment on Ubuntu
Build PyPy and Python execution environment with Docker
[Docker] Create a jupyterLab (python) environment in 3 minutes!
Create a Python virtual development environment on Windows
Create an API server quickly with Python + Falcon
Build a python execution environment with VS Code
Python environment construction
python environment settings
python windows environment
Environment construction (python)
python environment construction
Python --Environment construction
Python environment construction
python environment construction
Build an interactive environment for machine learning in Python
Create an Anaconda virtual environment in your project folder
Create an image file using PIL (Python Imaging Library).
Use Python installed with pyenv for PL / Python execution environment
Create a comfortable Python 3 (Anaconda) development environment on windows
Create a python development environment with vagrant + ansible + fabric