I use django-celery + RabbitMQ for my current job, but since I am using AWS for the infrastructure, I was wondering if it could be linked with Amazon SQS, so I tried it.
The source code can be found on GitHub.
Install the required libraries with pip.
$ pip install django
$ pip install django-celery
$ pip install boto
$ pip install django-dotenv
$ pip freeze > requirements.txt
$ django-admin.py startproject demo .
$ python manage.py startapp items
Before editing settings.py, we will use Amazon SQS, so prepare the access key and secret key. You can write the access key etc. directly in settings.py, but I didn't want to manage it with git, so I tried using django-dotenv.
The .env file looks like this:
.env
AWS_SQS_ACCESS_KEY=XXXXX
AWS_SQS_SECRET_ACCESS_KEY=YYYYY
AWS_SQS_REGION=ap-northeast-1
Then edit settings.py.
demo/settings.py
#import added
import os
import urllib
〜
#DB is sqlite3
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'demo.db',
}
}
〜
#Change the timezone for the time being
TIME_ZONE = 'Asia/Tokyo'
〜
INSTALLED_APPS = (
〜
#Add the following
'djcelery',
'items',
)
〜
#Add the following at the end
# urllib.I use quote because the access key may contain a string that requires URL encoding.
import djcelery
djcelery.setup_loader()
BROKER_URL = 'sqs://%s:%s@' % (urllib.quote(os.environ['AWS_SQS_ACCESS_KEY'], ''),
urllib.quote(os.environ['AWS_SQS_SECRET_ACCESS_KEY'], ''))
BROKER_TRANSPORT_OPTIONS = {'region': os.environ['AWS_SQS_REGION'],
'queue_name_prefix': 'celery_sqs_demo-'}
Then add the following to manage.py:
manage.py
import dotenv
dotenv.read_dotenv()
The .env file is now loaded.
Appropriate.
items/models.py
from django.db import models
class Item(models.Model):
uuid = models.CharField(max_length=200)
created_at = models.DateTimeField('created at')
Create a task to process asynchronously. It seems that this file needs to be named ** tasks.py **.
items/tasks.py
import time
from celery import task
from items.models import Item
@task
def add_item(uuid, created_at):
time.sleep(3)
item = Item(uuid=uuid, created_at=created_at)
item.save()
This is also appropriate.
items/views.py
import uuid
from django.http import HttpResponse
from django.utils import timezone
from items.tasks import add_item
def index(request):
add_item.delay(uuid.uuid4(), timezone.now())
return HttpResponse('success')
Also edit urls.py.
demo/urls.py
from django.conf.urls import patterns, include, url
# Uncomment the next two lines to enable the admin:
# from django.contrib import admin
# admin.autodiscover()
urlpatterns = patterns('',
url(r'^items/$', 'items.views.index'),
)
Before that, prepare the DB. A celery-related table is also created.
$ python manage syncdb
Start celeryd.
Add -l info
to check the log.
$ python manage.py celeryd
At first, celeryd did not start and the following error occurred.
ValueError: invalid literal for int() with base 10: ‘XXXXXXXXXX’
> The cause is also written in the editing of settings.py, but it is because the AWS access key and secret key sometimes contain strings that require URL encoding.
Once celeryd starts successfully, start the Django application.
$ python manage.py runserver
Once started, try accessing http: // localhost: 4567 / items /.
If you check the log of celeryd, you can see the following log, and you can confirm that you can take the message.
[2014-01-25 15:38:27,668: INFO/MainProcess] Received task: items.tasks.add_item[XXXXXXXXXX] [2014-01-25 15:38:30,702: INFO/MainProcess] Task items.tasks.add_item[XXXXXXXXXX] succeeded in 3.031301742s: None
If you check the DB as well, the record is created properly.
$ sqlite3 demo.db
sqlite> select * from items_item; 1|c23bd4f4-720f-4488-a6b9-dc26ed495c71|2014-01-25 06:38:26.908489
When I checked SQS, the following two queues were created.
* celery_sqs_demo-celery
* celery_sqs_demo-celery_ {hostname} -celery-pidbox
#### **`celery_sqs_The demo part is settings.BROKER with py_TRANSPORT_It will be the prefix set in OPTIONS.`**
```BROKER with py_TRANSPORT_It will be the prefix set in OPTIONS.
I'm not sure what the queue celery_ {hostname} -celery-pidbox is for, so I'll look it up.
# Summary
I thought it would take more time, but I was able to confirm the operation easily without any big addiction points.
However, since the cooperation with SQS is still Experimental, I can't wait to become Stable.
Next time, I would like to send messages from multiple hosts and try to see if there are duplicate tasks.
# reference
* [First steps with Django - Celery 3.1.8 documentation](http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html#using-celery-with-django)
* [Using Amazon SQS - Celery 3.1.8 documentation](http://docs.celeryproject.org/en/latest/getting-started/brokers/sqs.html)
* [Asynchronous processing quick start guide with django-celery --hirokiky's blog](http://blog.hirokiky.org/2013/03/23/quick_start_guid_about_django_celery.html)
Recommended Posts