Touch AWS with Serverless Framework and Python

Introduction

December was early and we reached the middle stage. Personally, I feel very happy because this day is my birthday. I wrote this article with that feeling. Recently, I've been touching Serverless Framework and AWS, so I tried to summarize it in my own way. The services I used are Lambda and S3, SQS, and API Gateway, so I've written about that.

The code I used this time can be found here [https://github.com/masaemon/aws-serverless).

In addition, I referred to the following articles when creating this article. Thoroughly serverless ①: Introduction to Serverless Framework The story of making a super-simple voting system with Serverless Framework and S3 [I tried using Lambda with Python 3.6 with Serverless Framework] (https://dev.classmethod.jp/cloud/aws/serverless-framework-with-python-3-6/) Summary of how to use Serverless Framework

Preparation / environment

You need a Node.js and Python environment. You will also need an AWS account. Node.js is required to use Serverless Framework and Python is required to use AWS CLI.

1. Install Serverless

Install with npm.

$ npm install -g serverless
$ sls -v
Framework Core: 1.54.0
Plugin: 3.1.2
SDK: 2.1.2
Components Core: 1.1.1
Components CLI: 1.4.0

If this is displayed, the installation is complete.

2. Create an IAM user and set AWS Credentials

You need to create an IAM to use AWS with Serverless Framework.

Creating an IAM user

You can create it by clicking Create User from User on the IAM dashboard. Basically, all you have to do is check Programmatic Access and give it the permissions you need to create it this time. Also note that the AWS Credential key can only be obtained here.

AWS Credential Settings

You need to install the AWS CLI in order to set up your AWS Credentials. You can install it with the following command.

$ pip install aws-sdk
$ aws --version
aws-cli/1.16.276 Python/3.7.3 Darwin/18.7.0 botocore/1.13.12

If this is displayed, the installation is complete. Set the AWS Credential as follows.

$ aws configure
AWS Access Key ID [None]:IAM access key ID created
AWS Secret Access Key [None]:Created IAM secret access key
Default region name [None]: ap-northeast-1
Default output format [None]: ENTER

Quote: Thoroughly serverless ①: Introduction to Serverless Framework

Now you are ready to use the Serverless Framework.

3. Hello World with Serverless Framework

First, let's create a Serverless project. You can create it by typing the following command.

$ sls create -t aws-python3 -p Hogehoge Project

If the following is displayed, it is successful.

Serverless: Generating boilerplate...
Serverless: Generating boilerplate in "/..."
 _______                             __
|   _   .-----.----.--.--.-----.----|  .-----.-----.-----.
|   |___|  -__|   _|  |  |  -__|   _|  |  -__|__ --|__ --|
|____   |_____|__|  \___/|_____|__| |__|_____|_____|_____|
|   |   |             The Serverless Application Framework
|       |                           serverless.com, v1.54.0
 -------'

Serverless: Successfully generated boilerplate for template: "aws-python3"

Next, change handler.py in the project name directory as follows.

handler.py


import json

def hello(event, context):
    body = {
        "message": "Hello, world",
    }

    response = {
        "statusCode": 200,
        "body": json.dumps(body)
    }

    return response

Also, set the deployment destination region to Tokyo.

serverless.yml


service:Hogehoge

provider:
  name: aws
  runtime: python3.7
  region: ap-northeast-1 #Deploy to Tokyo region

functions:
  hello:
    handler: handler.hello

Then deploy.

$ sls deploy

I will do it.

$ sls invoke -f hello

If the following is displayed, it is successful.

{
    "statusCode": 200,
    "body": "{\"message\": \"Hello, world\"}"
}

Also, there is a function deployed on the AWS Lambda dashboard, so you can try running it from there.

4. Launch Lambda from API Gate Way

First, let's execute the hello function deployed earlier using API Gateway. Modify `` `serverless.yml``` as follows.

serverless.yml


service:Hogehoge

provider:
 ...

functions:
  hello:
    handler: handler.hello
    #Add from here
    events:
      - http:
          path: hello
          method: get
          cors: true

Quote: https://serverless.com/framework/docs/providers/aws/events/apigateway/

If you do `` `sls deployafter doing this, you will getendpoints``` as shown below.

Service Information
...
endpoints:
  GET - https://hogehoge.amazonaws.com/dev/hello
functions:
  hello: hogehoge-dev-hello
layers:
  None

If you hit this API with a curl command etc., Hello World will be displayed.

5. Save the file to S3 and read it

Creating a function to save a file in S3

Create a new `` `sthreeput.py``` and write as follows. Here, we will create a function that saves a text file containing 10 random numbers up to 100 with the name of date and time.

sthreeput.py


import boto3 #AWS SDK for Python

import datetime
import random

s3 = boto3.resource('s3')
bucketName = "hogehogebucket" #Your own name for your S3 bucket

def index(event, context):
    dt = datetime.datetime.now()
    text = ""
    for i in range(10):
        num = str(random.randint(0, 100))
        text += num + " "

    key = "{0:%Y-%m-%d-%H-%M-%S}.txt".format(dt) #Become the file name to save

    obj = s3.Object(bucketName, key) 
    obj.put(Body=text.encode())

    return {
        "statusCode": 200,
        "body": text
    }

serverless.yml settings

Create an S3 bucket to store your files. It also describes a function for adding files to S3. Please note that the S3 bucket name cannot be used if it already exists (even if it is not your own).

serverless.yml



functions:
  hello:
  ....
  #Function name
  sthree-put:
    handler: sthreeput.index #sthreeput.Invoke py's index function

resources:
  Resources:
    SthreeBucket: #Create an S3 bucket
      Type: AWS::S3::Bucket
      Properties:
        BucketName : "hogehogebucket" #S3 bucket name (you can't give it an existing name)

Now let's do a `` `sls deploy``` and run the function. If all goes well, the files are stored in the bucket you created.

View the file triggered by the file being saved in S3

Create a new `` `sthreereceive.py``` as follows.

sthreereceive.py


import boto3

s3 = boto3.client('s3')

def index(event, context):
    bucket = event["Records"][0]["s3"]["bucket"]["name"] #Bucket name
    key = event["Records"][0]["s3"]["object"]["key"] #The name of the saved file
    response = s3.get_object(Bucket=bucket, Key=key) #Get objects in a bucket
    body = response['Body'].read().decode("utf-8") #Get the information in the file and utf-Decode to 8
    print(body)
    return body

When the Lambda function is started by being hooked to S3, the following JSON is passed to the first argument event.

{
  "Records": [
    {
      "eventVersion": "2.0",
      "eventSource": "aws:s3",
      "awsRegion": "ap-northeast-1",
      "eventTime": "1970-01-01T00:00:00.000Z",
      "eventName": "ObjectCreated:Put",
      "userIdentity": {
        "principalId": "EXAMPLE"
      },
      "requestParameters": {
        "sourceIPAddress": "127.0.0.1"
      },
      "responseElements": {
        "x-amz-request-id": "EXAMPLE123456789",
        "x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
      },
      "s3": {
        "s3SchemaVersion": "1.0",
        "configurationId": "testConfigRule",
        "bucket": {
          "name": "example-bucket",
          "ownerIdentity": {
            "principalId": "EXAMPLE"
          },
          "arn": "arn:aws:s3:::example-bucket"
        },
        "object": {
          "key": "test/key", 
          "size": 1024,
          "eTag": "0123456789abcdef0123456789abcdef",
          "sequencer": "0A1B2C3D4E5F678901"
        }
      }
    }
  ]
}

From this information, we get the name of the S3 bucket and the key that is the file name.

serverless.yml settings

Set the function and the event of S3.

serverless.yml


...
functions:
  sthree-put:
    handler: sthreeput.index #sthreeput.Invoke py's index function

  sthree-receive:
    handler: sthreereceive.index
    events:
    - s3:
        bucket: "sthreebucket-hogehoge" #Bucket name
        event: s3:ObjectCreated:* #Start when made to S3
        existing: true #I'm creating an S3 bucket, so I'll attach it

Without the last existing: ture, a bucket will be created and you will not be able to deploy if you already have a bucket with the same name. We set this option because we created the bucket earlier. You can deploy without the last option by erasing the Resources part of serverless.yml and doing `` `sls remove``` to get it clean.

Now, if you do `` `sls deploy``` and start the sthreeput function, you can see that the file is saved in the bucket and the sthreereceive function is started by monitoring the Lambda dashboard.

Also, this time I checked the file by using the fact that it was saved in S3 as a trigger, but you can also check the file by specifying the name of the bucket and the key of the file you want to check.

6. Send a message to SQS and receive it

Creating a function to send a message to SQS

Newly describe `` `sendmessage.py``` as follows.

sendmessage.py


import boto3
import random
import json

sqs = boto3.resource('sqs')
queueName = "hogehoge-queue"


def index(event, context):
    queue = sqs.get_queue_by_name(QueueName=queueName)  #Get queue

    message = ""  #Message to send

    for i in range(10):
        num = str(random.randint(0, 100))
        message += num + " "
        
    print(message)

    queue.send_message(MessageBody=json.dumps({ "message": message })) #Send message
    
    return message

serverless.yml settings

Create a queue to send messages. Unlike S3 buckets, SQS queues can have the same name as someone in another account.

serverless.yml


provider:
  ...
  iamRoleStatements:
  - Effect: "Allow"
    Action:
    - "s3:PutObject"
    - "s3:GetObject"
    - "s3:ListBucket"
    - "sqs:CreateQueue"
    - "sqs:DeleteQueue"
    - "sqs:SendMessage"
    - "sqs:GetQueueUrl"
    Resource: "*"
...
functions:
  sthree-put:
  ...
  sendmessage:
    handler: sendmessage.index

resources:
  Resources:
    SthreeBucket: 
    ...
    hogehogeQueue:
      Type: AWS::SQS::Queue #SQS queue creation
      Properties:
        QueueName: "hogehoge-queue"

Now do a `` `sls deploy``` and start the sendmessage function and you should see the message added to the queue you created.

Receive a message on the trigger of sending a message to SQS

Next, let's receive the message you sent earlier. Create `` `receivemessage.py``` and write as follows.

receivemessage.py


import json

def index(event, context):
    body = json.loads(event["Records"][0]["body"]) 
    message = body["message"]
    print(message)
    return message

When the Lambda function is started by being hooked by SQS, the following JSON is passed to the first argument event.

{
  "Records": [
    {
      "messageId": "19dd0b57-b21e-4ac1-bd88-01bbb068cb78",
      "receiptHandle": "MessageReceiptHandle",
      "body": "Hello from SQS!", 
      "attributes": {
        "ApproximateReceiveCount": "1",
        "SentTimestamp": "1523232000000",
        "SenderId": "123456789012",
        "ApproximateFirstReceiveTimestamp": "1523232000001"
      },
      "messageAttributes": {},
      "md5OfBody": "7b270e59b47ff90a553787216d55d91d",
      "eventSource": "aws:sqs",
      "eventSourceARN": "arn:aws:sqs:ap-northeast-1:123456789012:MyQueue",
      "awsRegion": "ap-northeast-1"
    }
  ]
}

You can get the sent message by getting the body from here.

serverless.yml settings

serverlss.yml


...
functions:
  receivemessage:
    handler: receivemessage.index
    events:
      - sqs:
          # arn:aws:sqs:region:accountid:hogehoge-Become like a queue
          arn:
            Fn::Join:
            - ':'
            - - arn
              - aws
              - sqs
              - Ref: AWS::Region
              - Ref: AWS::AccountId
              - "hogehoge-queue"

Now when you do `` `sls deploy``` and start the sendmessage function, the receivemessage function will start. Note that the receivemessage function will always be invoked if it causes an error. In this case, you can stop it by clearing the message in the queue.

in conclusion

Delete it if you don't need it. The following command will clean it up.

sls remove

The bucket doesn't disappear well, but in that case, you may want to empty the bucket or delete the stack directly.

I thought that AWS has various other services, so let's touch it.

Recommended Posts

Touch AWS with Serverless Framework and Python
Deploy Python3 function with Serverless Framework on AWS Lambda
AWS CDK with Python
Use additional Python packages with Serverless Framework (v1.x)
Serverless application with AWS SAM! (APIGATEWAY + Lambda (Python))
Programming with Python and Tkinter
Dynamic HTML pages made with AWS Lambda and Python
Easily serverless with Python with chalice
Python and hardware-Using RS232C with Python-
Create Python version Lambda function (+ Lambda Layer) with Serverless Framework
python with pyenv and venv
Make ordinary tweets fleet-like with AWS Lambda and Python
Effortlessly with Serverless Python Requirements
Works with Python and R
Easily build network infrastructure and EC2 with AWS CDK Python
Site monitoring and alert notification with AWS Lambda + Python + Slack
Communicate with FX-5204PS with Python and PyUSB
Shining life with Python and OpenCV
Robot running with Arduino and python
Install Python 2.7.9 and Python 3.4.x with pip.
Neural network with OpenCV 3 and Python 3
Scraping with Node, Ruby and Python
Scraping with Python, Selenium and Chromedriver
Text extraction with AWS Textract (Python3.6)
Scraping with Python and Beautiful Soup
JSON encoding and decoding with python
Hadoop introduction and MapReduce with Python
[GUI with Python] PyQt5-Drag and drop-
I played with PyQt5 and Python3
Reading and writing CSV with Python
Notify HipChat with AWS Lambda (Python)
Multiple integrals with Python and Sympy
Coexistence of Python2 and 3 with CircleCI (1.0)
Easy modeling with Blender and Python
Sugoroku game and addition game with python
FM modulation and demodulation with Python
How to use Serverless Framework & Python environment variables and manage stages
Make a scraping app with Python + Django + AWS and change jobs
Let's make a web chat using WebSocket with AWS serverless (Python)!
Communicate between Elixir and Python with gRPC
Data pipeline construction with Python and Luigi
Calculate and display standard weight with python
[AWS] Using ini files with Lambda [Python]
Monitor Mojo outages with Python and Skype
Install Python as a Framework with pyenv
FM modulation and demodulation with Python Part 3
Python installation and package management with pip
Using Python and MeCab with Azure Databricks
POST variously with Python and receive with Flask
Capturing images with Pupil, python and OpenCV
Fractal to make and play with Python
A memo with Python2.7 and Python3 on CentOS
Use PIL and Pillow with Cygwin Python
Create and decrypt Caesar cipher with python
I want to play with aws with python
CentOS 6.4 with Python 2.7.3 with Apache with mod_wsgi and Django
Reading and writing JSON files with Python
Dealing with "years and months" in Python
I installed and used Numba with Python3.5
Tweet analysis with Python, Mecab and CaboCha
Linking python and JavaScript with jupyter notebook