This is a continuation of the previous article. Last time I created the lower half, so this time I will create the upper half

npm (6.14.8): If it's not old ... serverless (2.8.0): If it is 2.x.x series python (3.8.2): If it is 3.8 series
Below, for those who know serverless framework, for those who can understand python somehow The explanation is omitted. Please see for reference only.
functions/layers/serverless.yml  #Configuration file
functions/layers/package.json   #Package related
functions/layers/requirements.txt #Package related
functions/layers/python/util.py  #Common function
functions/main/serverless.yml  #Configuration file
functions/main/handler.py        #lambda main
It is convenient to create layers when there are many packages. Reference Aws Lambda Layer
functions/layers/serverless.yml
service: goole-test-layer
frameworkVersion: "2"
plugins:
  - serverless-python-requirements
custom:
  defaultStage: dev
  pythonRequirements:
    dockerizePip: true
    layer: true
provider:
  name: aws
  runtime: python3.8
  stage: ${opt:stage, self:custom.defaultStage}
  region: ap-northeast-1
  environment:
    TZ: Asia/Tokyo
package:
  exclude:
    - ./node_modules/** #Define where the package is
layers:
  LayersCommon:
    path: "./"  #If you put it in a folder called python, you can call it from the lambda side as a common function
    compatibleRuntimes:
      - python3.8
resources:
  Outputs:
    PythonRequirementsLambdaLayerExport:
      Value:
        Ref: PythonRequirementsLambdaLayer ##Used in the settings on the function side
    LayersCommonLambdaLayerExport:
      Value:
        Ref: LayersCommonLambdaLayer ##Used in the settings on the function side
functions/layers/package.json
{
  "name": "sample",
  "description": "",
  "version": "0.1.0",
  "dependencies": {},
  "devDependencies": {
    "serverless-python-requirements": "^5.1.0"
  }
}
functions/layers/requirements.txt
boto3
botocore
gspread
oauth2client
functions/main/serverless.yml
service: goole-test
frameworkVersion: "2"
custom:
  defaultStage: dev
  sampleS3BucketName:
    Fn::Join:
      - ""
      - - ${self:service}-
        - ${self:provider.stage}-
        - Ref: AWS::AccountId
  ##layer settings[packege]
  requirements_service: goole-test-layer
  requirements_export: PythonRequirementsLambdaLayerExport
  requirements_layer: ${cf:${self:custom.requirements_service}-${self:provider.stage}.${self:custom.requirements_export}}
  ##layer settings[common]
  layers_common_service: goole-test-layer
  layers_common_export: LayersCommonLambdaLayerExport
  layers_common: ${cf:${self:custom.layers_common_service}-${self:provider.stage}.${self:custom.layers_common_export}}
provider:
  name: aws
  runtime: python3.8
  stage: ${opt:stage, self:custom.defaultStage}
  region: ap-northeast-1
  logRetentionInDays: 30
  environment:
    KEYNAME : "/google/access_key" #Storage location of the created key
  iamRoleStatements:
    - Effect: "Allow"
      Action:
        - "s3:ListBucket"
        - "s3:GetObject"
        - "s3:PutObject"
      Resource:
        - Fn::Join: ["", ["arn:aws:s3:::", { "Ref": "S3Bucket" }]]
        - Fn::Join: ["", ["arn:aws:s3:::", { "Ref": "S3Bucket" }, "/*"]]
    - Effect: Allow
      Action:
        - secretsmanager:GetSecretValue
      Resource:
        - "*" #Permission control is possible by specifying arn of secrets manager
functions:
  google_test:
    handler: handler.google_test
    memorySize: 512
    timeout: 900
    layers:
      - ${self:custom.requirements_layer}
      - ${self:custom.layers_common}
    events:
      - s3:                 #Set the S3 create object that is often used for the time being
          bucket:
            Ref: S3Bucket
          event: s3:ObjectCreated:*
          existing: true
          rules:
            - suffix: .csv
resources:
  Resources:
    S3Bucket:                           #Create S3
      Type: AWS::S3::Bucket
      Properties:
        BucketName: ${self:custom.sampleS3BucketName}
functions/main/hander.py I wear it sideways and write it all in one module, but please divide the file for each function ...
import json
import os
import boto3
from botocore.exceptions import ClientError
import base64
import gspread
from oauth2client.service_account import ServiceAccountCredentials
def get_secret():
    #This is almost exactly the sample code when creating Secrets Manager
    try:
        secret = None
        decoded_binary_secret = None
        secret_name = os.environ['KEYNAME']
        region_name = "ap-northeast-1"
        # Create a Secrets Manager client
        session = boto3.session.Session()
        client = session.client(
            service_name='secretsmanager',
            region_name=region_name
        )
        get_secret_value_response = client.get_secret_value(
            SecretId=secret_name
        )
    except ClientError as e:
        if e.response['Error']['Code'] == 'DecryptionFailureException':
            raise e
        elif e.response['Error']['Code'] == 'InternalServiceErrorException':
            raise e
        elif e.response['Error']['Code'] == 'InvalidParameterException':
            raise e
        elif e.response['Error']['Code'] == 'InvalidRequestException':
            raise e
        elif e.response['Error']['Code'] == 'ResourceNotFoundException':
            raise e
    else:
        if 'SecretString' in get_secret_value_response:
            secret = get_secret_value_response['SecretString']
        else:
            decoded_binary_secret = base64.b64decode(
                get_secret_value_response['SecretBinary'])
    # Your code goes here.
    return decoded_binary_secret.decode()
def connect_gspread(jsonf, key):
    scope = ['https://spreadsheets.google.com/feeds',
             'https://www.googleapis.com/auth/drive']
    credentials = ServiceAccountCredentials.from_json_keyfile_name(
        jsonf, scope)
    gc = gspread.authorize(credentials)
    SPREADSHEET_KEY = key
    worksheet = gc.open_by_key(SPREADSHEET_KEY).sheet1
    return worksheet
def google_test(event, context):
    #Because it is passed to the API as a file/Let's output to tmp.
    jsonf = "/tmp/google-access.json"
    with open(jsonf, mode='w') as f:
        f.write(get_secret())
    spread_sheet_key = '1o3xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx'
    ws = connect_gspread(jsonf, spread_sheet_key)
    #Put hoge1 in cell A1
    ws.update_cell(1, 1, "hoge1")
    body = {
        "message": "{} !".format("finished ."),
        "input": event
    }
    response = {
        "statusCode": 200,
        "body": json.dumps(body)
    }
    return response
##Deploy from layers
cd functions/layers
npm install
pip install -r requirements.txt
sls deploy
##Deploy the main function
cd functions/main
sls deploy
Run! !! !!
Hoge1 has entered the A1 cell safely!

So I was able to update the spread sheet from AWS lambda. Well, should I write it in GAS? There is also a tsukkomi, but if you want to do something like this We would appreciate it if you could refer to it.
Have a good AWS life!
Recommended Posts