[PYTHON] Awareness of using Aurora Severless Data API

What is this article?

Here are some things I noticed during development activities using the Aurora Severless Data API. As for the Data API, it was just released in the Tokyo region last year (2019), so there is little information in Japanese. We hope this article will be useful to those who will use it in the future.

RDS usage environment

--Region: ap-northeast-1 --Role: Serverless --Engine: Aurora PostgreSQL

Awareness of using Aurora Severless (PostgreSQL engine)

: one: Only the MySQL engine can set the minimum capacity from 1 Aurora Capacity Unit (ACU).

The minimum capacity of the PostgreSQL engine is 2ACU. "Amazon Aurora Serverless Starts Supporting 1 Unit Capacity and New Scaling Options" After hearing the announcement, I thought that it could be set from 1 ACU regardless of the engine selected.

You can now set the minimum capacity of a MySQL-compatible Aurora Serverless DB cluster to 1 Aurora Capacity Unit (ACU).

That is: innocent:

: two: There are few data types of values passed to placeholders. ARRAY type and JSON type are not supported as of May 2020.

When I read the official document of ExecuteStatement Action, it seems that the request syntax supports ARRAY type at first glance. However, there was the following note.

:information_source: Note Array parameters are not supported.

I was curious, so I tried ExecuteStatement to pass an ARRAY type value from boto3 to the placeholder, but I got an "Array parameters are not supported" error. After checking, it is also open as Issues on GitHub, ** AWS engineers commented as follows Therefore, we think that we will support ARRAY type in the future **.

The SDK team has already things set up, its up-to the service team the time they take to implement it. Will update if they implement the feature.

It will be a workaround correspondence, but ** It is possible to pass the value to the ARRAY type placeholder by writing the following **.

tags = ['tag1', 'tag2']

rds_client = boto3.client('rds-data')
parameters = {
    'secretArn': 'your_secret_arn',
    'resourceArn': 'your_resource_arn',
    'sql': INSERT INTO sample_tbl (id, tags) VALUES (:id, :tags::text[]),
    'parameters': [
        {'name':'id', 'value':{'stringValue': '001'},
        {'name':'tags', 'value':{'stringValue': '{' + ','.join(tags) + '}'}}
    ],
    'database': 'your_database_name',
}

response = rds_client.execute_statement(**parameters)

: three: Transaction timeout value is 3 minutes.

According to the official documentation of the BeginTransaction Action (https://docs.aws.amazon.com/rdsdataservice/latest/APIReference/API_BeginTransaction.html), ** the transaction times out in 3 minutes if the issued transaction ID is not used. To do**. If it times out before it is committed, it will automatically roll back. Also, ** once used transaction is valid for up to 24 hours **, it will be automatically terminated after 24 hours and rollback will be executed.

:warning: Important A transaction can run for a maximum of 24 hours. A transaction is terminated and rolled back automatically after 24 hours. A transaction times out if no calls use its transaction ID in three minutes. If a transaction times out before it's committed, it's rolled back automatically.

Recommended Posts

Awareness of using Aurora Severless Data API
I tried using the API of the salmon data project
Get data using Ministry of Internal Affairs and Communications API
[Python] I tried collecting data using the API of wikipedia
Get Salesforce data using REST API
Data acquisition using python googlemap api
Data acquisition memo using Backlog API
Recommendation of data analysis using MessagePack
I tried using YOUTUBE Data API V3
Get Youtube data in Python using Youtube Data API
Creating Google Spreadsheet using Python / Google Data API
FX data collection using OANDA REST API
[Python] Get all comments using Youtube Data API
[Pandas] Basics of processing date data using dt
Anonymous upload of images using Imgur API (using Python)
Try using kabu station API of kabu.com Securities
Python introductory study-output of sales data using tuples-
Check the status of your data using pandas_profiling
Scraping the winning data of Numbers using Docker
Get a list of GA accounts, properties, and views as vertical data using API
Get LEAD data using Marketo's REST API in Python
[Question] About API conversion of chat bot using Python
[Python] Get insight data using Google My Business API
[Python] Using Line API [1st Creation of Beauty Bot]
Easy usage memo of OpenStack Swift API using swiftclient
Numerical summary of data
Data analysis using xarray
Data analysis using Python 0
Data cleansing 2 Data cleansing using DataFrame
Preprocessing of prefecture data
Example of using lambda
Selection of measurement data
Instantly create a diagram of 2D data using python's matplotlib
I tried to search videos using Youtube Data API (beginner)
Collect large numbers of images using Bing's image search API
The story of creating a database using the Google Analytics API
Play with YouTube Data API v3 using Google API Python Client