Although it says that re: dash can use python as a data source, there is not much information in the documentation, so I will write a simple execution example.
Basically, in a dictionary-type variable named "result" http://docs.redash.io/en/latest/dev/results_format.html If you enter the value according to the format specified in, it will be recognized without permission.
Go to the Add Data Source screen, set the type to Python, and enter the modules that allow import, separated by commas. I'm entering boto3 and datetime for later use.
Set the data source of the query to the data source you added earlier, and enter the following query.
result = {}
add_result_row(result, {'name':'hoge', 'count':5, 'countf':6.3})
add_result_row(result, {'name':'bar', 'count':11, 'countf':3.14159})
add_result_row(result, {'name':'foo', 'count':0, 'countf':99.9999})
add_result_column(result, 'name', '', 'string')
add_result_column(result, 'count', '', 'integer')
add_result_column(result, 'countf', '', 'float')
Since the help methods add_result_row and add_result_column are provided by default, These are used for column definitions and row definitions.
add_result_row adds a row. If you pass the dictionary type that is a pair of column name and value as the second argument, it will be added to result as you like.
add_result_column adds a column definition. Column name in the second argument (name corresponding to the dictionary type to be added to add_result_row) Name for display in the third argument Type as the fourth argument (corresponds to string, integer, float, boolean, date, datetime) To specify.
When executed, the following results will be obtained.
In the previous query, the data is solidly written in the source and it is not interesting, so let's write a query to get external data.
This time, let's get the CPU usage of the instance with Amazon CloudWatch.
The OS uses the officially distributed ubuntu AMI.
Install boto3 in advance with `` `sudo pip install boto3```. Also, allow access to CloudWatch with IamRole.
Create a new query and write the following code.
'target-instanceId'In the place of, write the ID of the instance you want to get.
Also, since the graph of re: dash cannot be displayed in units finer than the date, `` `Period = 3600``` is set and the average is taken in hourly units.
The return value of cloud_watch.get_metric_statistics is looped around, adding CPU usage and date to the row.
For the detailed return value of cloud_watch.get_metric_statistics, please refer to [Class Method's Blog](http://dev.classmethod.jp/cloud/aws/get_value_of_cloudwatch_using_boto3/).
```python3
import boto3
import datetime
result = {}
cloud_watch = boto3.client('cloudwatch', region_name='ap-northeast-1')
get_metric_statistics = cloud_watch.get_metric_statistics(
Namespace='AWS/EC2',
MetricName='CPUUtilization',
Dimensions=[
{
'Name': 'InstanceId',
'Value': 'target-instanceId'
}
],
StartTime=datetime.datetime.now() - datetime.timedelta(days=1),
EndTime=datetime.datetime.now(),
Period=3600,
Statistics=['Average'])
for d in get_metric_statistics['Datapoints']:
add_result_row(result, {'average':d['Average'], 'datetime':d['Timestamp']. isoformat()})
add_result_column (result, 'average', 'aaaaa', 'float')
add_result_column (result, 'datetime', '', 'datetime')
If you do this, you will get the following results:
Also, if you make a line graph, you will get this kind of data.
You can also use it like a monitoring server by getting the latest data and setting alerts.
Recommended Posts