[PYTHON] I tried to execute SQL from the local environment using Looker SDK

What i did

--Send SQL from your local environment using Looker API and receive the result


――It's a little tedious to download csv each time you analyze data, so I want to erase it before work is born. --If you have an SDK, you may be able to enjoy it even more.

What to prepare


Looker Api SDK Official description https://docs.looker.com/reference/api-and-integration/api-sdk

GitHub https://github.com/looker-open-source/sdk-codegen/tree/master/python

Install for the time being

pip install looker_sdk


--Click here to get API authentication key link --Click here for the types of environment variables link

###Environment variable settings

sdk = looker_sdk.init31() 
my_user = sdk.me()

There seems to be a way to read the ini file, but

スクリーンショット 2020-12-22 19.50.21.png

did it! It seems that the SDK will read the authentication information from the environment variables.

Run SQL from local via Looker and get the result!

Where can I find out? .. ..

Let's take a look at the reference for the time being. .. .. .. When

スクリーンショット 2020-12-22 20.05.53.png Link

Oh oh oh! ?? There is something called SQL Runner in the API reference. It's like that!

スクリーンショット 2020-12-22 20.13.04.png

Apparently it looks like the BETA version. (As of December 23, 2020) If it changes, it has changed, so for the time being GoGo !!

Well how to go

Probably a usable module looks like this image.png Link

--There is no sample code, so it's a bit painful. --Looking at the reference, it seems that the operation of the corresponding part is as follows --Register query --Get query --Execute query --Probably it seems to be a mechanism to queue and execute. You can see the queue below - https://xxxx.looker.com/admin/queries

--As far as I can find it on the net, it seems that sdk will bring the method corresponding to the API URL. --This is in the reference. ――What should I do with the arguments? --Like creating with SqlQueryCreate - SqlQueryCreate - model_name ――I'm not sure. However, when I put the model string, it started to work. - sql --Explanation omitted

Query registration

from looker_sdk import methods31, models31

#Create API arguments
SQL = "select * from xxxxxxx limit 10"

query = models31.SqlQueryCreate( model_name="xxxxxxxx",sql="select * from public.xxxxx limit 10")
#Execute query
result = sdk.create_sql_query(body=query).slug

Apparently, there is a module in looker_sdk that corresponds to the API version. This time it is 3.1, so select the one that corresponds to it

Executing a query


--Check here when you throw SQL and an empty one comes back.

It will come back in binary, so if you save it in binary, you will be able to use it.

fw = open('result.csv', 'wb')

in conclusion

To be honest, I was wondering if it would be nice to have Looker execute SQL locally. Many people look at the dashboard, and if you want, you can connect BigQuery with pandas, so most of the merits of what I investigated this time do not feel like how to use the SDK. While using Looker's assets, for queries that are too complicated or change frequently to handle on Looker, get it with SQL via SDK like this → Build a mechanism to automatically rotate analysis I think it is meaningful.


Get authentication key

Official description


Recommended Posts

I tried to execute SQL from the local environment using Looker SDK
I tried to predict the deterioration of the lithium ion battery using the Qore SDK
I tried to execute Python code from .Net using Pythonnet (Hallo World edition)
I tried to detect the iris from the camera image
I tried to approximate the sin function using chainer
I tried to identify the language using CNN + Melspectogram
I tried to complement the knowledge graph using OpenKE
I tried to compress the image using machine learning
I tried to predict the victory or defeat of the Premier League using the Qore SDK
[AWS] I tried using EC2, RDS, Django. Environment construction from 1
I tried using the Python library from Ruby with PyCall
I tried to simulate ad optimization using the bandit algorithm.
[TF] I tried to visualize the learning result using Tensorboard
I tried to fight the Local Minimum of Goldstein-Price Function
I tried changing the python script from 2.7.11 to 3.6.0 on windows10
I tried to get various information from the codeforces API
I tried to approximate the sin function using chainer (re-challenge)
I tried to output the access log to the server using Node.js
I tried to get data from AS / 400 quickly using pypyodbc
Implementation of recommendation system ~ I tried to find the similarity from the outline of the movie using TF-IDF ~
I tried to move the ball
I tried to automate the construction of a hands-on environment using IBM Cloud's SoftLayer API
I tried using the checkio API
I tried to estimate the interval.
I tried to get the index of the list using the enumerate function
LINEbot development, I want to check the operation in the local environment
I tried to digitize the stamp stamped on paper using OpenCV
I tried to cut out a still image from the video
I tried to get data from AS / 400 quickly using pypyodbc Preparation 1
I tried using Azure Speech to Text.
I tried to summarize the umask command
I tried to recognize the wake word
I tried to classify text using TensorFlow
I tried using UnityCloudBuild API from Python
I tried to summarize the graphical modeling.
I tried to estimate the pi stochastically
I tried to touch the COTOHA API
I tried using Headless Chrome from Selenium
I tried using the BigQuery Storage API
I tried to predict Covid-19 using Darts
I tried to transform the face image using sparse_image_warp of TensorFlow Addons
I tried to get the batting results of Hachinai using image processing
I tried using jpholidayp over proxy to execute cron only on weekdays
I tried to estimate the similarity of the question intent using gensim's Doc2Vec
I tried to control multiple servo motors MG996R using the servo driver PCA9685.
I tried to summarize various sentences using the automatic summarization API "summpy"
I tried to learn the angle from sin and cos with chainer
I tried to extract and illustrate the stage of the story using COTOHA
I tried the common story of using Deep Learning to predict the Nikkei 225
Using COTOHA, I tried to follow the emotional course of Run, Melos!
I tried to analyze the New Year's card by myself using python
I tried to make PyTorch model API in Azure environment using TorchServe
Python programming: I tried to get company information (crawling) from Yahoo Finance in the US using BeautifulSoup4
I tried to deliver mail from Node.js and Python using the mail delivery service (SendGrid) of IBM Cloud!
I tried to notify the update of "Hamelin" using "Beautiful Soup" and "IFTTT"
I tried web scraping to analyze the lyrics.
I tried using scrapy for the first time
I tried to summarize the relationship between probability distributions starting from the Bernoulli distribution
I tried hitting the Qiita API from go
vprof --I tried using the profiler for Python
[Python] I tried to judge the member image of the idol group using Keras