Until now, observation images have been acquired from the Copernicus hub, which is the hub of Sentinel satellite images, and the images have been analyzed.
How to get the latest satellite imagery for free. I tried to automatically acquire the observation image of the artificial satellite (Sentinel-2) using API.
I made a comparison of the observed images with the past images and Gif animation of the time series change, but the past images that can be acquired by API are about the past year, and if you want to see a longer-term trend, manage Copernicus separately. I had to contact someone to get it ready. I was wondering if this would be okay because I could get the latest observation images, but an acquaintance asked me, "I want to know the long-term trends," and I was thinking of working manually. At that time, when I learned about the following article, I learned that the Google Earth Engine can be used to ** acquire and analyze past observation images of the Sentinel satellite online. Thank you.
I knew Google Earth Engine (hereinafter, GEE), but I avoided it because it can only be used in the GEE environment and the language is not Python. However, this is not convenient because it can be used on Google colaboratory from the above article, and you do not have to process it after acquiring the observation image of Sentinel as before! I found that it was relatively easy, so I will introduce it.
An example of the analysis is the following ** Trends in parking lot usage at Tokyo Disneyland **.
I will introduce it in this article including the explanation of this graph. Also, the code introduced in the article is placed on Github, so if you want to try it yourself, please use this. ** Google colaboratory is used **, so you can use it regardless of PC as long as you have a network environment **.
For example, the following article may be helpful for estimating the parking lot usage status from satellite images.
When are you free !? Disneyland Congestion Prediction Challenge with Free Satellite Data (Part 1)
From this, it can be seen that the observation image of the SAR satellite, which is the satellite of the radio wave sensor, is also used for the satellite image. That is, the optical observation image needs to be a high-resolution image of ** 50 cm or less ** in order to confirm the car, and ** the target area must be a clear sky without clouds **. It is said that the cloud coverage rate in the Asian region is about 30%, and depending on the location, it may be covered with clouds for many days and cannot be seen on satellite images. In comparison, ** SAR images are difficult to identify the type of vehicle, such as a car or truck **, but it can be qualitatively evaluated if a large number of vehicles are parked in the parking lot. , And because it is not affected by clouds, it can be evaluated stably. This time, unlike the case of estimating the distribution of the number of cars from aerial photographs, we will find a relative trend of how the usage status is changing based on the usage status of cars at a certain date and time. Therefore, even SAR images can be analyzed. However, there is a difference in resolution even with the same SAR image, and I would like to use a high-resolution SAR image if possible, but ** the observation image of the Sentinel-1 satellite that is free and regularly observed ** I will use it. The density of structures such as cars in the parking lot based on the SAR image can be estimated to some extent from the image. I don't have a drawing heart, so I will refer to related articles.
Kihon of satellite data-Understanding, type, frequency, resolution, application examples ~
SAR satellites irradiate the earth with radio waves and image the intensity of the reflected signal. At this time, ** radio waves are radiated from an oblique direction with respect to the ground surface **, so if the ground surface is flat as shown in the figure, many things are in the incident direction of the radio waves, that is, the satellite that receives the radiated radio waves. It reflects in the opposite direction. Therefore, ** If the target ground surface is flat, the received signal is weak and the image will be dark. The surface of water such as lakes and the sea is extremely flat, so it looks darker **. On the other hand, when there is a structure such as a building, a building, or a car, the object is uneven, so more radio waves are reflected in the same direction as the incident direction, so the image looks bright. For example, the observation image of the radio satellite of Tokyo Disneyland is as follows.
You can see the facilities at Disneyland well. Also, as you probably know if you went to Disneyland, you can see that there is a parking lot around it, which is wide and flat, so it looks dark. When a car is parked in this parking lot, it looks bright, so the evaluation method this time is to estimate the usage status of the parking lot by evaluating the brightness of the parking lot area. The artificial satellites used this time are Sentinel-1 and 2 developed and operated by the European Space Agency. For details of each, please refer to the following articles.
How to get the latest satellite imagery for free.
The observed image of the analysis evaluation is Sentinel-1, which is a SAR satellite. The Sentinel-1 is composed of two satellites, ** one has 12 days of return days ** (meaning that it passes through the same area under the same conditions every 12 days), but it has two satellites. Therefore, we observe many areas ** once every 6 days **. This ** regular observation ** is a major feature of the Sentinel satellite and an advantage on the user side. [Sentinel-1 Observation Scenario] (https://sentinel.esa.int/web/sentinel/missions/sentinel-1/observation-scenario)
In addition, the observation time of artificial satellites is almost fixed, ** for optical satellites, around 10:30 am, and for SAR satellites, around 6:00 am and 6:00 pm **. Can you see people from artificial satellites? ~ By satellite, ground resolution / local time summary ~
In Sentinel-1, which was introduced earlier, the observation time is 6:00 am for satellite A and 6:00 pm for satellite B. Therefore, using satellite A will evaluate the parking situation in the early morning. I think that it is rare for people to wait in the parking lot from this time, so ** I used only the observation image of satellite B ** this time. In addition to this parking lot evaluation, it is recommended that those who use the Sentinel-1 satellite for time-series evaluation use only the observation images of either the A satellite or the B satellite. This is because the observation direction differs depending on the satellite, so when evaluating a tall structure such as a building, the irradiation direction of radio waves is different and the shadow changes, so a simple front-to-back comparison causes a difference other than the time component. .. Next, we will introduce the acquisition of satellite images from GEE and its evaluation.
Please refer to the following article for details on how to acquire satellite images with GEE. Artificial satellite image analysis by Google Earth Engine and Google Colab-Satellite image analysis starting for free (Introduction)- ** Easy to use for free if you have a Google account **. It has become an amazing world. The acquired data such as satellite images are saved in Google Drive. Since the code and data are deleted every time you use Google Colaboratory, it is convenient to have the data in your own Google Drive. However, since the capacity of Google dirve for free use is 20GB, it will run out as soon as a large satellite image is downloaded, so please be careful to delete it as appropriate. Now, let me introduce the code.
import ee
import numpy as np
import matplotlib.pyplot as plt
ee.Authenticate()
ee.Initialize()
First, execute this to authenticate the GEE connection. When you execute it, the link will be returned. Click on it to perform the authentication procedure, copy the access code and enter it.
Next, authenticate the connection to Google Drive. Again, the flow is the same as GEE certification.
from google.colab import drive
drive.mount('/content/drive')
Next, we will perform work such as viewing the acquired satellite image and installing the modules necessary for digitizing and analyzing it.
#Package installation&import
!pip install rasterio
import numpy as np
import matplotlib.pyplot as plt
import rasterio
import json
import os
import glob
import time
from datetime import datetime
from dateutil.parser import parse
Frequently used modules are already installed in Google Colaboratory, so no additional work is required, but this time we will use Geotiff, which is an image with map information added, so it is necessary for image processing ** Rasterio Install **.
Next, install a module called ** folium ** to check the set target area on the map.
!pip install folium
import folium
Now that the environment is ready, take a satellite image from GEE.
In order to acquire satellite images from GEE, you need to enter the latitude / longitude information of the target area you are interested in. I somehow remember the latitude and longitude of Japan from the geography class at school and the globe, but what is the latitude and longitude of my home? Even if it is said, I think that you can finally find out by searching and searching. (Of course, I don't remember either.) Therefore, I made the following so that the latitude and longitude of the area of interest can be easily investigated and obtained.
#Acquisition of polygon information of the area of interest.
from IPython.display import HTML
HTML(r'<iframe width="1000" height="580" src="https://gispolygon.herokuapp.com/" frameborder="0"></iframe>')
When you execute this, the following screen will be displayed.
After expanding the area of interest, select a square polygon from the icon on the left to display the polygon of the area of interest. Then click ** Show Features ** to see the polygon geographic information in the right window. Then click ** Copy ** at the bottom to copy this geographic information. For example, in the case of the parking lot of Tokyo Disneyland, which is the target of this time, it will be as follows.
Then paste the copied map information below and enter it.
A = {"type":"FeatureCollection","features":[{"properties":{"note":"","distance":"1127.16 m","drawtype":"rectangle","area":"11.71 ha"},"type":"Feature","geometry":{"type":"Polygon","coordinates":[[[139.87487004138532,35.63225823758155],[139.87487004138532,35.6360949926148],[139.87790093757215,35.6360949926148],[139.87790093757215,35.63225823758155],[139.87487004138532,35.63225823758155]]]}}]}
This geographic information is processed into the input format for GEE and for display in Folium.
#Set any file name to be used in the future. For example, the name of the area.
object_name = 'Tokyo_TDL2'
with open(str(object_name) +'_2.geojson', 'w') as f:
json.dump(A, f)
json_file = open(str(object_name) +'_2.geojson')
json_object = json.load(json_file)
#Extract only the latitude / longitude information of the area of interest from json.
AREA = json_object["features"][0]["geometry"]['coordinates'][0]
area = pd.DataFrame(AREA, columns=['longtitude', 'latitude'])
area_d =[[area['longtitude'].min(), area['latitude'].max()],
[area['longtitude'].max(), area['latitude'].max()],
[area['longtitude'].max(), area['latitude'].min()],
[area['longtitude'].min(), area['latitude'].min()],
[area['longtitude'].min(), area['latitude'].max()]]
AREA = area_d
Now, let's check the set area of interest.
m = folium.Map([(AREA[0][1]+AREA[len(AREA)-2][1])/2,(AREA[0][0]+AREA[len(AREA)-3][0])/2], zoom_start=15)
folium.GeoJson(str(object_name) +'_2.geojson').add_to(m)
m
output
A lot of satellite images and a lot of information already analyzed are set in GEE. For details, please refer to Data Catalog. Sentinel-1 and 2 are as follows.
Sentinel-1 SAR GRD: C-band Synthetic Aperture Radar Ground Range Detected, log scaling Sentinel-2 MSI: MultiSpectral Instrument, Level-1C
From this page, ** Sentinel-1 observation images can be used from October 3, 2014 **, and ** Sentinel-2 data can be used from June 23, 2015 **. As for the observation image of Sentinel-2, the image of Level 2A of atmospheric correction with the atmospheric haze removed is also prepared, but only the image after March 28, 2017 when the analysis became standard is targeted. It becomes. Therefore, if you want to use an older observation image, please use this image.
Now, get the images of Sentinel-1 and 2 from GEE and save them in Google colaboratory.
First, prepare the format of the geographic information to be set in GEE.
region=ee.Geometry.Rectangle(area['longtitude'].min(),area['latitude'].min(), area['longtitude'].max(), area['latitude'].max())
Next, set the parameters of the information to be acquired. This time, the period of the acquired image and the save destination of the acquired image are specified.
#Specify the period
from_date='2019-01-01'
to_date='2020-08-31'
#Folder name to save
dir_name_s1 = 'GEE_Sentinel1_' + object_name
dir_name_s2 = 'GEE_Sentinel2_' + object_name
Now, let's set the image conditions for Sentinel-1 and 2.
def cloudMasking(image):
qa = image.select('QA60')
cloudBitMask = 1 << 10
cirrusBitMask = 1 << 11
mask = qa.bitwiseAnd(cloudBitMask).eq(0).And(qa.bitwiseAnd(cirrusBitMask).eq(0))
return image.updateMask(mask).divide(10000)
def ImageExport(image,description,folder,region,scale):
task = ee.batch.Export.image.toDrive(image=image,description=description,folder=folder,region=region,scale=scale)
task.start()
Sentinel1 = ee.ImageCollection('COPERNICUS/S1_GRD').filterBounds(region).filterDate(parse(from_date),parse(to_date)).filter(ee.Filter.eq('orbitProperties_pass', 'ASCENDING')).select(['VV'])
Sentinel2 = ee.ImageCollection('COPERNICUS/S2').filterBounds(region).filterDate(parse(from_date),parse(to_date)).filterMetadata('CLOUDY_PIXEL_PERCENTAGE','less_than', 20).map(cloudMasking).select(['B4','B3','B2'])
imageList_s1 = Sentinel1.toList(300)
imageList_s2 = Sentinel2.toList(300)
For the above settings, refer to this article. [6. Local GEE execution by Python](https://sites.google.com/site/mizuochipublic/%E5%AE%9F%E8%B7%B5%E3%82%B3%E3%83%B3 % E3% 83% 86% E3% 83% B3% E3% 83% 84-remote-sensing-tutorials / google-earth-engine% E5% 85% A5% E9% 96% 80 / 6-python% E3% 81 % AB% E3% 82% 88% E3% 82% 8B% E3% 83% AD% E3% 83% BC% E3% 82% AB% E3% 83% AB% E3% 81% 8B% E3% 82% 89 % E3% 81% AEgee% E5% AE% 9F% E8% A1% 8C) Based on the above, the setting of the observation image of Sentinel-1 and the deletion of the cloud image are added in Sentinel-2.
Here, since the observation image of Sentinel-1 uses only the observation image of satellite B at 6 pm,'ASCENDING' is selected in'orbitProperties_pass'. When set to "Descending", the observation image will be at 6:00 am.
Now, under the above satellite image acquisition conditions, the images of the Sentinel-1 and Sentinel-2 areas of interest are acquired.
for i in range(imageList_s1.size().getInfo()):
image = ee.Image(imageList_s1.get(i))
ImageExport(image.reproject(crs='EPSG:4326',scale=10),image.get('system:index').getInfo(),dir_name_s1,region['coordinates'][0],10)
for i in range(imageList_s2.size().getInfo()):
image = ee.Image(imageList_s2.get(i))
ImageExport(image.reproject(crs='EPSG:4326',scale=10),image.get('system:index').getInfo(),dir_name_s2,region['coordinates'][0],10)
Display and confirm the acquired satellite image. Satellite images are saved in the directory (folder) set in my drive of Google drive. Call it and display it. First, from the observation image of Sentinel-2.
#Visualization in chronological order
s2_path = '/content/drive/My Drive/' + dir_name_s2 + '/'
files =os.listdir(s2_path)
files.sort()
plt.figure(figsize=(25, 25))
j=0
v = len(files)//5 +1
for i in range(len(files)):
#Acquire and visualize images one scene at a time
with rasterio.open(s2_path + files[i]) as src:
arr = src.read()
j+=1#Shift and place the plot position of the image
plt.subplot(v,5,j)
arrayImg = np.asarray(arr).transpose(1,2,0).astype(np.float32)*2 #Brightness is doubled and brightness is corrected.
plt.imshow(arrayImg)
plt.title(files[i][0:8])#Get the date from the file name
#plt.tight_layout()
** The resolution of the observed image of Sentinel-2 is 10m **, which makes it difficult to identify the model of the car. This time, the observation image of Sentinel-2 will not be used for evaluation, but you can get an overview of what the parking lot was like when the observation image of Sentinel-1 was acquired. The current situation is unknown because Tokyo Disneyland has not been visited for more than a few years ago, but a multi-storey car park or some kind of building (upper left) is built in the street parking lot, and 2019 when the observation image was acquired. Construction may be underway in January. Next, the observed image of Sentinel-1 is displayed.
#Visualization in chronological order
s1_path = '/content/drive/My Drive/' + dir_name_s1 + '/'
files =os.listdir(s1_path)
files.sort()
plt.figure(figsize=(20, 40))
j=0
v = len(files)//5 +1
for i in range(len(files)):
#Acquire and visualize images one scene at a time
with rasterio.open(s1_path + files[i]) as src:
arr = src.read()
j+=1#Shift and place the plot position of the image
plt.subplot(v,5,j)
plt.imshow(arr[0], cmap='gray')
plt.title(files[i][33:41])#Get the date from the file name
plt.tight_layout()
The observed image of the SAR satellite is a black-and-white image, and it is not clear from this alone. The observed image of Sentinel-2 confirmed earlier is useful for understanding this image.
Next, the integrated value of the signal (brightness) of each observed image of Sentinel-1 is obtained, and the trend is acquired.
#Time series graph of the total value of reflection intensity in the area
sum_signal = []
label_signal = []
for i in range(len(files)):
#Acquire and visualize images one scene at a time
with rasterio.open(s1_path + files[i]) as src:
arr = src.read()
sum_signal.append(arr.sum())
label_signal.append(files[i][33:41])
#Visualization
fig,ax = plt.subplots(figsize=(15,6))
plt.plot(sum_signal, marker='o')
ax.set_xticks(np.arange(0,len(files)))
ax.set_xticklabels(label_signal, rotation=90)
plt.title('Trend in parking lot usage at TDL.')
plt.xlabel('date')
plt.show()
The horizontal axis is the observation date, and the vertical axis is the integrated value of the signals of the observed image (the integrated value of the signals of the entire image). The value on the vertical axis has no physical meaning, but it is evaluated based on the value on January 18, 2019. The value is increasing from January to March 2019. Looking at the observation image of Sentinel-2, a building (multi-storey car park?) Was constructed in the parking lot area during this period, and this structure increased the signal of the SAR image (increased density of buildings in the target area). ) Probably because of this. After that, until the end of January 2020, there was no significant change, although it fluctuated slightly, but by February 2020, the value dropped sharply and maintained. This is considered to be ** the effect of the spread of the new coronavirus infection **. In fact, ** Tokyo Disneyland has been suspended from operation on February 29, 2020 **, and has resumed operation on July 1. With the resumption of operations, the value increased, but it decreased again in August. I think this has the effect of re-spreading the infection. I think it was expected that many people would come to the venue during the summer vacation, but I think the difficult situation continues. We hope that the theme park will return to a fun place while working to prevent the spread of the infection.
We introduced an example of satellite image acquisition and analysis using Google Earth Engine provided by Google. This time, we targeted the usage status of parking lots in Disneyland, but ** the SAR signal will increase as the number of structures in the target area increases **, so it can also be used for trends in the development status of urban areas other than parking lots. I think it can be applied. I hope this will be an opportunity for more people to be interested in satellite imagery. If you have any comments or questions, please feel free to comment. I'm happy.
How to get the latest satellite imagery for free. I tried to automatically acquire the observation image of the artificial satellite (Sentinel-2) using API. Create an estimated distribution map of cars from artificial satellite images by PyTorch.
[Sequel] Artificial satellite image analysis by Google Earth Engine and Google Colab-Satellite image analysis starting for free (practice)- Artificial satellite image analysis by Google Earth Engine and Google Colab-Satellite image analysis starting for free (Introduction)- [6. Local GEE execution by Python](https://sites.google.com/site/mizuochipublic/%E5%AE%9F%E8%B7%B5%E3%82%B3%E3%83%B3 % E3% 83% 86% E3% 83% B3% E3% 83% 84-remote-sensing-tutorials / google-earth-engine% E5% 85% A5% E9% 96% 80 / 6-python% E3% 81 % AB% E3% 82% 88% E3% 82% 8B% E3% 83% AD% E3% 83% BC% E3% 82% AB% E3% 83% AB% E3% 81% 8B% E3% 82% 89 % E3% 81% AEgee% E5% AE% 9F% E8% A1% 8C)
Kihon of satellite data-Understanding, type, frequency, resolution, application examples ~ Can you see people from artificial satellites? ~ By satellite, ground resolution / local time summary ~
Sentinel-1 SAR GRD: C-band Synthetic Aperture Radar Ground Range Detected, log scaling Sentinel-2 MSI: MultiSpectral Instrument, Level-1C
Recommended Posts