Last time, [Let's draw a "weather map-like front" by machine learning based on weather data (2)](https://qiita.com/m-taque/items/988b08185097dca5f5b5 "Based on weather data," I tried to draw a "weather map-like front" by machine learning (2) "), and summarized the story up to the visualization of numerical data in GPV format. This time, I will summarize the story up to extracting frontal data as teacher data.
We aim to recognize multiple types of weather data images that were visualized last time and generate a front image at the same location as the front of the preliminary weather map.
Specifically, we will create a neural network that takes a meteorological data image at a certain time (6 types, 256x256 size each) as input and generates a 256x256 size image as output.
This output image uses a front image created from a preliminary weather map as a teacher image. The teacher image is created by converting the cold front element to "blue", the warm front and occluded front to "red", and the others to "white" from the preliminary weather map.
For each pixel in the output image, let the cross entropy calculate which of the tricolors it corresponds to and color that pixel with the most probable result.
Why is it only the front line without generating the weather map itself?
As a matter of fact, I also tried to generate the weather map itself at first, but as a result, it seemed to be pulled by the generation of isobars, so I didn't really think about it.
The front elements of the preliminary weather map are cold fronts (blue), warm fronts (red), occluded fronts (pink), and stationary fronts (blue and red).
(12:00 UTC, June 23, 2019) The weather map is rather lively, and a stationary front extends from the South China region through the East China Sea to the south of Japan. In addition, the low pressure system off Sanriku (moving at 30km / h) is blocked, and the occluded front, warm front and cold front extend.
From here, cut only the front element based on the color.
Topographic maps, latitude / longitude lines, and isobars are not required, so black and green are excluded. The occluded front also stopped leaving a slight pink color and turned red. However, only the symbols such as "low" and "high" remain, but I close my eyes.
After all, from the preliminary weather map, create an image that converts the cold front element to "blue", the warm front and occluded front to "red", and the others to "white", and use it as a teacher image.
PIL was used to cut out frontal elements from the weather map based on color. I referred to the following.
Image pixel manipulation in Python-Qiita
cripping.py
t_img = load_img(t_imgfile)
#Load the weather map image
t_data = img_to_array(t_img)/255
#Convert to a numeric array
#- Mask Fornt Line
m_data = 255*t_data
#Make one image of the same size
m_img = array_to_img(m_data)
#Convert to a numeric array
img_size = m_img.size
mask1 = Image.new('RGB', img_size)
#Generate image data by RGB
for x in range(img_size[0]):
for y in range(img_size[1]):
r,g,b = m_img.getpixel((x,y))
#Read the RGB value of a pixel
if(r>g+25):
if(r>b-40):
#If it is decided to be red(255,0,0)
r=255
b=0
g=0
else:
r,g,b=255,255,255
#If it is not red, "white"
elif(b>r):
if(b>g+25):
#If it is decided to be blue(0,0,255)
b=255
g,r=0,0
else:
r,g,b=255,255,255
#If it is not blue, "white"
else:
r,g,b=255,255,255
#Others are "white"
mask1.putpixel((x,y),(r,g,b))
#Substitute for the pixel value of the image
There is a play of 25 or 40 for RGB discrimination of colors. When the front image of the preliminary weather map is enlarged, even if it is red, for example, the colors are mixed at the boundary with the color of the surrounding sea and the green of the longitude line. If you do not pick up this area when making a teacher image, it will be a thin jagged line, so there is play in the judgment range. While reading the RGB values using the "Digital Color Meter" that comes standard with MacOS, I decided the play range by trial and error.
This time, we have summarized how the front elements, which are the teacher data for creating machine learning that automatically draws fronts, are extracted from the weather map.
Actually, I don't have many color weather maps, and there were many black and white weather maps going back in time, so it became necessary to create teacher data from this black and white version as well.
Therefore, after creating a CNN that converts a black-and-white weather map (right) to a color weather map on the left as shown below, I cut out the front elements using the image cropping program explained this time.
Next time, I will post a story about colorizing black-and-white weather maps, which I did to increase teacher data.
[Next time: Let's draw a "weather map-like front" by machine learning based on weather data (4)](https://qiita.com/m-taque/items/80ba51b74167b2aa669e "Based on weather data," Let's draw a "weather map-like front" by machine learning (4) ")