[PYTHON] Before the coronavirus, I first tried SARS analysis

Introduction

It's boring to stay at home because of the recent coronavirus epidemic, and I want to get out early, so I'm going to use it in the future. I tried to analyze from the same type of SARS virus in order to know the trend of Navirus.

Future outlook

  1. Analyze SARS virus of the same classification due to the epidemic of coronavirus
  2. After that, analyze the coronavirus
  3. Compare the two

First of all

I started searching for data because I didn't have any data to analyze it. ** No good CSV data exists ** So, first of all, to create from the CSV file, search for SARS data from WHO However, I created it.

Actually created CSV

スクリーンショット 2020-05-15 8.35.14.png This is the data up to the 10th day. Since I created it myself, it seems good that there are few missing values. ## Actually analyze Now I would like to move on to the actual analysis. ### 1. Library

SARS.ipynd


import pandas as pd
import numpy as np
import matplotlib.pyplot as plt

2. CSV output

SARS.ipynd


df1 = pd.read_csv('analytics_SARS].csv')
df1.head()

3. See the increase in the number of infected people

SARS.ipynd


x_num = df1['day']
y_num = df1['total']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 8.58.47.png Up to about 60 days, there was an upward trend, from which it can be seen that the number of infected people has decreased significantly. ### 4.1 Let's see the increase and decrease in the number of infected people per day

SARS,ipyind


x_num = df1['day']
y_num = df1['total-change']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 9.06.00.png The initial fluctuations can be predicted to be due to gradual reports of infection from new countries. ### 5. See the increase in deaths

SARS.ipynd


x_num = df1['day']
y_num = df1['death']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 9.10.38.png As many as 800 people have died in about 100 days. In other words, as many as eight people are dead in a simple calculation day.

6.1 See the increase / decrease in deaths per day

SARS.ipynd


x_num = df1['day']
y_num = df1['death-change']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 9.20.27.png The number of deaths per day is increasing or decreasing. This seems to be due to the cycle in which the news of death arrives. ###### Mortality

SARS.ipynd


 df1['death'].sum() / df1['total'].sum() 

It turns out that the case fatality rate is 0.08121591227553301, or about 8%.

7. Look at the number of recoverers

SARS.ipynd


x_num = df1['day']
y_num = df1['recovery']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 10.02.52.png It can be seen that recovery people gradually appeared from around 20 days. The number of people recovering from it has increased significantly.

8.1 Let's see the increase and decrease in the number of infected people per day

SARS.ipynd


x_num = df1['day']
y_num = df1['recovery_change']
plt.plot(x_num,y_num)#Show graph
スクリーンショット 2020-05-15 10.03.49.png A sharp rise is seen around 20 days. This seems to be related to the number of recoverers reported in each country. However, the number of recoverers seems to be stable on average. # Summary This time, I focused on a simple and simple analysis. I analyzed it using simple code and simple calculation, but I think it became easier to understand just by graphing it. In the future, I will also perform more advanced analysis and corona analysis.

Recommended Posts

Before the coronavirus, I first tried SARS analysis
I tried tensorflow for the first time
I tried using scrapy for the first time
I tried cluster analysis of the weather map
I tried logistic regression analysis for the first time using Titanic data
I tried python programming for the first time.
I tried Mind Meld for the first time
I tried Python on Mac for the first time.
I tried to predict the J-League match (data analysis)
I tried python on heroku for the first time
AI Gaming I tried it for the first time
I tried the changefinder library!
[First data science ⑤] I tried to help my friend find the first property by data analysis.
[First COTOHA API] I tried to summarize the old story
I tried morphological analysis of the general review of Kusoge of the Year
I tried the Google Cloud Vision API for the first time
I tried the TensorFlow tutorial 1st
I tried the Naro novel API 2
I tried the TensorFlow tutorial 2nd
I tried the Naruro novel API
I tried to move the ball
I tried using the checkio API
I tried to estimate the interval.
The first time a programming beginner tried simple data analysis by programming
I tried the TensorFlow tutorial MNIST 3rd
I tried multiple regression analysis with polynomial regression
I tried to summarize the umask command
I tried to recognize the wake word
I tried time series analysis! (AR model)
I tried factor analysis with Titanic data!
I tried the OSS visualization tool, superset
(Now) I analyzed the new coronavirus (COVID-19)
I tried to summarize the graphical modeling.
I tried to estimate the pi stochastically
I tried to touch the COTOHA API
Python: I tried the traveling salesman problem
I tried playing with the image with Pillow
I tried fractal dimension analysis by the box count method in 3D
I tried to predict the behavior of the new coronavirus with the SEIR model.
I tried the Python Tornado Testing Framework
I tried using the BigQuery Storage API
I tried to display the infection condition of coronavirus on the heat map of seaborn
I tried to automatically send the literature of the new coronavirus to LINE with Python
I tried "smoothing" the image with Python + OpenCV
I tried web scraping to analyze the lyrics.
[Python] I tried substituting the function name for the function name
I tried hitting the Qiita API from go
vprof --I tried using the profiler for Python
I tried "differentiating" the image with Python + OpenCV
I tried to optimize while drying the laundry
I tried to save the data with discord
I tried simulating the "birthday paradox" in Python
I tried the least squares method in Python
I tried using PyCaret at the fastest speed
I tried principal component analysis with Titanic data!
I tried using the Google Cloud Vision API
I tried to touch the API of ebay
I tried FX technical analysis by AI "scikit-learn"
I tried to correct the keystone of the image
I tried "binarizing" the image with Python + OpenCV
[Data analysis] Should I buy the Harumi flag?