Hyogo Prefecture's coronavirus special site is open to the public https://stop-covid19-hyogo.org/ So far, it works separately from my published GraphQL API. I would like to introduce it as it is very helpful.
hyogo.covid19-api GraphQL Playground
** api breakdown **
readInfectedPeoples Infected person information list acquisition query https://web.pref.hyogo.lg.jp/kk03/corona_kanjyajyokyo.html Excel file of infected person information posted on the above page every 30 minutes Download in the container. Read that file and insert the data into PostgreSQL. The GraphQL server set up with gqlgen uses gorm to acquire data from PostgreSQL and distribute it.
** Relationship between GraphQL fields and Excel files **
Data after "Certified Children's Garden" in Excel can be read by a sub-collection query called cluster_places
.
number | Date confirmed | Age | sex | jurisdiction | residence | Profession | Date of onset | Travel history | Remarks column |
---|---|---|---|---|---|---|---|---|---|
no | confirmed_date | age_group | sex | jurisdiction | residence | occupation | onset_date | travel_history | remarks |
** Acquisition example **
** Example of query to get all fields ** There is also an id field by uuid, but when trying it on the playground it is too long and hard to see Here, I will paste a query that does not include the id.
query{
readInfectedPeoples{
no
confirmed_date
age_group
sex
jurisdiction
residence
occupation
onset_date
travel_history
remarks
cluster_places{
no
label
is_relation
}
}
}
readExamOverviews https://web.pref.hyogo.lg.jp/kk03/corona_hasseijyokyo.html Every 30 minutes on the above page, the pdf file just below "Occurrence" Download in the container. Read that file and insert the data into PostgreSQL. The GraphQL server set up with gqlgen uses gorm to acquire data from PostgreSQL and distribute it.
** Related data **
** Acquisition example **
** Example of query to get all fields **
query{
readExamOverviews{
no
label
count
}
}
readExamDetails https://web.pref.hyogo.lg.jp/kf16/singatakoronakensa.html Scraping the table tag at the bottom of the bar chart above every 30 minutes Insert the extracted data into PostgreSQL. The GraphQL server set up with gqlgen uses gorm to acquire data from PostgreSQL and distribute it.
** Related data **
** Acquisition example **
** Example of query to get all fields **
query{
readExamDetails{
no
date
exam_count
positive_count
}
}
readPCROverviews Improved version of readExamOverviews Until 2020/3/13 https://web.pref.hyogo.lg.jp/kk03/corona_hasseijyokyo.html Accumulate pdf information of However, since the pdf is replaced every day With cumulative information of https://web.pref.hyogo.lg.jp/kf16/singatakoronakensa.html We have borrowed some data from https://covid-hyogo.now.sh. (Manual input www) Data accumulation has started from March 23, 2020. The date in the date field is the date of the previous day according to the rules posted on the Hyogo prefecture site. It is saved in ISO8601 (RFC3339) format. By the way, until the data on the site is updated, the same data as the data announced the day before will be returned.
** Related data **
** Acquisition example **
** Example of query to get all fields **
query{
readPCROverviews{
id
date
pcr_total
pcr_positive_count
hospitalized_count
not_serious_count
serious_count
death_count
discharge_count
}
}
Added on 2020/3/23 It's time to try a GraphQL subscription Also, I've finally become interested in CI around operations.
Recommended Posts