[PYTHON] I tried using jpholidayp over proxy to execute cron only on weekdays

It seems that there is a convenient thing called jpholidayp when cron is run only on weekdays.

For the usage and explanation, the following was easy to understand, so please refer to it. Emasaka / jpholidayp was great when I tried to judge holidays with cron

However, jpholidayp went to see the outside calendar at the time of execution, For various reasons, I can't go out unless I go through a proxy, so I played with the code a little.

Add about 4 lines between the 49th and 50th lines.


        else:
            #from here
            proxy = {"http":"hoge-proxy:80"}
            proxy_handler = urllib2.ProxyHandler(proxy)
            opener = urllib2.build_opener(proxy_handler)
            urllib2.install_opener(opener)
            #So far
            res = urllib2.urlopen(self.URL)
            dat = yaml.load(res)
            cache.set({"holiday_jp": dat})
        self.holiday_jp = dat

reference: [Note when running a crawler made with python's urllib2 in a proxy environment regularly with cron --moguranosenshi](http://moguranosenshi.hatenablog.com/entry/2013/10/16/%E3%83% 97% E3% 83% AD% E3% 82% AD% E3% 82% B7% E7% 92% B0% E5% A2% 83% E4% B8% 8B% E3% 81% A7python% E3% 81% AEurllib2% E3% 81% A7% E4% BD% 9C% E3% 81% A3% E3% 81% 9F% E3% 82% AF% E3% 83% AD% E3% 83% BC% E3% 83% A9% E3% 82% 92cron% E3% 81% A7)

Now in cron as it is


0 09 * * 1-5 /home/hoge/jpholidayp || /home/hoge/batch

If you do it, you can do it from Monday to Friday and on days other than holidays, so I'm happy.

(But of course there is no way to know if there are company, school, or organization-specific holidays. Don't want me to provide a holiday api for that organization)

Recommended Posts

I tried using jpholidayp over proxy to execute cron only on weekdays
I tried using cron
I tried using "Syncthing" to synchronize files on multiple PCs
I tried to digitize the stamp stamped on paper using OpenCV
I tried to visualize BigQuery data using Jupyter Lab on GCP
I tried using Azure Speech to Text.
I tried to classify text using TensorFlow
I tried to predict Covid-19 using Darts
I tried to execute SQL from the local environment using Looker SDK
I tried using Remote API on GAE / J
I tried to synthesize WAV files using Pydub.
[Pythonocc] I tried using CAD on jupyter notebook
I tried to make a translation BOT that works on Discord using googletrans
I tried to execute Python code from .Net using Pythonnet (Hallo World edition)
I tried to make a ○ ✕ game using TensorFlow
I tried to make it easy to change the setting of authenticated Proxy on Jupyter
I tried to create a table only with Django
I tried to implement Minesweeper on terminal with python
I tried to get an AMI using AWS Lambda
I tried to approximate the sin function using chainer
I tried to become an Ann Man using OpenCV
I tried using PySpark from Jupyter 4.x on EMR
I tried to identify the language using CNN + Melspectogram
I tried to access Google Spread Sheets using Python
I tried to notify the honeypot report on LINE
I tried to install scrapy on Anaconda and couldn't
I tried to complement the knowledge graph using OpenKE
I tried to draw a configuration diagram using Diagrams
I tried to compress the image using machine learning
I tried using parameterized
I tried using argparse
I tried using mimesis
I tried using anytree
I tried using aiomysql
I tried using Summpy
I tried using coturn
I tried using Pipenv
I tried using matplotlib
I tried using "Anvil".
I tried using Hubot
I tried using ESPCN
I tried using openpyxl
I tried using Ipython
I tried to debug.
I tried using PyCaret
I tried using ngrok
I tried using face_recognition
I tried to paste
I tried using Jupyter
I tried using PyCaret
I tried using Heapq
I tried using doctest
I tried using folium
I tried using jinja2
I tried using folium
I tried using time-window
I tried to search videos using Youtube Data API (beginner)
[Images available] I tried using neofetch on various operating systems!
I tried to simulate ad optimization using the bandit algorithm.
I tried to get Web information using "Requests" and "lxml"
[TF] I tried to visualize the learning result using Tensorboard