R & D life with iPython notebook

This is a memo for working while staying at home or Starbucks using iPython notebook, which is a Python web interface. I will update it when I feel like it.

Allow access to notebooks on remote servers

You can start your R & D with Starbucks at any time by launching your notebook on your workstation at work. Please refer to here for the introduction. `c.IPKernelApp.pylab ='inline'` is especially important, so that what you draw with matplotlib can be displayed on your notebook.

Run the program

Create a new notebook for each test so you don't get confused. The following Cell magic is useful for debugging.

`% time` Measure the time


`` `% prun``` profiling


`% debug` Start the debugger

After the program stops with an error, run `` `% debug``` on the appropriate Cell to start ipdb

Calculate in parallel

There are several methods. This area is also helpful I will.

1 multiprocessing.Pool Refer to this area

from multiprocessing import Pool

p = Pool(n_cores)
p.map(func, [arg list])

You can calculate in parallel like this. It should be noted that in the middle of running parallel computing![Kobito.1408940334.347506.png](https://qiita-image-store.s3.amazonaws.com/0/69/1461469f-0ba1-0986- 0431-7a8f0e745f23.png "kobito.1408940334.347506.png ") If you press the suspend button or restart button here, the worker will usually become a zombie. In that case, I try to kill everything with `` `pkill Python``` etc. and then start up the notebook again.

2 IPython.parallel

Since iPython itself has a framework for parallel computing, it is one way to use it. First, launch iPython notebook, select the Clusters tab from the dashboard that appears first, and then kobito.1408941346.713972.png Enter the desired number of workers in # of engines here and start. In the program, write as follows.

from IPython.parallel import Client

def test(x):
Functions calculated in parallel
    return x **2

cli = Client()
dv = cli[:]
x = dv.map(test, range(100))
result = x.get()

3 Run the program itself in parallel

If you want to run the same function multiple times in parallel while changing the arguments, methods 1 and 2 are effective, but if you want to run a large program in parallel, it is easier to perform background processing with a shell script and parallelize it. In other words, let's execute a shell script that writes multiple `python program.py arg1 arg2 ... &`. You can also run a bash script on a notebook Cell by using the `%% bash` command, you said, "I was talking about iPython notebook so far, but it's a shell after all!" I will. For example kobito.1408943333.067115.png

If you change this a little, kobito.1408943371.767258.png

You can easily (?) Parallelize like this. If you want to use command line arguments in Python, create a if (__ name__ =='__main__'): `` `block and read it from` `sys.argv.

Allows you to hit SSH from Chrome

If you have a small program, you can write it on iPython notebook, but when the program gets messy, it is better to make it a .py file, so you will inevitably need to dive into the workstation with SSH. .. I'm using Secure Shell ( link ) ..

Manage python packages

I use Canopy . OpenCV is also convenient because it can be inserted with one click. You can use the enpkg command to manage packages from the terminal (<a href="https://support.enthought.com/entries/22415022-Using-enpkg-to-update-Canopy-EPD" -packages "target =" _ blank "> link ). Mac users may be addicted to keychain issues, so <a href="http://stackoverflow.com/questions/14719731/keychain-issue-when-trying-to-set-up-enthought-enpkg-on -check mac-os-x "target =" _ blank "> here as well.

Recommended Posts

R & D life with iPython notebook
Parallel computing with iPython notebook
Play with Jupyter Notebook (IPython Notebook)
Run Apache-Spark with IPython Notebook
Graph drawing with IPython Notebook
Use Bokeh with IPython Notebook
Build IPython Notebook environment with boot2docker
Use apache Spark with jupyter notebook (IPython notebook)
"LIVE" HTML presentation with IPython 3.0.0-dev, IPython Notebook
Rich cell output with Jupyter Notebook (IPython)
ipython notebook installation
IPython Notebook Recommendations
How to debug with Jupyter or iPython Notebook
Graph drawing with jupyter (ipython notebook) + matplotlib + vagrant
Create a table of contents with IPython notebook
R environment construction with Jupyter (formerly IPython notebook) (on OS X El Capitan 10.11.3)
Data analysis environment construction with Python (IPython notebook + Pandas)
EC2 provisioning with Vagrant + Jupyter (IPython Notebook) on Docker
Pepper-kun remote control environment construction with Docker + IPython Notebook
Start IPython with virtualenv
3D display with plotly
3D plot with matplotlib
3D or D with Py
Make slides with iPython
Remotely connect IPython notebook
Jupyter is coming in with great momentum, so be prepared for you (IPython Notebook + R)
[Machine learning] Start Spark with iPython Notebook and try MLlib
[IPython] How to Share IPython Notebook
When using optparse with iPython
Using Graphviz with Jupyter Notebook
Use pip with Jupyter Notebook
Create 3d gif with python3
Displaying strings on IPython Notebook
How to use IPython Notebook
PySpark life starting with Docker
Use Cython with Jupyter Notebook
3D scatter plot with PyQtGraph
Launch the IPython notebook server
Run IPython Notebook on Docker
Interpolate 2D data with scipy.interpolate.griddata
Works with Python and R