[PYTHON] Deep Learning from scratch

Execute the code while reading Deep Learning created from scratch. I will write down the places where it is jammed.

environment

-Windows10 -Ubuntu on Windows -python3

Building a Python environment

numpy was already included, but matplotlib was not included, so I installed it with a command. $ sudo apt-get install python3-matplotlib

1.6.1 Drawing a simple graph

show () cannot be used I need to introduce Xming http://qiita.com/makky0620/items/e31edc90f22340d791ff

Download Xming from the following site https://ja.osdn.net/projects/sfnet_xming/

Xming-6-9-0-31-setup.exe Xming-fonts-7-7-0-10-setup.exe 2 I ran the file and installed it.

When you hover the mouse over the icon in the lower right bar, Xming server: 0.0 is displayed (as per the reference site) Enter in the environment variable. $ export DISPLAY=localhost:0.0

$ xclock The clock was displayed successfully.

It is now displayed by show ().

1.6.2 legend () seems to be a function that gives a legend

? The keyboard was US, probably because I installed Xming.

1.6.3 An error occurred when loading and displaying the image

Traceback (most recent call last):
  File "img.py", line 5, in <module>
    plt.imshow(img)
  File "/usr/lib/python3/dist-packages/matplotlib/pyplot.py", line 2881, in imshow
    ax = gca()
  File "/usr/lib/python3/dist-packages/matplotlib/pyplot.py", line 803, in gca
    ax =  gcf().gca(**kwargs)
  File "/usr/lib/python3/dist-packages/matplotlib/pyplot.py", line 450, in gcf
    return figure()
  File "/usr/lib/python3/dist-packages/matplotlib/pyplot.py", line 423, in figure
    **kwargs)
  File "/usr/lib/python3/dist-packages/matplotlib/backends/backend_tkagg.py", line 79, in new_figure_manager
    return new_figure_manager_given_figure(num, figure)
  File "/usr/lib/python3/dist-packages/matplotlib/backends/backend_tkagg.py", line 87, in new_figure_manager_given_figure
    window = Tk.Tk()
  File "/usr/lib/python3.4/tkinter/__init__.py", line 1854, in __init__
    self.tk = _tkinter.create(screenName, baseName, className, interactive, wantobjects, useTk, sync, use)
_tkinter.TclError: no display name and no $DISPLAY environment variable

I thought, but I forgot ʻexport DISPLAY = localhost: 0.0`. Every time you restart the terminal, you have to reconfigure the environment. Write in .bashrc.

3.6.1 It says that mnist.py is in the dataset directory, but what is the dataset directory? It said that you should include the Anaconda distribution, but is it related to not including it? There was a link to GitHub from the book introduction page on O'Reilly's site. https://github.com/oreilly-japan/deep-learning-from-scratch to download. $ git clone https://github.com/oreilly-japan/deep-learning-from-scratch.git

Images cannot be displayed using PIL. You can save the file with pil_img.save ("output.png "). It seems that it can be displayed by inserting something called Imagemagick. http://d.hatena.ne.jp/kimihito/20120508/1336461904 $ sudo apt-get install imagemagick

An error occurred that pickle was undefined because it was copied. ʻImport pickle`.

4.4.1 Hyperparameters are different from neural network parameters and seem to be adjusted manually. I wonder if this parameter can also be acquired automatically.

4.4.2 I found a (don't care) mistake in the text. It was confirmed that the same was true for the 1st and 6th prints.

P. 111


\>>> net = simpleNet()
\>>> print(net.W) #Weight parameter
[[ 0.47355232, 0.9977393 , 0.84668094],
 [ 0.85557411, 0.03563661, 0.69422093]])
\>>>
\>>> x = np.array([0.6, 0.9])
\>>> p = net.predict(x)
\>>> print(p)
[ 1.13282549 0.66052348 1.20919114]```
 The above interactive code is included, but the final print result is numerically `[1.05414809 0.63071653 1.1328074]`.
 Since net.W is generated by random numbers, I wonder if it was executed once and net.predict (x) was copied to the manuscript of the book, then net.W was regenerated and net.W was copied to the manuscript of the book.


 lambda is a syntax for creating anonymous functions.
`f = lambda w: net.loss(x, t)`
 When

#### **`python`**
```def f(x)

  return net.loss(x, t)```
 Is probably equivalent.


Recommended Posts

Deep Learning from scratch
Deep Learning from scratch 1-3 chapters
Deep learning from scratch (cost calculation)
Deep Learning memos made from scratch
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
"Deep Learning from scratch" self-study memo (unreadable glossary)
"Deep Learning from scratch" Self-study memo (9) MultiLayerNet class
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Good book "Deep Learning from scratch" on GitHub
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Deep learning / LSTM scratch code
Python vs Ruby "Deep Learning from scratch" Summary
Deep Learning
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
[Deep Learning from scratch] I implemented the Affine layer
"Deep Learning from scratch" Self-study memo (No. 19) Data Augmentation
"Deep Learning from scratch 2" Self-study memo (No. 21) Chapters 3 and 4
Application of Deep Learning 2 made from scratch Spam filter
[Deep Learning from scratch] I tried to explain Dropout
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Deep Learning / Deep Learning from Zero Chapter 3 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 6 Memo
Deep learning tutorial from environment construction
An amateur stumbled in Deep Learning from scratch Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 2
Deep Learning Memorandum
Create an environment for "Deep Learning from scratch" with Docker
Start Deep learning
An amateur stumbled in Deep Learning from scratch Note: Chapter 3
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 7
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 1
Python Deep Learning
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 4
"Deep Learning from scratch" self-study memo (No. 18) One! Meow! Grad-CAM!
Deep learning × Python
"Deep Learning from scratch" self-study memo (No. 19-2) Data Augmentation continued
An amateur stumbled in Deep Learning from scratch Note: Chapter 4
An amateur stumbled in Deep Learning from scratch Note: Chapter 2