Python vs Ruby "Deep Learning from scratch" Chapter 2 Logic circuit by Perceptron

Overview

Logic circuit by perceptron in Python and Ruby with reference to the code in Chapter 2 of the book "Deep Learning from scratch-The theory and implementation of deep learning learned with Python" Implement (AND gate, NAND gate, OR gate, XOR gate).

An external library is used in the calculation process. Use NumPy for Python and Numo :: NArray for Ruby.

If you need to build an environment, see here. → Python vs Ruby "Deep Learning from scratch" Chapter 1 Graph of sin and cos functions http://qiita.com/niwasawa/items/6d9aba43f3cdba5ca725

AND gate

Weights and biases are manually derived values.

Python

and_gate.py

import numpy as np

def AND(x1, x2):
  x = np.array([x1, x2])
  w = np.array([0.5, 0.5]) #weight
  b = -0.7 #bias
  tmp = np.sum(w*x) + b
  if tmp <= 0:
    return 0
  else:
    return 1

if __name__ == '__main__':
  for xs in [(0, 0), (1, 0), (0, 1), (1, 1)]:
    y = AND(xs[0], xs[1])
    print(str(xs) + " -> " + str(y))

Ruby

and_gate.rb

require 'numo/narray'

def AND(x1, x2)
  x = Numo::DFloat[x1, x2]
  w = Numo::DFloat[0.5, 0.5] #weight
  b = -0.7 #bias
  tmp = (w*x).sum + b
  if tmp <= 0
    0
  else
    1
  end
end

if __FILE__ == $0
  for xs in [[0, 0], [1, 0], [0, 1], [1, 1]]
    y = AND(xs[0], xs[1])
    puts "#{xs} -> #{y}"
  end
end

Execution result

Compare the execution results of Python and Ruby. If the inputs have the same value, it can be seen that the output also has the same value.

$ python and_gate.py
(0, 0) -> 0
(1, 0) -> 0
(0, 1) -> 0
(1, 1) -> 1

$ ruby and_gate.rb
[0, 0] -> 0
[1, 0] -> 0
[0, 1] -> 0
[1, 1] -> 1

NAND gate

NAND gate implementations differ from AND gates only in weight and bias.

Python

nand_gate.py

import numpy as np

def NAND(x1, x2):
  x = np.array([x1, x2])
  w = np.array([-0.5, -0.5])
  b = 0.7
  tmp = np.sum(w*x) + b
  if tmp <= 0:
    return 0
  else:
    return 1

if __name__ == '__main__':
  for xs in [(0, 0), (1, 0), (0, 1), (1, 1)]:
    y = NAND(xs[0], xs[1])
    print(str(xs) + " -> " + str(y))

Ruby

nand_gate.rb

require 'numo/narray'

def NAND(x1, x2)
  x = Numo::DFloat[x1, x2]
  w = Numo::DFloat[-0.5, -0.5]
  b = 0.7
  tmp = (w*x).sum + b
  if tmp <= 0
    0
  else
    1
  end
end

if __FILE__ == $0
  for xs in [[0, 0], [1, 0], [0, 1], [1, 1]]
    y = NAND(xs[0], xs[1])
    puts "#{xs} -> #{y}"
  end
end

Execution result

$ python nand_gate.py
(0, 0) -> 1
(1, 0) -> 1
(0, 1) -> 1
(1, 1) -> 0

$ ruby nand_gate.rb
[0, 0] -> 1
[1, 0] -> 1
[0, 1] -> 1
[1, 1] -> 0

OR gate

The OR gate implementation also differs only in weight and bias values.

Python

or_gate.py

import numpy as np

def OR(x1, x2):
  x = np.array([x1, x2])
  w = np.array([0.5, 0.5])
  b = -0.2
  tmp = np.sum(w*x) + b
  if tmp <= 0:
    return 0
  else:
    return 1

if __name__ == '__main__':
  for xs in [(0, 0), (1, 0), (0, 1), (1, 1)]:
    y = OR(xs[0], xs[1])
    print(str(xs) + " -> " + str(y))

Ruby

or_gate.rb

require 'numo/narray'

def OR(x1, x2)
  x = Numo::DFloat[x1, x2]
  w = Numo::DFloat[0.5, 0.5]
  b = -0.2
  tmp = (w*x).sum + b
  if tmp <= 0
    0
  else
    1
  end
end

if __FILE__ == $0
  for xs in [[0, 0], [1, 0], [0, 1], [1, 1]]
    y = OR(xs[0], xs[1])
    puts "#{xs} -> #{y}"
  end
end

Execution result

$ python or_gate.py
(0, 0) -> 0
(1, 0) -> 1
(0, 1) -> 1
(1, 1) -> 1

$ ruby or_gate.rb
[0, 0] -> 0
[1, 0] -> 1
[0, 1] -> 1
[1, 1] -> 1

XOR gate

The XOR gate implementation uses a multi-layer perceptron with NAND, OR, and AND gates stacked.

Python

xor_gate.py

from and_gate  import AND
from or_gate   import OR
from nand_gate import NAND

def XOR(x1, x2):
  s1 = NAND(x1, x2)
  s2 = OR(x1, x2)
  y  = AND(s1, s2)
  return y

if __name__ == '__main__':
  for xs in [(0, 0), (1, 0), (0, 1), (1, 1)]:
    y = XOR(xs[0], xs[1])
    print(str(xs) + " -> " + str(y))

Ruby

xor_gate.rb

require './and_gate'
require './or_gate'
require './nand_gate'

def XOR(x1, x2)
  s1 = NAND(x1, x2)
  s2 = OR(x1, x2)
  y  = AND(s1, s2)
  return y
end

if __FILE__ == $0
  for xs in [[0, 0], [1, 0], [0, 1], [1, 1]]
    y = XOR(xs[0], xs[1])
    puts "#{xs} -> #{y}"
  end
end

Execution result

$ python xor_gate.py
(0, 0) -> 0
(1, 0) -> 1
(0, 1) -> 1
(1, 1) -> 0

$ ruby xor_gate.rb
[0, 0] -> 0
[1, 0] -> 1
[0, 1] -> 1
[1, 1] -> 0

Reference material

--Python vs Ruby "Deep Learning from scratch" Summary --Qiita http://qiita.com/niwasawa/items/b8191f13d6dafbc2fede

Recommended Posts

Python vs Ruby "Deep Learning from scratch" Chapter 2 Logic circuit by Perceptron
Python vs Ruby "Deep Learning from scratch" Summary
Python vs Ruby "Deep Learning from scratch" Chapter 4 Implementation of loss function
Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function
Python vs Ruby "Deep Learning from scratch" Chapter 1 Graph of sin and cos functions
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
[Learning memo] Deep Learning made from scratch [Chapter 7]
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Deep Learning from scratch
Deep Learning from scratch The theory and implementation of deep learning learned with Python Chapter 3
Python learning memo for machine learning by Chainer from Chapter 2
An amateur stumbled in Deep Learning from scratch Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 2
An amateur stumbled in Deep Learning from scratch Note: Chapter 3
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 7
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 4
An amateur stumbled in Deep Learning from scratch Note: Chapter 4
An amateur stumbled in Deep Learning from scratch Note: Chapter 2
I tried to implement Perceptron Part 1 [Deep Learning from scratch]
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 6
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Deep learning from scratch (cost calculation)
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo
Deep Learning memos made from scratch
Deep Learning / Deep Learning from Zero 2 Chapter 6 Memo
Chapter 2 Implementation of Perceptron Cut out only the good points of deep learning made from scratch
Chapter 1 Introduction to Python Cut out only the good points of deep learning made from scratch
Deep learning / Deep learning from scratch 2-Try moving GRU
"Deep Learning from scratch" in Haskell (unfinished)
[Windows 10] "Deep Learning from scratch" environment construction
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
Build a "Deep learning from scratch" learning environment on Cloud9 (jupyter miniconda python3)
[Python / Machine Learning] Why Deep Learning # 1 Perceptron Neural Network
"Deep Learning from scratch" Self-study memo (9) MultiLayerNet class
Good book "Deep Learning from scratch" on GitHub
[Python] [Natural language processing] I tried Deep Learning ❷ made from scratch in Japanese ①
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Video frame interpolation by deep learning Part1 [Python]
Python Deep Learning
Deep learning × Python
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
Dare to learn with Ruby "Deep Learning from scratch" Importing pickle files from forbidden PyCall
[Deep Learning from scratch] I implemented the Affine layer
Implemented in Python PRML Chapter 4 Classification by Perceptron Algorithm