University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (15)

Last time University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (14) https://github.com/legacyworld/sklearn-basic

Challenge 6.6 Logistic Regression and Log Likelihood

Commentary on Youtube is the 8th (1) per 20 minutes The task itself was so easy that I'm programming a little more myself to understand logistic regression. The problem of finding $ E (w) $ for each training sample ($ x_ {1i}, x_ {2i}, i = (1,2, \ cdots, 10) $). $ E (w) $ is expressed as follows.

E(w) = -\sum_{n=1}^{N}{t_n\,ln\hat t_n + (1-t_n)\,ln(1-\hat t_n)}

In this example N = 10 t_n = (1,0,0,1,1,1,0,1,0,0)

After finding $ E (w) $, we also use logistic regression to find $ w $. Click here for source code

python:Homework_6.6.py


import numpy as np
from sklearn.linear_model import LogisticRegression

#Sigmoid function
def sigmoid(w,x):
    return 1/(1+np.exp(-np.dot(w,x)))

#Cross entropy loss
def cross_entropy_loss(w,x,y):
    y_sig = sigmoid(w,x)
    return -np.sum(y*np.log(y_sig)+(1-y)*np.log(1-y_sig),axis=1)

X = np.array([[1.5,-0.5],[-0.5,-1.0],[1.0,-2.5],[1.5,-1.0],[0.5,0.0],[1.5,-2.0],[-0.5,-0.5],[1.0,-1.0],[0.0,-1.0],[0.0,0.5]])
X = np.concatenate([X,np.ones(10).reshape(-1,1)],1)
y = np.array([1,0,0,1,1,1,0,1,0,0])
w = np.array([[6,3,-2],[4.6,1,-2.2],[1,-1,-2]])
print(f"E(w1) = {cross_entropy_loss(w,X.T,y)[0]:.3f} E(w2) = {cross_entropy_loss(w,X.T,y)[1]:.3f} E(w3) = {cross_entropy_loss(w,X.T,y)[2]:.3f}")

for c_value in [10**(a-2) for a in range(5)]:
    clf = LogisticRegression(C=c_value).fit(X,y)
    w = np.array([[clf.coef_[0][0],clf.coef_[0][1],clf.intercept_[0]]])
    print(f"C = {c_value} w = {w} E(w) = {cross_entropy_loss(w,X.T,y)}")

Execution result

E(w1) = 1.474 E(w2) = 1.832 E(w3) = 6.185
C = 0.01 w = [[ 0.02956523  0.00018875 -0.01756914]] E(w) = [6.84341713]
C = 0.1 w = [[ 0.26242317  0.01451582 -0.1445077 ]] E(w) = [6.19257501]
C = 1 w = [[ 1.38391039  0.32530732 -0.55198479]] E(w) = [3.91381807]
C = 10 w = [[ 3.9100986   1.36910424 -1.28870173]] E(w) = [1.77721899]
C = 100 w = [[ 9.40098848  3.40849535 -3.23672119]] E(w) = [0.57516562]

In logistic regression, the L2 regularization parameter is moved from 0.01 to 100, and the cross entropy loss is also displayed for each calculated $ w $. It can be seen that the absolute value of $ w $ naturally increases where regularization is not very effective (C is large).

By the way, if regularization is not done, it will be as follows

Homework_6.6.py:13: RuntimeWarning: divide by zero encountered in log
  return -np.sum(y*np.log(y_sig)+(1-y)*np.log(1-y_sig),axis=1)
Homework_6.6.py:13: RuntimeWarning: invalid value encountered in multiply
  return -np.sum(y*np.log(y_sig)+(1-y)*np.log(1-y_sig),axis=1)
No regularization w = [[57.89037518 20.53048228 -9.91476711]] E(w) = [nan]

The absolute value of $ w $ becomes too large and 1 is returned in the calculation of the sigmoid function, making it impossible to calculate the log and an error occurs.

Past posts

University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (1) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (2) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (3) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (4) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (5) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (6) University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (7) Make your own steepest descent method University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (8) Make your own stochastic steepest descent method University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (9) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (10) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (11) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (12) University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the assignment (13) https://github.com/legacyworld/sklearn-basic https://ocw.tsukuba.ac.jp/course/systeminformation/machine_learning/

Recommended Posts

University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (17)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (5)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (16)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (10)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (2)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (13)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (9)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (4)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (12)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (1)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (11)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (3)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (14)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (6)
University of Tsukuba Machine Learning Course: Study sklearn while creating the Python script part of the task (15)
University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (7) Make your own steepest descent method
University of Tsukuba Machine Learning Course: Study sklearn while making the Python script part of the task (8) Make your own stochastic steepest descent method
Python & Machine Learning Study Memo ⑤: Classification of irises
Python & Machine Learning Study Memo ②: Introduction of Library
Image collection Python script for creating datasets for machine learning
Summary of the basic flow of machine learning with Python
The result of Java engineers learning machine learning in Python www
[Machine learning pictorial book] A memo when performing the Python exercise at the end of the book while checking the data
Python learning memo for machine learning by Chainer until the end of Chapter 2
Python & Machine Learning Study Memo: Environment Preparation
I installed Python 3.5.1 to study machine learning
Python Basic Course (at the end of 15)
Python & Machine Learning Study Memo ③: Neural Network
Python & Machine Learning Study Memo ④: Machine Learning by Backpropagation
Learning notes from the beginning of Python 2
Python & Machine Learning Study Memo ⑥: Number Recognition
Align the number of samples between classes of data for machine learning with Python
Introducing the book "Creating a profitable AI with Python" that allows you to learn machine learning in the shortest course
Machine learning memo of a fledgling engineer Part 1
[Python] Read the source code of Bottle Part 2
Machine learning starting with Python Personal memorandum Part2
The story of low learning costs for Python
2016 The University of Tokyo Mathematics Solved with Python
Machine learning starting with Python Personal memorandum Part1
Upgrade the Azure Machine Learning SDK for Python
[Python] Read the source code of Bottle Part 1
About the development contents of machine learning (Example)
Machine learning memo of a fledgling engineer Part 2
Classification of guitar images by machine learning Part 2
Get a glimpse of machine learning in Python
Python & Machine Learning Study Memo ⑦: Stock Price Forecast
[Python + OpenCV] Whiten the transparent part of the image
Predicting the goal time of a full marathon with machine learning-③: Visualizing data with Python-
The first step of machine learning ~ For those who want to implement with python ~
[CodeIQ] I wrote the probability distribution of dice (from CodeIQ math course for machine learning [probability distribution])
[Machine learning] "Abnormality detection and change detection" Let's draw the figure of Chapter 1 in Python.