[PYTHON] Understanding Logistic Regression (1) _ About odds and logit transformations

Introduction

We have summarized what we have learned about odds and logit transformations, which are prerequisites for understanding logistic regression.

reference

In understanding the odds and logit conversion, I referred to the following.

-Logistic Regression Analysis (5) ─ Inverse transformation of logistic transformation -Understanding Statistical Analysis-Overview of Logistic Regression Analysis-

Logistic regression

Logistic regression overview

Logistic regression is an algorithm ** that is often used to estimate the probability that a piece of data belongs to a particular class.

The usual linear regression model is used when predicting that the objective variable is a quantitative variable (such as store sales for a month), but if the objective variable is a qualitative variable (whether or not this email is spam). , And whether or not the blood type is type A) cannot be directly applied to that model. Even if such ** objective variable is a qualitative variable, the generalized linear model (GLM) is extended so that the idea of linear regression can be applied, and logistic regression is one of them **.

Logistic regression itself is expressed by the following formula. ($ \ Beta $ is a parameter, $ p $ is the output probability)


\log(\frac{p}{1-p})=\beta_{0} + \beta_{1}x_{1} + \beta_{2}x_{2} + ...+ \beta_{n}x_{n}

The following is a summary of the odds and the concept of logit conversion, which are prerequisites for understanding this.

Odds

Overview of odds

There is an idea called "odds" that comes up when performing logistic analysis. The odds are expressed as the ratio of the probability that an event will occur to the probability that it will not occur. If the probability that a certain event occurs is set as $ p $, it is expressed by the following formula.


\frac{p}{1-p}

Consider the following cases as specific examples. Consider the odds of having five baseball games, one for your team and four for your opponent. Then you can find the odds as below.


\frac{\frac{1}{5}}{1-\frac{1}{5}}=0.25

Logarithm to odds

Often, the logarithm of the above odds is taken and used in the calculation. Here's what you can represent by taking the logarithm of the odds.

In considering the significance of logarithmic odds, consider two patterns of odds: when your team is very weak and when your team is very strong.

When your team is very weak

When your team is very weak, you can see that the odds are ** getting closer and closer to 0 **. ** On the contrary, you can see that it is not mathematically less than 0 **.

・ When 1 win and 8 losses


\frac{\frac{1}{9}}{1-\frac{1}{9}}=0.125

・ When 1 win and 16 losses


\frac{\frac{1}{17}}{1-\frac{1}{17}}=0.062

When your team is very strong

You can see that when your team is very strong, the odds can go from ** 1 to infinitely **.

・ When 8 wins and 1 loss


\frac{\frac{8}{9}}{1-\frac{8}{9}}=8

・ When 16 wins and 1 loss


\frac{\frac{16}{17}}{1-\frac{16}{17}}=16

Looking at the above two examples, ** The odds when your team is weak are between 0 and 1, while the odds when you are strong are 1 to ∞, which is a simple comparison. You can see that you can't **. The odds for 1 win and 8 losses are $ 0.25 $, while the odds for 8 wins and 1 loss are $ 8 $.

Logarithm aligns scale

In fact, the problem of the above scale can be solved by taking the logarithm.

・ Logarithm to the odds of 1 win and 8 losses


\log(0.25)\fallingdotseq-2.079

・ Logarithm to the odds of 8 wins and 1 loss


\log(8)\fallingdotseq2.079

Aligning the scale by taking the logarithm is very important in logistic regression.

Logit conversion

Logistic regression calculates the weighted sum of the explanatory variables (with the bias term added) like a normal linear regression model, but its ** output returns a logit-transformed result rather than a direct result. .. And the logit transformation of the ** objective variable is equal to ** taking the logarithm of the odds of the objective variable ** explained earlier.


\log(\frac{p}{1-p})

Logistic regression is a regression that takes a probability in the objective variable, but since the probability falls between 0 and 1, it is inconvenient to apply it to a normal linear regression. ** By performing logit conversion, it is possible to convert the value from -∞ to ∞, and the inconvenience can be eliminated. ** **

The formula for logistic regression can be expressed as follows.


\log(\frac{p}{1-p})=\beta_{0} + \beta_{1}x_{1} + \beta_{2}x_{2} + ...+ \beta_{n}x_{n}

The following is a graph drawn with the probability (p) on the y-axis and the logit-converted value (logit) on the x-axis.

import numpy as np
import matplotlib.pyplot as plt

def logit(x):
    return np.log(p/(1-p))

x = np.arange(0.001, 1, 0.001)
y = logit(x)

plt.xlabel("logit")
plt.ylabel("p")
plt.plot(y, p)

ダウンロード (2).png

When logit is 0, the probability p is 0.5. The change in p is large around 0 of the logit, and the change in p becomes gradual as the logit moves away from 0. No matter how large the logit is, p will never exceed 1, and no matter how small the logit is, it will never be less than p.

Inverse conversion of logit

On the contrary, the logit-converted value can be converted to a probability value by the following formula.


p = \frac{\exp(logit)}{1+\exp(logit)}

The above equation can be derived as follows.

{\begin{eqnarray*}

logit &=& \log(\frac{p}{1-p})\\
\exp(logit) &=& \frac{p}{1-p}\\
p(1+\exp(logit)) &=& \exp(logit)\\
p &=& \frac{\exp(logit)}{1+\exp(logit)}\\

\end{eqnarray*}}

The predicted value obtained by logistic regression can be converted into a probability value by using the inverse conversion of logit.

Next This time, we have summarized the knowledge that is a prerequisite for understanding logistic regression. Next, we will summarize how to find the optimum parameters (coefficients) for logistic regression.

Recommended Posts

Understanding Logistic Regression (1) _ About odds and logit transformations
[Machine learning] Understanding logistic regression from both scikit-learn and mathematics
First TensorFlow (Revised) -Linear Regression and Logistic Regression
Logistic regression
Memorandum about regression and binary classification metrics
Logistic regression
Understanding data types and beginning linear regression
About _ and __