# Explain Python's functools.partial (partial application)

## Motivation

There is no explanation in the teaching materials used for JDLA E qualification measures at our company, and `functools.partial` appears, and successive examinees

What is this ...?

So I decided to make a commentary article.

## Thing you want to do

Suppose you are trying to implement a simple linear regression problem with steepest descent or something.

I implemented the linear regression hypothesis (`hypothesis`) like this.

``````import numpy as np

def hypothesis(X, W, b):
"""Hypothetical function

Linear regression hypothesis(WX + b)And return the result

Args:
X (numpy.ndarray):Training data(The number of data,Number of items)
W (numpy.ndarray):weight(Number of data items,1)
b (numpy.ndarray):bias(1,1)

Returns:
numpy.ndaray:Hypothesis calculation result(The number of data,)
"""

# WX+Connect the bias term (all 1s) to the column of X to calculate b as a matrix product
X_with_bias = np.c_[X, np.ones((X.shape[0],1))]
#X and W with bias terms connected,Take the product of a matrix that connects b in the row direction
return X_with_bias.dot(np.r_[W,b])
``````

For example, use it like this.

``````X = np.array([[7,5],[8,10],[8.5,1]]) #Learning data (texto)
W = np.random.randn(X.shape[1]) #Weight initialization
b = np.zeros(1) #Bias initialization

y = hypothesis(X, W, b)
``````

It is assumed that the learning converges and the weights and biases that can be predicted with high accuracy are determined.

``````learned_W = np.array([0.8,-0.05]) #Learned weights (texto)
learned_b = np.array([-0.8]) #Learned bias (texto)
``````

Now that the learning is over, I want to actually make a prediction using this model. When you put the input `X` you want to predict as` X_to_predict`

``````X_to_predict = np.array([[7,10],[10,1]]) #Input X you want to predict(Texto)
``````

The prediction value (`prediction`) can be obtained in this way using the` hypothesis` function and the trained `learned_W` and` learned_b`.

``````prediction = hypothesis(X_to_predict, learned_W, learned_b)
``````

However, `learned_W` and` learned_b` are learned weights/biases that are no longer ** fixed values ​​**, so it's not fun to give them as arguments every time you make a prediction. Therefore, I would like to create a `predict` function that returns the predicted value by passing only the input` X` to ** without passing `learned_W` or` learned_b` as an argument.

``````#I want a function like this!
prediction = predict(X_to_predict)
``````

## Unexpectedly difficult to implement

Of course, such an implementation is ** not good **.

``````def predict(X):
"""Prediction function

Learned weights(learned_W)And learned bias(learned_b)Returns the predicted value using

Args:
X (numpy.ndarray):Data you want to predict(The number of data,Number of items)

Returns:
numpy.ndaray:Prediction result(The number of data,)
"""

# WX+Connect the bias term (all 1s) to the column of X to calculate b as a matrix product
X_with_bias = np.c_[X, np.ones((X.shape[0],1))]
#X and W with bias terms connected,Take the product of a matrix that connects b in the row direction
return X_with_bias.dot(np.r_[learned_W,learned_b])
``````

It clearly violates the DRY principle, so I'd like to do something a little more.

At least, using the `hypothesis` function implemented earlier,

``````def predict(X):
"""Prediction function

Learned weights(learned_W)And learned bias(learned_b)Returns the predicted value using

Args:
X (numpy.ndarray):Data you want to predict(The number of data,Number of items)

Returns:
numpy.ndaray:Prediction result(The number of data,)
"""
# learned_W and learned_Call b and call the hypothesis function
return hypothesis(X, learned_W, learned_b)
``````

When I was thinking, "I wonder if it's a fairly simple implementation," a scary senior stands behind me.

What does it mean to rely on global variables! ?? Do you know the concept of coupling? How do you want to unit test this function? Yeah! ??

...So that's it. How to solve it? It doesn't make sense to return `learned_W` or` learned_b` to the arguments of the `predict` function. ..

That's it! Python is a functional language in a broad sense, so let's solve it using functional techniques. Create a closure to separate the scopes of `learned_W` and` learned_b`.

``````def create_predict_func(learned_W, learned_b):
"""A function that creates a predictive function

Trained weights in the hypothesis function(learned_W)And learned bias(learned_b)Apply,
predict(Forecast)Create a function

Args:
learned_W (numpy.ndarray):Learned weights(Number of data items,1)
learned_b (numpy.ndarray):Learned bias(1,1)

Returns:
func :Prediction function
Args:
X:Data you want to predict(The number of data,Number of items)
Returns:
numpy.ndaray:Prediction result(The number of data,)
"""
#Returns a function that takes an X as an argument and returns the return value of hypothesis
return lambda X: hypothesis(X, learned_W, learned_b)

# (reference)Another solution that does not use lambda expressions
#def predict(X):
#    return hypothesis(X, learned_W, learned_b)
#
#return predict
``````

Use this like this.

``````# learned_W and learned_Create a predict function by specifying b
predict = create_predict_func(learned_W, learned_b)
#The predict function once created is only X (learned) many times in the future_W and learned_You can call (without specifying b one by one)
prediction = predict(X_to_predict)

# (reference)
#This time I purposely create for explanation_predict_I made a func function, but the following is also acceptable
#
# # learned_W and learned_Create a predict function by specifying b
# predict = lambda X: hypothesis(X, learned_W, learned_b)
# #The predict function once created is only X (learned) many times in the future_W and learned_You can call (without specifying b one by one)
# prediction = predict(X_to_predict)
``````

No, hypothesis is also a global function!

...I understand. .. Then hypothesis is also an argument,

``````def create_predict_func(hypothesis, learned_W, learned_b):
"""A function that creates a predictive function

Trained weights in the hypothesis function(learned_W)And learned bias(learned_b)Apply,
predict(Forecast)Create a function

Args:
hypothesis (func):Predictive function that applies weights and biases
learned_W (numpy.ndarray):Learned weights(Number of data items,1)
learned_b (numpy.ndarray):Learned bias(1,1)

Returns:
func :Prediction function
Args:
X:Data you want to predict(The number of data,Number of items)
Returns:
numpy.ndaray:Prediction result(The number of data,)
"""
#Returns a function that takes an X as an argument and returns the return value of hypothesis
return lambda X: hypothesis(X, learned_W, learned_b)

#Learned to hypothesis_W and learned_Apply b to create a predict function
predict = create_predict_func(hypothesis, learned_W, learned_b)
#The predict function once created is only X (learned) many times in the future_W and learned_You can call (without specifying b one by one)
prediction = predict(X_to_predict)
``````

No, it depends on you, this implementation, the order of the hypothesis arguments is (X, W, b)! What do you do when it becomes (W, b, X)?

... I don't like it anymore. Ah, someone ** could you prepare a function ** that creates a new function with some of the arguments of any function fixed? .. ..

## Partial application there

So

Create a new function by fixing some of the arguments of a function

This is called ** partial application **, and Python provides a function called `functools.partial`.

The implementation of the `predict` function, which I had been struggling with earlier, can be written as follows using this` functools.partial`.

``````from functools import partial
#Argument W of hypothesis function,Learned to b respectively_W,learned_Partially apply b to create a predict function
predict = partial(hypothesis, W=learned_W, b=learned_b)
#The predict function once created can be called by passing only X as an argument.
prediction = predict(X_to_predict)
``````

`functools.partial` can conveniently specify arguments to be partially applied by argument name regardless of the order, such as` W = learned_W`, `b = learned_b`, so it can satisfy the request of the previous grotesque destination. ..

## Quick scope management with functional techniques

To be honest, before I introduced `functools.partial`, I had already done ** partial application ** to real hypothsis since I created the closure, and used a functional technique to refer to global variables. I avoided it and managed the scope.

Then, how can we realize this requirement without knowing the idea of ​​** partial application ** and not using it? Of course, ** DRY ** and ** scope management ** should be done properly.

If you have a ** object-oriented ** mindset, you might do something like this:

``````class Predictor:
def __init__(self, hypothesis, learned_W, learned_b):
self._hypothesis = hypothesis
self._W = learned_W
self._b = learned_b

def predict(self, X):
return self._hypothesis(X, self._W, self._b)

predictor = Predictor(hypothesis, learned_W, learned_b)

prediction = predictor.predict(X_to_predict)
``````

Of course, this method itself is perfectly fine. However, the drawback of this object-oriented solution is that ** must declare a class for this degree of scope management **?

If you have developed from `Java6` to legacy Java before that, you can sympathize with it, but ** If you try to manage scope and encapsulate it seriously only with object orientation, there are many classes. And interface must be declared **.

Most of today's major languages, including the `Python` and the recent` Java`, are more or less capable of ** object-oriented and functional hybrids **, so cases. If scope management can be done in an appropriate way on a by-case basis, it will be possible to create code with high productivity and safety. Global seniors must have wanted to convey that.

Even so, I think it's a shame to suddenly bring ** partial application ** without explanation in the teaching materials for E qualification measures that many people who are not programmers take. .. To be honest, there are many people who don't know even professional programmers. .. ..

Recommended Posts