Recipe 17
Linear regression is A* x =b...
Here, x does not indicate the input value like the previous x_vals: corn: The input value is A, and x is the slope of a line and the y-intercept.
※2 [Translation]
...First row of design matrix A (all 1)...
All 1's are in the first row, right? : spaghetti:
5.Output the coefficient () from the solution...
...
print('y'_intercept: ' + str(y_intercept))
There are many "'" s (typo?): Cookie:
Recipe 18
Decompose matrix A and perform matrix operations on the decomposed matrix
...
Here, simultaneous equations A* x =b to L* L' * x =Let's solve it as b.
First L* y =Solve b, then L' * x =Solving y gives the coefficient matrix x
I wanted you to write "put y = L'* x" at the beginning: oden: At this point, I thought that A would be Cholesky decomposed, but in the code in the text, A.T * A is Cholesky decomposed (.T means transpose). Or rather, Cholesky decomposition is for the Hermitian matrix, so A cannot be Cholesky decomposed (probably).
In other words, what we solve here is the equation obtained by multiplying the above simultaneous equations by A.T from the left in advance.
Here, simultaneous equations A.T * A * x = A.T *b to L* L' * x = A.T *Let's solve it as b.
First L* y = A.T *Solve b, then L' * x =Solving y gives the coefficient matrix x
It would be nice to write (for myself)
2.Then perform the Cholesky decomposition of the square matrix.
...
# L * y = t(A) *Solve b
tA_b = tf.matmul(tf.transpose(A_tensor), b)
Why use b instead of b_tensor here? : fried_shrimp: I thought, but it was okay to use b_tensor instead
Recipe 19
Specifically, x_vals is the width of the petals, y_vals are petal length...
To be correct, x is the width of the petals and y is the length of the splinter (typo?): Cherrys: Also, there were some misspellings of petal in the code.
plt.plot(loss_vec, 'k-')
There are few arguments, right? I thought, but there was no problem: candy: The content of loss_vec is y and the index is x
Recipe 20
In Fig. 3-5, it seems that the shape changes considerably with random numbers.
Also plt.ylabel was weird
Recipe 21
I see, x also has an error, so when do you do deming regression: birthday: However, the number of trainings is large, and learning often happens. The one that worked and the one that seems to have an accident
my_opt's opt (== optimizer): optimization function subtract: subtraction of tf.subtract indices: plural of index demming_numerator numerator: numerator denominator: denominator of demming_denominator
Recommended Posts