Generate n correlated pseudo-random numbers (with Python sample)

Previous article generates $ 2 $ correlated pseudo-random numbers. did. However, as I went through various things, I wanted $ n $ of correlated random numbers, so I will explain how to generate them.

This time we will start with the story of suddenly creating $ n $ random numbers, so I think it would be smoother to read the previous article first.

PS: If you just want to create correlated random numbers and aren't interested in the background, you can use numpy.random.multivariate_normal.

Overview

When there are $ n $ data series, the symmetric matrix with the $ i $ and $ j $ correlation coefficients in the $ ij $ component is called [^ 2], ** correlation matrix **. Consider how to use this correlation matrix as an input to generate $ n $ random numbers that satisfy a given correlation.

Prepare $ n $ independent random numbers with zero mean and equal variance, and combine them into vectors to express $ X = (X_1 \ \ X_2 \ \ \ dots \ \ X_n) ^ T $ [^] 1]. At this time, $ n $ random numbers that have the same variance as $ X_1 $ and $ X_2 $ and correlate with the correlation matrix $ R $ $ Z = (Z_1 \ \ Z_2 \ \ \ dots \ \ Z_n) ^ T $ is

Z = L \ X \tag{1}

Is required at.

However, the correlation matrix has the correlation coefficient of $ Z_i $ and $ Z_j $ as $ \ rho_ {ij} $.

R = \left( \begin{array}{cccc}
\rho_{11} & \rho_{12} & \dots & \rho_{1n}
\\
\rho_{21} & & &
\\
\vdots & & \ddots & \vdots
\\
\rho_{n1} & & \dots & \rho_{nn}
\end{array} \right) \tag{2}

And $ L $ is the lower triangular matrix obtained from the Cholesky decomposition of $ R $ (the matrix with all zeros in the upper right corner of the diagonal component).

…… Yes, there may be a lot of unfamiliar words, but please be assured that I will explain each one. Lol

So in this article,

I will write about.

I will also touch on what Cholesky decomposition is, but I will not explain how to actually achieve it.

theory

As a general flow of theory,

  1. Think about how to express $ \ {Z_i \} = Z_1, Z_2, \ dots, Z_n $ using $ \ {X_i \} $
  2. Associate the obtained expression with the correlation coefficient
  3. Confirm the expression

It's like that. Let's go.

Ideas to make Z

For the independent random numbers $ \ {X_i \} $, the mean is zero and the variance is the same, $ \ sigma ^ 2 $. Since they are independent, the covariance $ \ mathrm {Cov} [X_i, X_j] $ and the correlation coefficient $ \ mathrm {Corr} [X_i, X_j] $ are zero.

I want to make $ \ {Z_i \} $ using this $ \ {X_i \} $, but for the time being, I will do as follows.

\begin{align}
Z_1 &= a_{11} X_1 \tag{3.1}
\\
Z_2 &= a_{21} X_1 + a_{22} X_2 \tag{3.2}
\\
Z_3 &= a_{31} X_1 + a_{32} X_2 + a_{33} X_3 \tag{3.3}
\\
&\vdots
\\
Z_n &= a_{n1} X_1 + a_{n2} X_2 + \dots + a_{nn} X_n \tag{3.n}
\end{align}

This first creates a reference $ Z_1 $, then $ Z_2 $ with a defined correlation to it, and then $ Z_1 $ and $ Z_2 $ with a defined correlation to each. It is an image of making Z_3 $ and repeating the process to $ Z_n $. It is very important to understand this part. I think it is no exaggeration to say that it occupies more than half of the theory part.

To make the image a little easier to grasp, consider $ n = 3 $. If you think the image is perfect, skip this part.

Correlation is the relationship between two things. It doesn't start without a standard to think about a relationship,

Z_1 = a_{11} X_1 \tag{4}

I don't think this part is a problem [^ 3]. Then, using $ Z_1 $ and another random number (choose $ X_2 $),

Z_2 = c Z_1 + a_{22} X_2 \tag{5}

Then, by choosing the appropriate coefficient, you can create $ Z_2 $ that correlates with $ Z_1 $ and $ \ rho_ {12} $. I think that there is no problem if you look at the previous article. Make the right-hand side of this equation just $ \ {X_i \} $

\begin{align}
Z_2 &= c_1 \cdot a_{11} X_1 + a_{22} X_2 & (\because Z_Equation 1)
\\
&= a_{21} X_1 + a_{22} X_2 & (\because a_{21} = c_1 \cdot a_{11}I said) \tag{6}
\end{align}

Now you have the formula for the second column. Finally, $ Z_3 $, but since there are already two random numbers, there are various possible cases as follows.

To cover a wide range of these situations,

Z_3 = c_1 Z_1 + c_2 Z_2 + a_{31} X_3
\tag{7}

And it seems that $ Z_1 $, $ Z_2 $ and a new random number $ X_3 $ will need to be superimposed. In fact, if $ c_1 $ is taken large, situation 1 is taken, if $ c_2 $ is taken large, situation 2 is taken, and if $ c_1 = c_2 = 0 $ and only $ a_ {31} $ is made into some number, situation 3 is taken. I think I can express it. The right-hand side of this equation is also represented by only $ \ {X_i \} $, and the equation in the third column can be derived by redefining the coefficients appropriately.

In this way, we can see that $ Z_ {i + 1} $ can be created by superimposing the already created $ Z_1, Z_2, \ dots, Z_i $ and the new random number $ X_ {i + 1} $. I did.

Associate with the correlation coefficient

Now, if you express the formula for finding $ \ {Z_i \} $ earlier as a matrix,

Z = 
\left(
 \begin{array}{ccccc}
    a_{11} & 0 & 0 & \dots & 0
    \\
    a_{21} & a_{22} & 0 & \dots & 0
    \\
    a_{31} & a_{32} & a_{33} & & \vdots
    \\
    \vdots & \vdots & & \ddots & 0
     \\
    a_{n1} & a_{n2} & & \dots & a_{nn}
 \end{array}
\right)X
\tag{8}

Can be written. This is exactly the "lower triangular matrix" that came out briefly earlier, and it is a matrix that contains diagonal components and values in the lower left, and the upper right is all filled with zeros.

This matrix corresponds to $ L $ that appears in $ Z = L \ X $. Once you know that, all you have to do is decide on the coefficient $ \ {a_ {ij} \} $ to satisfy the given correlation. In other words, all you need to know is the relationship between $ \ {a_ {ij} \} $ and $ \ {\ rho_ {ij} \} $.

Correlation between $ Z_i $ and $ Z_j $ Given $ \ rho_ {ij} $, the expression to satisfy is

\mathrm{Corr}[Z_i, Z_j] = \rho_{ij}
\tag{9}

is not it. I'm sorry.

I can't say anything at this rate, so I'll transform the left side.

LHS = \frac{\mathrm{Cov}[Z_i, Z_j]}{\sqrt{\mathrm{Var}[Z_i]} \sqrt{\mathrm{Var}[Z_j]}} \ (\because \mathrm{Corr}Definition of)
\tag{10}

Now, let's anticipate the result and accept $ \ mathrm {Var} [Z_k] = \ sigma ^ 2 \ (\ mathrm {for \ all} \ k) $. We assume that the variance of $ \ {Z_k \} $ will be the same as the variance of $ \ {X_k \} $ (otherwise the calculation would be confusing). We will verify this numerically later.

So the above formula is

\frac{\mathrm{Cov}[Z_i, Z_j]}{\sigma^2}
\tag{11}

It will be. Next, we will expand the numerator with $ \ {X_i \} $. As $ i \ geq j $

\begin{align}
\mathrm{Cov}[Z_i, Z_j] &= \mathrm{Cov}[a_{i1} X_1 + a_{i2} X_2 + \dots + a_{ii} X_i, \ 
a_{j1} X_1 + a_{j2} X_2 + \dots + a_{jj} X_j]
\tag{12}
\end{align}

But for $ \ mathrm {Cov} $

\mathrm{Cov}[a A + b B, c C + d D] = ac \mathrm{Cov}[A, C] + ad \mathrm{Cov}[A, D] + bc \mathrm{Cov}[B, C] + bd \mathrm{Cov}[B, D]
\tag{13}

(Lowercase letters are constants and uppercase letters are random numbers), so the above formula can be decomposed in the same way. Furthermore, in this case $ \ mathrm {Cov} [X_k, X_k] = \ sigma ^ 2 $, $ \ mathrm {Cov} [X_k, X_l] = 0 \ (k \ neq l) $, so

\begin{align}
\mathrm{Cov}[&a_{i1} X_1 + a_{i2} X_2 + \dots + a_{ii} X_i, \ 
a_{j1} X_1 + a_{j2} X_2 + \dots + a_{jj} X_j]
\\
&= a_{i1} a_{j1} \mathrm{Cov}[X_1, X_1] + a_{i2} a_{j2} \mathrm{Cov}[X_2, X_2] + \dots + a_{ij} a_{jj} \mathrm{Cov}[X_j, X_j]
\\
&\ \ \ \ + a_{i(j+1)} \cdot 0  \cdot \mathrm{Cov}[X_{j+1}, X_{j+1}] + \dots
\\
&= a_{i1} a_{j1} \sigma^2 + a_{i2} a_{j2} \sigma^2 + \dots + a_{ij} a_{jj} \sigma^2
\\
&= (a_{i1} a_{j1} + a_{i2} a_{j2} + \dots + a_{ij} a_{jj}) \ \sigma^2
\tag{14}
\end{align}

It will be. $ a_ {j (j + 1)} = a_ {j (j + 2)} = \ dots = a_ {ji} = 0 $, so the term is censored at the $ j $ th.

Finally, the correlation coefficient is

\rho_{ij} = \mathrm{Corr}[Z_i, Z_j] = a_{i1} a_{j1} + a_{i2} a_{j2} + \dots + a_{ij} a_{jj}
\tag{15}

And it looks like this.

Since the calculation was made assuming $ i \ geq j $, the lower triangular part of the correlation matrix was calculated, but since the correlation matrix is always symmetric,

\rho_{ji} = a_{i1} a_{j1} + a_{i2} a_{j2} + \dots + a_{ij} a_{jj}
\tag{16}

Also holds. The subscripts on the left side have been secretly replaced.

"Solve" the matrix

To summarize the discussion so far in the form of a matrix, the correlation matrix $ R $ uses $ \ {a_ij \} $.

R = \left( \begin{array}{cccc}
a_{11} a_{11} & a_{21} a_{11} & \cdots & a_{n1} a_{11}
\\
a_{21} a_{11} & a_{21} a_{21} + a_{22} a_{22} &
\\
\\
\vdots & & \ddots & \vdots
\\
a_{n1} a_{11} & a_{n1} a_{21} + a_{n2} a_{22} & \dots &
\end{array} \right)
\tag{17}

It can be expressed like this. There are two ways to express $ R $, one using $ \ {\ rho_ {ij} \} $ and the other using $ \ {a_ {ij} \} $. It means that it was associated with. After that, you can solve this for $ \ {a_ {ij} \} $.

For confirmation, let's calculate with $ n = 2 $. The correspondence between the correlation matrix and the coefficient

\left( \begin{array}{cc}
a_{11} a_{11} & a_{21} a_{11}
\\
a_{21} a_{11} & a_{21} a_{21} + a_{22} a_{22}
\end{array} \right)
=
\left( \begin{array}{cc}
\rho_{11} & \rho_{12}
\\
\rho_{21} & \rho_{22}
\end{array} \right)
\tag{18}

is. Since it is a correlation matrix, $ \ rho_ {11} = \ rho_ {22} = 1 $, and $ \ rho_ {21} = \ rho_ {12} $.

Based on this, if we extract a meaningful formula,

\begin{align}
a_{11}^2 &= 1
\tag{19.1}
\\
a_{21} a_{11} &= \rho_{21}
\tag{19.2}
\\
a_{21}^2 + a_{22}^2 &= 1
\tag{19.3}
\end{align}

And these three. Let's say $ a_ {kk}> 0 \ (\ mathrm {for \ all} \ k) $. If you solve this,

\begin{align}
a_{11} &= 1
\tag{20.1}
\\
a_{21} &= \rho_{21}
\tag{20.2}
\\
a_{22} &= \sqrt{1 - \rho_{21}^2}
\tag{20.3}
\end{align}

It is consistent with the result obtained last time. Looks good.

So what about more generally $ n $ dimensional matrices? Let's take a closer look at the procession again. I tried to increase the ingredients to write a little more.

R = \left( \begin{array}{ccccc}
a_{11} a_{11} & a_{21} a_{11} & a_{31} a_{11} & \cdots & a_{n1} a_{11}
\\
a_{21} a_{11} & a_{21} a_{21} + a_{22} a_{22} &
a_{31} a_{21} + a_{32} a_{22} &
\\
a_{31} a_{11} & a_{31} a_{21} + a_{32} a_{22} & a_{31} a_{31} + a_{32} a_{32} + a_{33} a_{33} 
\\
\vdots & \vdots & & \ddots & \vdots
\\
a_{n1} a_{11} & a_{n1} a_{21} + a_{n2} a_{22} & \dots &
\end{array} \right)
\tag{21}

It has a pretty beautiful shape, isn't it? Actually this,

R = \left(
 \begin{array}{ccccc}
    a_{11} & 0 & 0 & \dots & 0
    \\
    a_{21} & a_{22} & 0 & \dots & 0
    \\
    a_{31} & a_{32} & a_{33} & & \vdots
    \\
    \vdots & \vdots & & \ddots & 0
     \\
    a_{n1} & a_{n2} & & \dots & a_{nn}
 \end{array}
\right)
\left(
 \begin{array}{ccccc}
    a_{11} & a_{21} & a_{31}  & \dots & a_{n1}
    \\
    0 & a_{22} & a_{32} & \dots & a_{n2}
    \\
    0 & 0 & a_{33} & & 
    \\
    \vdots & \vdots & & \ddots & \vdots
     \\
    0 & 0 & & \dots & a_{nn}
 \end{array}
\right)
\tag{22}

Wouldn't it be surprising to say that it can be decomposed into the form? In fact, this is true, and this is where mathematics is beautiful.

Let's check it as $ n = 3 $. Since manual calculation is difficult, I will use SymPy here. SymPy is a Python library that calculates mathematical formulas like Mathematica.

Let's try multiplying the decomposed matrix and see if it returns to the original.

import sympy

# Define symbols
a_11, a_22, a_33 = sympy.symbols('a_11 a_22 a_33', real=True, positive=True)
a_21, a_31, a_32 = sympy.symbols('a_21 a_31 a_32', real=True)

# Define lower and upper triangular matrices
L = sympy.Matrix([
    [a_11, 0, 0], 
    [a_21, a_22, 0], 
    [a_31, a_32, a_33]
    ])
U = L.transpose()

# Calculate product of triangular matrices
R = L * U
print("R = {}".format(R))

Click here for output. It is shaped so that it is easy to see [^ 4].

output


R = Matrix([
[a_11**2,     a_11 * a_21,               a_11 * a_31                ],
[a_11 * a_21, a_21**2 + a_22**2,         a_21 * a_31 + a_22 * a_32  ],
[a_11 * a_31, a_21 * a_31 + a_22 * a_32, a_31**2 + a_32**2 + a_33**2]
])

What! You're getting the results you've come to expect!

To put it the other way around, the operation of "decomposing a matrix into the product of the lower triangular matrix and its transpose (upper triangular matrix)" is the Cholesky decomposition. Let's do this as well.

import sympy

# Define symbols
a_11, a_22, a_33 = sympy.symbols('a_11 a_22 a_33', real=True, positive=True)
a_21, a_31, a_32 = sympy.symbols('a_21 a_31 a_32', real=True)

# Define correlation matrix
R = sympy.Matrix([
    [a_11**2,     a_11 * a_21,               a_11 * a_31                ],
    [a_11 * a_21, a_21**2 + a_22**2,         a_21 * a_31 + a_22 * a_32  ],
    [a_11 * a_31, a_21 * a_31 + a_22 * a_32, a_31**2 + a_32**2 + a_33**2]
    ])

# Perform cholesky decomposition
L = R.cholesky()
print("L = {}".format(L))

output.

output


L = Matrix([
[a_11, 0,    0   ], 
[a_21, a_22, 0   ], 
[a_31, a_32, a_33]
])

Uo, it's neat! The coefficient can be taken out neatly by Cholesky decomposition!

Now we have confirmed that the Cholesky decomposition of the correlation coefficient $ R $ is a matrix that generates correlated random numbers!

Summary of theory

It's getting longer, so I'll summarize it here.

How to create correlated $ n $ random numbers:

  1. Prepare the correlation matrix $ R $.
  2. Cholesky decompose $ R $ to find the lower triangular matrix $ L $.
  3. Multiply $ L $ by the uncorrelated random number column vector $ X $ from the left and you're done!

The outline hasn't changed at all, but you should have a good idea of it.

Again, I won't cover how to calculate the Cholesky decomposition here. Just treat it as a black box that decomposes the matrix into a triangular matrix. If you look up the calculation method, you will find various things, so please use that.

Verification

It was a long time, but it is finally verified with Python3. Based on the previous reflection, we will use normal random numbers from the beginning.

import numpy as np
import numpy.random as rand
import matplotlib.pyplot as plt

# Set parameters
n = 3 # The number of random numbers
size = int(1e4) # Size of the vector
r_in = np.matrix([
    [1, 0.2, 0.8],
    [0.2, 1, 0.6],
    [0.8, 0.6, 1]
    ]) # Correlation matrix

# Generate correlated random numbers
l = np.linalg.cholesky(r_in)
x = rand.randn(n, size)
z = l * x

# Calculate stats
cov = np.cov(z)
r_out = np.corrcoef(z)
print("covariance matrix:\n{}\n".format(cov))
print("correlation matrix:\n{}\n".format(r_out))

# Plot results
fig, ax = plt.subplots()
ax.scatter(z[0, :], z[1, :], s=1, color='red', label='Z_2')
ax.scatter(z[0, :], z[2, :], s=1, color='blue', label='Z_3')
ax.set_xlabel('Z_1')
ax.set_ylabel('Z_2, Z_3')
ax.legend()
plt.show()

This is the output.

output


covariance matrix:
[[ 1.00545504  0.19297604  0.79517079]
 [ 0.19297604  1.00391907  0.5943957 ]
 [ 0.79517079  0.5943957   0.99000159]]

correlation matrix:
[[ 1.          0.19207582  0.79700517]
 [ 0.19207582  1.          0.5962225 ]
 [ 0.79700517  0.5962225   1.        ]]

The diagonal component of the Covariance matrix is the variance. Since the value is close to the variance $ 1 , which is the same as the original normal random number, the variance of " \ {Z_i \} $" in theory is equal to $ \ {X_i \} $. The assumption does not seem to be a problem.

Also, in the Correlation matrix, numbers that are close to the values that were properly input appear.

Click here for a plot of $ Z_2 $ and $ Z_3 $ against $ Z_1 $. rho12_0p2_rho13_0p8_rho23_0p6.png It is properly reflected that we put a strong correlation in $ \ rho_ {13} $!

So what about such a correlation matrix? You also want to see the negative correlation.

r_in = np.matrix([
    [1, -0.2, -0.8],
    [-0.2, 1, -0.6],
    [-0.8, -0.6, 1]
    ]) # Correlation matrix

When you run ...

output


Traceback (most recent call last):
  File "spike.py", line 15, in <module>
    l = np.linalg.cholesky(r_in)
  File "/Users/horiem/.pyenv/versions/anaconda3-4.0.0/lib/python3.5/site-packages/numpy/linalg/linalg.py", line 612, in cholesky
    r = gufunc(a, signature=signature, extobj=extobj)
  File "/Users/horiem/.pyenv/versions/anaconda3-4.0.0/lib/python3.5/site-packages/numpy/linalg/linalg.py", line 93, in _raise_linalgerror_nonposdef
    raise LinAlgError("Matrix is not positive definite")
numpy.linalg.linalg.LinAlgError: Matrix is not positive definite

shell returned 1

Oops, I got an error (and I found out that I was using pyenv and anaconda). Apparently the Cholesky decomposition has failed. You're angry with "Matrix is not positive definite".

Yes, in fact, in order for Cholesky decomposition to be possible, the target matrix is:

I need that.

What is a definite matrix? For any non-zero vector $ \ boldsymbol {z} $

\boldsymbol{z}^T M \boldsymbol{z} > 0

It is a symmetric matrix $ M $ such that [^ 5].

There is a good way for those who can't check each and every one of them. "The matrix $ M $ is definite" and "the matrix $ M $ has all positive eigenvalues" are equivalent. So let's check the sign of the eigenvalue.

r_in = np.matrix([
    [1, -0.2, -0.8],
    [-0.2, 1, -0.6],
    [-0.8, -0.6, 1]
    ]) # Correlation matrix

w, v = np.linalg.eig(r_in)
print(w)

output


[-0.10192577  1.19135241  1.91057336]

Oh, after all there is a negative eigenvalue. This cannot be disassembled into Cholesky.

Well then, when there is a negative correlation, can't Cholesky decompose? No, it's not.

In fact, any correlation matrix originally has [non-negative definite matrix (semi-positive definite matrix)](http://mathtrain.jp/correlationmatrix "Relationship between the definition of correlation matrix and covariance matrix | Beautiful story of high school mathematics" "). Non-negative definite is when zero is included in the condition of definite matrix. In other words, the eigenvalue is zero or positive. An example that includes zero eigenvalues

\left( \begin{array}{ccc}
1 & 1 & -1
\\
1 & 1 & -1
\\
-1 & -1 & 1
\end{array} \right)

Or something like that. Since this is a non-negative definite value, it is eligible for a correlation matrix, but it is a rare case where it cannot be decomposed into Cholesky because it is not a positive definite value (but it is not necessary, right ??).

Furthermore, such a matrix

\left( \begin{array}{ccc}
1 & -1 & -1
\\
-1 & 1 & -1
\\
-1 & -1 & 1
\end{array} \right)

This also looks like a correlation matrix at first glance, but it is not a non-negative constant value, so it is not qualified as a correlation matrix. Obviously, when you think about it, $ Z_1 $ and $ Z_2 $ are inversely correlated, $ Z_2 $ and $ Z_3 $ are also inversely correlated, and $ Z_3 $ and $ Z_1 $ are also inversely correlated, right? What this is saying

\begin{align}
Z_1 = - c_2 Z_2
\\
Z_2 = - c_3 Z_3
\\
Z_3 = -c_1 Z_1
\end{align}

That is ($ c_1, c_2, c_3> 0 $). Substituting all of this results in $ Z_1 = (negative constant) \ times Z_1 $, which causes a contradiction. This kind of incorrect data is played by the mathematical system. It's amazing.

The example in which the error occurred earlier also looked like a correlation matrix at first glance, but it is not a non-negative constant value because it contained negative eigenvalues. In other words, it was the data of Chimpung Kampung that cannot actually be called a correlation matrix.

So, when you add a negative correlation, you have to make it a little milder. For example

r_in = np.matrix([
    [1, 0.2, -0.4],
    [0.2, 1, 0.6],
    [-0.4, 0.6, 1]
    ]) # Correlation matrix

Something like this. If you check the eigenvalues ...

output


[ 0.17738188  1.18223576  1.64038236]

Everything is positive. Was good. When I turn it ...

output


covariance matrix:
[[ 1.0117086   0.17839112 -0.42687826]
 [ 0.17839112  0.98871838  0.60641387]
 [-0.42687826  0.60641387  1.03022947]]

correlation matrix:
[[ 1.          0.17836482 -0.41812808]
 [ 0.17836482  1.          0.60084968]
 [-0.41812808  0.60084968  1.        ]]

rho12_0p2_rho13_-0p4_rho23_0p6.png

There is an inverse correlation with a good feeling! You did it!

Summary

How can we create correlated $ n $ random numbers? And why? I have seen. In such a story, it is always said "Cholesky decomposition", but I think there are not many articles that even mention why it works. The calculation itself has a lot of terms (it's a bit annoying to write), but the basics of linear algebra and statistics are enough to get to the end. After all, when you start working with $ n $ variables, linear algebra always comes up.

In the verification, it was found that there are some things that look like a correlation matrix but cannot actually be called a correlation matrix. You can't just have each component between $ -1 $ and $ 1 $.

It's fun to discover various things by moving your hands. I think that correlated random number generation is just the right thing to do, so please enjoy statistics!

[^ 1]: $ X $ is a "column vector" in which the components are arranged vertically, but since it is a waste of space, it is expressed as a transpose of the row vector. [^ 2]: Because $ (correlation between \ i \ and \ j ) = (correlation between \ j \ and \ i ) $. [^ 3]: $ \ {X_i \} $ and $ \ {Z_i \} $ have the same variance, so you can immediately see $ a_11 = 1 $, but leave it as a symbol here. I will leave it. [^ 4]: When I do it in interactive mode, it looks like a matrix, but it's not so easy to see, so I'm doing it with a script. [^ 5]: So, in reality, the condition for Cholesky decomposition is that "the symmetric matrix is definite" is sufficient.

Recommended Posts

Generate n correlated pseudo-random numbers (with Python sample)
Generate two correlated pseudo-random numbers (with Python sample)
Generate Fibonacci numbers with Python closures, iterators, and generators
Sample data created with python
Testing with random numbers in Python
[Python] Generate a password with Slackbot
Play handwritten numbers with python Part 2 (identify)
Sample to convert image to Wavelet with Python
Library comparison summary to generate PDF with Python
Try to automatically generate Python documents with Sphinx
Note for formatting numbers with python format function
Sample to send slack notification with python lambda
Generate an insert statement from CSV with Python.
Sample program that outputs syslog with Python logging
FizzBuzz with Python3
Scraping with Python
Statistics with python
Scraping with Python
Python with Go
Twilio with Python
Python closure sample
Integrate with Python
Play with 2016-Python
AES256 with python
Tested with Python
python starts with ()
with syntax (Python)
Zundokokiyoshi with python
Excel with Python
Microcomputer with Python
Cast with python
Specific sample code for working with SQLite3 in Python
[Python] Get the numbers in the graph image with OCR
I tried to automatically generate a password with Python3