Inner product is quite important in deep learning. The relationship between the inner product, outer product, dot product, and numpy dot function is a little complicated, so I will write an article.
The conclusion I want to make here is
The dot function of numpy contains something that is slightly different in terms of dot product.
Dare to think in a narrow range, the dot function, dot product, and inner product can point to the same thing. (Note that the numpy dot function has different meanings and functions.)
Cross product is something that has a wide range, so it has no direct effect on deep learning. I think that.
How important the inner product is in deep learning is outside the scope of this article.
For the time being, the description on the Wiki shows the relationship between the inner product, the outer product, and the dot product.
Quoted from below. https://ja.wikipedia.org/wiki/%E5%86%85%E7%A9%8D
The inner product in linear algebra is a non-degenerate and canonical sesquilinear form defined on a (real or complex) vector space (symmetrical bilinear form in the case of real coefficients). ). Since it is a binary operation that determines a certain number (scalar) for two vectors, it is also called a scalar product (scalar product).
The word "inner" is the opposite of "outer", but the outer product can be thought of in a slightly broader context (rather than exactly the opposite).
Quoted from below. https://ja.wikipedia.org/wiki/%E3%82%AF%E3%83%AD%E3%82%B9%E7%A9%8D
Vector product (English: vector product) is a binary operation that gives a new vector from two vectors defined in a three-dimensional oriented inner product space in vector analysis. The vector product of two vectors a and b (hereinafter, the vectors are shown in bold) is represented by a × b or [a, b]. It is sometimes called a cross product because of the symbol of operation. It is also called the outer product for the inner product, which is a binary operation that gives a scalar from two vectors, but it should be noted that the outer product means the direct product in English.
Quoted from below. https://ja.wikipedia.org/wiki/%E3%83%89%E3%83%83%E3%83%88%E7%A9%8D
In mathematics or physics, dot product (dot product) or point product (tenjoseki) is a type of vector operation that returns one numerical value from two sequences of the same length. It is defined algebraically and geometrically. By geometric definition, it is the dot product that is standardly defined in Euclidean space Rn (with Cartesian coordinates).
From the above, it means "standard inner product".
From numpy help
dot(...)
dot(a, b, out=None)
Dot product of two arrays. Specifically,
- If both `a` and `b` are 1-D arrays, it is inner product of vectors
(without complex conjugation).
- If both `a` and `b` are 2-D arrays, it is matrix multiplication,
but using :func:`matmul` or ``a @ b`` is preferred.
- If either `a` or `b` is 0-D (scalar), it is equivalent to :func:`multiply`
and using ``numpy.multiply(a, b)`` or ``a * b`` is preferred.
- If `a` is an N-D array and `b` is a 1-D array, it is a sum product over
the last axis of `a` and `b`.
- If `a` is an N-D array and `b` is an M-D array (where ``M>=2``), it is a
sum product over the last axis of `a` and the second-to-last axis of `b`::
dot(a, b)[i,j,k,m] = sum(a[i,j,:] * b[k,:,m])
Google translate
-If both
a
andb
are two-dimensional arrays, it is a matrix multiplication. However, it is recommended to use: func:matmul
ora @ b``.
If either -
a
orb
is 0-D (scalar), it is equivalent to: func:multi
. It is recommended to usenumpy.multiply (a, b)
ora * b
.
I would like to write a separate article about the importance of inner product in deep learning. If you have any comments, please let us know.
Recommended Posts