If a matrix has repeated eigenvalues, the eigenvectors of the matched repeated eigenvalues become one of eigenspace. For example, the identity matrix has all repeated eigenvalues of one. The eigenvectors of the identity matrix are any vectors of the whole column space of the identity matrix. The whole column space is the eigenspace. A symmetric matrix can be chosen with orthogonal eigenvectors not only orthogonal eigenvectors because when a symmetric matrix has repeated eigenvalues, the eigenvectors that correspond to the repeated eigenvalues do not have to be orthonormal but can be orthonormal.
Gaussian kernel matrix can be factorized into \((\Phi \textbf{X})^\textbf{H} \Phi \textbf{X} =\textbf{X}^\textbf{H} \Phi^\textbf{H} \Phi \textbf{X} = \textbf{X}^\textbf{H}\textbf{X}\), where \(\Phi\) is Gaussian kernel basis matrix and \(\textbf{X}\) is coefficients matrix of reproducing kernel Hilbert space \(K(\cdot,x) \in \mathcal{H}_K\) https://www.jkangpathology.com/post/reproducing-kernel-hilbert-space/.
A matrix is a system. A system takes input and gives output. A matrix is a linear system. Differentiation and Integration are linear systems. Fourier transformation matches input basis and operator (differentiation) basis.
The Fourier series represents a periodic function as a descrete vectors. The Fourier transformation turns a time domain non-periodic function into a frequency domain continuous function. The Fourier series and transformation change a single time base \(t\) into infinite frequency basis \(e^{inx}\) or \(e^{iwx}\). The function on infinite basis domain can be represented by a vector or a function of basis domain \(v_{n}\) or \(f(w)\). This is a coefficients of Fourier series or Fourier transformation.
Convolution is a vector operation on two vectors.
\[ Convolution \\ c * d = d*c \\ (c*d)_n = \Sigma_{i+j} c_i d_j = \Sigma_i c_i d_{n-i}.\] This is multiplying polynomials. The parameters of multiplied polynomial become convolution of two polynomials. Fourier transformation expands x base to infinite exponential basis \(e^{iwk}\). The multiplication on x (time) space becomes convolutionn on k (frequency) space.
If time space is periodic, its Fourier transformation is discrete i.
Bases are the central idea of linear algebra. An invertable square matrix has eigenvectors. A symetric matrix has orthogonal eigenvectors with non-negative eigenvalues, i.e. positive semidefinite. A matrix has two types of singular vectors, left and right signular vectors, \(A=U\Sigma V^{T}\).
When we think the matrix \(A\) is data points of rows \(A=U\Sigma V^{T}\) like data table, The right singular vectors \(V\) build bases, the sigular values \(\Sigma\) are magnitude of the bases and the left singular values \(U\) becomes new data points on new bases.
This is a note for part III of Linear Algebra and learning from data, Gilbert Strang
The main themes are sparsity (Low rank), Information theory (compression), and of course linear transformation.
A full rank matrix is inefficient. Finding low lank matrix which is close with original matrix can save computation.
The rank one matrix \(uv^{T}\) is a unit of a matrix. The full rank matrix can be decomposed by sum of rank one matrices i.