Math

Repeated eigenvalues

If a matrix has repeated eigenvalues, the eigenvectors of the matched repeated eigenvalues become one of eigenspace. For example, the identity matrix has all repeated eigenvalues of one. The eigenvectors of the identity matrix are any vectors of the whole column space of the identity matrix. The whole column space is the eigenspace. A symmetric matrix can be chosen with orthogonal eigenvectors not only orthogonal eigenvectors because when a symmetric matrix has repeated eigenvalues, the eigenvectors that correspond to the repeated eigenvalues do not have to be orthonormal but can be orthonormal.

Spectral decomposition

Gaussian kernel matrix can be factorized into (ΦX)HΦX=XHΦHΦX=XHX(ΦX)HΦX=XHΦHΦX=XHX, where ΦΦ is Gaussian kernel basis matrix and XX is coefficients matrix of reproducing kernel Hilbert space K(,x)HKK(,x)HK https://www.jkangpathology.com/post/reproducing-kernel-hilbert-space/. A matrix is a system. A system takes input and gives output. A matrix is a linear system. Differentiation and Integration are linear systems. Fourier transformation matches input basis and operator (differentiation) basis.

Cauchy's integral formula and Taylor series

Cauchy’s integral formula defines analytic function evaluation with path integral with denominator translation at evaluation point (1za1za). f(a)=12πiγf(z)zadzf(a)=12πiγf(z)zadz f(n)(a)=n!2πiγf(z)(za)n+1dzf(n)(a)=n!2πiγf(z)(za)n+1dz Cauchy’s integral formula is a limit of path. limr0γ:|zz0|=rlimr0γ:|zz0|=r Taylor series evaluated a analytic function by approximation at an open disc D(z0,r)D(z0,r). f(x)=n=0an(xb)nf(x)=n=0an(xb)n f(n)(b)n!=anf(n)(b)n!=an

Primitive function

The presence of primitive function is a strong condition that makes a function is analytic in a disc D(a,R)D(a,R). The meaning is the presence of primitive function is confusing at first to me. If a function is integrable, then integration value and a primitive function can be determined. But in complex analysis this is not the case. In real analysis, the integral interval [a,b][a,b] is unique, but in complex analysis the integral interval should be determined by line path Γ=g(x)Γ=g(x).

Linear transformation

A norm of linear transformation Λ:XYΛ:XY is defined by Λ=sup{Λ(x):xX,x1}Λ=sup{Λ(x):xX,x1}. We can give a norm to a space or a set. A norm determines the size of a vector in the function space. The way of measure the size of a vector gives important properties of the space like boundness, completeness or orthogonality.

Laplace transformation

The Fourier series represents a periodic function as a descrete vectors. The Fourier transformation turns a time domain non-periodic function into a frequency domain continuous function. The Fourier series and transformation change a single time base tt into infinite frequency basis einxeinx or eiwxeiwx. The function on infinite basis domain can be represented by a vector or a function of basis domain vnvn or f(w)f(w). This is a coefficients of Fourier series or Fourier transformation.

Convolution and Fourier transformation

Convolution is a vector operation on two vectors. Convolutioncd=dc(cd)n=Σi+jcidj=Σicidni. This is multiplying polynomials. The parameters of multiplied polynomial become convolution of two polynomials. Fourier transformation expands x base to infinite exponential basis eiwk. The multiplication on x (time) space becomes convolutionn on k (frequency) space. If time space is periodic, its Fourier transformation is discrete i.

Lagrange dual problem and conjugate function

The optimization problem have two components that are objective function f0:RnR and the constraints. The objective function and constraints keep in check each other and make balance at saddle point i.e. optimal point. The dual (Lagrange) problem of the optimal problem also solve the optimization problem by making low boundary. The dual problem can be explained as a conjugate function f=sup(xTyf(x)).

Approximation

The purpose of approximation is finding optimal point x i.e. F(x)=0. We need a step/search direction Δx and step size t. Taylor approximation has polynomial arguments that is a step and parameters of derivatives at the start point. The first degree of Taylor approximation has one adding term from start point (x0,F(x0)). The adding term F(x)Δx is consistent with a parameter (gradient F(x)) and a argument (step Δx).

Steady state equilibrium

The meaning of AT Steady state equilibrium Graph Laplacian matrix ATCA Differential equation and Laplacian matrix Derivative is a graph without branch. Row space and column space are dual. A and AT are dual. ref) Linear algebra and learning from data, Part IV, Gilbert Strang