Abstract
We begin by constructing the separable extension of a basis for functions of several variables, and define the Sobolev regularity in any dimension p from the decay of the Fourier coefficients captured by a convergent series. We demonstrate the equivalence between this regularity and the decay of the approximation error from M Fourier coefficients in dimension p. This theorem highlights the curse of the p-dimension. We demonstrate the equivalence between this regularity and the decay of the approximation error from M Fourier coefficients in dimension p. This theorem highlights the curse of dimensionality, which requires a number of coefficients M that grows exponentially with dimension, to achieve a fixed approximation error. In the case of the Fourier basis, these approximations can also be interpreted as filters that keep only low frequencies.
The optimality of linear approximations is studied in a probabilistic framework using principal components. When x is a random vector, we show that the error of linear approximation from M coefficients in an orthonormal basis depends only on its covariance matrix. We prove the Karhunen-Loève approximation theorem, i.e. that the bases that minimize the linear approximation are bases that diagonalize the covariance matrix. The vectors of such a basis are called principal components. If x(t) is stationary, then its covariance matrix is a convolution matrix diagonalized in the Fourier basis. The principal components are therefore sinusoids. Linear Fourier approximations are optimal in this case. This completes the tour of the triangle between regularity, approximation and parsimony in the linear case.