Tikhonov regularization
From Wikipedia, the free encyclopedia
Tikhonov regularization is the monly used method of regularization of ill-posed problems named for Andrey Tychonoff. In statistics, the method is also known as ridge regression. It is related to the Levenberg-Marquardt algorithm for non-linear least-squares problems.
The standard approach to solve an underdetermined system of linear equations given as
is known as linear least squares and seeks to minimize the residual
where is the Euclidean norm. However, the matrix A may be ill-conditioned or singular yielding a non-unique solution. In order to give preference to a particular solution with desirable properties, the regularization term is included in this minimization:
for some suitably chosen Tikhonov matrix, . In many cases, this matrix is chosen as the identity matrix = I, giving preference to solutions with smaller norms. In other cases, highpass operators (., a difference operator or a weighted Fourier operator) may be used to enforce smoothness if the underlying vector is believed to be mostly continuous. This regularization improves the conditioning of the problem, thus enabling a numerical solution. An explicit solution, denoted by , is given by:
The effect of regularization may be varied via the scale of matrix . For = αI, when α= 0 this reduces to the unregularized least squares solution provided
Tikhonov吉洪诺夫正则化 来自淘豆网www.taodocs.com转载请标明出处.