Orthogonal matrix: Difference between revisions

Content deleted Content added
Edited footnote date to match citation
Line 238:
In the case of a linear system which is underdetermined, or an otherwise non-[[invertible matrix]], singular value decomposition (SVD) is equally useful. With {{mvar|A}} factored as {{math|''U''Σ''V''<sup>T</sup>}}, a satisfactory solution uses the Moore-Penrose [[pseudoinverse]], {{math|''V''Σ<sup>+</sup>''U''<sup>T</sup>}}, where {{math|Σ<sup>+</sup>}} merely replaces each non-zero diagonal entry with its reciprocal. Set {{math|'''x'''}} to {{math|''V''Σ<sup>+</sup>''U''<sup>T</sup>'''b'''}}.
 
The case of a square invertible matrix also holds interest. Suppose, for example, that {{mvar|A}} is a {{nowrap|3 × 3}} rotation matrix which has been computed as the composition of numerous twists and turns. Floating point does not match the mathematical ideal of real numbers, so {{mvar|A}} has gradually lost its true orthogonality. A [[Gram–Schmidt process]] could [[orthogonalization|orthogonalize]] the columns, but it is not the most reliable, nor the most efficient, nor the most invariant method. The [[polar decomposition]] factors a matrix into a pair, one of which is the unique ''closest'' orthogonal matrix to the given matrix, or one of the closest if the given matrix is singular. (Closeness can be measured by any [[matrix norm]] invariant under an orthogonal change of basis, such as the spectral norm or the Frobenius norm.) For a near-orthogonal matrix, rapid convergence to the orthogonal factor can be achieved by a "[[Newton's method]]" approach due to {{harvtxt|Higham|1986}} ([[#CITEREFHigham1990|1990]]), repeatedly averaging the matrix with its inverse transpose. {{harvtxt|Dubrulle|19941999}} has published an accelerated method with a convenient convergence test.
 
For example, consider a non-orthogonal matrix for which the simple averaging algorithm takes seven steps