Orthogonal matrix: Difference between revisions

Content deleted Content added
Edited footnote date to match citation
m removed incorrect bolding
 
(15 intermediate revisions by 9 users not shown)
Line 1:
{{Short description|Real square matrix whose columns and rows are orthogonal unit vectors}}
{{for|matrices with orthogonality over the [[complex number]] field|unitary matrix}}
{{More footnotes needed|date=May 2023}}
In [[linear algebra]], an '''orthogonal matrix''', or '''orthonormal matrix''', is a real [[square matrix]] whose columns and rows are [[Orthonormality|orthonormal]] [[Vector (mathematics and physics)|vectors]].
 
Line 13 ⟶ 14:
An orthogonal matrix {{mvar|Q}} is necessarily invertible (with inverse {{math|1=''Q''<sup>−1</sup> = ''Q''<sup>T</sup>}}), [[Unitary matrix|unitary]] ({{math|1=''Q''<sup>−1</sup> = ''Q''<sup>∗</sup>}}), where {{math|1=''Q''<sup>∗</sup>}} is the [[Hermitian adjoint]] ([[conjugate transpose]]) of {{mvar|Q}}, and therefore [[Normal matrix|normal]] ({{math|1=''Q''<sup>∗</sup>''Q'' = ''QQ''<sup>∗</sup>}}) over the [[real number]]s. The [[determinant]] of any orthogonal matrix is either +1 or −1. As a [[Linear map|linear transformation]], an orthogonal matrix preserves the [[inner product]] of vectors, and therefore acts as an [[isometry]] of [[Euclidean space]], such as a [[Rotation (mathematics)|rotation]], [[Reflection (mathematics)|reflection]] or [[Improper rotation|rotoreflection]]. In other words, it is a [[unitary transformation]].
 
The set of {{math|''n'' × ''n''}} orthogonal matrices, under multiplication, forms athe [[group (mathematics)|group]], {{math|O(''n'')}}, known as the [[orthogonal group]]. The [[subgroup]] {{math|SO(''n'')}} consisting of orthogonal matrices with determinant +1 is called the [[Orthogonal group#special orthogonal group|special orthogonal group]], and each of its elements is a '''special orthogonal matrix'''. As a linear transformation, every special orthogonal matrix acts as a rotation.
 
==Overview==
[[File:Matrix multiplication transpose.svg|thumb|275px|Visual understanding of multiplication by the transpose of a matrix. If A is an orthogonal matrix and B is its transpose, the ij-th element of the product AA<sup>T</sup> will vanish if i≠j, because the i-th row of A is orthogonal to the j-th row of A.]]
An orthogonal matrix is the real specialization of a unitary matrix, and thus always a [[normal matrix]]. Although we consider only real matrices here, the definition can be used for matrices with entries from any [[field (mathematics)|field]]. However, orthogonal matrices arise naturally from [[dot product]]s, and for matrices of complex numbers that leads instead to the unitary requirement. Orthogonal matrices preserve the dot product,<ref>[http://tutorial.math.lamar.edu/Classes/LinAlg/OrthogonalMatrix.aspx "Paul's online math notes"]{{Citation broken|date=January 2013|note=See talk page.}}, Paul Dawkins, [[Lamar University]], 2008. Theorem 3(c)</ref> so, for vectors {{math|'''u'''}} and {{math|'''v'''}} in an {{mvar|n}}-dimensional real [[Euclidean space]]
 
An orthogonal matrix is the real specialization of a unitary matrix, and thus always a [[normal matrix]]. Although we consider only real matrices here, the definition can be used for matrices with entries from any [[field (mathematics)|field]]. However, orthogonal matrices arise naturally from [[dot product]]s, and for matrices of complex numbers that leads instead to the unitary requirement. Orthogonal matrices preserve the dot product,<ref>[http://tutorial.math.lamar.edu/Classes/LinAlg/OrthogonalMatrix.aspx "Paul's online math notes"]{{CitationFull brokencitation needed|date=January 2013|note=See talk page.}}, Paul Dawkins, [[Lamar University]], 2008. Theorem 3(c)</ref> so, for vectors {{math|'''u'''}} and {{math|'''v'''}} in an {{mvar|n}}-dimensional real [[Euclidean space]]
<math display="block">{\mathbf u} \cdot {\mathbf v} = \left(Q {\mathbf u}\right) \cdot \left(Q {\mathbf v}\right) </math>
where {{mvar|Q}} is an orthogonal matrix. To see the inner product connection, consider a vector {{math|'''v'''}} in an {{mvar|n}}-dimensional real [[Euclidean space]]. Written with respect to an orthonormal basis, the squared length of {{math|'''v'''}} is {{math|'''v'''<sup>T</sup>'''v'''}}. If a linear transformation, in matrix form {{math|''Q'''''v'''}}, preserves vector lengths, then
Line 36 ⟶ 39:
\cos \theta & -\sin \theta \\
\sin \theta & \cos \theta \\
\end{bmatrix}</math> &emsp;&emsp; (rotation about the origin)
*<math>
\begin{bmatrix}
Line 84 ⟶ 87:
0 & 1\\
1 & 0
\end{bmatrix}.</math>
 
The identity is also a permutation matrix.
Line 270 ⟶ 273:
This may be combined with the Babylonian method for extracting the square root of a matrix to give a recurrence which converges to an orthogonal matrix quadratically:
<math display="block">Q_{n + 1} = 2 M \left(Q_n^{-1} M + M^\mathrm{T} Q_n\right)^{-1}</math>
where {{math|1=''Q''<sub>0</sub> = ''M''}}.
 
These iterations are stable provided the [[condition number]] of {{mvar|M}} is less than three.<ref>[http://www.maths.manchester.ac.uk/~nareports/narep91.pdf "Newton's Method for the Matrix Square Root"] {{Webarchive|url=https://web.archive.org/web/20110929131330/http://www.maths.manchester.ac.uk/~nareports/narep91.pdf |date=2011-09-29 }}, Nicholas J. Higham, Mathematics of Computation, Volume 46, Number 174, 1986.</ref>
Line 292 ⟶ 295:
There is no standard terminology for these matrices. They are variously called "semi-orthogonal matrices", "orthonormal matrices", "orthogonal matrices", and sometimes simply "matrices with orthonormal rows/columns".
 
For the case {{math|''n'' ≤ ''m''}}, matrices with orthonormal columns may be referred to as [[k-frame| orthogonal k-frames]] and they are elements of the [[Stiefel manifold]].
 
==See also==