Difference between revisions of "Orthogonalization"
Ulf Rehmann (talk | contribs) m (Undo revision 48082 by Ulf Rehmann (talk)) Tag: Undo |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | o0704201.png | ||
+ | $#A+1 = 33 n = 0 | ||
+ | $#C+1 = 33 : ~/encyclopedia/old_files/data/O070/O.0700420 Orthogonalization, | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
''orthogonalization process'' | ''orthogonalization process'' | ||
− | An algorithm to construct for a given linear independent system of vectors in a Euclidean or Hermitian space | + | An algorithm to construct for a given linear independent system of vectors in a Euclidean or Hermitian space $ V $ |
+ | an [[Orthogonal system|orthogonal system]] of non-zero vectors generating the same subspace in $ V $. | ||
+ | The most well-known is the Schmidt (or Gram–Schmidt) orthogonalization process, in which from a linear independent system $ a _ {1} \dots a _ {k} $, | ||
+ | an orthogonal system $ b _ {1} \dots b _ {k} $ | ||
+ | is constructed such that every vector $ b _ {i} $( | ||
+ | $ i = 1 \dots k $) | ||
+ | is linearly expressed in terms of $ a _ {1} \dots a _ {i} $, | ||
+ | i.e. $ b _ {i} = \sum _ {j=} 1 ^ {i} \gamma _ {ij} a _ {j} $, | ||
+ | where $ C = \| \gamma _ {ij} \| $ | ||
+ | is an upper-triangular matrix. It is possible to construct the system $ \{ b _ {i} \} $ | ||
+ | such that it is orthonormal and such that the diagonal entries $ \gamma _ {ii} $ | ||
+ | of $ C $ | ||
+ | are positive; the system $ \{ b _ {i} \} $ | ||
+ | and the matrix $ C $ | ||
+ | are defined uniquely by these conditions. | ||
− | The Gram–Schmidt process is as follows. Put | + | The Gram–Schmidt process is as follows. Put $ b _ {1} = a _ {1} $; |
+ | if the vectors $ b _ {1} \dots b _ {i} $ | ||
+ | have already been constructed, then | ||
− | + | $$ | |
+ | b _ {i+} 1 = a _ {i+} 1 + \sum _ { j= } 1 ^ { i } \alpha _ {j} b _ {j} , | ||
+ | $$ | ||
where | where | ||
− | + | $$ | |
+ | \alpha _ {j} = - | ||
+ | \frac{( a _ {j+} 1 , b _ {j} ) }{( b _ {j} , b _ {j} ) } | ||
+ | , | ||
+ | $$ | ||
+ | |||
+ | $ j = 1 \dots i $, | ||
+ | are obtained from the condition of orthogonality of the vector $ b _ {i+} 1 $ | ||
+ | to $ b _ {1} \dots b _ {i} $. | ||
+ | The geometric sense of this process comprises the fact that at every step, the vector $ b _ {i+} 1 $ | ||
+ | is perpendicular to the linear hull of $ a _ {1} \dots a _ {i} $ | ||
+ | drawn to the end of the vector $ a _ {i+} 1 $. | ||
+ | The product of the lengths $ | b _ {1} | \dots | b _ {k} | $ | ||
+ | is equal to the volume of the parallelepiped constructed on the vectors of the system $ \{ a _ {i} \} $ | ||
+ | as edges. By normalizing the vectors $ b _ {i} $, | ||
+ | the required orthonormal system is obtained. An explicit expression of the vectors $ b _ {i} $ | ||
+ | in terms of $ a _ {1} \dots a _ {k} $ | ||
+ | is given by the formula | ||
− | + | $$ | |
+ | b _ {i} = \left | | ||
− | + | \begin{array}{llll} | |
+ | ( a _ {1} , a _ {1} ) &\dots &( a _ {1} , a _ {i-} 1 ) &a _ {1} \\ | ||
+ | \dots &\dots &\dots &{} \\ | ||
+ | ( a _ {i} , a _ {1} ) &\dots &( a _ {i} , a _ {i-} 1 ) &a _ {i} \\ | ||
+ | \end{array} | ||
+ | \right | | ||
+ | $$ | ||
− | where | + | where $$ |
+ | q _ {i} = | ||
+ | \frac{b _ {i} }{\sqrt {G _ {i-} 1 G _ {i} } } | ||
+ | , | ||
+ | $$ | ||
+ | is the [[Gram determinant|Gram determinant]] of the system $ G _ {i} $, | ||
+ | with ''G''<sub>0</sub>=1 by definition. (The determinant at the right-hand side has to be formally expanded by the last column). | ||
The norm of these orthogonal vectors is given by ||''b''<sub>''i''</sub>||=SQRT(''G''<sub>''i''</sub>/''G''<sub>''i''-1</sub>). Thus, the corresponding orthonormal system takes the form | The norm of these orthogonal vectors is given by ||''b''<sub>''i''</sub>||=SQRT(''G''<sub>''i''</sub>/''G''<sub>''i''-1</sub>). Thus, the corresponding orthonormal system takes the form | ||
− | + | $ a _ {1} \dots a _ {i} $ | |
− | |||
This process can also be used for a countable system of vectors. | This process can also be used for a countable system of vectors. |
Revision as of 14:54, 7 June 2020
orthogonalization process
An algorithm to construct for a given linear independent system of vectors in a Euclidean or Hermitian space $ V $ an orthogonal system of non-zero vectors generating the same subspace in $ V $. The most well-known is the Schmidt (or Gram–Schmidt) orthogonalization process, in which from a linear independent system $ a _ {1} \dots a _ {k} $, an orthogonal system $ b _ {1} \dots b _ {k} $ is constructed such that every vector $ b _ {i} $( $ i = 1 \dots k $) is linearly expressed in terms of $ a _ {1} \dots a _ {i} $, i.e. $ b _ {i} = \sum _ {j=} 1 ^ {i} \gamma _ {ij} a _ {j} $, where $ C = \| \gamma _ {ij} \| $ is an upper-triangular matrix. It is possible to construct the system $ \{ b _ {i} \} $ such that it is orthonormal and such that the diagonal entries $ \gamma _ {ii} $ of $ C $ are positive; the system $ \{ b _ {i} \} $ and the matrix $ C $ are defined uniquely by these conditions.
The Gram–Schmidt process is as follows. Put $ b _ {1} = a _ {1} $; if the vectors $ b _ {1} \dots b _ {i} $ have already been constructed, then
$$ b _ {i+} 1 = a _ {i+} 1 + \sum _ { j= } 1 ^ { i } \alpha _ {j} b _ {j} , $$
where
$$ \alpha _ {j} = - \frac{( a _ {j+} 1 , b _ {j} ) }{( b _ {j} , b _ {j} ) } , $$
$ j = 1 \dots i $, are obtained from the condition of orthogonality of the vector $ b _ {i+} 1 $ to $ b _ {1} \dots b _ {i} $. The geometric sense of this process comprises the fact that at every step, the vector $ b _ {i+} 1 $ is perpendicular to the linear hull of $ a _ {1} \dots a _ {i} $ drawn to the end of the vector $ a _ {i+} 1 $. The product of the lengths $ | b _ {1} | \dots | b _ {k} | $ is equal to the volume of the parallelepiped constructed on the vectors of the system $ \{ a _ {i} \} $ as edges. By normalizing the vectors $ b _ {i} $, the required orthonormal system is obtained. An explicit expression of the vectors $ b _ {i} $ in terms of $ a _ {1} \dots a _ {k} $ is given by the formula
$$ b _ {i} = \left | \begin{array}{llll} ( a _ {1} , a _ {1} ) &\dots &( a _ {1} , a _ {i-} 1 ) &a _ {1} \\ \dots &\dots &\dots &{} \\ ( a _ {i} , a _ {1} ) &\dots &( a _ {i} , a _ {i-} 1 ) &a _ {i} \\ \end{array} \right | $$
where $$ q _ {i} = \frac{b _ {i} }{\sqrt {G _ {i-} 1 G _ {i} } } , $$ is the Gram determinant of the system $ G _ {i} $, with G0=1 by definition. (The determinant at the right-hand side has to be formally expanded by the last column).
The norm of these orthogonal vectors is given by ||bi||=SQRT(Gi/Gi-1). Thus, the corresponding orthonormal system takes the form
$ a _ {1} \dots a _ {i} $
This process can also be used for a countable system of vectors.
The Gram–Schmidt process can be interpreted as expansion of a non-singular square matrix in the product of an orthogonal (or unitary, in the case of a Hermitian space) and an upper-triangular matrix with positive diagonal entries, this product being a particular example of an Iwasawa decomposition.
References
[1] | F.R. [F.R. Gantmakher] Gantmacher, "The theory of matrices" , 1 , Chelsea, reprint (1977) (Translated from Russian) |
[2] | A.G. Kurosh, "Higher algebra" , MIR (1972) (Translated from Russian) |
Orthogonalization. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Orthogonalization&oldid=49509