Namespaces
Variants
Actions

Difference between revisions of "Fundamental matrix"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
 +
<!--
 +
f0422301.png
 +
$#A+1 = 46 n = 0
 +
$#C+1 = 46 : ~/encyclopedia/old_files/data/F042/F.0402230 Fundamental matrix,
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
''matrizant''
 
''matrizant''
  
The transition matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422301.png" /> of solutions of a system of linear ordinary differential equations
+
The transition matrix $  X( t) $
 +
of solutions of a system of linear ordinary differential equations
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422302.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
\dot{x}  = A( t) x,\ \
 +
x \in \mathbf R  ^ {n} ,
 +
$$
  
normalized at the point <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422303.png" />. The fundamental matrix is the unique continuous solution of the matrix initial value problem
+
normalized at the point $  t _ {0} $.  
 +
The fundamental matrix is the unique continuous solution of the matrix initial value problem
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422304.png" /></td> </tr></table>
+
$$
 +
\dot{X}  = A( t) X,\ \
 +
X( t _ {0} )  = I
 +
$$
  
(<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422305.png" /> denotes the identity matrix) if the matrix-valued function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422306.png" /> is locally summable over some interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422307.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422308.png" />.
+
( $  I $
 +
denotes the identity matrix) if the matrix-valued function $  A( t) $
 +
is locally summable over some interval $  J \subset  \mathbf R $,  
 +
$  t \in J $.
  
Every matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f0422309.png" /> built of column-solutions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223010.png" /> of the system (*), where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223011.png" /> is a natural number, is expressible as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223012.png" />. In particular, every solution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223013.png" /> of (*) can be written in the form <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223014.png" />.
+
Every matrix $  M( t) $
 +
built of column-solutions $  x _ {1} \dots x _ {m} $
 +
of the system (*), where $  m $
 +
is a natural number, is expressible as $  M( t) = X( t) M( t _ {0} ) $.  
 +
In particular, every solution $  x( t) $
 +
of (*) can be written in the form $  x( t) = X( t) x _ {0} $.
  
 
The expansion
 
The expansion
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223015.png" /></td> </tr></table>
+
$$
 +
X( t)  = I + \int\limits _ {t _ {0} } ^ { t }  A( s)  ds + \int\limits _ {t _ {0} } ^ { t }  A( s) \int\limits
 +
_ {t _ {0} } ^ { s }  A( r)  dr  ds + \dots ,
 +
$$
  
which converges absolutely for every <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223016.png" /> and uniformly on every compact interval in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223017.png" />, and the [[Liouville–Ostrogradski formula|Liouville–Ostrogradski formula]]
+
which converges absolutely for every $  t \in J $
 +
and uniformly on every compact interval in $  J $,  
 +
and the [[Liouville–Ostrogradski formula|Liouville–Ostrogradski formula]]
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223018.png" /></td> </tr></table>
+
$$
 +
\mathop{\rm det}  X( t)  =   \mathop{\rm exp} \int\limits _ {t _ {0} } ^ { t }  \mathop{\rm Sp}  A( s)  ds
 +
$$
  
are valid. If the matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223019.png" /> satisfies the Lappo-Danilevskii condition
+
are valid. If the matrix $  A( t) $
 +
satisfies the Lappo-Danilevskii condition
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223020.png" /></td> </tr></table>
+
$$
 +
A( t) \cdot \int\limits _ {t _ {0} } ^ { t }  A( s)  ds  = \
 +
\int\limits _ {t _ {0} } ^ { t }  A( s)  ds \cdot A( t),
 +
$$
  
 
then
 
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223021.png" /></td> </tr></table>
+
$$
 +
X( t)  =   \mathop{\rm exp} \int\limits _ {t _ {0} } ^ { t }  A( s)  ds.
 +
$$
  
In particular, if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223022.png" /> is a constant matrix, then
+
In particular, if $  A( t) \equiv A $
 +
is a constant matrix, then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223023.png" /></td> </tr></table>
+
$$
 +
X( t)  = e ^ {A( t- t _ {0} ) } .
 +
$$
  
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223024.png" /> is the fundamental matrix of the system (*) with matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223025.png" />, then
+
If $  X _ {A} ( t) $
 +
is the fundamental matrix of the system (*) with matrix $  A( t) $,  
 +
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223026.png" /></td> </tr></table>
+
$$
 +
X _ {A+} B ( t)  = X _ {A} ( t) X _ {D} ( t),
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223027.png" /></td> </tr></table>
+
$$
 +
D( t)  = [ X _ {A} ( t)]  ^ {-} 1 B( t) X _ {A} ( t).
 +
$$
  
 
The fundamental matrix makes it possible to write every solution of the inhomogeneous system
 
The fundamental matrix makes it possible to write every solution of the inhomogeneous system
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223028.png" /></td> </tr></table>
+
$$
 +
\dot{x}  = A( t) x + b( t) ,
 +
$$
  
in which the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223029.png" /> is locally summable on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223030.png" />, in the form of Cauchy's formula
+
in which the function $  b( t) $
 +
is locally summable on $  J $,  
 +
in the form of Cauchy's formula
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223031.png" /></td> </tr></table>
+
$$
 +
x( t)  = X( t) x( t _ {0} ) + \int\limits _ {t _ {0} } ^ { t }  C( t, s) b( s)  ds,\ \
 +
t \in J;
 +
$$
  
 
here
 
here
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223032.png" /></td> </tr></table>
+
$$
 +
C( t, s)  = X( t)[ X( s)]  ^ {-} 1
 +
$$
  
is called the Cauchy matrix of (*). The Cauchy matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223033.png" /> is jointly continuous in its arguments on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223034.png" /> and for arbitrary <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223035.png" /> it has the properties
+
is called the Cauchy matrix of (*). The Cauchy matrix $  C( t, s) $
 +
is jointly continuous in its arguments on $  J \times J $
 +
and for arbitrary $  t, s, r \in J $
 +
it has the properties
  
1) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223036.png" />;
+
1) $  C( t, s) = C( t, t _ {0} ) [ C( s, t _ {0} ) ]  ^ {-} 1 $;
  
2) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223037.png" />;
+
2) $  C( t, s) = C( t, r) C( r, s) $;
  
3) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223038.png" />;
+
3) $  C( s, t) = [ C( t, s)]  ^ {-} 1 $;
  
4) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223039.png" />;
+
4) $  C( t, t) = I $;
  
5) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223040.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223041.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223042.png" /> is the norm in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223043.png" />;
+
5) $  | C( t, s) | \leq  \mathop{\rm exp}  \int _ {s}  ^ {t} | A( r) |  dr $,  
 +
$  s \leq  t $,
 +
where $  | \cdot | $
 +
is the norm in $  \mathbf R  ^ {n} $;
  
6) if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223044.png" /> is the Cauchy matrix of the adjoint system
+
6) if $  H( t, s) $
 +
is the Cauchy matrix of the adjoint system
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223045.png" /></td> </tr></table>
+
$$
 +
\dot{x}  = - A  ^ {*} ( t) x,
 +
$$
  
 
then
 
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/f/f042/f042230/f04223046.png" /></td> </tr></table>
+
$$
 +
H( t, s)  = [ C  ^ {*} ( t, s)]  ^ {-} 1 .
 +
$$
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  N. Bourbaki,  "Elements of mathematics. Functions of a real variable" , Addison-Wesley  (1976)  (Translated from French)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  F.R. [F.R. Gantmakher] Gantmacher,  "The theory of matrices" , '''1''' , Chelsea, reprint  (1977)  (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  B.P. Demidovich,  "Lectures on the mathematical theory of stability" , Moscow  (1967)  (In Russian)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  V.A. Yakubovich,  V.M. Starzhinskii,  "Linear differential equations with periodic coefficients" , Wiley  (1975)  (Translated from Russian)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  N. Bourbaki,  "Elements of mathematics. Functions of a real variable" , Addison-Wesley  (1976)  (Translated from French)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  F.R. [F.R. Gantmakher] Gantmacher,  "The theory of matrices" , '''1''' , Chelsea, reprint  (1977)  (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  B.P. Demidovich,  "Lectures on the mathematical theory of stability" , Moscow  (1967)  (In Russian)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  V.A. Yakubovich,  V.M. Starzhinskii,  "Linear differential equations with periodic coefficients" , Wiley  (1975)  (Translated from Russian)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====

Latest revision as of 19:40, 5 June 2020


matrizant

The transition matrix $ X( t) $ of solutions of a system of linear ordinary differential equations

$$ \tag{* } \dot{x} = A( t) x,\ \ x \in \mathbf R ^ {n} , $$

normalized at the point $ t _ {0} $. The fundamental matrix is the unique continuous solution of the matrix initial value problem

$$ \dot{X} = A( t) X,\ \ X( t _ {0} ) = I $$

( $ I $ denotes the identity matrix) if the matrix-valued function $ A( t) $ is locally summable over some interval $ J \subset \mathbf R $, $ t \in J $.

Every matrix $ M( t) $ built of column-solutions $ x _ {1} \dots x _ {m} $ of the system (*), where $ m $ is a natural number, is expressible as $ M( t) = X( t) M( t _ {0} ) $. In particular, every solution $ x( t) $ of (*) can be written in the form $ x( t) = X( t) x _ {0} $.

The expansion

$$ X( t) = I + \int\limits _ {t _ {0} } ^ { t } A( s) ds + \int\limits _ {t _ {0} } ^ { t } A( s) \int\limits _ {t _ {0} } ^ { s } A( r) dr ds + \dots , $$

which converges absolutely for every $ t \in J $ and uniformly on every compact interval in $ J $, and the Liouville–Ostrogradski formula

$$ \mathop{\rm det} X( t) = \mathop{\rm exp} \int\limits _ {t _ {0} } ^ { t } \mathop{\rm Sp} A( s) ds $$

are valid. If the matrix $ A( t) $ satisfies the Lappo-Danilevskii condition

$$ A( t) \cdot \int\limits _ {t _ {0} } ^ { t } A( s) ds = \ \int\limits _ {t _ {0} } ^ { t } A( s) ds \cdot A( t), $$

then

$$ X( t) = \mathop{\rm exp} \int\limits _ {t _ {0} } ^ { t } A( s) ds. $$

In particular, if $ A( t) \equiv A $ is a constant matrix, then

$$ X( t) = e ^ {A( t- t _ {0} ) } . $$

If $ X _ {A} ( t) $ is the fundamental matrix of the system (*) with matrix $ A( t) $, then

$$ X _ {A+} B ( t) = X _ {A} ( t) X _ {D} ( t), $$

where

$$ D( t) = [ X _ {A} ( t)] ^ {-} 1 B( t) X _ {A} ( t). $$

The fundamental matrix makes it possible to write every solution of the inhomogeneous system

$$ \dot{x} = A( t) x + b( t) , $$

in which the function $ b( t) $ is locally summable on $ J $, in the form of Cauchy's formula

$$ x( t) = X( t) x( t _ {0} ) + \int\limits _ {t _ {0} } ^ { t } C( t, s) b( s) ds,\ \ t \in J; $$

here

$$ C( t, s) = X( t)[ X( s)] ^ {-} 1 $$

is called the Cauchy matrix of (*). The Cauchy matrix $ C( t, s) $ is jointly continuous in its arguments on $ J \times J $ and for arbitrary $ t, s, r \in J $ it has the properties

1) $ C( t, s) = C( t, t _ {0} ) [ C( s, t _ {0} ) ] ^ {-} 1 $;

2) $ C( t, s) = C( t, r) C( r, s) $;

3) $ C( s, t) = [ C( t, s)] ^ {-} 1 $;

4) $ C( t, t) = I $;

5) $ | C( t, s) | \leq \mathop{\rm exp} \int _ {s} ^ {t} | A( r) | dr $, $ s \leq t $, where $ | \cdot | $ is the norm in $ \mathbf R ^ {n} $;

6) if $ H( t, s) $ is the Cauchy matrix of the adjoint system

$$ \dot{x} = - A ^ {*} ( t) x, $$

then

$$ H( t, s) = [ C ^ {*} ( t, s)] ^ {-} 1 . $$

References

[1] N. Bourbaki, "Elements of mathematics. Functions of a real variable" , Addison-Wesley (1976) (Translated from French)
[2] F.R. [F.R. Gantmakher] Gantmacher, "The theory of matrices" , 1 , Chelsea, reprint (1977) (Translated from Russian)
[3] B.P. Demidovich, "Lectures on the mathematical theory of stability" , Moscow (1967) (In Russian)
[4] V.A. Yakubovich, V.M. Starzhinskii, "Linear differential equations with periodic coefficients" , Wiley (1975) (Translated from Russian)

Comments

The term "matrizant" is no longer in common use; instead the term "transition matrixtransition matrix" has become popular for what is called above "fundamental matrix" . See also Fundamental system of solutions.

Cauchy's formula is often called the variation of constants formula, and the Cauchy matrix is also called the transition matrix (cf. also Cauchy matrix).

References

[a1] R.W. Brockett, "Finite dimensional linear systems" , Wiley (1970)
[a2] J.K. Hale, "Ordinary differential equations" , Wiley (1980)
How to Cite This Entry:
Fundamental matrix. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Fundamental_matrix&oldid=47026
This article was adapted from an original article by Yu.V. Komlenko (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article