Consider the following model of switched multilayer neural networks [3]:
where is the state vector, is a linear combination of the states, (, ) is the self-feedback matrix, and are the connection weight matrices, is the nonlinear function vector satisfying the global Lipschitz condition with Lipschitz constant , is an external input vector, and is a known constant matrix. α is a switching signal which takes its values in the finite set . The matrices are allowed to take values, at an arbitrary time, in the finite set . Throughout this paper, we assume that the switching rule α is not known a priori and its instantaneous value is available in real time. Define the indicator function , where
with . By using this indicator function, the model of the switched multilayer neural networks (1)-(2) can be written as
where is satisfied under any switching rules. Let be a predefined level of disturbance attenuation. In this paper, for a given , we derive sets of criteria such that the switched multilayer neural network (3)-(4) with is exponentially stable (, where ) and
(5)
under zero-initial conditions for all nonzero , where is the space of square integrable vector functions over .
A set of generalized exponential stability criterion of the switched multilayer neural network (3)-(4) is derived in the following theorem.
Theorem 1 For givenand, the switched multilayer neural network (3)-(4) is generalizedexponentially stable if
for, whereis the minimum eigenvalue of the matrix and P satisfies the Lyapunov inequality.
Proof We consider the Lyapunov function . The time derivative of the function along the trajectory of (3) satisfies
(10)
Applying Young’s inequality [20], we have , and . Substituting these inequalities into (10), we have
(11)
If the following condition is satisfied,
(12)
for , we have
(13)
The following inequalities
(14)
for , imply the condition (12). Thus, we obtain (6) and (7). Under the zero-initial condition, we have and . Define
(15)
Then, for any nonzero , we obtain
From (13), we have . It means
The condition (9) implies
(16)
Taking the supremum over leads to (5). This completes the proof. □
Corollary 1 When, the conditions (6)-(9) ensure that the switched multilayer neural network (3)-(4) is exponentially stable.
Proof When , from (13), for all . Thus, for any , it implies that
(17)
Finally, we have
(18)
This completes the proof. □
In the next theorem, we find a new set of LMI criteria for the generalized exponential stability of the switched multilayer neural network (3)-(4). This set of LMI criteria can be facilitated readily via standard numerical algorithms [21, 22].
Theorem 2 For given leveland, the switched multilayer neural network (3)-(4) is generalizedexponentially stable if there exist a positive symmetric matrix P and a positive scalar ϵ such that
for.
Proof Consider the Lyapunov function . Applying Young’s inequality [20], we have . By using this inequality, the time derivative of along the trajectory of (3) is
(21)
If the LMI (19) is satisfied, we have
(22)
Under the zero-initial condition, one has and . Define
(23)
Then, for any nonzero , we obtain
From (22), we have . It means
The LMI (20) implies
(24)
Taking the supremum over leads to (5). This completes the proof. □
Corollary 2 When, a set of LMI conditions (19)-(20) ensure that the switched multilayer neural network (3)-(4) is exponentially stable.
Proof When , from (22), we have
Thus, for any , it implies that
(26)
Finally, we have
(27)
This completes the proof. □
Example 1 Consider the switched neural network (3)-(4), where
Applying Theorem 2 with , we obtain
The switching signal is given by
where Z is the whole set of nonnegative integers. Figure 1 shows state trajectories when and () is a white noise.