- Research
- Open Access
- Published:
Exponential stability for a stochastic delay neural network with impulses
Advances in Difference Equations volume 2014, Article number: 250 (2014)
Abstract
In this paper, we study the exponential stability for a stochastic neural network with impulses. By employing fixed point theory and some analysis techniques, sufficient conditions are derived for achieving the required result.
MSC:34K20.
1 Introduction
Neural networks have found important applications in various areas such as combinatorial optimization, signal processing, pattern recognition, and solving nonlinear algebraic equations. We notice that a lot of practical systems have the phenomenon of time delay, and many scholars have paid much attention to time delay systems [1–8]. As is well known, stochastic functional differential systems which include stochastic delay differential systems have been widely used since stochastic modeling plays an important role in many branches of science and engineering [9]. Consequently, the stability analysis of these systems has received a lot of attention [10–12].
In some applications, besides delay and stochastic effects, impulsive effects are also likely to exist [13, 14]; they could stabilize or destabilize the systems. Therefore, it is of interest to take delay effects, stochastic effects, and impulsive effects into account when studying the dynamical behavior of neural networks.
In [11], Guo et al. studied the exponential stability for a stochastic neutral cellular neural network without impulses and obtained new criteria for exponential stability in mean square of the considered neutral cellular neural network by using fixed point theory. To the best of the authors’ knowledge there are only a few papers where fixed point theory is used to discuss the stability of stochastic neural networks. In this paper, we will study the exponential stability for a stochastic neural network with impulses by the contraction mapping theorem and Krasnoselskii’s fixed point theorem.
2 Some preliminaries
Throughout this paper, unless otherwise specified, we let be a complete probability space with a filtration satisfying the usual conditions, i.e. it is right continuous and contains all P-null sets, be the family of all bounded, -measurable functions. Let denote the n-dimensional real space equipped with Euclidean norm . denote a matrix, its norm is denoted by .
In this paper, by using fixed point theory, we discuss the stability of the impulsive stochastic delayed neural networks:
where , and are positive constant. with the norm defined by , is the state vector, (i.e. , ) is the connection weight constant matrix with appropriate dimensions, and represents the time-varying parameter which is uncertain with bounded.
Here is the neuron activation function with and is an m-dimensional Brownian motion defined on . The stochastic disturbance term, , can be viewed as stochastic perturbations on the neuron states and delayed neuron states with . is the impulse at moment , and is strictly increasing sequence such that , and stand for the right-hand and left-hand limit of at , respectively. shows the abrupt change of at the impulsive moment and .
The local Lipschitz condition and the linear growth condition on the function and guarantee the existence and uniqueness of a global solution for system (2.1); we refer to [9] for detailed information. Clearly, system (2.1) admits a trivial solution .
Definition 2.1 System (2.1) is said to be exponentially stable in mean square for all admissible uncertainties if there exists a solution x of (2.1) and there exist a pair of positive constants β and μ with
In order to prove the exponentially stability in mean square of system (2.1), we need the following lemma.
Lemma 2.1 (Krasnoselskii)
Suppose that Ω is Banach space and X is a bounded, convex, and closed subset of Ω. Let satisfy the following conditions:
-
(1)
for any ;
-
(2)
U is contraction mapping;
-
(3)
S is continuous and compact.
Then has a fixed point in X.
Lemma 2.2 ( inequality)
If , then
and
3 Main results
Let be the Banach space of all -adapted processes such that is continuous on , and exist, and , ; we have
Let Λ be the set of functions such that on and as . It is clear that Λ is a bounded, convex, and closed subset of ß.
To obtain our results, we suppose the following conditions are satisfied:
-
(H1) there exist such that ;
-
(H2) there exist such that ;
-
(H3) there exists an such that ;
-
(H4) for , the mapping satisfies and is globally Lipschitz function with Lipschitz constants ;
-
(H5) there exists a constant ρ such that ;
-
(H6) there exists constant p such that , for and .
The solution of system (2.1) is, for the time t, a piecewise continuous vector-valued function with the first kind discontinuity at the points (), where it is left continuous, i.e.,
Theorem 3.1 Assume (H1)-(H6) hold and the following condition is satisfied:
then system (2.1) is exponentially stable in mean square for all admissible uncertainties, that is, , as .
Proof System (2.1) is equivalent to
Let
Define an operator by for , and for , we define (i.e. the right-hand side of (3.1)). From the definition of Λ, we have , for all , and .
Next, we prove that . It is clear that is continuous on . For a fixed time , it is easy to check that , , are continuous in mean square on the fixed time , for . In the following, we check the mean square continuity of on the fixed time ().
Let and such that sufficiently small; we obtain
Hence, is continuous in mean square on the fixed time , for . On the other hand, as , it is easy to check that , , are continuous in mean square on the fixed time .
Let be small enough; we have
which implies that .
Let be small enough, we have
which implies that .
Hence, we see that is continuous in mean square on , and for , and exist. Furthermore, we also obtain .
It follows from (3.1) that
By (H3), it is easy to see , as . Now, we prove , , and , as .
Note, for any , there exists such that implies that . Hence, we have from (H1), (H3)
where , represents the minimal eigenvalue of A. Thus, we have as .
From (H2) and (H3), we have
As , we have . Then, for any , there exists a nonimpulsive point such that implies . It then follows from the conditions (H4)-(H6) that
Thus we conclude that .
Finally, we prove that Q is a contraction mapping. For any , we obtain
From the condition (P1), we find that Q is a contraction mapping. Hence, by the contraction mapping principle, we see that Q has a unique fixed point , which is a solution of (2.1) with as and as . □
The second result is established using Krasnoselskii’s fixed point theorem.
Theorem 3.2 Assume (H1)-(H6) hold and the following condition is satisfied:
then system (2.1) is exponentially stable in mean square for all admissible uncertainties, that is, , as .
Proof For , define the operators and , respectively, by
and
By the proof of Theorem 3.1, we can verify that when and S is mean square continuous.
Next, we show that U is a contraction mapping. For , we have
From the condition (P2), we find that U is a contraction mapping.
Finally, we prove that S is compact.
Let be a bounded set:, , we have
Therefore, we can conclude that Sx is uniformly bounded.
Further, let and , with ; we have
Thus, the equicontinuity of S is obtained. According to the PC-type Ascoli-Arzela lemma [[15], Lemma 2.4], is relatively compact in Λ. Therefore S is compact. By Lemma 2.1, has a fixed point x in Λ and we note on and as . This completes the proof. □
References
Huang H, Cao JD: Exponential stability analysis of uncertain stochastic neural networks with multiple delays. Nonlinear Anal., Real World Appl. 2007, 8: 646-653. 10.1016/j.nonrwa.2006.02.003
Wang ZD, Liu YR, Liu XH: On global asymptotic stability analysis of neural networks with discrete and distributed delays. Phys. Lett. A 2005, 345(5-6):299-308.
Wang ZD, Lauria S, Fang JA, Liu XH: Exponential stability of uncertain stochastic neural networks with mixed time-delays. Chaos Solitons Fractals 2007, 32: 62-72. 10.1016/j.chaos.2005.10.061
Wan L, Sun J: Mean square exponential stability of stochastic delayed Hopfield neural networks. Phys. Lett. A 2005, 343: 306-318. 10.1016/j.physleta.2005.06.024
Zheng Z: Theory of Functional Differential Equations. Anhui Education Press, Hefei; 1994.
Wei J: The Degeneration Differential Systems with Delay. Anhui University Press, Hefei; 1998.
Wei J: The constant variation formulae for singular fractional differential systems with delay. Comput. Math. Appl. 2010, 59(3):1184-1190. 10.1016/j.camwa.2009.07.010
Wei J: Variation formulae for time varying singular fractional delay differential systems. Fract. Differ. Calc. 2011, 1(1):105-115.
Mao X: Stochastic Differential Equations and Applications. Horwood, New York; 1977.
Mao X: Razumikhin-type theorems on exponential stability of stochastic functional differential equations. Stoch. Process. Appl. 1996, 65: 233-250. 10.1016/S0304-4149(96)00109-3
Guo C, O’Regan D, Deng F, Agarwal RP: Fixed points and exponential stability for a stochastic neutral cellular neural network. Appl. Math. Lett. 2013, 26: 849-853. 10.1016/j.aml.2013.03.011
Luo J: Exponential stability for stochastic neutral partial functional differential equations. J. Math. Anal. Appl. 2009, 355: 414-425. 10.1016/j.jmaa.2009.02.001
Peng S, Jia B: Some criteria on p th moment stability of impulsive stochastic functional differential equations. Stat. Probab. Lett. 2010, 80: 1085-1092. 10.1016/j.spl.2010.03.002
Zhang YT, Luo Q: Global exponential stability of impulsive cellular neural networks with time-varying delays via fixed point theory. Adv. Differ. Equ. 2013., 2013: Article ID 23
Bainov DD, Simeonov PS: Impulsive Differential Equations: Periodic Solutions and Applications. Longman, New York; 1993.
Acknowledgements
This research was supported by the National Nature Science Foundation of China (No. 11371027); National Nature Science Foundation of China, Tian Yuan (No. 11326115); Research Project for Academic Innovation (No. yfc100002); Program of Natural Science of Colleges of Anhui Province (No. KJ2013A032); Program of Natural Science Research in Anhui Universities (No. KJ2011A020 and No. KJ2012A019); Special Research Fund for the Doctoral Program of the Ministry of Education of China (No. 20123401120001 and No. 20103401120002).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0), which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Zhang, S., Jiang, W. & Zhang, Z. Exponential stability for a stochastic delay neural network with impulses. Adv Differ Equ 2014, 250 (2014). https://doi.org/10.1186/1687-1847-2014-250
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1687-1847-2014-250
Keywords
- neural network
- exponential stability
- impulse
- fixed point theory