- Research
- Open access
- Published:
Biperiodicity in neutral-type delayed difference neural networks
Advances in Difference Equations volume 2012, Article number: 5 (2012)
Abstract
In this article we employ Krasnoselskii's fixed point theorem to obtain new biperiodicity criteria for neutral-type difference neural networks with delays. It is shown that the neutral-type term can leads to biperiodicity results. That is coexistence of a positive periodic sequence solution and its anti-sign periodic sequence solution. We illustrate our novel approach the biperiodicity dynamics of biperiodicity for neutral-type delay difference neural networks by two computer numerical examples.
Mathematics Subject Classification 2010: 39A23; 39A10.
1 Introduction
It is well known that neural networks with delays have a rich dynamical behavior that have been recently investigated by Huand and Li [1] and the references therein. It is naturally important that such systems should contain some information regarding the past rate of change since they effectively describe and model the dynamic of the application of neural networks [2–4]. As a consequence, scholars and researchers have paid more attention to the stability of neural networks that are described by nonlinear delay differential equations of the neutral type (see [4–8])
Cheng et al. first investigated the globally asymptotic stability of a class of neutral-type neural networks with delays [6]. Delay-dependent criterion has been attained in [5] by using Lyapunov stability theory and linear matrix inequality. Recently a conservative robust stability criteria for neutral-type networks with delays are proposed in [4] by using a new Lyapunov-Krasovskii functional and a novel series compensation technique. For more relative results, we can refer to [4, 7] and references cited therein.
Difference equations or discrete-time analogs of differential equations can preserve the convergence dynamics of their continuous-time counterparts in some degree [9]. So, due to its usage in computer simulations and applications, these discrete-type or difference networks have been deeply discussed by the authors of [10–15] and extended to periodic or almost periodic difference neural systems [16–21].
However, few papers deal with multiperiodicity of neutral-type difference neural networks with delays. Stimulated by the articles [22, 23], in this article, we should consider corresponding neutral-type difference version of (1.1) as follows:
where . Our main aim is to study biperiodicity of the above neutral-type difference neural networks. Some new criteria for coexistence of a periodic sequence solution and anti-sign periodic one of (1.2) have been derived by using Krasnoselskii's fixed point theorem. Our results are completely different from monoperiodicity existing ones in [16–20].
The rest of this article is organized as follows. In Section 2, we shall make some preparations by giving some lemmas and Krasnoselskii's fixed point theorem. In Section 3, we gives new criteria for biperiodicity of (1.2). Finally, two numerical examples are given to illustrate our results.
2 Preliminaries
We begin this section by introducing some notations and some lemmas. Let be the set of all real T-periodic sequences defined on ℤ, where T is an integer with T ≥ 1. Then is a Banach space when it is endowed with the norm
Denote [a,b]ℤ: = {a, a + 1,...,b}, where a, b ∈ ℤ and a ≤ b. Let C((-∞, 0]ℤ, ℝm) be the set of all continuous and bounded functions ψ(s) = (ψ1(s), ψ 2(s), ..., ψ m (s))Tmapping (-∞,0]ℤ into ℝm. For any given ψ ∈ C((-∞, 0]ℤ, ℝN), we denote by {u(n; ψ)} the sequence solution of system (1.2). Next, we present the basic assumptions:
-
Assumption (H1): Each a i (·), b ij (·), d ij (·), and I i (·) are T-periodic functions defined on ℤ, 0 < a i (n) < 1. The activation g j (·) is strictly increasing and bounded with for all v ∈ ℝ. The kernel h j : ℕ → ℝ+ is a bounded sequence with , where .
For each and any n ∈ ℤ, we let
Since 0 < a i (n) < 1 for all n ∈ [0,T - 1], each G i (n, p) is not zero and
Lemma 2.1. For each i ∈ ℕ and ∀p ∈ ℤ+,
holds for any sequence solution {u(n)} of (1.2), where, is a shift operator defined as for and p ∈ ℤ+.
Proof.
The proof is complete.
Lemma 2.2. Assume that (H1) hold. Any sequence is a solution of (1.2) if and only if
where G i (n, p) is defined by (2.1) for and p ∈ ℤ+.
Proof. Rewrite (1.2) as
where and n ∈ ℤ+. Summing (2.3) from n to n + T - 1, we obtain
That is,
Since u i (n + T) = u i (n), we obtain
It follows from Lemma 2.1 that
Therefore, one gets from (2.4) that
Dividing both sides of the above equation by completes the proof.
In what follows, we state Krasnoselskii's theorem.
Lemma 2.3. Let be a closed convex nonempty subset of a Banach space ().
Suppose that C and B map into such that
(i) implies that ,
(ii) C is continuous and is contained in a compact set and
(iii) B is a contraction mapping.
Then there exists a with z = Cz + B z.
3 Biperiodicity of neutral-type difference networks
Due to the introduction of the neutral term neutral , we must construct two closed convex subsets and in , which necessitate the use of Krasnoselskii's fixed point theorem. As a consequence, we are able to derive the new biperiodicity criteria for (1.2). That is there exists a positive T-periodic sequence solution in and an anti-sign T-periodic sequence solution in . Next, for the case c ij ≥ 0, we present the following assumption:
-
Assumption (H2): For each , c ij ≥ 0, b ii (n) > 0 and , g j (·) satisfies g j (-v) = -g j (v) for all v ∈ ℝ. Moreover, there exist constants α > 0 and β > 0 with α < β such that for all
where
Construct two subsets of as follows:
Obviously, and are two closed convex subsets of Banach space . Define the map by
and the map by
where Σ = R or L. Due to the fact defines a contraction mapping.
Proposition 3.1. Under the basic assumptions (H1) and (H2), for each Σ, the operator CΣ is completely continuous on .
Proof. For any given Σ and u , we have two cases for the estimation of (CΣu) i (n).
-
Case 1: As Σ = R and u ∈ , u i (n) ∈ [α, β] holds for each and all n ∈ ℤ. It follows from (3.1) and (H2) that
and
-
Case 2: As Σ = L and u ∈ , u i (n) ∈ [-β, -α] holds for each and all n ∈ ℤ. It follows from (3.1) and (H2) that
and
It follows from above two cases about the estimation of (CΣu) i (n) that . This shows that CΣ () is uniformly bounded. Together with the continuity of CΣ, for any bounded sequence {ψ n } in , we know that there exists a subsequence in such that is convergent in CΣ(). Therefore, CΣ is compact on . This completes the proof.
Theorem 3.1. Under the basic assumptions (H1) and (H2), for each Σ, (1.2) has a T-periodic solution uΣ satisfying uΣ ∈ .
Proof. Let . We should show that . For simplicity we only consider the case Σ = R. It follows from (2.2) and (H2) that
On the other hand,
Therefore, all the hypotheses stated in Lemma 2.3 are satisfied. Hence, (1.2) has a T-periodic solution uRsatisfying uR∈ . Almost the same argument can be done for the case Σ = L. The proof is complete.
For the case c ij < 0, we present the following assumption:
-
Assumption : For each , c ij ≤ 0 and . There exist constants α > 0 and β > 0 with α < β such that for all n ∈ ℤ
where
Similarly as Proposition 3.1, we can obtain
Proposition 3.2. Under the basic assumptions (H1) and , for each Σ, the operator CΣ is completely continuous on .
Proof For any given Σ and u ∈ , we have two cases for the estimation of (CΣu) i (n).
-
Case 1: As Σ = R and u ∈ , u i (n) ∈ [α, β] holds for each and all n ∈ ℤ. It follows from (3.1) and that
and
-
Case 2: As Σ = L and u ∈ , u i (n) ∈ [-β, -α] holds for each and all n ∈ ℤ. It follows from (3.1) and that
and
By a similar argument, we prove that CΣ is continuous and compact on . This completes the proof.
Theorem 3.2. Under the basic assumptions (H1) and , for each Σ, (1.2) has a T-periodic solution uΣ satisfying uΣ ∈ .
Proof. Let . We should show that . For simplicity, we only consider the case Σ = L. It follows from (2.2) and that
On the other hand,
Therefore, all the hypotheses stated in Lemma 2.3 are satisfied. Hence, (1.2) has a T-periodic solution uLsatisfying uL∈ . By a similar argument, one can prove the case Σ = R. This completes the proof.
4 Numerical examples
Example 1. Consider the following neutral-type difference neural networks with delays
where
Obviously, the sigmoidal function tanh(z) is strictly increasing on ℝ with |tanh(z)| < 1. It is easy for us to check that (H1) holds. After some computations, we have
Take α = 3, β = 160 and define
From Figure 1, we can check that assumption (H2) hold. By Theorem 3.1, there exists a positive ten-periodic sequence solution of (4.1) and a negative ten-periodic sequence solution. For the coexistence of positive periodic sequence solution and its anti-sgn ones, we can refer to Figures 2 and 3. Phase view for biperiodicity dynamics of (4.1), we can refer to Figure 4.
Example 2. Consider the following neutral-type difference neural networks with delays
where
Obviously, (H1) holds. From some computations, we have
Let α = 1, β = 20. We can check assumption holds. From Theorem 3.2, there exist a positive ten-periodic sequence solution and an anti-sgn ones of (4.2). For the coexistence of a positive T-periodic sequence solution and its an anti-sgn ones of (4.2), we can refer to Figure 5. Figure 6 shows phase view for biperiodicity dynamics of (4.2).
5 Remarks and open problems
To the best of authors' knowledge, this is the first time when biperiodicity criteria for neutral-type difference neural networks with delays
have been studied.
We propose the following open problems for future research:
Our new assumptions (H2) and indicate that neutral term plays an important role on the dynamics of biperiodicity. Such study has not been mentioned in the literature. However, there is still more to do. For example:
-
(i)
If we relax the conditions c ij ≤ 0 or c ij ≥ 0 for all on the neutral term, then is the existence of multiperiodic dynamics still exist?
-
(ii)
Evidently, in our work Biperiodicity of neural networks depends on the boundedness of activation functions. Can such requirement be relaxed and yet still obtain periodic sequence solutions and whether they are always of anti-sign?
To discuss the sign of each c ij and consider analytic properties of activation functions is a possible way to investigate these problems.
References
Huang LH, Li XM: Dynamics of Cellular Neural Networks. Science Press, Beijing; 2007.
Bellen A, Guglielmi N, Ruehli AE: Methods for linear systems of circuit delay differential equations of neutral type. IEEE Trans Circuits Syst I Fundam Theory Appl 1999, 46: 212–215. 10.1109/81.739268
Clarkson ID, Goodall DP: On the stabilizability of imperfectly known nonlinear delay systems of the neutral type. IEEE Trans Autom Control 2000, 45: 2326–2331. 10.1109/9.895567
Zhang HG, Liu ZW, Huang GB: Novel delay-dependent robust stability analysis for switched neutral-type neural networks with time-varying dalays via SC technique. IEEE Trans Syst Man, Cyber B: Cybernetics 2010, 40: 1480–1491.
Park JH, Kwon OM, Lee SM: LMI optimization approach on stability for delayed neural network of neutral-type. J Comput Appl Math 2008, 196: 224–236.
Cheng CJ, Liao TL, Yan JJ, Hwang CC: Globally Asymptotic Stability of a Class of Neutral-Type Neural Networks With Delays. IEEE Trans Syst Man Cybern B: Cybernetics 2006, 36: 1191–1195.
Samli R, Arik S: New results for global stability of a class of neutral-type neural systems with time delays. Appl Math Comput 2009, 210: 564–570. 10.1016/j.amc.2009.01.031
Rakkiyappan P, Balasubramaniam P: New global exponential stability results for neutral type neural networks with distributed time delays. Neurocomputing 2008, 71: 1039–1045. 10.1016/j.neucom.2007.11.002
Kelley W, Perterson A: Difference Equations: An Introduction with Applications. Harcourt Acadamic Press, San Diego 2001.
Chen LN, Aihara K: Chaos and asymptotical stability in discrete-time neural networks. Physica D: Nonlinear Phenomena 1997, 104: 286–325. 10.1016/S0167-2789(96)00302-8
Liang JL, Cao JD, Ho DWC: Discrete-time bidirectional associative memory neural networks with variable delays. Phys Lett A 2005, 335: 226–234. 10.1016/j.physleta.2004.12.026
Liu YR, Wang ZD, Serrano A, Liu XH: Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis. Phys Lett A 2007, 362: 480–488. 10.1016/j.physleta.2006.10.073
Mohamad S: Global exponential stability in continuous-time and discrete-time delayed bidirectional neural networks. Physica D 2001, 159: 233–51. 10.1016/S0167-2789(01)00344-X
Chen WH, Lu XM, Liang DY: Global exponential stability for discrete-time neural networks with variable delays. Phys Lett A 2006, 358: 186–198. 10.1016/j.physleta.2006.05.014
Brucoli M, Carnimeo L, Grassi G: Discrete-time cellular neural networks for associative memories with learning and forgetting capabilities. IEEE Trans Circ Sys I 1995, 42: 396–399. 10.1109/81.401156
Wang L, Zou X: Capacity of stable periodic solutions in discrete-time bidirectional associative memory neural networks. IEEE Trans Circ Syst II 2004, 51: 315–319. 10.1109/TCSII.2004.829571
Zeng ZG, Wang J: Multiperiodicity of discrete-time delayed neural networks evoked by periodic external inputs. IEEE Trans Neural Netw 2006, 17: 1141–1151. 10.1109/TNN.2006.877533
Zhao HY, Sun L, Wang GL: Periodic oscillation of discrete-time bidirectional associative memory neural networks. Neurocomputing 2007, 70: 2924–2930. 10.1016/j.neucom.2006.11.010
Zou L, Zhou Z: Periodic solutions for nonautonomous discrete-time neural networks. Appl Math Lett 2006, 19: 174–185. 10.1016/j.aml.2005.05.004
Zhou Z, Wu JH: Stable periodic orbits in nonlinear discrete-time neural networks with delayed feedback. Comput Math Appl 2003, 45: 935–942. 10.1016/S0898-1221(03)00066-X
Huang ZK, Wang XH, Gao F: The existence and global attractivity of almost periodic sequence solution of discrete-time neural networks. Phys Lett A 2006, 350: 182–191. 10.1016/j.physleta.2005.10.022
Raffoul Y: Periodic solutions for scalar and vector nonlinear difference equations. Panamer J Math 1999, 9: 97–111.
Raffoul Y, Yankson E: Positive periodic solutions in neutral delay difference equations. Adv Dyn Syst Appl 2010, 5: 123–130.
Acknowledgements
This research was supported by National Natural Science Foundation of China under Grant 11101187, the Foundation for Young Professors of Jimei University and the Foundation of Fujian Higher Education (JA10184,JA11154,JA11144).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors' contributions
All the authors have contributed in all the part and they have read and approved the final manuscript.
Authors’ original submitted files for images
Below are the links to the authors’ original submitted files for images.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Huang, Z., Raffoul, Y.N. Biperiodicity in neutral-type delayed difference neural networks. Adv Differ Equ 2012, 5 (2012). https://doi.org/10.1186/1687-1847-2012-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1687-1847-2012-5