numerical simulation of periodic solutions for a class of numerical discretization neural networks
TRANSCRIPT
![Page 1: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/1.jpg)
Mathematical and Computer Modelling 52 (2010) 386–396
Contents lists available at ScienceDirect
Mathematical and Computer Modelling
journal homepage: www.elsevier.com/locate/mcm
Numerical simulation of periodic solutions for a class of numericaldiscretization neural networksI
Chun Lu a,∗, Xiaohua Ding b, Mingzhu Liu ca School of Science, Qingdao Technological University, Qingdao 266520, Chinab Department of Mathematics, Harbin Institute of Technology (Weihai), Weihai 264209, Chinac Department of Mathematics, Harbin Institute of Technology, Harbin 150001, China
a r t i c l e i n f o
Article history:Received 28 May 2009Received in revised form 6 March 2010Accepted 8 March 2010
Keywords:Neural networkPeriodic solutionGlobally exponentially stableCoincidence degreeLyapunov function
a b s t r a c t
In this paper, the existence and global exponential stability of periodic solutions fora class of numerical discretization neural networks are investigated. Using coincidencedegree theory and the Lyapunov method, sufficient conditions for the existence and globalexponential stability of periodic solutions are obtained. Numerical simulations are given atthe end of this paper.
© 2010 Elsevier Ltd. All rights reserved.
1. Introduction
In this paper, we consider the existence and stability of a numerical discretization neural network resulting from thesemi-discretization technique for neural networks with finite delays and distributed delays:
xi(t) = −bi(t)xi(t)+ fi
(t, x1(t), . . . , xm(t); x1(t − τi1(t)), . . . , xm(t − τim(t));
∫+∞
0ki1(s)x1(t − s)ds, . . . ,
∫+∞
0kim(s)xm(t − s)ds
)+ Ii(t), i = 1, . . . ,m. (1.1)
System (1.1) is a more general form of neural networks with delay. Many authors have discussed other neural networkswith finite delays [1–8], and most of their systems can be deduced from (1.1).For system (1.1) we make the following assumptions:(H1): For each i ∈ 1, . . . ,m, bi(t) is bounded and continuous with bi(t) > 0 and bi(t + ω) = bi(t) for t ∈ R+, Ii(t) is
bounded, continuous and Ii(t + ω) = Ii(t) for t ∈ R+, where ω is a positive constant.(H2): For each i, j ∈ 1, . . . ,m, kij : R+ → R+ satisfies:∫
+∞
0kij(s)ds = 1,
∫+∞
0skij(s)ds <∞.
I This paper is supported by the National Natural Science Foundation of China (10671047) and the foundation of HITC (200713).∗ Corresponding author. Tel.: +86 13864220728.E-mail address:[email protected] (C. Lu).
0895-7177/$ – see front matter© 2010 Elsevier Ltd. All rights reserved.doi:10.1016/j.mcm.2010.03.005
![Page 2: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/2.jpg)
C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396 387
(H3): For each i ∈ 1, . . . ,m, fi : R+ × Rm × Rm × Rm → R is bounded, continuous and there exists nonnegative,bounded and continuous functions αij(t), βij(t), γij(t) defined on R+ such that
|fi(t, u1, . . . , um; v1, . . . , vm;w1, . . . , wm)− fi(t, u1, . . . , um; v1, . . . , vm; w1, . . . , wm)|
≤
m∑j=1
[αij(t)|uj − uj| + βij(t)|vj − vj| + γij(t)|wj − wj|
], i = 1, . . . ,m
for any (u1, . . . , um), (u1, . . . , um), (v1, . . . , vm), (v1, . . . , vm), (w1, . . . , wm), (w1, . . . , wm) ∈ Rm.For system (1.1), we consider initial conditions of the form
xi(t) = ϕi(t), t ∈ (−∞, 0], i = 1, . . . ,m (1.2)
where ϕi(t) is bounded and continuous on (−∞, 0].For convenience in our study, let Z denote the set of all integers; Z+0 = 0, 1, 2, . . .; [a, b]Z = a, a + 1, . . . , b − 1, b
where a, b ∈ Z, a ≤ b; and [a,+∞)Z = a, a + 1, a + 2, . . . where a ∈ Z. We begin approximating the continuous-timenetwork (1.2) by replacing the integral terms with discrete sums of the form
∫+∞
0kij(s)xj(t − s)ds ≈
+∞∑[ sh ]=1
ωij(h)kij([ sh
]h)xj
([th
]h−
[ sh
]h)
(1.3)
for t ∈ [nh, (n+1)h), s ∈ [ph, (p+1)h), n ∈ Z+0 , p ∈ Z+, where [r]denotes the integer part of a real number r ,h > 0 is a fixednumber denoting a uniform discretization step size satisfying h = ω
ω, ω ∈ N , ωij(h) > 0 for h > 0 and ωij(h) ≈ h + O(h2)
for small h > 0. We note that ωij(h) are chosen so that the analogue kernels
Hij
([ sh
]h)= ωij(h)kij
([ sh
]h)
(1.4)
satisfy the following properties:
(H4) : Hij : Z+ −→ [0,+∞),+∞∑p=1
Hij(p) = 1,+∞∑p=1
Hij(p)p < +∞.
This analogue has been employed elsewhere (see, for instance [9–12] in the formulation of discrete-time analogues of thedistributed delay). With (1.3) and (1.4), we approximate (1.1) by differential equations with piecewise constant argumentsof the form
xi(t) = −bi
([th
]h)xi(t)+ fi
([th
]h, x1
([th
]h), . . . , xm
([th
]h); x1
([th
]h
− τi1
([th
]h))
, . . . , xm
([th
]h− τim
([th
]h));
+∞∑[ sh ]=1
Hi1
([ sh
]h)
× x1
([th
]h−
[ sh
]h), . . . ,
+∞∑[ sh ]=1
Him
([ sh
]h)xm
([th
]h−
[ sh
]h))+ Ii
([th
]h)
(1.5)
for t ∈ [nh, (n + 1)h), s ∈ [ph, (p + 1)h), n ∈ Z+0 , p ∈ Z+. Noting that[ th
]= n,
[ sh
]= p and adopting the notation
u(n) = u(nh), we rewrite (1.5) as
xi(t) = −bi(n)xi(t)+ fi
(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)+ Ii(n), t ≥ 0, i = 1, . . . ,m. (1.6)
![Page 3: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/3.jpg)
388 C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396
Integrating (1.6) over the interval [nh, t), where t < (n+ 1)h, we get
xi(t)ebi(n)t − xi(n)ebi(n)nh =ebi(n)t − ebi(n)nh
bi(n)
fi
(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)+ Ii(n)
, i = 1, . . . ,m (1.7)
and by allowing t → (n+ 1)h above, we obtain
xi(n+ 1) = xi(n)e−bi(n)h + θi(h)
(fi
(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)+ Ii(n)
), i = 1, . . . ,m, n ∈ Z+0 (1.8)
where
θi(h) =1− e−bi(n)h
bi(n), i = 1, . . . ,m, n ∈ Z+0 .
It is not difficult to verify that θi(h) > 0 if bi(n), h > 0 and θi(h) ≈ h+ O(h2) for small h > 0. Also, one can show that (1.8)converges towards (1.1) when h→ 0+. In studying the discrete-time analogue (1.8), we assume that
h ∈ (0,+∞), bi : Z→ (0,+∞), τij : Z→ Z+0 , i, j = 1, 2, . . . ,m (1.9)
and the function fi satisfy (H1)–(H4). The system (1.8) is supplemented with initial values given by
xi(s) = ϕi(s), s ∈ Z−0 = 0,−1,−2, . . ., τ = max1≤i,j≤m
supτij(n), n ∈ Z.
In what follows, for convenience, we will use the following notations:
Iω = 0, 1, . . . , ω − 1, u =1ω
ω−1∑k=0
u(k),
where u(k) is an ω-periodic sequence of real numbers defined for k ∈ Z and the notations:
αMij = maxn∈Zαij(n), βMij = maxn∈Z
βij(n), γMij = maxn∈Zγij(n),
M = supu∈R+×Rm×Rm×Rm
|fi(u)|, i = 1, 2, . . . ,m ,
bi = minn∈Iωbi(n), i = 1, 2, . . . ,m, JM = max
n∈Iω|Ii(n)|, i = 1, 2, . . . ,m.
2. Existence of periodic solutions
In this section, based on the Mawhin’s continuation theorem (see [13–17]), we shall study the existence of at least oneperiodic solution of (1.8).Let X , Z be normed vector spaces, L : Dom L ⊂ X → Z be a linear mapping, and N : X → Z be a continuous mapping.
This mapping Lwill be called a Fredholmmapping of index zero if dimKer L = codim Im L <∞ and Im L is closed in Z . If L isa Fredholmmapping of index zero and there exist continuous projectors P : X → X and Q : Z → Z such that Im P = Ker L,KerQ = Im L = Im(I − Q ), it follows that L|Dom L ∩ Ker P : (I − P)X → Im L is invertible, we denote the inverse of thatmap by kP . If Ω is an open bounded subset of X , the mapping N will be called L-compact on Ω if QN(Ω) is bounded andkP(I − Q )N : Ω → X is compact. Since ImQ is isomorphic to Ker L, there exist isomorphisms J : ImQ → Ker L.
Lemma 2.1. Let L be a Fredholm mapping of index zero and let N be L-compact onΩ . Suppose
(a) For each λ ∈ (0, 1) every solution x of Lx = λNx is such that x 6∈ ∂Ω;(b) QNx 6= 0 for each x ∈ ∂Ω ∩ Ker L and degJQN,Ω ∩ Ker L, 0 6= 0.
Then the equation Lx = Nx has at least one solution lying in Dom L ∩Ω .
Theorem 2.1. Assume that (H1)–(H4) and (1.9) hold. Then system (1.8) has at least one ω-periodic solution.
![Page 4: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/4.jpg)
C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396 389
Proof. Define
lm = x = x(k) : x(k) ∈ Rm, k ∈ Z.
|x| =
(m∑i=1
x2i
) 12
, ∀x ∈ Rm.
Let lω ⊂ lm denotes the subspace of allω periodic sequences equipped with the norm ‖ · ‖, i.e., ‖x‖ = maxk∈Iω |x(k)|, for anyx(k) = (x1(k), x2(k), . . . , xm(k))T, k ∈ Z ∈ lω .It is easy to prove that lω is a finite dimensional Banach space.Let
lω0 =
y = y(k) ∈ lω :
1ω
ω−1∑k=0
y(k) = 0
,
lωc =y = y(k) ∈ lω : y(k) = c ∈ Rn, k ∈ Z
,
then it follows that lω0 and lωc are both closed linear subspaces of l
ω and
lω = lω0 ⊕ lωc , dim lωc = m.
Now let us define X = Y = lω, (Lx)(k) = x(k+ 1)− x(k), x ∈ X, k ∈ Z, and
(Nxi)(n) = xi(n)(e−b1(n)h − 1)+ θi(h)
(fi
(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)+ Ii(n)
), i = 1, . . . ,m, n ∈ Z+0 . (2.1)
It is trivial to see that L is a bounded linear operator and
Ker L = lωc , Im L = lω0 ,
as well as
dimKer L = m = codim Im L;
then it follows that L is a Fredholm mapping of index zero.Define
Px =1ω
ω−1∑k=0
x(k), x ∈ X, Qy =1ω
ω−1∑k=0
y(k), y ∈ Y .
It is not difficult to show that P and Q are continuous projectors such that
Im P = Ker L and KerQ = Im L = Im(I − Q ).
Furthermore, the generalized inverse (to L) kP : Im L→ Ker P ∩ Dom L is given by
(kPy)(n) =k−1∑ν=0
y(ν)−1ω
ω∑ν=1
ν−1∑s=0
y(s). (2.2)
Clearly, QN and kP(I − Q )N are continuous, since X is a finite-dimensional Banach space. Using the Arzela–Ascoli Theorem,it is not difficult to show that kP(I − Q )N(Ω) is compact for any open bounded setΩ ⊂ X . Moreover, QN(Ω) is bounded.Thus N is L-compact onΩ . Since ImQ = Ker L, the isomorphic mapping J from ImQ to Ker L is I . We now are in position tosearch for an appropriate open, bounded subsetΩ ⊂ X for the continuation theorem.Corresponding to operator equation Lx = λNx, λ ∈ (0, 1), we have
xi(n+ 1)− xi(n) = λ
(xi(n)(e−bi(n)h − 1)+ θi(h)fi
(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)+ θi(h)Ii(n)
). (2.3)
![Page 5: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/5.jpg)
390 C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396
Suppose that (x1(n), x2(n), . . . , xm(n))T ∈ X is a solution of system (2.3) for a certain λ ∈ (0, 1). In view of (2.3), we have
maxn∈Iω|xi(n)| = max
n∈Iω|xi(n+ 1)|
≤ maxn∈Iω
((1+ λ(e−bih − 1)
)|xi(n)| + λθi(h)
∣∣∣∣∣fi(n, x1(n), . . . , xm(n); x1(n− τi1(n)), . . . , xm(n− τim(n));
+∞∑p=1
Hi1(p)x1(n− p), . . . ,+∞∑p=1
Him(p)xm(n− p)
)∣∣∣∣∣+ λθi(h)|Ii(n)|). (2.4)
Hence
(1− e−bih)maxn∈Iω|xi(n)| ≤ θi(h)[M + JM ], i = 1, 2, . . . ,m,
this is,
maxn∈Iω|xi(n)| ≤
θi(h)[M + JM ]1− e−bih
:= Ai, i = 1, 2, . . . ,m.
Denote A = (∑mi=1 A
2i )12 +B, where B > 0 is a constant. Clearly, A is independent of λ. Nowwe takeΩ = x ∈ X : ‖x‖ < A.
This Ω satisfies condition (i) in Lemma 2.1. When x ∈ ∂Ω ∩ Ker L = ∂Ω ∩ Rm, x is a constant vector in Rm with ‖x‖ = A.Then, if necessary, we can let A be greater such that
xTQNx =m∑i=1
−x2i
ω−1∑s=0
(1− e−bi(s)h)ω
+ xi[fi(t, x1 . . . , xm)+ I i]θi(h)
≤
m∑i=1
[−(1− e−bih)|xi|2 + |xi|(M + JM)θi(h)]
≤ − min1≤i≤m1− e−bih‖x‖2 +
√m(M + JM)θi(h)‖x‖ < 0.
Therefore, QNx 6= 0 for any x ∈ ∂Ω ∩ ker L. Let ψ(γ ; x) = −γ x + (1 − γ )QNx, γ ∈ [0, 1], then for any x ∈ ∂Ω ∩ ker L,xTψ(γ ; x) < 0. From the homotopy invariance of Brouwer degree, it follows that
degJQN,Ω ∩ Ker L, 0 = deg−x,Ω ∩ Ker L, 0 6= 0.
Condition (b) of Lemma 2.1 is also satisfied. Thus, by Lemma 2.1 we conclude that Lx = Nx has at least one solution in X ,that is, (1.8) has at least one ω-periodic solution. The proof is complete.
3. Stability of periodic solutions
In this section, we shall construct appropriate Lyapunov functions to study the stability of periodic solutions of (1.8).
Theorem 3.1. Suppose the assumptions (H1)–(H4) and (1.9) hold, furthermore, assume that τij(n) ≡ τij ∈ Z+, n ∈ Z, i, j =1, 2, . . . ,m, are constants and
bi(n) >
(m∑j=1
αMji +
m∑j=1
βMji +
m∑j=1
γMji
),
+∞∑p=1
pλpHji(p) < +∞,
then the ω-periodic solution x∗(n) = (x∗1(n), x∗
2(n), . . . , x∗m(n))
T of (1.8) is unique and globally exponentially stable in the
sense that there exist constants λ > 1 and δ ≥ 1 such that
m∑i=1
|xi(n)− x∗i (n)| ≤ δ(1λ
)n m∑i=1
sups∈Z−0
|xi(s)− x∗i (s)|
, n ∈ Z+0 . (3.1)
![Page 6: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/6.jpg)
C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396 391
Proof. Let x(n) = (x1(n), x2(n), . . . , xm(n))T be an arbitrary solution of (1.8), and x∗(n) = (x∗1(n), x∗
2(n), . . . , x∗m(n))
T
also be an ω-periodic solution of (1.8). Then
|xi(n+ 1)− x∗i (n+ 1)| ≤ |xi(n)− x∗
i (n)|e−bih + θi(h)
m∑j=1
αMij |xj(n)− x∗
j (n)|
+ θi(h)m∑j=1
βMij |xj(n− τij)− x∗
j (n− τij)|
+ θi(h)m∑j=1
γMij
+∞∑p=1
Hij(p)|xj(n− p)− x∗j (n− p)| i = 1, 2, . . . ,m. (3.2)
Now we consider functions βi(·, ·), i = 1, 2, . . . ,m, defined by
βi(υi, n) = 1− υie−bi(n)h − υiθi(h)m∑j=1
αMji − θi(h)m∑j=1
βMji υτji+1i − θi(h)
m∑j=1
γMji
+∞∑p=1
υp+1i Hji(p),
where υi ∈ [1,+∞), n ∈ Iω, i = 1, 2, . . . ,m. Since
βi(1, n) = 1− e−bi(n)h − θi(h)m∑j=1
αMji − θi(h)m∑j=1
βMji − θi(h)m∑j=1
γMji
= θi(h)
(bi(n)−
m∑j=1
αMji −
m∑j=1
βMji −
m∑j=1
γMji
)≥ min1≤i≤mθi(h)η > 0, i = 1, 2, . . . ,m (3.3)
where
η = min1≤i≤m
bi −
m∑j=1
αMji −
m∑j=1
βMji −
m∑j=1
γMji
,
using the continuity of βi(υi, n) on [1,+∞) with respect to υi and the fact that βi(υi, n) → −∞ as υi → +∞ uniformlyin n ∈ Iω, i = 1, 2, . . . ,m, we see that there exists υ∗i (n) ∈ (1,+∞) such that βi(υ
∗
i (n), n) = 0 for n ∈ Iω, i = 1, 2, . . . ,m.By choosing λ = minυ∗i (n), n ∈ Iω, i = 1, 2, . . . ,m, where λ > 1, we obtain βi(λ, n) ≥ 0 for all n ∈ Iω, i = 1, 2, . . . ,mwith the implications
λe−bi(n)h − λθi(h)m∑j=1
αMji − θi(h)m∑j=1
βMji λτji+1 − θi(h)
m∑j=1
γMji
+∞∑p=1
λp+1Hji(p) ≤ 1, n ∈ Iω, i = 1, 2, . . . ,m.
Hence
λe−bih − λθi(h)m∑j=1
αMji − θi(h)m∑j=1
βMji λτji+1 − θi(h)
m∑j=1
γMji
+∞∑p=1
λp+1Hji(p) ≤ 1, i = 1, 2, . . . ,m. (3.4)
Now let us consider
ui(n) = λn|xi(n)− x∗i (n)|
θi(h), n ∈ Z, i = 1, 2, . . . ,m. (3.5)
Using (3.2) and (3.5), we derive that
ui(n+ 1) ≤ λe−bihui(n)+ λm∑j=1
θj(h)αMij uj(n)+m∑j=1
θj(h)βMij λτij+1uj(n− τij)
+
m∑j=1
θj(h)γMij+∞∑p=1
λp+1Hij(p)uj(n− p). (3.6)
![Page 7: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/7.jpg)
392 C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396
We consider the Lyapunov function
V (n) =m∑i=1
(ui(n)+
m∑j=1
θj(h)βMij λτij+1
n−1∑s=n−τij
ui(s)+m∑j=1
θj(h)γMij+∞∑p=1
Hij(p)λp+1n−1∑s=n−p
ui(s)
). (3.7)
Consider the difference1V (n) = V (n+ 1)− V (n) along (3.7), we obtain
1V (n) =m∑i=1
(ui(n+ 1)− ui(n)+
m∑j=1
θj(h)βMij λτij+1
(uj(n)− uj(n− τij)
)
+
m∑j=1
θj(h)γMij+∞∑p=1
Hij(p)λp+1(uj(n)− uj(n− p)
)),
≤
m∑i=1
((λe−bih − 1)ui(n)+ λ
m∑j=1
θj(h)αMij uj(n)
+
m∑j=1
θj(h)βMij λτij+1uj(n)+
m∑j=1
θj(h)γMij+∞∑p=1
λp+1Hij(p)uj(n)
),
≤ −
m∑i=1
(1− λe−bih − λθi(h)
m∑j=1
αMji − θi(h)m∑j=1
βMji λτji+1 − θi(h)
m∑j=1
γMji
+∞∑p=1
λp+1Hji(p)
)ui(n),
and by using (3.4) in the above we deduce that1V (n) ≤ 0 for n ∈ Z+0 . From this result and (3.7) it follows thatm∑i=1
ui ≤ V (n) ≤ V (0) for n ∈ Z+.
Thus,m∑i=1
ui = λnm∑i=1
|xi(n)− x∗i (n)|θi(h)
≤
m∑i=1
(ui(0)+
m∑j=1
θj(h)βMij λτij+1
−1∑s=−τij
ui(s)+m∑j=1
θj(h)γMij+∞∑p=1
Hij(p)λp+1−1∑s=−p
ui(s)
),
≤
m∑i=1
(1+ θi(h)
m∑j=1
βMji λτji+1τji + θi(h)
m∑j=1
γMji
+∞∑p=1
Hji(p)λp+1p
)sups∈Z−0
|xi(s)− x∗i (s)|.
Therefore, we obtain the assertion (3.1), where
δ =
max1≤i≤mθi(h)
min1≤i≤mθi(h)
σ ≥ 1,
σ = max1≤i≤m
1+ θi(h)
m∑j=1
βMji λτji+1τji + θi(h)
m∑j=1
γMji
+∞∑p=1
Hji(p)λp+1p
≥ 1.
We conclude from (3.1) that the unique periodic solution of (1.8) is globally exponentially stable and this completes theproof.
4. Numerical simulations
In this section, we consider the following discrete-time neural networks with delays:
![Page 8: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/8.jpg)
C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396 393
Example 4.1.
x(n+ 1) = x(n)e−5h[cos(7nπ)+4] +1− e−5h[cos(7nπ)+4]
5[cos(7nπ)+ 4]
y(n)
2+ [y(n)]2+
z(n)3+ [z(n)]2
+x(n− 8)
3+ [x(n− 8)]2
+
100∑p=1(eh − 1)e−phx(n− p)
1+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 8 sin(nπ),
y(n+ 1) = y(n)e−4h[sin(5nπ)+7] +1− e−4h[sin(5nπ)+7]
4[sin(5nπ)+ 7]
x(n)
1+ [x(n)]2+
y(n)3+ [y(n)]2
+y(n− 1)
1+ [y(n− 1)]2
+
100∑p=1(eh − 1)e−phx(n− p)
2+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 7 cos(nπ),
z(n+ 1) = z(n)e−6h[sin(3nπ)+6] +1− e−6h[sin(3nπ)+6]
6[sin(3nπ)+ 6]
x(n)h
1+ [x(n)]2+
z(n)h3+ [z(n)]2
+z(n− 6)h
1+ [z(n− 6)]2
+
100∑p=1(eh − 1)e−phx(n− p)h
3+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 2 sin(nπ),
(4.1)
with initial conditions
(1) ϕ1(n) = 3 sin(πnh)+ 7, ϕ2(n) = 1.8 cos(3πnh)+ 1, ϕ3(n) = sin(2πnh)− 20,(2) ϕ1(n) = 5 sin(2πnh)− 6, ϕ2(n) = 0.8 cos(2πnh)+ 15, ϕ3(n) = sin(4πnh)− 6,(3) ϕ1(n) = 3 sin(πnh)+ 17, ϕ2(n) = 3 cos(πnh)− 15, ϕ3(n) = 2 sin(2πnh)+ 2,
respectively.
It is easy to verify that
bi(n) >
(m∑j=1
αMji +
m∑j=1
βMji +
m∑j=1
γMji
),
and the functionHij(p) satisfy100∑p=1
(eh − 1)e−ph ≈ 1,
and100∑p=1
p(eh − 1)e−ph <+∞∑p=1
p(eh − 1)e−ph < +∞,
![Page 9: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/9.jpg)
394 C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396
so the condition (H4) is satisfied. Furthermore, using the same method as Theorem 3.1, we can choose a constant λ suchthat
100∑p=1
pλpHji(p) =100∑p=1
pλp(eh − 1)e−ph <+∞∑p=1
pλp(eh − 1)e−ph =λ(eh − 1)e−h
(1− λe−h)2
< +∞.
Thus, by using Theorems 2.1 and 3.1, the system (4.1) has a unique positive 2-periodic solution which is globally exponen-tially stable (see Fig. 1).
Example 4.2.
x(n+ 1) = x(n)e−4h[sin(7nπ)+5] +1− e−4h[sin(7nπ)+5]
4[sin(7nπ)+ 5]
x(n)
1+ [x(n)]2+
y(n)2+ [y(n)]2
+z(n)
3+ [z(n)]2
+x(n− 1)
1+ [x(n− 1)]2+
200∑p=1(eh − 1)e−phx(n− p)
1+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 5 sin(nπ),
y(n+ 1) = y(n)e−4h[sin(5nπ)+7] +1− e−4h[sin(5nπ)+7]
4[sin(5nπ)+ 7]
x(n)
1+ [x(n)]2+
y(n)2+ [y(n)]2
+y(n)
3+ [y(n)]2
+x(n− 1)
1+ [x(n− 1)]2+
100∑p=1(eh − 1)e−phx(n− p)
2+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 6 cos(nπ),
z(n+ 1) = z(n)e−4h[sin(3nπ)+7] +1− e−4h[sin(3nπ)+7]
4[sin(3nπ)+ 7]
x(n)h
1+ [x(n)]2+
z(n)h2+ [y(n)]2
+z(n)h
3+ [z(n)]2
+x(n− 1)h
1+ [x(n− 1)]2+
100∑p=1(eh − 1)e−phx(n− p)h
3+
[100∑p=1(eh − 1)e−phx(n− p)
]2 + 7 sin(nπ),
(4.2)
with initial conditions
(1) ϕ1(n) = 7, ϕ2(n) = 1, ϕ3(n) = −20,(2) ϕ1(n) = −6, ϕ2(n) = 5, ϕ3(n) = −6,(3) ϕ1(n) = 1, ϕ2(n) = −5, ϕ3(n) = 2,
respectively.
It is straightforward to check that all the conditions needed in Theorems 2.1 and 3.1 are satisfied. Therefore, system (4.2)has exactly one 2-periodic solution which is globally exponentially stable (see Fig. 2).
![Page 10: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/10.jpg)
C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396 395
t
x y
t
z–20
–10
0
10
20
–20
–10
0
10
20
–10 0 10 20 30
t–10 0 10 20 30
–30
–20
–10
0
10
–10 0 10 20 30
Fig. 1. ‘‘−·’’ represents the state trajectory of the neural network (4.1) with initial condition (1); ‘‘· · ·’’ represents the state trajectory of the neural network(4.1) with initial condition (2); ‘‘−’’ represents the state trajectory of the neural network (4.1) with initial condition (3). The three state trajectories of theneural network (4.1) are all converge exponentially to a unique 2-periodic solution (h = 0.025).
xy
t
yz
t
zx
xy
z
–10
0
10
20
t
–10
0
10
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20
–20
0
20–10
0
10
20
Fig. 2. Seen from the three-dimensional spaces, the neural network (4.2) has a unique globally exponentially stable 2-periodic solution (h = 0.025).
![Page 11: Numerical simulation of periodic solutions for a class of numerical discretization neural networks](https://reader035.vdocuments.us/reader035/viewer/2022080311/57501fe81a28ab877e980c4c/html5/thumbnails/11.jpg)
396 C. Lu et al. / Mathematical and Computer Modelling 52 (2010) 386–396
Acknowledgements
The authors would like to thank the Editor and the reviewers for their helpful comments and constructive suggestions,which have been very useful for improving the presentation of this paper.
References
[1] Y.Q. Li, L.H. Huang, Exponential convergence behavior of solutions to shunting inhibitory cellular neural networks with delays and time-varyingcoefficients, Math. Comput. Modelling 48 (2008) 499–504.
[2] W. Wu, B.T. Cui, X.Y. Lou, Global exponential stability of Cohen–Grossberg neural networks with distributed delays, Math. Comput. Modelling (2008)868–873.
[3] H.J. Jiang, J.D. Cao, BAM-type Cohen–Grossberg neural networks with time delays, Math. Comput. Modelling 47 (2008) 92–103.[4] S.J. Guo, L.H. Huang, Stability of nonlinear waves in a ring of neurons with delays, J. Differential Equations 236 (2007) 343–374.[5] Y.R. Liu, Z.D. Wang, X.H. Liu, Asymptotic stability for neural networks with mixed time-delays: the discrete-time case, Neural Netw. 22 (2009) 67–74.[6] S.M. Zhong, X.Z. Liu, Exponential stability and periodicity of cellular neural networkswith time delay, Math. Comput. Modelling 45 (2007) 1231–1240.[7] J.H. Park, A new stability analysis of delayed cellular neural networks, Appl. Math. Comput. 181 (2006) 200–205.[8] M. Fan, X.F. Zou, Global asymptotic stability of a class of nonautonomous integro-differential systems and applications, Nonlinear Anal. 57 (2004)111–135.
[9] S. Mohamad, A.G. Naim, Discrete-time analogues of integrodifferential equationsmodelling bidirectional neural networks, J. Comput. Appl. Math. 138(2002) 1–20.
[10] Y.K. Li, Global stability and existence of periodic solutions of discrete delayed cellular neural networks, Phys. Lett. A 333 (2004) 51–61.[11] S. Mohamad, K. Gopalsamy, Neuronal dynamics in time varying environments: continuous and discrete time models, Discrete Contin. Dyn. Syst. 6
(2000) 841–860.[12] P. Liu, X. Cui, A discrete model of competition, Math. Comput. Simulation 49 (1999) 1–12.[13] R.E. Gaines, J.L. Mawhin, Coincidence Degree and Nonlinear Differential Equations, Springer-Verlag, Berlin, 1977.[14] T. Yoshizawa, Stability theory by Liapunov’s second method, Math. Soc. Japan, Tokyo, 1966.[15] T. Yoshizawa, Stability Theory and the Existence of Periodic Solutions and Almost-Periodic Solutions, in: Applied Math. Sciences, vol. 14, Springer-
Verlag, 1975.[16] M. Fan, K. Wang, Periodic solutions of a discrete time nonautonomous ratio-dependent predator–prey system, Math. Comput. Modelling 35 (2002)
951–961.[17] M. Fan, K.Wang, Existence and global attractivity of positive periodic solutions of periodic n-species Lotka–Volterra competition systemswith several
deviating arguments, Math. Biosci. 160 (1999) 47–61.