Skip to main content

Theory and Modern Applications

  • Research Article
  • Open access
  • Published:

Global Stability Analysis for Periodic Solution in Discontinuous Neural Networks with Nonlinear Growth Activations

Abstract

This paper considers a new class of additive neural networks where the neuron activations are modelled by discontinuous functions with nonlinear growth. By Leray-Schauder alternative theorem in differential inclusion theory, matrix theory, and generalized Lyapunov approach, a general result is derived which ensures the existence and global asymptotical stability of a unique periodic solution for such neural networks. The obtained results can be applied to neural networks with a broad range of activation functions assuming neither boundedness nor monotonicity, and also show that Forti's conjecture for discontinuous neural networks with nonlinear growth activations is true.

1. Introduction

The stability of neural networks, which includes the stability of periodic solution and the stability of equilibrium point, has been extensively studied by many authors so far; see, for example, [1–15]. In [1–4], the authors investigated the stability of periodic solutions of neural networks with or without time delays, where the assumptions on neuron activation functions include Lipschitz conditions, bounded and/or monotonic increasing property. Recently, in [13–15], the authors discussed global stability of the equilibrium points for the neural networks with discontinuous neuron activations. Particularly, in [14], Forti conjectures that all solutions of neural networks with discontinuous neuron activations converge to an asymptotically stable limit cycle whenever the neuron inputs are periodic functions. As far as we know, there are only works of Wu in [5, 7] and Papini and Taddei in [9] dealing with this conjecture. However, the activation functions are required to be monotonic in [5, 7, 9] and to be bounded in [5, 7].

In this paper, without assumptions of the boundedness and the monotonicity of the activation functions, by the Leray-Schauder alternative theorem in differential inclusion theory and some new analysis techniques, we study the existence of periodic solution for discontinuous neural networks with nonlinear growth activations. By constructing suitable Lyapunov functions we give a general condition on the global asymptotical stability of periodic solution. The results obtained in this paper show that Forti's conjecture in [14] for discontinuous neural networks with nonlinear growth activations is true.

For later discussion, we introduce the following notations.

Let , where the prime means the transpose. By (resp., ) we mean that (resp., ) for all . denotes the Euclidean norm of . denotes the inner product. denotes 2-norm of matrix , that is, , where denotes the spectral radius of .

Given a set , by we denote the closure of the convex hull of , and denotes the collection of all nonempty, closed, and convex subsets of . Let be a Banach space, and denotes the norm of , . By we denote the Banach space of the Lebesgue integrable functions : equipped with the norm . Let be a locally Lipschitz continuous function. Clarke's generalized gradient [16] of at is defined by

(11)

where is the set of Lebesgue measure zero where does not exist, and is an arbitrary set with measure zero.

The rest of this paper is organized as follows. Section 2 develops a discontinuous neural network model with nonlinear growth activations, and some preliminaries also are given. Section 3 presents the proof on the existence of periodic solution. Section 4 discusses global asymptotical stability of the neural network. Illustrative examples are provided to show the effectiveness of the obtained results in Section 5.

2. Model Description and Preliminaries

The model we consider in the present paper is the neural networks modeled by the differential equation

(21)

where is the vector of neuron states at time ; is an matrix representing the neuron inhibition; is an neuron interconnection matrix; , , represents the neuron input-output activation and is the continuous -periodic vector function denoting neuron inputs.

Throughout the paper, we assume that

: has only a finite number of discontinuity points in every compact set of . Moreover, there exist finite right limit and left limit at discontinuity point .

has the nonlinear growth property, that is, for all

(22)

where , are constants, and .

: for all , where is a constant.

Under the assumption , is undefined at the points where is discontinuous. Equation (2.1) is a differential equation with a discontinuous right-hand side. For (2.1), we adopt the following definition of the solution in the sense of Filippov [17] in this paper.

Definition 2.1.

Under the assumption , a solution of (2.1) on an interval with the initial value is an absolutely continuous function satisfying

(23)

It is easy to see that : is an upper semicontinuous set-valued map with nonempty compact convex values; hence, it is measurable [18]. By the measurable selection theorem [19], if is a solution of (2.1), then there exists a measurable function such that

(24)

Consider the following differential inclusion problem

(25)

It easily follows that if is a solution of (2.5), then defined by

(26)

is an -periodic solution of (2.1). Hence, for the neural network (2.1), finding the periodic solutions is equivalent to finding solutions of (2.5).

Definition 2.2.

The periodic solution with initial value of the neural network (2.1) is said to be globally asymptotically stable if is stable and for any solution , whose existence interval is , we have .

Lemma 2.3.

If is a Banach space, is nonempty closed convex with and is an upper semicontinuous set-valued map which maps bounded sets into relatively compact sets, then one of the following statements is true:

  1. (a)

    the set is unbounded;

  2. (b)

    the has a fixed point in , that is, there exists such that .

Lemma 2.3 is said to be the Leray-Schauder alternative theorem, whose proof can be found in [20]. Define the following:

(27)

then is a class of norms of , , and are Banach space under the norm .

If is (i) regular in [16]; (ii) positive definite, that is, for , and ; (iii) radially unbounded, that is, as , then is said to be C-regular.

Lemma 2.4 (Chain Rule [15]).

If is C-regular and is absolutely continuous on any compact interval of , then and are differential for a.e. , and one has

(28)

3. Existence of Periodic Solution

Theorem 3.1.

If the assumptions and hold, then for any , (2.1) has at least a solution defined on with the initial value .

Proof.

By the assumption , it is easy to get that : is an upper semicontinuous set-valued map with nonempty, compact, and convex values. Hence, by Definition 2.1, the local existence of a solution for (2.1) on , , with , is obvious [17].

Set . Since is a continuous -periodic vector function, is bounded, that is, there exists a constant such that , . By the assumption , we have

(31)

By , we can choose a constant , such that when ,

(32)

By (2.4), (3.1), (3.2), and the Cauchy inequality, when ,

(33)

Therefore, let , then, by (3.3), it follows that on . This means that the local solution is bounded. Thus, (2.1) has at least a solution with the initial value on . This completes the proof.

Theorem 3.1 shows the existence of solutions of (2.1). In the following, we will prove that (2.1) has an -periodic solution.

Let for all , then is a linear operator.

Proposition 3.2.

is bounded, one to one and surjective.

Proof.

For any , we have

(34)

this implies that is bounded.

Let . If , then

(35)

By the assumption ,

(36)

Noting , we have

(37)

By (3.6),

(38)

Hence . It follows . This shows that is one to one.

Let . In order to verify that is surjective, in the following, we will prove that there exists such that

(39)

that is, we will prove that there exists a solution for the differential equation

(310)

Consider Cauchy problem

(311)

It is easily checked that

(312)

is the solution of (3.11). By (3.12), we want , then

(313)

that is,

(314)

By the assumption , is a nonsingular matrix, where is a unit matrix. Thus, by (3.14), if we take as

(315)

in (3.12), then (3.12) is the solution of (3.10). This shows that is surjective. This completes the proof.

By the Banach inverse operator theorem, is a bounded linear operator.

For any , define the set-valued map as

(316)

Then has the following properties.

Proposition 3.3.

has nonempty closed convex values in and is also upper semicontinuous from into endowed with the weak topology.

Proof.

The closedness and convexity of values of are clear. Next, we verify the nonemptiness. In fact, for any , there exists a sequence of step functions such that and a.e. on . By the assumption (1) and the continuity of , we can get that is graph measurable. Hence, for any , admits a measurable selector . By the assumption (2), is uniformly integrable. So by Dunford-Pettis theorem, there exists a subsequence such that weakly in . Hence, from [21, Theorem  3.1], we have

(317)

Noting that is an upper semicontinuous set-valued map with nonempty closed convex values on for a.e. , . Therefore, . This shows that is nonempty.

At last we will prove that is upper semicontinuous from into . Let be a nonempty and weakly closed subset of , then we need only to prove that the set

(318)

is closed. Let and assume in , then there exists a subsequence such that a.e. on . Take , , then By the assumption (2) and Dunford-Pettis theorem, there exists a subsequence such that weakly in . As before we have

(319)

This implies , that is, is closed in . The proof is complete.

Theorem 3.4.

Under the assumptions and , there exists a solution for the boundary-value problem (2.5), that is, the neural network (2.1) has an -periodic solution.

Proof.

Consider the set-valued map . Since is continuous and is upper semicontinuous, the set-valued map is upper semicontinuous. Let be a bounded set, then

(320)

is a bounded subset of . Since is a bounded linear operator, is a bounded subset of . Noting that is compactly embedded in , is a relatively compact subset of . Hence by Proposition 3.3, is the upper semicontinuous set-valued map which maps bounded sets into relatively compact sets.

For any , when , by (3.1) and the Cauchy inequality,

(321)

Arguing as (3.2), we can choose a constant , such that when ,

(322)

Therefore, when , by (3.21),

(323)

Set

(324)

In the following, we will prove that is a bounded subset of . Let , then , that is, . By the definition of , there exists a measurable selection , such that

(325)

By (3.23) and (3.25), . Otherwise, . By , we have . Since is continuous, we can choose , such that

(326)

Thus, there exists a constant , such that when , . By (3.23) and (3.25),

(327)

This is a contradiction. Thus, for any . Furthermore, we have

(328)

This shows that is a bounded subset of .

By Lemma 2.3, the set-valued map has a fixed point, that is, there exists such that , . Hence there exists a measurable selection , such that

(329)

By the definition of , . Moreover, by Definition 2.1 and (3.29), is a solution of the boundary-value problem (2.5), that is, the neural network (2.1) has an -periodic solution. The proof is completed.

4. Global Asymptotical Stability of Periodic Solution

Theorem 4.1.

Suppose that and the following assumptions are satisfied.

: for each , there exists a constant , such that for all two different numbers , for all and for all

(41)

: is a diagonal matrix, and there exists a positive diagonal matrix such that and

(42)

where is the minimum eigenvalues of symmetric matrix , , for all . Then the neural network (2.1) has a unique -periodic solution which is globally asymptotically stable .

Proof.

By the assumptions and , there exists a positive constant such that

(43)

for all , , and

(44)

In fact, from (4.2), we have

(45)

Choose , then , which implies that (4.3) holds from (4.1), and (4.4) is also satisfied. By the assumption , it is easy to get that the assumption is satisfied. By Theorem 3.4, the neural network (2.1) has an -periodic solution. Let be an -periodic solution of the neural network (2.1). Consider the change of variables , which transforms (2.4) into the differential equation

(46)

where , , and . Obviously, is a solution of (4.6).

Consider the following Lyapunov function:

(47)

By (4.3),

(48)

and thus . In addition, it is easy to check that is regular in and . This implies that is C-regular. Calculate the derivative of along the solution of (4.6). By Lemma 2.4, (4.3), and (4.4),

(49)

where . Thus, the solution of (4.6) is globally asymptotically stable, so is the periodic solution of the neural network (2.1). Consequently, the periodic solution is unique. The proof is completed.

Remark 4.2.

  1. (1)

    If is nondecreasing, then the assumption obviously holds. Thus the assumption is more general.

  2. (2)

    In [14], Forti et al. considered delayed neural networks modelled by the differential equation

    (410)

where is a positive diagonal matrix, and is an constant matrix which represents the delayed neuron interconnection. When satisfies the assumption (1) and is nondecreasing and bounded, [14] investigated the existence and global exponential stability of the equilibrium point, and global convergence in finite time for the neural network (4.10). At last, Forti conjectured that the neural network

(411)

has a unique periodic solution and all solutions converge to the asymptotically stable limit cycle when is a periodic function. When , the neural network (4.11) changes as the neural network (2.1) without delays. Thus, without assumptions of the boundedness and the monotonicity of the activation functions, Theorem 4.1 obtained in this paper shows that Forti's conjecture for discontinuous neural networks with nonlinear growth activations and without delays is true.

5. Illustrative Example

Example 5.1.

Consider the three-dimensional neural network (2.1) defined by ,

(51)

It is easy to see that is discontinuous, unbounded, and nonmonotonic and satisfies the assumptions and . , . Take and , then , and we have

(52)

All the assumptions of Theorem 4.1 hold and the neural network in Example has a unique -periodic solution which is globally asymptotically stable.

Figures 1 and 2 show the state trajectory of this neural network with random initial value . It can be seen that this trajectory converges to the unique periodic solution of this neural network. This is in accordance with the conclusion of Theorem 4.1.

Figure 1
figure 1

Time-domain behavior of the state variables , and .

Figure 2
figure 2

Phase-plane behavior of the state variables , and .

References

  1. Di Marco M, Forti M, Tesi A: Existence and characterization of limit cycles in nearly symmetric neural networks. IEEE Transactions on Circuits and Systems I 2002,49(7):979-992. 10.1109/TCSI.2002.800481

    Article  Google Scholar 

  2. Chen B, Wang J: Global exponential periodicity and global exponential stability of a class of recurrent neural networks. Physics Letters A 2004,329(1-2):36-48. 10.1016/j.physleta.2004.06.072

    Article  MATH  Google Scholar 

  3. Cao J: New results concerning exponential stability and periodic solutions of delayed cellular neural networks. Physics Letters A 2003,307(2-3):136-147. 10.1016/S0375-9601(02)01720-6

    Article  MATH  MathSciNet  Google Scholar 

  4. Liu Z, Chen A, Cao J, Huang L: Existence and global exponential stability of periodic solution for BAM neural networks with periodic coefficients and time-varying delays. IEEE Transactions on Circuits and Systems I 2003,50(9):1162-1173. 10.1109/TCSI.2003.816306

    Article  MathSciNet  Google Scholar 

  5. Wu H, Li Y: Existence and stability of periodic solution for BAM neural networks with discontinuous neuron activations. Computers & Mathematics with Applications 2008,56(8):1981-1993. 10.1016/j.camwa.2008.04.027

    Article  MATH  MathSciNet  Google Scholar 

  6. Wu H, Xue X, Zhong X: Stability analysis for neural networks with discontinuous neuron activations and impulses. International Journal of Innovative Computing, Information and Control 2007,3(6B):1537-1548.

    Google Scholar 

  7. Wu H: Stability analysis for periodic solution of neural networks with discontinuous neuron activations. Nonlinear Analysis: Real World Applications 2009,10(3):1717-1729. 10.1016/j.nonrwa.2008.02.024

    Article  MATH  MathSciNet  Google Scholar 

  8. Wu H, San C: Stability analysis for periodic solution of BAM neural networks with discontinuous neuron activations and impulses. Applied Mathematical Modelling 2009,33(6):2564-2574. 10.1016/j.apm.2008.07.022

    Article  MATH  MathSciNet  Google Scholar 

  9. Papini D, Taddei V: Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations. Physics Letters A 2005,343(1–3):117-128.

    Article  MATH  Google Scholar 

  10. Wu H, Xue X: Stability analysis for neural networks with inverse Lipschitzian neuron activations and impulses. Applied Mathematical Modelling 2008,32(11):2347-2359. 10.1016/j.apm.2007.09.002

    Article  MATH  MathSciNet  Google Scholar 

  11. Wu H, Sun J, Zhong X: Analysis of dynamical behaviors for delayed neural networks with inverse Lipschitz neuron activations and impulses. International Journal of Innovative Computing, Information and Control 2008,4(3):705-715.

    Google Scholar 

  12. Wu H: Global exponential stability of Hopfield neural networks with delays and inverse Lipschitz neuron activations. Nonlinear Analysis: Real World Applications 2009,10(4):2297-2306. 10.1016/j.nonrwa.2008.04.016

    Article  MATH  MathSciNet  Google Scholar 

  13. Forti M, Nistri P: Global convergence of neural networks with discontinuous neuron activations. IEEE Transactions on Circuits and Systems I 2003,50(11):1421-1435. 10.1109/TCSI.2003.818614

    Article  MathSciNet  Google Scholar 

  14. Forti M, Nistri P, Papini D: Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain. IEEE Transactions on Neural Networks 2005,16(6):1449-1463. 10.1109/TNN.2005.852862

    Article  Google Scholar 

  15. Forti M, Grazzini M, Nistri P, Pancioni L: Generalized Lyapunov approach for convergence of neural networks with discontinuous or non-Lipschitz activations. Physica D 2006,214(1):88-99. 10.1016/j.physd.2005.12.006

    Article  MATH  MathSciNet  Google Scholar 

  16. Clarke FH: Optimization and Nonsmooth Analysis, Canadian Mathematical Society Series of Monographs and Advanced Texts. John Wiley & Sons, New York, NY, USA; 1983:xiii+308.

    Google Scholar 

  17. Filippov AF: Differential Equations with Discontinuous Right-Hand Side, Mathematics and Its Applications (Soviet Series). Kluwer Academic Publishers, Boston, Mass, USA; 1984.

    Google Scholar 

  18. Aubin J-P, Cellina A: Differential Inclusions: Set-Valued Maps and Viability Theory, Grundlehren der Mathematischen Wissenschaften. Volume 264. Springer, Berlin, Germany; 1984:xiii+342.

    Google Scholar 

  19. Aubin J-P, Frankowska H: Set-Valued Analysis, Systems and Control: Foundations and Applications. Volume 2. Birkhäuser, Boston, Mass, USA; 1990:xx+461.

    Google Scholar 

  20. Dugundji J, Granas A: Fixed Point Theory. Vol. I, Monografie Matematyczne, 61. Polish Scientific, Warsaw, Poland; 1982.

    Google Scholar 

  21. Papageorgiou NS: Convergence theorems for Banach space valued integrable multifunctions. International Journal of Mathematics and Mathematical Sciences 1987,10(3):433-442. 10.1155/S0161171287000516

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors are extremely grateful to anonymous reviewers for their valuable comments and suggestions, which help to enrich the content and improve the presentation of this paper. This work is supported by the National Science Foundation of China (60772079) and the National 863 Plans Projects (2006AA04z212).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huaiqin Wu.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Li, Y., Wu, H. Global Stability Analysis for Periodic Solution in Discontinuous Neural Networks with Nonlinear Growth Activations. Adv Differ Equ 2009, 798685 (2009). https://doi.org/10.1155/2009/798685

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1155/2009/798685

Keywords