- Open Access
Delay-probability-distribution-dependent stability criteria for discrete-time stochastic neural networks with random delays
© Zhou et al.; licensee Springer. 2013
- Received: 20 June 2013
- Accepted: 10 September 2013
- Published: 8 November 2013
The problem of delay-probability-distribution-dependent robust stability for a class of discrete-time stochastic neural networks (DSNNs) with delayed and parameter uncertainties is investigated. The information of the probability distribution of the delay is considered and transformed into parameter matrices of the transferred DSSN model. In the DSSN model, the time-varying delay is characterized by introducing a Bernoulli stochastic variable. By constructing an augmented Lyapunov-Krasovskii functional and introducing some analysis techniques, some novel delay-distribution-dependent mean square stability conditions for the DSSN, which are to be robustly globally exponentially stable, are derived. Finally, a numerical example is provided to demonstrate less conservatism and effectiveness of the proposed methods.
- discrete-time stochastic neural networks
- discrete time-varying delays
- robust exponential stability
In the past few decades, neural networks (NNs) have received considerable attention owing to their potential applications in a variety of areas such as signal processing , pattern recognition , static image processing , associative memory , combinatorial optimization  and so on. In recent years, the stability problem of time-delay NNs has become a topic of great theoretic and practical use importance due to the fact that inherent time delays and unavoidable parameter uncertainties are all well known to many biological and artificial NNs because of the finite speed of information processing as well as the NNs parameter fluctuations of the hardware implementation. Various efforts have been achieved in the stability analysis of NNs with time-varying delays and parameter uncertainties, please refer to [6–15] and some following references.
The majority of the existing research results have been limited in continuous-time and deterministic NNs. On the one hand, in implementation and application of the NNs, discrete-time neural networks (DNNs) play a more important role than their continuous-time counter-parts in today’s digital world. To be more specific, DNNs can ideally keep the dynamical characteristics, functional similarity, and even the physical of biological reality of the continuous-time NNs under mild restriction. On the other hand, when modeling real NNs systems, stochastic disturbance is probably the main resource of the performance degradation of the implemented NN. Thus, the research on the dynamical behavior of discrete-time stochastic neural networks (DSNNs) with time-varying delays and parameter uncertainties is necessary. Recently, stability analysis for DSNNs with time-varying delays and parameter uncertainties has received more and more interest. Some stability criteria have been proposed in [16–20]. In , Liu with his coauthor have researched a class of DSNNs with time-varying delays and parameter uncertainties and have proposed some delay-dependent sufficient conditions guaranteeing the global robust exponential stability by using the Lyapunov method and the linear matrix inequality technology. Employing the similar technique to that in , the result obtained in  has been improved by Zhang et al. in  and Luo with his coauthor in .
In practice, the time-varying delay in some NNs often exists in a stochastic fashion [21–26]. That is, the time-varying delay in some NNs may be subject to probabilistic measurement delays. In some NNs, the output signal of the node is transferred to another node by multi-branches with arbitrary time delays, which are random, and its probabilities can often be measured by the statistical methods such as normal distribution, uniform distribution, Poisson distribution, Bernoulli random binary distribution. In most of the existing references for DSNNs, the deterministic time delay case was concerned, and the stability criteria were derived based on the information of variation range of the time delay, [16–20], or the information of variation range of the time delay and time delays themselves  and . However, it often occurs in the real systems, where the max value of the delay is very large, but the probability of it to take such a large value is very small. It may lead to a more conservative result if only the information of variation range of time delay is considered. Yet, as far as we know, little attention has been paid to the study of stability of DSNNs with stochastic time delay, when considering the variation range and the probability distribution of the time delay. More recently, in , some sufficient conditions on robust globally exponential stability for a class of SDNNs, which is an involved parameter, uncertainties and stochastic delay were derived. What is more, the robust globally exponential stability analysis problem for uncertain DSNNs with random delay has not been adequately investigated and still needs challenge.
In this paper, some new improved delay-probability-distribution-dependent stability criteria, which guarantee the robust global exponential stability for discrete-time stochastic neural networks with time-varying delay are obtained via constructing a novel augmented Lyapunov-Krasovskii functional. These new conditions are less conservative than those obtained in [16–18] and . The numerical example is also provided to illuminate the improvement of the proposed criteria.
The notations are quite standard. Throughout this paper, stands for the set of nonnegative integers, and denote, respectively, the n-dimensioned Euclidean space and the set of all real matrices. The superscript ‘T’ denotes the transpose and the notation (respective ) means that X and Y are symmetric matrices, and that is positive semi-definitive (respective positive definite). is the Euclidean norm in . I is the identity matrix with appropriate dimensions. If A is a matrix, denote by its own operator norm, i.e., , where (respectively, ) means the largest (respectively, smallest) eigenvalue of A. Moreover, let be a complete probability space with a filtration to satisfy the usual conditions (i.e., the filtration contains all P-null sets and is right continuous). stands for the mathematical expectation operator with respect to the given probability measure P. The asterisk ∗ in a matrix is used to denote term that is induced by symmetry. Matrices, if not explicitly, are assumed to have compatible dimensions. . Sometimes, the arguments of function will be omitted in the analysis when no confusion can arise.
where , , and are known constants.
Remark 1 The constants , , , in Assumption 1 are allowed to be positive, negative, or zero. Hence, the functions and could be non-monotonic, and are more general than the usual sigmoid functions and the commonly used Lipschitz conditions recently.
where , are known constant scalars. Thus, the assumption condition (8) is a special case of the assumption condition (7). It should be pointed out that the robust delay-distribution-dependent stability criteria for DSNNs with time-varying delay by (7) is generally less conservative than by (8).
It is obvious that , (empty set). It is easy to check that implies that the event takes place, and means that happens.
where is a constant.
Assumption 5 Assume that for any , is independent of .
Remark 4 It is noted that the introduction of binary stochastic variable was first introduced in  and then successfully used in [25, 26, 28]. By introducing the new functions and , the stochastic variable sequence , system (1) is transformed into (14). In (14), the probabilistic effects of the time delay have been translated into the parameter matrices of the transformed system. Then, the stochastic stability criteria based on the new model (14) can be derived, which show the relationship between the stability of the system and the variation range of the time delay and the probability distribution parameter.
It is obvious that is a trivial solution of DSNN (15).
The following definition and lemmas are needed to conclude our main results.
Definition 2.1 
for all parameter uncertainties satisfying the admissible condition.
Lemma 2.1 
Lemma 2.2 
where denotes the unit column vector having one element on its i th row, zeros elsewhere.
By Definition 2.1, the DSNN (15) is globally exponentially stable in the mean square. This completes the proof. □
Remark 5 In Theorem 3.1, free-weighting matrices R, H, S, T are introduced by constructing a new Lyapunov functional (19). On the one hand, in (19), the useful information of the time delays is considered sufficiently. On the other hand, the terms , are introduced and make full use of the information of the activation function . Which make this stability criterion generally less conservative than those obtained in [16–18, 28]. However, because of the parameter uncertainties contained in (18), it is difficult to use Theorem 3.1 directly to determine the stability of the DSNN (15). Thus, it is necessary for us to give another criterion as follows.
which implies that (46) holds. This completes the proof. □
Remark 6 When , the DSNN (15) reduce to (1), which has been well investigated in [16–18]. By setting , and in Theorem 3.2 and deleting the fifth rows and the corresponding fifth columns of (46), we can obtain the stability condition for system (1), which can be easily seen to be equivalent to Theorem 3.2 in .
then we get the following results.
For given , , allowable upper bounds with different probability distribution of time delay
By Theorem 3.2
Remark 7 From this example, we can see that stability conditions in this paper are dependent on time delays themselves, the variation interval and the distribution probability of the delay, that is, not only dependent on the time-delay interval, which distinguishes them from the traditional delay-dependent stability conditions.
In this paper, the robust delay-probability-distribution-dependent stochastic stability problem for a class of DSNNs with parameter uncertainties has been studied. In terms of LMIs technique, and combined with Lyapunov stable theory, a new augmented Lyapunov-Krasovskii functional has been constructed, and some novel sufficient conditions ensuring robustly globally exponentially stable in the mean square sense have been derived. Compared with some previous works established in the literature cited therein, the new criteria derived in this paper are less conservative. The numerical example has been demonstrated to show the validity of these new sufficient conditions.
The authors would like to thank the editor, the associate editor and the anonymous referees for their detailed comments and valuable suggestions which considerably improved the presentation of this paper. The work of Xia Zhou is supported by the National Natural Science Foundation of China (No. 11226140) and the Anhui Provincial Colleges and Universities Natural Science Foundation (No. KJ2013Z267). The work of Yong Ren is supported by the National Natural Science Foundation of China (No. 10901003 and 11126238), the Distinguished Young Scholars of Anhui Province (No. 1108085J08), the Key Project of Chinese Ministry of Education (No. 211077) and the Anhui Provincial Natural Science Foundation (No. 10040606Q30).
- Suaha FBM, Ahmad M, Taib MN: Applications of artificial neural network on signal processing of optical fibre pH sensor based on bromophenol blue doped with sol–gel film. Sens. Actuators B, Chem. 2003, 90: 182-188. 10.1016/S0925-4005(03)00026-1View ArticleGoogle Scholar
- Anagun AS: A neural network applied to pattern recognition in statistical process control. Comput. Ind. Eng. 1998, 35: 185-188.View ArticleGoogle Scholar
- Wilson CL, Watson CI, Paek EG: Effect of resolution and image quality on combined optical and neural network fingerprint matching. Pattern Recognit. 2000, 33: 317-331. 10.1016/S0031-3203(99)00052-7View ArticleGoogle Scholar
- Miyoshi S, Yanai HF, Okada M: Associative memory by recurrent neural networks with delay elements. Neural Netw. 2004, 17: 55-63. 10.1016/S0893-6080(03)00207-7View ArticleMATHGoogle Scholar
- Ding Z, Leung H, Zhu Z: A study of the transiently chaotic neural network for combinatorial optimization. Math. Comput. Model. 2002, 36: 1007-1020. 10.1016/S0895-7177(02)00254-6MathSciNetView ArticleMATHGoogle Scholar
- Shao JL, Huang TZ, Wang XP: Further analysis on global robust exponential stability of neural networks with time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 2012, 17: 1117-1124. 10.1016/j.cnsns.2011.08.022MathSciNetView ArticleMATHGoogle Scholar
- Mathiyalagan K, Sakthivel R, Marshal S: New robust exponential stability results for discrete-time switched fuzzy neural networks with time delays. Comput. Math. Appl. 2012, 64: 2926-2938. 10.1016/j.camwa.2012.08.008MathSciNetView ArticleMATHGoogle Scholar
- Hou LL, Zong GD, Wu YQ: Robust exponential stability analysis of discrete-time switched Hopfield neural networks with time delay. Nonlinear Anal. Hybrid Syst. 2011, 5: 525-534. 10.1016/j.nahs.2010.10.014MathSciNetView ArticleMATHGoogle Scholar
- Wang WQ, Zhong SM, Nguang SK, Liu F: Novel delay-dependent stability criterion for uncertain genetic regulatory networks with interval time-varying delays. Neurocomputing 2013, 121: 170-178.View ArticleGoogle Scholar
- Faydasicok O, Arik S: Robust stability analysis of a class of neural networks with discrete time delays. Neural Netw. 2012, 29-30: 52-59.View ArticleMATHGoogle Scholar
- Deng FQ, Hua MG, Liu XZ, Peng YJ, Fei JT: Robust delay-dependent exponential stability for uncertain stochastic neural networks with mixed delays. Neurocomputing 2011, 74: 1503-1509. 10.1016/j.neucom.2010.08.027View ArticleGoogle Scholar
- Wang T, Zhang C, Fei SM: Further stability criteria on discrete-time delayed neural networks with distributed delay. Neurocomputing 2013, 111: 195-203.View ArticleGoogle Scholar
- Mahmoud MS, Ismail A: Improved results on robust exponential stability criteria for neutral-type delayed neural networks. Appl. Math. Comput. 2010, 217: 3011-3019. 10.1016/j.amc.2010.08.034MathSciNetView ArticleMATHGoogle Scholar
- Wang WQ, Zhong SM: Stochastic stability analysis of uncertain genetic regulatory networks with mixed time-varying delays. Neurocomputing 2012, 82: 143-156.View ArticleGoogle Scholar
- Zhu S, Shen Y: Robustness analysis for connection weight matrices of global exponential stability of stochastic recurrent neural networks. Neural Netw. 2013, 38: 17-22.View ArticleMATHGoogle Scholar
- Liu YR, Wang ZD, Liu XH: Robust stability of discrete-time stochastic neural networks with time-varying delays. Neurocomputing 2008, 71: 823-833. 10.1016/j.neucom.2007.03.008View ArticleGoogle Scholar
- Zhang YJ, Xu SY, Zeng ZP: Novel robust stability criteria of discrete-time stochastic recurrent neural networks with time delay. Neurocomputing 2009, 72: 3343-3351. 10.1016/j.neucom.2009.01.014View ArticleGoogle Scholar
- Luo MZ, Zhong SM, Wang RJ, Kang W: Robust stability analysis for discrete-time stochastic neural networks systems with time-varying delays. Appl. Math. Comput. 2009, 209: 305-313. 10.1016/j.amc.2008.12.084MathSciNetView ArticleMATHGoogle Scholar
- Gao M, Cui BT: Global robust exponential stability of discrete-time interval BAM neural networks with time-varying delays. Appl. Math. Model. 2009, 33: 1270-1284. 10.1016/j.apm.2008.01.019MathSciNetView ArticleMATHGoogle Scholar
- Udpin S, Niamsup P: Robust stability of discrete-time LPD neural networks with time-varying delay. Commun. Nonlinear Sci. Numer. Simul. 2009, 14: 3914-3924. 10.1016/j.cnsns.2008.08.018MathSciNetView ArticleMATHGoogle Scholar
- Balasubramaniam P, Vembarasan V, Rakkiyappan R: Delay-dependent robust exponential state estimation of Markovian jumping fuzzy Hopfield neural networks with mixed random time-varying delays. Commun. Nonlinear Sci. Numer. Simul. 2011, 16: 2109-2129. 10.1016/j.cnsns.2010.08.024MathSciNetView ArticleMATHGoogle Scholar
- Lou XY, Ye Q, Cui BT: Exponential stability of genetic regulatory networks with random delays. Neurocomputing 2010, 73: 759-769. 10.1016/j.neucom.2009.10.006View ArticleGoogle Scholar
- Ray A: Output feedback control under randomly varying distributed delays. J. Guid. Control Dyn. 1994, 17: 701-711. 10.2514/3.21258View ArticleMATHGoogle Scholar
- Nilsson J, Bernhardsson B, Wittenmark B: Stochastic analysis and control of real-time systems with random time delays. Automatica 1998, 34: 57-64. 10.1016/S0005-1098(97)00170-2MathSciNetView ArticleMATHGoogle Scholar
- Yue D, Zhang YJ, Tian EG, Peng C: Delay-distribution-dependent exponential stability criteria for discrete-time recurrent neural networks with stochastic delay. IEEE Trans. Neural Netw. 1998, 19: 1299-1306.Google Scholar
- Tang Y, Fang JA, Xia M, Yu DM: Delay-distribution-dependent stability of stochastic discrete-time neural networks with randomly mixed time-varying delays. Neurocomputing 2009, 2009(72):383-3838.Google Scholar
- Wu ZG, Su HY, Chua J, Zhou WN: Improved result on stability analysis of discrete stochastic neural networks with time delay. Phys. Lett. A 2009, 373: 1546-1552. 10.1016/j.physleta.2009.02.056MathSciNetView ArticleMATHGoogle Scholar
- Zhang YJ, Yue D, Tian EG: Robust delay-distribution-dependent stability of discrete-time stochastic neural networks with time-varying delay. Neurocomputing 2009, 72: 1265-1273. 10.1016/j.neucom.2008.01.028View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.