- Research
- Open Access
Analysis of the equilibrium points of background neural networks with uniform firing rate
- Fang Xu^{1}Email author,
- Lingling Liu^{2} and
- Jianying Xiao^{2}
https://doi.org/10.1186/s13662-017-1322-z
© The Author(s) 2017
- Received: 29 May 2017
- Accepted: 18 August 2017
- Published: 6 October 2017
Abstract
In this paper, we give a complete analysis of the equilibrium points of background neural networks with uniform firing rates. By using continuity, monotonicity of some functions and Rolle’s theorem, the number of equilibrium points and their locations are obtained. Moreover, some novel sufficient conditions are given to guarantee the stability of the equilibrium points for the network model by utilizing Taylor’s theorem. A simulation example is conducted to illustrate the theories developed in this paper.
Keywords
- equilibrium point
- Rolle’s theorem
- background neural networks
1 Introduction
Dynamical analysis is one of the most important issues of recurrent neural networks and many results on this topic have been reported in the literature; see Atteneave [1]; Cohen and Grossberg [2]; Forti [3]; Hahnloser [4]; Zeng et al. [5]; Zeng and Wang [6]; Cao and Wang [7]; Chen et al. [8]; Chen [9]; Tang et al. [10]; Zhang et al. [11]; Zuo et al. [12]; Zhang et al. [13]; Li and Cao [14]; Samidurai and Manivannan [15] and the references therein. It is also an essential step towards successful applications such as signal processing and problem optimization. Analysis of the equilibrium point for the concerned recurrent neural networks is a very important part of dynamical analysis (Cheng et al. [16]; Qu et al. [17]). Especially, existence and stability problems of the equilibrium points for various types of recurrent neural networks have attracted significant attention of many researchers (Manivannan et al. [18]; Nie and Cao [19]; Manivannan et al. [20]). Generally, two ways are used to prove the existence of equilibria for a neural network model. One way to prove this is by using a mapping derived from the neural network being a homeomorphism (Forti and Tesi [21]; Chen [9]; Lu [22]; Zhao and Zhu [23]). Another way uses Brouwer’s fixed-point theorem (Forti [3]; Forti and Tesi [21]; Guo and Huang [24]; Wang [25]; Miller and Michel [26]).
In order to interpret the phenomena and exhibit how the dynamical states of recurrent neural networks are affected by a given external background input, the background neural network model was proposed in Salinas [27]. By utilizing theoretical models and computer simulations, it has been shown that small changes in this model may shift a network from a relatively quiet state to some other state with highly complex dynamics.
As far as we are concerned, few references have studied the dynamical properties of the background neural network model; see Zhang and Yi [28]; Wan et al. [29]; Xu and Yi [30]. However, since the network equations (2.1) are nonlinear and coupled equations, neither the well-known homeomorphism method nor Brouwer’s fixed-point theorem can be used to easily investigate the equilibrium point of (2.1). The only known theoretical results in the literature of local stability conditions of the equilibria for background neural networks are obtained by computing eigenvalues at equilibria (Salinas [27]). Unfortunately, so far, the equilibrium analysis problem for the background neural networks with firing rate (the uniform firing rate means that the firing rate is the same for all neurons) remains far from completion. The major difficulty stems from the network model which consists of highly nonlinear coupled equations. The lack of basic information on equilibria creates some difficulties in discussing dynamical properties and bifurcations of the background neural networks.
In this paper, we give a complete analysis of the equilibrium for the background neural networks with uniform firing rate. For the first time, we transform the equilibrium problem of the background neural network with firing rate into a root problem of a cubic equation. Not following the common idea of computing roots of the cubic equation, we analyze the equilibria through a geometrical formulation of the parameter conditions of the background neural networks. Correspondingly, the number and coordinates of the equilibria are determined by using continuity and monotonicity, together with Rolle’s theorem. Furthermore, novel sufficient stability conditions for the equilibria are given. The studies based on the background neural network with uniform firing rate provide an insightful understanding of the computational performance of system (2.1).
The rest of this paper is organized as follows. In Section 2, preliminaries are given. In Section 3, we establish conditions for the exact number of equilibria for the background networks. Locations of these equilibria are obtained. Moreover, we formulate novel sufficient conditions for stability of such equilibria. In Section 4, a simulation example is presented to illustrate the theoretical results. In Section 5, conclusions are drawn.
2 Preliminaries
3 Equilibria and qualitative properties
In this section, we present novel sufficient conditions which guarantee the existence and the number of equilibria for network (2.3). Our approach is based on a geometrical observation. We also establish stability criteria of these equilibria through the Taylor expansion at some equilibrium.
Theorem 1
The number of equilibria and their locations
Conditions | Number | Equilibria |
---|---|---|
\(\mathcal{T}_{111}\) | 1 | \(E_{1}^{(1)}\) |
\(\mathcal{T}_{112} \) | 2 | \(E_{2}^{(1)}\), \(E_{3}^{(1)}\) |
\(\mathcal{T}_{113}\) | 3 | \(E_{4}^{(1)}\), \(E_{5}^{(1)}\), \(E_{6}^{(1)}\) |
\(\mathcal{T}_{114}\) | 2 | \(E_{7}^{(1)}\), \(E_{8}^{(1)}\) |
\(\mathcal{T}_{115}\) | 1 | \(E_{9}^{(1)}\) |
\(\mathcal{T}_{12}\) | 1 | \(E_{10}^{(1)}\) |
\(\mathcal{T}_{13}\) | 1 | \(E_{11}^{(1)}\) |
\(\mathcal{T}_{21}\) | 1 | \(E_{1}^{(2)}\) |
\(\mathcal{T}_{31}\) | 1 | \(E_{1}^{(3)}\) |
Proof
System (2.3) has at most three equilibria in \((0,\Pi )\) because \(\operatorname{deg}(P)=3\).
We separately discuss the three cases: \(\Delta_{P'(x)}>0,\Delta_{P'(x)}=0\) and \(\Delta_{P'(x)}<0\).
Case 1: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)>0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs>0\). However, \(2ab-1\) may be positive, negative or zero. Thus, according to the sign of \(2ab-1\), we need to discuss the following three subcases.
Subcase 1.1: \(ab<\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}<\frac{1}{2}\).
- (a)\(\zeta_{-}\), \(\zeta_{+}\) are the minimum point and the maximum point of \(P(x)\), respectively. By simple computation, we obtain \(\zeta_{+}< \Pi \). According to Rolle’s theorem and the continuity of \(P(x)\), \(P(x)\) has a unique positive real zero \(x_{1}^{*}\in (\zeta_{+}, \Pi )\) when \(P(\zeta_{-})>0\). Obviously, system (2.3) has a unique equilibrium \(E_{1}^{(1)}:x_{1}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(a).
- (b)
\(P(x)\) has two positive real zeros \(\zeta_{-}\) and \(x_{2}^{*} \in (\zeta_{+},\Pi )\) when \(P(\zeta_{-})= 0\). Then it follows that system (2.3) has two equilibrium points \(E_{2}^{(1)}:\zeta_{-}\) and \(E_{3}^{(1)}:x_{2}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(b).
- (c.1)
\(P(x)\) has three distinct positive real zeros \(x_{3}^{*}\in (0, \zeta_{-}),x_{4}^{*}\in (\zeta_{-},\zeta_{+}),x_{5}^{*}\in (\zeta_{+}, \Pi )\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})>0\). It is also clear that system (2.3) has three equilibrium points \(E_{4}^{(1)}:x_{3} ^{*}\in (0,\zeta_{-})\), \(E_{5}^{(1)}:x_{4}^{*}\in (\zeta_{-},\zeta _{+})\) and \(E_{6}^{(1)}:x_{5}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(c). \(P(\zeta_{-})<0, P(\zeta_{+})>0\) on the solid curve.
- (c.2)
\(P(x)\) has two positive real zeros \(x_{6}^{*}\in (0,\zeta_{-})\) and \(\zeta_{+}\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})=0\). Then system (2.3) has two equilibrium points \(E_{7}^{(1)}:x_{6}^{*}\in (0, \zeta_{-})\) and \(E_{8}^{(1)}:\zeta_{+}\), as shown in Figure 1(c). \(P(\zeta_{-})<0, P(\zeta_{+})=0\) on the dotted curve.
- (c.3)
\(P(x)\) has one positive real zero \(x_{7}^{*}\in (0,\zeta_{-})\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})<0\). Then system (2.3) has a unique equilibrium \(E_{9}^{(1)}:x_{7}^{*}\in (0,\zeta_{-})\), as shown in Figure 1(c). \(P(\zeta_{-})<0\) and \(P(\zeta_{+})<0\) on the star-shaped curve.
Subcase 1.2: \(ab>\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}>\frac{1}{2}\).
Subcase 1.3: \(ab=\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}=\frac{1}{2}\).
\(P'(x)=0\) has one zero \(\zeta_{-}=0,\zeta_{+}=\frac{2a^{2}}{3c}>0\). \(P(x)\) has a unique positive real zero \(x_{9}^{*}\in (\zeta_{+}, \Pi )\) implying that system (2.3) has a unique equilibrium \(E_{11}^{(1)}:x_{9}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 2(e).
Case 2: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)=0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs=0\).
\(P'(x)\) has a multiple one \(\frac{a^{2}}{3c}\) of multiplicity 2. \(\frac{a^{2}}{3c}\) is not the extreme point of \(P(x)\). Combining with (3.1), \(P(x)\) has a positive zero \(x_{10}\in (0,\Pi )\), which implies that system (2.3) has a unique equilibrium \(E_{1}^{(2)}:x _{10}^{*}\in (0,\Pi )\).
Case 3: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)<0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs<0\).
\(P'(x)\) has no real zero. It means that \(P(x)\) has no extreme point. By (3.1), \(P(x)\) is a strictly monotonically decreasing function. Thus, \(P(x)\) has a positive zero \(x_{11}^{*}\in (0,\Pi )\). System (2.3) accordingly has a unique equilibrium \(E_{1}^{(3)}:x_{11} ^{*}\in (0,\Pi )\).
The proof is completed. □
Remark 1
It should be noted that, as shown in the proof of Theorem 1, we separately discuss the existence and the number of the equilibria in three cases based on the sign of the discriminant \(\Delta_{P'(x)}\). We have employed continuity and monotonicity of the function \(P(x)\), combined with Rolle’s theorem, to estimate the coordinates and the number of the equilibria. These techniques have provided some new analysis approaches, rather than simply using the well-known homeomorphism method and Brouwer’s fixed-point theorem. Moreover, these results may provide important theoretical foundations to further analyze the limit cycle and bifurcations of networks (2.3).
Next, we will discuss stability of the equilibrium point of system (2.3) by utilizing the Taylor expansion.
Theorem 2
Stability of the equilibrium points for ( 2.3 )
Conditions | Equilibria (stability) |
---|---|
\(\mathcal{T}_{111}\) | \(E_{1}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{112} \) | \(E_{2}^{(1)}\) (unstable),\(E_{3}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{113}\) | \(E_{4}^{(1)}\) (asymptotically stable), \(E_{5}^{(1)}\) (unstable), \(E_{6}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{114}\) | \(E_{7}^{(1)}\) (asymptotically stable), \(E_{8}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{115}\) | \(E_{9}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{12}\) | \(E_{10}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{13}\) | \(E_{11}^{(1)}\) (asymptotically stable) |
\(\mathcal{T}_{21}\) | \(E_{1}^{(2)}\) (asymptotically stable) |
\(\mathcal{T}_{31}\) | \(E_{1}^{(3)}\) (asymptotically stable) |
Proof
Let \(x^{*}\) be a general equilibrium of system (2.3).
Because \(F(x^{*})=0\), the sign of \(\dot{x}(t)\) is the same as that of \(F'(x^{*})\). Therefore, the stability of the equilibrium is determined by the sign of \(P'(x^{*})\). Concretely, \(P(x)\) is monotonically decreasing in \((\zeta_{+},\Pi )\) and \(P'(x_{1}^{*})<0\). Thus, the equilibrium point \(E_{1}^{(1)}\) is asymptotically stable. Similarly, equilibria \(E_{3}^{(1)}, E_{4}^{(1)}, E_{6}^{(1)}, E_{7}^{(1)}, E_{9} ^{(1)}, E_{10}^{(1)}, E_{11}^{(1)}, E_{1}^{(2)}\) and \(E_{1}^{(3)}\) are asymptotically stable, because \(P'(x_{i}^{*})<0\), when \(i=2,3,5,6,7,8,9,10,11\).
\(P(x)\) is monotonically increasing in \((\zeta_{+},\Pi )\) and \(P'(x_{4})^{*}>0\), which means that equilibrium point \(E_{5}^{(1)}\) is unstable.
It means that the sign of \(F''(x^{\ast })\) is the same as that of \(P''(x^{\ast })\) because \(1+cx^{{\ast }^{2}}>0\). \(P'(\zeta_{-})=0\) and \(P'(\zeta_{+})=0\) at equilibria \(E_{2}^{(1)}:\zeta_{-}, E_{8}^{(1)}: \zeta_{+}\), respectively. Thus, \(F'(\zeta_{-})=0,F'(\zeta_{+})=0\).
\(P''(\zeta_{-})>0\) because the graph of \(P(x)\) is concave down in the neighborhood of \(E_{2}^{(1)}\). \(P'(\zeta_{-})=0,P''(\zeta_{-})>0\), thus \(E_{2}^{(1)}\) is unstable. Similarly, \(P''(\zeta_{+})<0\) because the graph of \(P(x)\) is convex up in the neighborhood of \(E_{8}^{(1)}\). \(P'(\zeta_{+})=0,P''(\zeta_{+})<0\), thus \(E_{8}^{(1)}\) is asymptotically stable.
The proof is completed. □
Remark 2
It should be mentioned that we have derived the conditions for stability of the equilibria by using the Taylor expansion. In Figures 1 and 2, we see that the equilibrium is stable or unstable according to the sign \(P'(x^{*})\) at equilibrium \(x^{*}\). The method is easier and more convenient than constructing suitable Lyapunov-Krasovskii functionals or constructing energy functions.
4 Simulations
In this section, one example will be provided to illustrate and verify the theoretical results obtained in the above sections.
Example
Case 1: Consider a class of background neural networks (2.2) with \(w_{\mathrm{tot}}=1.8965, h=4.6457, vN=0.0900\) and \(s=50\). Thus, we obtain \(\frac{w^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=41.3951\). We randomly select the initial point \(x(0)\in (0,80)\).
Here, the parameters satisfy our conditions in Theorem 1: \(\Delta_{P'(x)}=0.0017>0,ab=0.1762< \frac{1}{2}\).
By simple computation, equation \(P'(x)=-3cx^{2}+2a^{2}x+2ab-1=0\) is found to have two distinct roots \(\zeta_{-}=5.7367 ,\zeta_{+}=20.9044\), with \(P(\zeta_{-})=-1.2559<0,P(\zeta_{+})=1.8846>0\).
In this investigation, system (2.3) has multiple stable equilibrium points. Thus, the network belongs to the multistable neural networks. Mathematically, multistability allows the network to have multiple stable fixed points and periodic orbits or limit cycles. Multiple stable equilibrium points or periodic orbits are accompanied by the existence of continuous attractors. Continuous attractors have been found in many applications, including applications related to visual perception, visual images, eye memory, etc. The existence of unstable equilibria is essential in winner-take-all problems (Yi et al. [31]). Therefore, the proposed work in this manuscript can be applied in aforementioned applications.
Case 2: If \(w_{\mathrm{tot}}=1.2, h=12, vN=0.02\) and \(s=63.36\), then \(\frac{w^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=75.2727\). We randomly select the initial point \(x(0)\in (0,80)\).
Case 3: If \(w_{\mathrm{tot}}=1.12, h=10, vN=0.03\) and \(s=65\), then \(\frac{w ^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=44.3518\). We randomly select the initial point \(x(0)\in (0,35)\).
In cases 2 and 3, the network (2.3) has a unique equilibrium point which is stable. Such convergent behavior is called ‘monostability’. Monostable networks can be used to solve optimization problems. Under some conditions, the network (2.3) in this paper is a monostable neural network. Therefore, the proposed work provides a novel model with respect to optimization problems.
5 Conclusions
In this paper, the dynamical properties of equilibria of the background neural network (2.2) (i.e., (2.3)) are analyzed, such as the number of equilibria and their locations. Firstly, the conditions for the number and locations of the equilibria of the network were investigated. Secondly, the conditions for stability of the equilibria were derived. The parametric relations between the dynamical properties of the equilibria and network parameters were revealed. These theories are primarily based on an observation on the geometric structures of the equations \(P(x)=0\). These studies enrich the analytical results for the equilibrium points of other related work.
The studies based on the background neural networks with firing rate (the uniform firing rate means that the firing rate is the same for all neurons, thus, we can regard system (2.3) as one-dimensional) may be further developed for the studies on general higher-dimensional systems, e.g., the mathematical methods in this paper can be applied to analyze the equilibria of background neural networks with two subnetworks which exhibit rival states (i.e., 2D background neural networks). The rivaling steady states have significant meaning in the development of practical applications of 2D neural networks. This switch problem (Terman and Rubin [32]; Toth et al. [33]) and binocular rivalry (Shpiro et al. [34]) are interesting topics for further research. Many practical problems, such as mechanical design and electrical networks, can be formulated as switch problems. In recent years, dynamical analysis for switched systems have attracted considerable research interest (Cao et al. [35]; Syed Ali et al. [36]). Our manuscript has given us further insight into providing parameter conditions for switched systems.
Declarations
Acknowledgements
This work was supported in part by the National Science Foundation of China under Grant 61202045, Grant 11501475, and in part by the Program of Science and Technology of Sichuan Province of China under Grant No. 2016JY0067.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Authors’ Affiliations
References
- Atteneave, F: Multistability in perception. Sci. Am. 225, 63-71 (1971) View ArticleGoogle Scholar
- Cohen, MA, Grossberg, S: Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Syst. Man Cybern. 13, 815-826 (1983) MathSciNetView ArticleMATHGoogle Scholar
- Forti, M: On global asymptotic stability of a class of nonlinear systems arising in neural network theory. J. Differ. Equ. 113, 246-264 (1994) MathSciNetView ArticleMATHGoogle Scholar
- Hahnloser, RLT: On the piecewise analysis of networks of linear threshold neurons. Neural Netw. 11, 691-697 (1998) View ArticleGoogle Scholar
- Zeng, Z, Wang, J, Liao, X: Global exponential stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 50(10), 1353-1358 (2003) MathSciNetView ArticleMATHGoogle Scholar
- Zeng, Z, Wang, J: Multiperiodicity and exponential attractivity evoked by periodic external inputs in delayed cellular neural networks. Neural Comput. 18(4), 848-870 (2006) MathSciNetView ArticleMATHGoogle Scholar
- Cao, J, Wang, J: Global asymptotic stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 50(1), 34-44 (2003) MathSciNetView ArticleMATHGoogle Scholar
- Chen, T, Lu, W, Chen, G: Dynamical behaviors of a large class of general delayed neural networks. Neural Comput. 17(4), 949-968 (2005) MathSciNetView ArticleMATHGoogle Scholar
- Chen, Y: Global asymptotic stability of delayed Cohen-Grossberg neural networks. IEEE Trans. Circuits Syst. I, Regul. Pap. 53(2), 351-357 (2006) MathSciNetView ArticleGoogle Scholar
- Tang, HJ, Tan, KC, Zhang, W: Cyclic dynamics analysis for networks of linear threshold neurons. Neural Comput. 17(1), 97-114 (2005) MathSciNetView ArticleMATHGoogle Scholar
- Zhang, L, Yi, Z, Yu, J: Multiperiodicity and attractivity of delayed recurrent neural networks with unsaturating piecewise linear transfer functions. IEEE Trans. Neural Netw. 19(1), 158-167 (2008) View ArticleGoogle Scholar
- Zuo, Z, Yang, C, Wang, Y: A new method for stability analysis of recurrent neural networks with interval time-varying delay. IEEE Trans. Neural Netw. 21(2), 339-344 (2010) View ArticleGoogle Scholar
- Zhang, H, Wang, Z, Liu, D: A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. Learn. Syst. 25(7), 1229-1262 (2014) View ArticleGoogle Scholar
- Li, R, Cao, J: Dissipativity analysis of memristive neural networks with time-varying delays and randomly occurring uncertainties. Math. Methods Appl. Sci. 39(11), 2896-2915 (2016) MathSciNetView ArticleMATHGoogle Scholar
- Samidurai, R, Manivannan, R: Delay-range-dependent passivity analysis for uncertain stochastic neural networks with discrete and distributed time-varying delays. Neurocomputing 185(12), 191-201 (2016) View ArticleGoogle Scholar
- Cheng, C, Lin, K, Shin, C: Multistability in recurrent neural networks. SIAM J. Appl. Math. 66(4), 1301-1320 (2006) MathSciNetView ArticleMATHGoogle Scholar
- Qu, H, Yi, Z, Wang, X: Switching analysis of 2-D neural networks with nonsaturating linear threshold transfer functions. Neurocomputing 72, 413-419 (2008) View ArticleGoogle Scholar
- Manivannan, R, Mahendrakumar, G, Samidurai, R, Cao, J, Alsaedi, A: Exponential stability and extended dissipativity criteria for generalized neural networks with interval time-varying delay signals. J. Franklin Inst. 354(11), 4353-4376 (2017) MathSciNetView ArticleGoogle Scholar
- Nie, X, Cao, J: Existence and global stability of equilibrium point for delayed competitive neural networks with discontinuous activation functions. Int. J. Syst. Sci. 43(3), 459-474 (2012) MathSciNetView ArticleMATHGoogle Scholar
- Manivannan, R, Samidurai, R, Cao, J, Alsaedi, A, Alsaadi, FE: Global exponential stability and dissipativity of generalized neural networks with time-varying delay signals. Neural Netw. 87, 149-159 (2017) View ArticleGoogle Scholar
- Forti, M, Tesi, A: New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 42(7), 354-366 (1995) MathSciNetView ArticleMATHGoogle Scholar
- Lu, H: On stability of nonlinear continuous-time neural networks with delays. Neural Netw. 13(10), 1135-1143 (2000) View ArticleGoogle Scholar
- Zhao, W, Zhu, Q: New results of global robust exponential stability of neural networks with delays. Nonlinear Anal., Real World Appl. 11(2), 1190-1197 (2010) MathSciNetView ArticleMATHGoogle Scholar
- Guo, S, Huang, L: Stability analysis of Cohen-Grossberg neural networks. IEEE Trans. Neural Netw. 17(1), 106-117 (2006) View ArticleGoogle Scholar
- Wang, L: Stability of Cohen-Grossberg neural networks with distributed delays. Appl. Math. Comput. 160(1), 93-110 (2005) MathSciNetMATHGoogle Scholar
- Miller, RK, Michel, AN: Ordinary Differential Equations. Academic Press, New York (1982) MATHGoogle Scholar
- Salinas, E: Background synaptic activity as a switch between dynamical states in a network. Neural Comput. 15, 1439-1475 (2003) View ArticleMATHGoogle Scholar
- Zhang, L, Yi, Z: Dynamical properties of background neural networks with uniform firing rate and background input. Chaos Solitons Fractals 33(3), 979-985 (2007) MathSciNetView ArticleMATHGoogle Scholar
- Wan, M, Gou, J, Wang, D, Wang, X: Dynamical properties of discrete-time background neural networks with uniform firing rate. Math. Probl. Eng. 2013(1), 289-325 (2013) MathSciNetMATHGoogle Scholar
- Xu, F, Yi, Z: Convergence analysis of a class of simplified background neural networks with two subnetworks. Neurocomputing 74(18), 3877-3883 (2011) View ArticleGoogle Scholar
- Yi, Z, Heng, PA, Fung, PF: Winner-take-all discrete recurrent neural networks. IEEE Trans. Circuits Syst. II 47, 1584-1589 (2000) Google Scholar
- Terman, D, Rubin, JE, Yew, AC, Wilson, CJ: Activity patterns in a model for the subthalamopallidal network of the basal ganglia. J. Neurosci. 22(7), 2963-2976 (2002) Google Scholar
- Toth, LJ, Assad, JA: Dynamic coding of behaviourally relevant stimuli in parietal cortex. Nature 415, 165-168 (2002) View ArticleGoogle Scholar
- Shpiro, A, Morenobote, R, Rubin, N, Rinzel, J: Balance between noise and adaptation in competition models of perceptual bistability. J. Comput. Neurosci. 27(1), 37-54 (2009) MathSciNetView ArticleGoogle Scholar
- Cao, J, Rakkiyappan, R, Maheswari, K, Chandrsekar, A: Exponential \(H_{\infty }\) filtering analysis for discrete-time switched neural networks with random delays using sojourn probabilities. Sci. China, Technol. Sci. 59(3), 387-402 (2016) View ArticleGoogle Scholar
- Syed Ali, M, Saravanan, S, Cao, J: Finite-time boundedness, \(L_{2}\)-gain analysis and control of Markovian jump switched neural networks with additive time-varying delays. Nonlinear Anal. Hybrid Syst. 23, 27-43 (2017) MathSciNetView ArticleMATHGoogle Scholar