Skip to main content

Theory and Modern Applications

Analysis of the equilibrium points of background neural networks with uniform firing rate

Abstract

In this paper, we give a complete analysis of the equilibrium points of background neural networks with uniform firing rates. By using continuity, monotonicity of some functions and Rolle’s theorem, the number of equilibrium points and their locations are obtained. Moreover, some novel sufficient conditions are given to guarantee the stability of the equilibrium points for the network model by utilizing Taylor’s theorem. A simulation example is conducted to illustrate the theories developed in this paper.

1 Introduction

Dynamical analysis is one of the most important issues of recurrent neural networks and many results on this topic have been reported in the literature; see Atteneave [1]; Cohen and Grossberg [2]; Forti [3]; Hahnloser [4]; Zeng et al. [5]; Zeng and Wang [6]; Cao and Wang [7]; Chen et al. [8]; Chen [9]; Tang et al. [10]; Zhang et al. [11]; Zuo et al. [12]; Zhang et al. [13]; Li and Cao [14]; Samidurai and Manivannan [15] and the references therein. It is also an essential step towards successful applications such as signal processing and problem optimization. Analysis of the equilibrium point for the concerned recurrent neural networks is a very important part of dynamical analysis (Cheng et al. [16]; Qu et al. [17]). Especially, existence and stability problems of the equilibrium points for various types of recurrent neural networks have attracted significant attention of many researchers (Manivannan et al. [18]; Nie and Cao [19]; Manivannan et al. [20]). Generally, two ways are used to prove the existence of equilibria for a neural network model. One way to prove this is by using a mapping derived from the neural network being a homeomorphism (Forti and Tesi [21]; Chen [9]; Lu [22]; Zhao and Zhu [23]). Another way uses Brouwer’s fixed-point theorem (Forti [3]; Forti and Tesi [21]; Guo and Huang [24]; Wang [25]; Miller and Michel [26]).

In order to interpret the phenomena and exhibit how the dynamical states of recurrent neural networks are affected by a given external background input, the background neural network model was proposed in Salinas [27]. By utilizing theoretical models and computer simulations, it has been shown that small changes in this model may shift a network from a relatively quiet state to some other state with highly complex dynamics.

As far as we are concerned, few references have studied the dynamical properties of the background neural network model; see Zhang and Yi [28]; Wan et al. [29]; Xu and Yi [30]. However, since the network equations (2.1) are nonlinear and coupled equations, neither the well-known homeomorphism method nor Brouwer’s fixed-point theorem can be used to easily investigate the equilibrium point of (2.1). The only known theoretical results in the literature of local stability conditions of the equilibria for background neural networks are obtained by computing eigenvalues at equilibria (Salinas [27]). Unfortunately, so far, the equilibrium analysis problem for the background neural networks with firing rate (the uniform firing rate means that the firing rate is the same for all neurons) remains far from completion. The major difficulty stems from the network model which consists of highly nonlinear coupled equations. The lack of basic information on equilibria creates some difficulties in discussing dynamical properties and bifurcations of the background neural networks.

In this paper, we give a complete analysis of the equilibrium for the background neural networks with uniform firing rate. For the first time, we transform the equilibrium problem of the background neural network with firing rate into a root problem of a cubic equation. Not following the common idea of computing roots of the cubic equation, we analyze the equilibria through a geometrical formulation of the parameter conditions of the background neural networks. Correspondingly, the number and coordinates of the equilibria are determined by using continuity and monotonicity, together with Rolle’s theorem. Furthermore, novel sufficient stability conditions for the equilibria are given. The studies based on the background neural network with uniform firing rate provide an insightful understanding of the computational performance of system (2.1).

The rest of this paper is organized as follows. In Section 2, preliminaries are given. In Section 3, we establish conditions for the exact number of equilibria for the background networks. Locations of these equilibria are obtained. Moreover, we formulate novel sufficient conditions for stability of such equilibria. In Section 4, a simulation example is presented to illustrate the theoretical results. In Section 5, conclusions are drawn.

2 Preliminaries

The background neural network model is described by the following system of nonlinear differential equations:

$$\begin{aligned} \tau \dot{x}_{i}(t)=-x_{i}(t)+\frac{(\sum_{j}w_{ij}x_{j}(t)+h_{i})^{2}}{s+v\sum_{j}x^{2}_{j}(t)}\quad (i=1, \ldots ,n), \end{aligned}$$
(2.1)

for \(t\geq 0\), where \(x_{i}\) denotes the activity of neuron i, \(h_{i}\) represents its external input, \(w_{ij}\) represents the excitatory synaptic connection from neuron j to neuron i, v is the inhibitory synaptic connection by which a neuron decreases another neuron’s gain, τ is a time constant, and s is a saturation constant. All these quantities are always positive or zero. If the firing rate is the same for all neurons, then the network equations (2.1) are reduced to a nonlinear equation as follows:

$$ \tau \dot{x}(t)=-x(t)+\frac{(w_{\mathrm{tot}}x(t)+h)^{2}}{s+vNx^{2}(t)}, $$
(2.2)

for all \(t \geq 0\). x denotes the uniform firing rate, \(w_{tot}\) denotes the excitatory synaptic connection, and N is the total number of neurons.

For simplicity, we consider system (2.2) in the following equivalent form:

$$ \dot{x}(t)=-x(t)+\frac{(ax(t)+b)^{2}}{1+cx^{2}(t)}:=F(x). $$
(2.3)

Herein, let \(\tau =1\), \(a=\frac{w_{\mathrm{tot}}}{\sqrt{s}} > 0\), \(b=\frac{h}{ \sqrt{s}} > 0\), \(c=\frac{vN}{s}> 0\).

An equilibrium of network (2.3) is described as follows:

$$ F \bigl(x^{*} \bigr): = \frac{-x^{*}(1+c{x^{*}}^{2})+(ax^{*}+b)^{2}}{1+c{x^{*}} ^{2}}=0. $$

Let

$$\begin{aligned} P(x)&:=-x(t) \bigl(1+cx^{2}(t) \bigr)+ \bigl(ax(t)+b \bigr)^{2} \\ &=-cx^{3}(t)+a^{2}x^{2}(t)+(2ab-1)x(t)+b ^{2}, \end{aligned}$$

i.e.,

$$ F(x)=\frac{P(x)}{1+cx^{2}}. $$

We suppose \(x(0)>0\) and

$$\begin{aligned} x(t) = &x(0)e^{-t} + \int_{0}^{t}e^{-(t-\theta )} \frac{ ( w_{\mathrm{tot}}x( \theta )+ h ) ^{2}}{s+vNx^{2}(\theta )}\,d\theta > 0. \end{aligned}$$

It means that the trajectory of system (2.2) is positive.

From reference Zhang and Yi [28], we have

$$ x(t)< x(0)+\frac{w^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=x(0)+ \frac{a^{2}}{c}+b^{2}+1. $$

Denote

$$ \Pi =x(0)+\frac{a^{2}}{c}+b^{2}+1. $$

Because \(1+cx^{2}(t)>0\), the equilibria of system (2.3) are determined by zeros of \(P(x)\) in the interval \(I:=(0,\Pi )\).

3 Equilibria and qualitative properties

In this section, we present novel sufficient conditions which guarantee the existence and the number of equilibria for network (2.3). Our approach is based on a geometrical observation. We also establish stability criteria of these equilibria through the Taylor expansion at some equilibrium.

Theorem 1

System (2.3) has at most three equilibria in \((0,\Pi )\). The number of equilibria and their locations are described in Table  1.

Table 1 The number of equilibria and their locations

Proof

System (2.3) has at most three equilibria in \((0,\Pi )\) because \(\operatorname{deg}(P)=3\).

Next, we will discuss the zeros of

$$P(x):=-cx^{3}+a^{2}x^{2}+(2ab-1)x+b ^{2} \quad (a>0,b>0,c>0). $$

According to the definition of \(P(x)\), we have

$$\begin{aligned} \textstyle\begin{cases} \lim_{x\rightarrow +\infty }P(x)= -\infty , \\ \lim_{x\rightarrow -\infty }P(x)= +\infty , \\ P(0)= b^{2} >0 . \end{cases}\displaystyle \end{aligned}$$
(3.1)

In order to state our results easily, we partition the parameter conditions ensuring the number and locations of equilibria into the following subregions:

$$\begin{aligned}& \mathcal{T}_{111}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}< \frac{1}{2},P(\zeta_{-})>0 \biggr\} , \\& \mathcal{T}_{112}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}< \frac{1}{2},P(\zeta_{-})=0 \biggr\} , \\& \mathcal{T}_{113}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}< \frac{1}{2},P(\zeta_{-})< 0,P( \zeta_{+})>0 \biggr\} , \\& \mathcal{T}_{114}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}< \frac{1}{2},P(\zeta_{-})< 0,P( \zeta_{+})=0 \biggr\} , \\& \mathcal{T}_{115}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}< \frac{1}{2},P(\zeta_{-})< 0,P( \zeta_{+})< 0 \biggr\} , \\& \mathcal{T}_{12}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}>\frac{1}{2},P(\zeta_{+})>0 \biggr\} , \\& \mathcal{T}_{13}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs > 0,\frac{w _{\mathrm{tot}}h}{s}=\frac{1}{2},P(\zeta_{+})>0 \biggr\} , \\& \mathcal{T}_{21}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs = 0,\frac{w _{\mathrm{tot}}h}{s} < \frac{1}{2} \biggr\} , \\& \mathcal{T}_{31}: = \biggl\{ w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs < 0,\frac{w _{\mathrm{tot}}h}{s} < \frac{1}{2} \biggr\} . \end{aligned}$$

The derivative \(P'(x)=-3cx^{2}+2a^{2}x+2ab-1=0\) has two zeros:

$$ \begin{aligned} &\zeta_{-}=\frac{a^{2}-\sqrt{a^{4}+3c(2ab-1)}}{3c}, \\ & \zeta_{+}=\frac{a ^{2}+\sqrt{a^{4}+3c(2ab-1)}}{3c}\quad (\zeta_{-}\leq \zeta_{+}). \end{aligned} $$

We have \(P(\zeta_{-})=-\frac{-2a^{6}+2a^{4}\sqrt{a^{4}+6abc-3c}+12abc\sqrt{a ^{4}+6abc-3c}-6c\sqrt{a^{4}+6abc-3c}-18a^{3}bc+9a^{2}c-27b^{2}c^{2}}{27c ^{2}}\), \(P(\zeta_{+})=\frac{2a^{6}+2a^{4}\sqrt{a^{4}+6abc-3c}+12abc\sqrt{a ^{4}+6abc-3c}-6c\sqrt{a^{4}+6abc-3c}+18a^{3}bc-9a^{2}c+27b^{2}c^{2}}{27c ^{2}}\). Consider the discriminant \(\Delta_{P'(x)}\) of a polynomial \(P'\) with degree 2

$$ \Delta_{P'(x)}:=a^{4}+3c(2ab-1). $$

We separately discuss the three cases: \(\Delta_{P'(x)}>0,\Delta_{P'(x)}=0\) and \(\Delta_{P'(x)}<0\).

Case 1: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)>0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs>0\). However, \(2ab-1\) may be positive, negative or zero. Thus, according to the sign of \(2ab-1\), we need to discuss the following three subcases.

Subcase 1.1: \(ab<\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}<\frac{1}{2}\).

Equation \(P'(x)=0\) has two distinct roots \(\zeta_{-},\zeta_{+}(0< \zeta_{-}<\zeta_{+})\) in this case.

  1. (a)

    \(\zeta_{-}\), \(\zeta_{+}\) are the minimum point and the maximum point of \(P(x)\), respectively. By simple computation, we obtain \(\zeta_{+}< \Pi \). According to Rolle’s theorem and the continuity of \(P(x)\), \(P(x)\) has a unique positive real zero \(x_{1}^{*}\in (\zeta_{+}, \Pi )\) when \(P(\zeta_{-})>0\). Obviously, system (2.3) has a unique equilibrium \(E_{1}^{(1)}:x_{1}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(a).

    Figure 1
    figure 1

    P has one, two or three zeros in subcase 1.1.

  2. (b)

    \(P(x)\) has two positive real zeros \(\zeta_{-}\) and \(x_{2}^{*} \in (\zeta_{+},\Pi )\) when \(P(\zeta_{-})= 0\). Then it follows that system (2.3) has two equilibrium points \(E_{2}^{(1)}:\zeta_{-}\) and \(E_{3}^{(1)}:x_{2}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(b).

  3. (c.1)

    \(P(x)\) has three distinct positive real zeros \(x_{3}^{*}\in (0, \zeta_{-}),x_{4}^{*}\in (\zeta_{-},\zeta_{+}),x_{5}^{*}\in (\zeta_{+}, \Pi )\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})>0\). It is also clear that system (2.3) has three equilibrium points \(E_{4}^{(1)}:x_{3} ^{*}\in (0,\zeta_{-})\), \(E_{5}^{(1)}:x_{4}^{*}\in (\zeta_{-},\zeta _{+})\) and \(E_{6}^{(1)}:x_{5}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 1(c). \(P(\zeta_{-})<0, P(\zeta_{+})>0\) on the solid curve.

  4. (c.2)

    \(P(x)\) has two positive real zeros \(x_{6}^{*}\in (0,\zeta_{-})\) and \(\zeta_{+}\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})=0\). Then system (2.3) has two equilibrium points \(E_{7}^{(1)}:x_{6}^{*}\in (0, \zeta_{-})\) and \(E_{8}^{(1)}:\zeta_{+}\), as shown in Figure 1(c). \(P(\zeta_{-})<0, P(\zeta_{+})=0\) on the dotted curve.

  5. (c.3)

    \(P(x)\) has one positive real zero \(x_{7}^{*}\in (0,\zeta_{-})\) when \(P(\zeta_{-})<0\) and \(P(\zeta_{+})<0\). Then system (2.3) has a unique equilibrium \(E_{9}^{(1)}:x_{7}^{*}\in (0,\zeta_{-})\), as shown in Figure 1(c). \(P(\zeta_{-})<0\) and \(P(\zeta_{+})<0\) on the star-shaped curve.

Subcase 1.2: \(ab>\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}>\frac{1}{2}\).

\(P'(x)=0\) has two distinct zeros \(\zeta_{-},\zeta_{+}( \zeta_{-}< 0 < \zeta_{+})\). \(P(x)\) has a unique positive real zero \(x_{8}^{*}\in ( \zeta_{+},\Pi )\). Then we see that system (2.3) has a unique equilibrium \(E_{10}^{(1)}:x_{8}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 2(d).

Figure 2
figure 2

P has one zero in subcases 1.2 and 1.3.

Subcase 1.3: \(ab=\frac{1}{2}\), i.e., \(\frac{w_{\mathrm{tot}}h}{s}=\frac{1}{2}\).

\(P'(x)=0\) has one zero \(\zeta_{-}=0,\zeta_{+}=\frac{2a^{2}}{3c}>0\). \(P(x)\) has a unique positive real zero \(x_{9}^{*}\in (\zeta_{+}, \Pi )\) implying that system (2.3) has a unique equilibrium \(E_{11}^{(1)}:x_{9}^{*}\in (\zeta_{+},\Pi )\), as shown in Figure 2(e).

Case 2: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)=0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs=0\).

\(P'(x)\) has a multiple one \(\frac{a^{2}}{3c}\) of multiplicity 2. \(\frac{a^{2}}{3c}\) is not the extreme point of \(P(x)\). Combining with (3.1), \(P(x)\) has a positive zero \(x_{10}\in (0,\Pi )\), which implies that system (2.3) has a unique equilibrium \(E_{1}^{(2)}:x _{10}^{*}\in (0,\Pi )\).

Case 3: \(\Delta_{P'(x)}=a^{4}+3c(2ab-1)<0\), i.e., \(w_{\mathrm{tot}}^{4}+6vNw_{\mathrm{tot}}h-3vNs<0\).

\(P'(x)\) has no real zero. It means that \(P(x)\) has no extreme point. By (3.1), \(P(x)\) is a strictly monotonically decreasing function. Thus, \(P(x)\) has a positive zero \(x_{11}^{*}\in (0,\Pi )\). System (2.3) accordingly has a unique equilibrium \(E_{1}^{(3)}:x_{11} ^{*}\in (0,\Pi )\).

The proof is completed. □

Remark 1

It should be noted that, as shown in the proof of Theorem 1, we separately discuss the existence and the number of the equilibria in three cases based on the sign of the discriminant \(\Delta_{P'(x)}\). We have employed continuity and monotonicity of the function \(P(x)\), combined with Rolle’s theorem, to estimate the coordinates and the number of the equilibria. These techniques have provided some new analysis approaches, rather than simply using the well-known homeomorphism method and Brouwer’s fixed-point theorem. Moreover, these results may provide important theoretical foundations to further analyze the limit cycle and bifurcations of networks (2.3).

Next, we will discuss stability of the equilibrium point of system (2.3) by utilizing the Taylor expansion.

Theorem 2

The stability of the equilibrium points of system (2.3) is described in Table  2.

Table 2 Stability of the equilibrium points for ( 2.3 )

Proof

Let \(x^{*}\) be a general equilibrium of system (2.3).

The Taylor expansion of \(F(x)\) at equilibrium \(x^{*}\) is described by

$$ \dot{x}(t)=F \bigl(x^{*} \bigr)+F' \bigl(x^{*} \bigr) \bigl(x-x^{*} \bigr)+O \bigl(x-x^{*} \bigr). $$
(3.2)

Since \(F(x)=\frac{P(x)}{1+cx^{2}}\), we have

$$ F'(x)=\frac{P'(x)(1+cx^{2})-2cxP(x)}{(1+cx^{2})^{2}}. $$

Denote \(Q(x)=P'(x)(1+cx^{2})-2cxP(x)\), thus

$$ F'(x)=\frac{Q(x)}{(1+cx^{2})^{2}}. $$

Then \(F'(x^{\ast })=\frac{Q(x^{\ast })}{(1+cx^{{\ast }^{2}})^{2}}=\frac{P'(x^{\ast })(1+cx^{{\ast }^{2}})}{(1+cx^{{\ast }^{2}})^{2}}\) (since \(P(x^{\ast })=0\)). The sign of \(F'(x^{\ast })\) is the same as that of \(P'(x^{\ast })\), since \(1+cx^{{\ast }^{2}}>0\).

Because \(F(x^{*})=0\), the sign of \(\dot{x}(t)\) is the same as that of \(F'(x^{*})\). Therefore, the stability of the equilibrium is determined by the sign of \(P'(x^{*})\). Concretely, \(P(x)\) is monotonically decreasing in \((\zeta_{+},\Pi )\) and \(P'(x_{1}^{*})<0\). Thus, the equilibrium point \(E_{1}^{(1)}\) is asymptotically stable. Similarly, equilibria \(E_{3}^{(1)}, E_{4}^{(1)}, E_{6}^{(1)}, E_{7}^{(1)}, E_{9} ^{(1)}, E_{10}^{(1)}, E_{11}^{(1)}, E_{1}^{(2)}\) and \(E_{1}^{(3)}\) are asymptotically stable, because \(P'(x_{i}^{*})<0\), when \(i=2,3,5,6,7,8,9,10,11\).

\(P(x)\) is monotonically increasing in \((\zeta_{+},\Pi )\) and \(P'(x_{4})^{*}>0\), which means that equilibrium point \(E_{5}^{(1)}\) is unstable.

If \(P'(x^{\ast })=0\), then the Taylor expansion of \(P(x)\) at equilibria \(x^{\ast }\) is

$$ \begin{aligned} &\dot{x}(t)=F'' \bigl(x^{*} \bigr) \bigl(x-x^{*} \bigr)^{2}+O \bigl(x-x^{*} \bigr)^{2}, \\ &F''(x)=\frac{Q'(x)(1+cx^{2})^{2}-Q(x)\cdot 2(1+cx^{2})\cdot 2cx}{(1+cx ^{2})^{4}}, \end{aligned} $$
(3.3)

where \(Q'(x)=P''(x)(1+cx^{2})+P'(x)\cdot 2cx-P'(x)\cdot 2cx-P(x) \cdot 2c\). Thus, we obtain

$$ Q' \bigl(x^{\ast } \bigr)=P'' \bigl(x^{\ast } \bigr) \bigl(1+cx^{{\ast }^{2}} \bigr)+P' \bigl(x^{\ast } \bigr)\cdot 2cx ^{\ast }-P' \bigl(x^{\ast } \bigr)\cdot 2cx^{\ast }-P \bigl(x^{\ast } \bigr) \cdot 2c. $$

Since \(P(x^{\ast })=0,P'(x^{\ast })=0\), we have \(Q'(x^{\ast })=P''(x ^{\ast })(1+cx^{{\ast }^{2}})\) and \(Q(x^{\ast })=P'(x^{\ast })(1+cx ^{{\ast }^{2}})-2cx^{\ast }P(x^{\ast })=0\).

Therefore, we get

$$\begin{aligned} F'' \bigl(x^{\ast } \bigr) =& \frac{Q'(x^{\ast })(1+cx^{{\ast }^{2}})^{2}-Q(x ^{\ast })\cdot 2(1+cx^{{\ast }^{2}})\cdot 2cx^{\ast }}{(1+cx^{{\ast } ^{2}})^{4}} \\ =& \frac{Q'(x^{\ast })(1+cx^{{\ast }^{2}})^{2}}{(1+cx^{{\ast }^{2}})^{4}} \\ =& \frac{P''(x^{\ast })(1+cx^{{\ast }^{2}})}{(1+cx^{{\ast }^{2}})^{4}}. \end{aligned}$$

It means that the sign of \(F''(x^{\ast })\) is the same as that of \(P''(x^{\ast })\) because \(1+cx^{{\ast }^{2}}>0\). \(P'(\zeta_{-})=0\) and \(P'(\zeta_{+})=0\) at equilibria \(E_{2}^{(1)}:\zeta_{-}, E_{8}^{(1)}: \zeta_{+}\), respectively. Thus, \(F'(\zeta_{-})=0,F'(\zeta_{+})=0\).

\(P''(\zeta_{-})>0\) because the graph of \(P(x)\) is concave down in the neighborhood of \(E_{2}^{(1)}\). \(P'(\zeta_{-})=0,P''(\zeta_{-})>0\), thus \(E_{2}^{(1)}\) is unstable. Similarly, \(P''(\zeta_{+})<0\) because the graph of \(P(x)\) is convex up in the neighborhood of \(E_{8}^{(1)}\). \(P'(\zeta_{+})=0,P''(\zeta_{+})<0\), thus \(E_{8}^{(1)}\) is asymptotically stable.

The proof is completed. □

Remark 2

It should be mentioned that we have derived the conditions for stability of the equilibria by using the Taylor expansion. In Figures 1 and 2, we see that the equilibrium is stable or unstable according to the sign \(P'(x^{*})\) at equilibrium \(x^{*}\). The method is easier and more convenient than constructing suitable Lyapunov-Krasovskii functionals or constructing energy functions.

4 Simulations

In this section, one example will be provided to illustrate and verify the theoretical results obtained in the above sections.

Example

Case 1: Consider a class of background neural networks (2.2) with \(w_{\mathrm{tot}}=1.8965, h=4.6457, vN=0.0900\) and \(s=50\). Thus, we obtain \(\frac{w^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=41.3951\). We randomly select the initial point \(x(0)\in (0,80)\).

Here, the parameters satisfy our conditions in Theorem 1: \(\Delta_{P'(x)}=0.0017>0,ab=0.1762< \frac{1}{2}\).

By simple computation, equation \(P'(x)=-3cx^{2}+2a^{2}x+2ab-1=0\) is found to have two distinct roots \(\zeta_{-}=5.7367 ,\zeta_{+}=20.9044\), with \(P(\zeta_{-})=-1.2559<0,P(\zeta_{+})=1.8846>0\).

\((a,b,c)\in \mathcal{T}_{113}\), thus, system (2.3) has three equilibria \(E_{4}^{(1)}:x_{3}^{*}\in (0,5.7367), E_{5}^{(1)}:x_{4} ^{*}\in (5.7367,20.9044)\) and \(E_{6}^{(1)}:x_{5}^{*}\in (20.9044,41.3951)\). According to Theorem 2, \(E_{4}^{(1)}\) and \(E_{6}^{(1)}\) are asymptotically stable, and \(E_{5}^{(1)}\) is unstable. Figure 3 demonstrates the theoretical results.

Figure 3
figure 3

The locations and stability of equilibria of the network ( 2.3 ) with \(\pmb{w_{\mathrm{tot}}=1.8965, h=4.6457, vN=0.0900}\) and \(\pmb{s=50}\) .

In this investigation, system (2.3) has multiple stable equilibrium points. Thus, the network belongs to the multistable neural networks. Mathematically, multistability allows the network to have multiple stable fixed points and periodic orbits or limit cycles. Multiple stable equilibrium points or periodic orbits are accompanied by the existence of continuous attractors. Continuous attractors have been found in many applications, including applications related to visual perception, visual images, eye memory, etc. The existence of unstable equilibria is essential in winner-take-all problems (Yi et al. [31]). Therefore, the proposed work in this manuscript can be applied in aforementioned applications.

Case 2: If \(w_{\mathrm{tot}}=1.2, h=12, vN=0.02\) and \(s=63.36\), then \(\frac{w^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=75.2727\). We randomly select the initial point \(x(0)\in (0,80)\).

In this case \(\Delta_{P'(x)}=0\), so it follows from Theorems 1 and 2 that system (2.3) has a unique equilibrium point \(E_{1}^{(2)}:x _{10}^{*}\in (0,75.2727)\), and it is asymptotically stable. The location and stability of the equilibrium points of this system are illustrated in Figure 4.

Figure 4
figure 4

The locations and stability of the equilibrium point \(\pmb{E_{1}^{(2)}}\) of the network ( 2.3 ) with \(\pmb{w_{\mathrm{tot}}=1.2, h=12, vN=0.02}\) and \(\pmb{s=63.36}\) .

Case 3: If \(w_{\mathrm{tot}}=1.12, h=10, vN=0.03\) and \(s=65\), then \(\frac{w ^{2}_{\mathrm{tot}}}{vN}+\frac{h^{2}}{s}+1=44.3518\). We randomly select the initial point \(x(0)\in (0,35)\).

In this case \(\Delta_{P'(x)}<0\), so system (2.3) has a unique equilibrium point \(E_{1}^{(3)}:x_{11}^{*}\in (0,44.3518)\) and it is asymptotically stable. The results are shown in Figure 5.

Figure 5
figure 5

The locations and stability of the equilibrium point \(\pmb{E_{1}^{(3)}}\) of the network ( 2.3 ) with \(\pmb{w_{\mathrm{tot}}=1.12, h=10, vN=0.03}\) and \(\pmb{s=65}\) .

In cases 2 and 3, the network (2.3) has a unique equilibrium point which is stable. Such convergent behavior is called ‘monostability’. Monostable networks can be used to solve optimization problems. Under some conditions, the network (2.3) in this paper is a monostable neural network. Therefore, the proposed work provides a novel model with respect to optimization problems.

5 Conclusions

In this paper, the dynamical properties of equilibria of the background neural network (2.2) (i.e., (2.3)) are analyzed, such as the number of equilibria and their locations. Firstly, the conditions for the number and locations of the equilibria of the network were investigated. Secondly, the conditions for stability of the equilibria were derived. The parametric relations between the dynamical properties of the equilibria and network parameters were revealed. These theories are primarily based on an observation on the geometric structures of the equations \(P(x)=0\). These studies enrich the analytical results for the equilibrium points of other related work.

The studies based on the background neural networks with firing rate (the uniform firing rate means that the firing rate is the same for all neurons, thus, we can regard system (2.3) as one-dimensional) may be further developed for the studies on general higher-dimensional systems, e.g., the mathematical methods in this paper can be applied to analyze the equilibria of background neural networks with two subnetworks which exhibit rival states (i.e., 2D background neural networks). The rivaling steady states have significant meaning in the development of practical applications of 2D neural networks. This switch problem (Terman and Rubin [32]; Toth et al. [33]) and binocular rivalry (Shpiro et al. [34]) are interesting topics for further research. Many practical problems, such as mechanical design and electrical networks, can be formulated as switch problems. In recent years, dynamical analysis for switched systems have attracted considerable research interest (Cao et al. [35]; Syed Ali et al. [36]). Our manuscript has given us further insight into providing parameter conditions for switched systems.

References

  1. Atteneave, F: Multistability in perception. Sci. Am. 225, 63-71 (1971)

    Article  Google Scholar 

  2. Cohen, MA, Grossberg, S: Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Syst. Man Cybern. 13, 815-826 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  3. Forti, M: On global asymptotic stability of a class of nonlinear systems arising in neural network theory. J. Differ. Equ. 113, 246-264 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  4. Hahnloser, RLT: On the piecewise analysis of networks of linear threshold neurons. Neural Netw. 11, 691-697 (1998)

    Article  Google Scholar 

  5. Zeng, Z, Wang, J, Liao, X: Global exponential stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 50(10), 1353-1358 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  6. Zeng, Z, Wang, J: Multiperiodicity and exponential attractivity evoked by periodic external inputs in delayed cellular neural networks. Neural Comput. 18(4), 848-870 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  7. Cao, J, Wang, J: Global asymptotic stability of a general class of recurrent neural networks with time-varying delays. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 50(1), 34-44 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  8. Chen, T, Lu, W, Chen, G: Dynamical behaviors of a large class of general delayed neural networks. Neural Comput. 17(4), 949-968 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  9. Chen, Y: Global asymptotic stability of delayed Cohen-Grossberg neural networks. IEEE Trans. Circuits Syst. I, Regul. Pap. 53(2), 351-357 (2006)

    Article  MathSciNet  Google Scholar 

  10. Tang, HJ, Tan, KC, Zhang, W: Cyclic dynamics analysis for networks of linear threshold neurons. Neural Comput. 17(1), 97-114 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  11. Zhang, L, Yi, Z, Yu, J: Multiperiodicity and attractivity of delayed recurrent neural networks with unsaturating piecewise linear transfer functions. IEEE Trans. Neural Netw. 19(1), 158-167 (2008)

    Article  Google Scholar 

  12. Zuo, Z, Yang, C, Wang, Y: A new method for stability analysis of recurrent neural networks with interval time-varying delay. IEEE Trans. Neural Netw. 21(2), 339-344 (2010)

    Article  Google Scholar 

  13. Zhang, H, Wang, Z, Liu, D: A comprehensive review of stability analysis of continuous-time recurrent neural networks. IEEE Trans. Neural Netw. Learn. Syst. 25(7), 1229-1262 (2014)

    Article  Google Scholar 

  14. Li, R, Cao, J: Dissipativity analysis of memristive neural networks with time-varying delays and randomly occurring uncertainties. Math. Methods Appl. Sci. 39(11), 2896-2915 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  15. Samidurai, R, Manivannan, R: Delay-range-dependent passivity analysis for uncertain stochastic neural networks with discrete and distributed time-varying delays. Neurocomputing 185(12), 191-201 (2016)

    Article  Google Scholar 

  16. Cheng, C, Lin, K, Shin, C: Multistability in recurrent neural networks. SIAM J. Appl. Math. 66(4), 1301-1320 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  17. Qu, H, Yi, Z, Wang, X: Switching analysis of 2-D neural networks with nonsaturating linear threshold transfer functions. Neurocomputing 72, 413-419 (2008)

    Article  Google Scholar 

  18. Manivannan, R, Mahendrakumar, G, Samidurai, R, Cao, J, Alsaedi, A: Exponential stability and extended dissipativity criteria for generalized neural networks with interval time-varying delay signals. J. Franklin Inst. 354(11), 4353-4376 (2017)

    Article  MathSciNet  Google Scholar 

  19. Nie, X, Cao, J: Existence and global stability of equilibrium point for delayed competitive neural networks with discontinuous activation functions. Int. J. Syst. Sci. 43(3), 459-474 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  20. Manivannan, R, Samidurai, R, Cao, J, Alsaedi, A, Alsaadi, FE: Global exponential stability and dissipativity of generalized neural networks with time-varying delay signals. Neural Netw. 87, 149-159 (2017)

    Article  Google Scholar 

  21. Forti, M, Tesi, A: New conditions for global stability of neural networks with application to linear and quadratic programming problems. IEEE Trans. Circuits Syst. I, Fundam. Theory Appl. 42(7), 354-366 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  22. Lu, H: On stability of nonlinear continuous-time neural networks with delays. Neural Netw. 13(10), 1135-1143 (2000)

    Article  Google Scholar 

  23. Zhao, W, Zhu, Q: New results of global robust exponential stability of neural networks with delays. Nonlinear Anal., Real World Appl. 11(2), 1190-1197 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  24. Guo, S, Huang, L: Stability analysis of Cohen-Grossberg neural networks. IEEE Trans. Neural Netw. 17(1), 106-117 (2006)

    Article  Google Scholar 

  25. Wang, L: Stability of Cohen-Grossberg neural networks with distributed delays. Appl. Math. Comput. 160(1), 93-110 (2005)

    MathSciNet  MATH  Google Scholar 

  26. Miller, RK, Michel, AN: Ordinary Differential Equations. Academic Press, New York (1982)

    MATH  Google Scholar 

  27. Salinas, E: Background synaptic activity as a switch between dynamical states in a network. Neural Comput. 15, 1439-1475 (2003)

    Article  MATH  Google Scholar 

  28. Zhang, L, Yi, Z: Dynamical properties of background neural networks with uniform firing rate and background input. Chaos Solitons Fractals 33(3), 979-985 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  29. Wan, M, Gou, J, Wang, D, Wang, X: Dynamical properties of discrete-time background neural networks with uniform firing rate. Math. Probl. Eng. 2013(1), 289-325 (2013)

    MathSciNet  MATH  Google Scholar 

  30. Xu, F, Yi, Z: Convergence analysis of a class of simplified background neural networks with two subnetworks. Neurocomputing 74(18), 3877-3883 (2011)

    Article  Google Scholar 

  31. Yi, Z, Heng, PA, Fung, PF: Winner-take-all discrete recurrent neural networks. IEEE Trans. Circuits Syst. II 47, 1584-1589 (2000)

    Google Scholar 

  32. Terman, D, Rubin, JE, Yew, AC, Wilson, CJ: Activity patterns in a model for the subthalamopallidal network of the basal ganglia. J. Neurosci. 22(7), 2963-2976 (2002)

    Google Scholar 

  33. Toth, LJ, Assad, JA: Dynamic coding of behaviourally relevant stimuli in parietal cortex. Nature 415, 165-168 (2002)

    Article  Google Scholar 

  34. Shpiro, A, Morenobote, R, Rubin, N, Rinzel, J: Balance between noise and adaptation in competition models of perceptual bistability. J. Comput. Neurosci. 27(1), 37-54 (2009)

    Article  MathSciNet  Google Scholar 

  35. Cao, J, Rakkiyappan, R, Maheswari, K, Chandrsekar, A: Exponential \(H_{\infty }\) filtering analysis for discrete-time switched neural networks with random delays using sojourn probabilities. Sci. China, Technol. Sci. 59(3), 387-402 (2016)

    Article  Google Scholar 

  36. Syed Ali, M, Saravanan, S, Cao, J: Finite-time boundedness, \(L_{2}\)-gain analysis and control of Markovian jump switched neural networks with additive time-varying delays. Nonlinear Anal. Hybrid Syst. 23, 27-43 (2017)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the National Science Foundation of China under Grant 61202045, Grant 11501475, and in part by the Program of Science and Technology of Sichuan Province of China under Grant No. 2016JY0067.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fang Xu.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors contributed equally to the writing of this paper. All authors read and approved the manuscript.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, F., Liu, L. & Xiao, J. Analysis of the equilibrium points of background neural networks with uniform firing rate. Adv Differ Equ 2017, 314 (2017). https://doi.org/10.1186/s13662-017-1322-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13662-017-1322-z

Keywords