Mean-square exponential input-to-state stability of stochastic inertial neural networks

By introducing some parameters perturbed by white noises, we propose a class of stochastic inertial neural networks in random environments. Constructing two Lyapunov–Krasovskii functionals, we establish the mean-square exponential input-to-state stability on the addressed model, which generalizes and refines the recent results. In addition, an example with numerical simulation is carried out to support the theoretical findings.

However, both reduced order and non-reduced order methods involve only deterministic inertial neural networks, do not incorporate stochastic inertial neural networks under the effect of environmental fluctuations. Remarkably, Haykin [37] has pointed out that synaptic transmission, caused by random fluctuations in neurotransmitter release and other probabilistic factors, is a noisy process in real nervous systems and in the implementation of artificial neural networks, hence one should take into consideration noise in modeling since it is unavoidable.
i denotes noise intensity. Then, corresponding to inertial neural network (1.1), we obtain the following stochastic system: (1.4) Obviously, the white noise disturbance term σ i x i (t) dB i (t) will induce randomness such that the traditional deterministic inertial neural network (1.1) becomes stochastic system (1.4). One difficulty of this paper is to process white noise disturbances and the other is to introduce a suitable concept of stability to explain the dynamics of (1.4) precisely. The main aim of this paper is to investigate the mean-square exponential input-to-state stability of stochastic inertial neural network (1.4) with initial conditions (1.2). Input-tostate stability, different from the traditional stability such as asymptotical stability, almost sure stability, and exponential stability that means the system states will converge to an equilibrium point as time tends to infinity, can describe the system states varying within a certain region under external control. For more details about input-to-state stability, one can refer to [38][39][40][41][42]. However, as far as we know, almost no one has studied mean-square exponential input-to-state stability of stochastic inertial neural networks.
The remaining part of this paper includes four sections. In Sect. 2, we give the main result: several sufficient conditions that ensue the stochastic inertial neural network (1.4) is mean-square exponentially input-to-state stable. In Sect. 3, we provide numerical examples to check the effectiveness of the developed result. Finally, we summarize and evaluate our work in Sect. 4.

Mean-square exponential stability
Although Wang and Chen [43] have studied the mean-square exponential stability of stochastic inertial neural network (1.4) with two groups of different initial conditions (1.2), it is not appropriate to mean-square exponentially input-to-state stability. Fortunately, motivated by Zhu and Cao [38], who introduced the definition of the mean-square exponential input-to-state stability for stochastic delayed neural networks, together with the mean-square exponential stability (Wang and Chen [43]), we present the following definition.
Proof Let x(t) = (x 1 (t), x 2 (t), . . . , x n (t)) be a solution of stochastic system (1.4) with initial values (1.2) such that In view of (2.1) and (2.2), for i ∈ J, we can find a sufficient little positive number λ such that Then we construct the following two Lyapunov-Krasovskii functionals: Using Itô's formula, we obtain the following stochastic differential: and where L is the weak infinitesimal operator such that and Integrating both sides of (2.5), (2.6) and taking the expectation operator, we obtain from (2.3), (2.4), (2.7), and (2.8) that and (2.10) Choosing γ = max i∈J {α 2 i + |α i γ i |,β i +ᾱ 2 i + |ᾱ iγi |} and β = min i∈J {β i ,β i }, we obtain from (2.9) and (2.10) that and Combining (2.11) and (2.12), the following holds: which, together with Definition 2.1, implies that the stochastic inertial neural network (1.4) is mean-square exponentially input-to-state stable. This completes the proof of Theorem 2.1.

An illustrative example
In order to verify correctness and effectiveness of the theoretical results, we show an example with numerical simulations.

Concluding remarks
In this paper, we have studied the mean-square exponential input-to-state stability for a class of stochastic inertial neural networks. By applying non-reduced order method and Lyapunov-Krasovskii functional, we have obtained several sufficient conditions to guarantee the mean-square exponential input-to-state stability of the suggested stochastic system, which has been considered by few authors. An example and its numerical simulation have been presented to check the theoretical result well.