Global exponential stability of Markovian jumping stochastic impulsive uncertain BAM neural networks with leakage, mixed time delays, and α-inverse Hölder activation functions

This paper concerns the problem of enhanced results on robust finite time passivity for uncertain discrete time Markovian jumping BAM delayed neural networks with leakage delay. By implementing a proper Lyapunov–Krasovskii functional candidate, reciprocally convex combination method, and linear matrix inequality technique, we derive several sufficient conditions for varying the passivity of discrete time BAM neural networks. Further, some sufficient conditions for finite time boundedness and passivity for uncertainties are proposed by employing zero inequalities. Finally, the enhancement of the feasible region of the proposed criteria is shown via numerical examples with simulation to illustrate the applicability and usefulness of the proposed method.


Introduction and problem statement with preliminaries
There has been a growing research interest in the field of recurrent neural networks (RNNs) largely studied by many researchers in recent years. The network architecture includes various types of neural networks such as bidirectional associative memory (BAM) neural networks, Hopfield neural networks, cellular neural networks, Cohen-Grossberg neural networks, neural and social networks which have received great attention due to their wide applications in the field of classification, signal and image processing, parallel computing, associate memories, optimization, cryptography, and so on. The bidirectional associative memory (BAM) neural network models were initially coined by Kosko, see [1,2]. This network has an extraordinary class of RNNs which can have the ability to store bipolar vector pairs. It is composed of neurons and is arranged in two layers, one is the Xlayer and the other is the Y-layer. The neurons in one layer are fully interconnected to the neurons in the other layer. The BAM neural networks are designed in such a way that, for a given external input, they can reveal only one global asymptotic or exponential stability equilibrium point. Hence, considerable efforts have been made in the study of stability analysis of neural networks, and as a credit to this, a large number of sufficient conditions have been proposed to guarantee the global asymptotic or exponential stability for the addressed neural networks.
Furthermore, the existence of time delays in the network will result in bad performance, instability or chaos. Accordingly, the time delays can be classified into two types: discrete and distributed delays. Here, we have taken both the delays into account while modeling our network system, because the length of the axon sizes is too large. So, it is noteworthy to inspect the dynamical behaviors of neural systems with both time delays, see, for instance, [3][4][5][6][7][8][9][10][11].
In [12], Shu et al. considered the BAM neural networks with discrete and distributed time delays. Some sufficient conditions were obtained to ensure the global asymptotic stability [12]. Also, time delays in the leakage term have great impact on the dynamic behavior of neural networks. However, so far, there have been a very few existing works on neural networks with time delay in the leakage term, see, for instance, [13][14][15][16][17].
Further the stability performance of state variable with leakage time delays was discussed by Lakshmanan et al. in [18]. While modeling a real nervous system, stochastic noises and parameter uncertainties are inevitable and should be taken into account. In the real nervous system, the synaptic transmission has created a noisy process brought on by apparent variation from the release of neurotransmitters and the connection weights of the neuron completely depend on undisputed resistance and capacitance values. Therefore, it is of practical significance to investigate the stochastic disruption in the stability of time-delayed neural networks with parameter uncertainties, see references cited therein [19][20][21][22]. Moreover, the hasty consequence (impulsive effect) is probable to exist in a wide variety of evolutionary processes that in turn make changes in the states abruptly at certain moments of time [23][24][25][26][27][28].
The conversion of the parameters jump will lead to a finite-state Markov process. Recently, the researchers in [29,30] investigated the existence of Markovian jumps in BAMNNs and exploited the stochastic LKF approach, the new sufficient conditions were derived for the global exponential stability in the mean square.
The BAM-type NNs with Markovian jumping parameters and leakage terms were described by Wang et al. in [31]. In [32], a robust stability problem was studied and some delay-dependent conditions were derived for the neutral-type NNs with time-varying delays. The authors in [33][34][35] developed some conditions for the stability analysis of neural networks with integral inequality approach. The criteria to obtain the stability result of neural networks with time-varying delays were checked in [36][37][38]. It should be noted that, with all the consequences reported in the literature above, they are concerned only with Markovian jumping SNNs with Lipschitz model neuron activation functions. Up to now, very little attention has been paid to the problem of the global exponential stability of Markovian jumping SBAMNNs with non-Lipschitz type activation functions, which frequently appear in realistic neural networks. This situation motivates our present problem, i.e., α-inverse holder activation functions.
Our main objective of this paper is to study the delay-dependent exponential stability problem for a class of Markovian jumping uncertain BAM neural networks with mixed time delays, leakage delays, and α-inverse Holder activation functions under stochastic noise perturbation.

Assumption 1
(1) f i ,f j are monotonic increasing continuous functions.
(2) For any ρ 1 , ρ 2 , θ 1 , θ 2 ∈ R, there exist the respective scalars q ρ 1 > 0, r ρ 1 > 0 and q ρ 2 > 0, r ρ 2 > 0 which are correlated with ρ 1 , ρ 2 and α > 0, β > 0 so that Assumption 2 g i , h i andg j ,h j are continuous and satisfy Remark 1.1 In [39], the function f i used in Assumption 1 is said to be an α-inverse Holder activation function which is a non-Lipschitz function. This activation function plays an important role in the stability issues of neural networks, and there exists a great number of results in the engineering mathematics, for example, f (θ ) = arc tan θ and f (θ ) = θ 3 + θ are 1-inverse Holder functions, f (θ ) = θ 3 is 3-inverse Holder function.
Remark 1.2 From Assumption 2, we can get that e i ,ẽ j and k i ,k j are positive scalars. So E, E and K , K are both positive definite diagonal matrices. The relations among the different activation functions f i ,f j (which are α-inverse Holder activation functions) g i ,g j and h i ,h j are implicitly established in Theorem 3.2. Such relations, however, have not been provided by any of the authors in the reported literature.
, and R 3j are known positive definite matrices with appropriate dimensions.
Consider a general stochastic system By generalized Ito's formula, one can see that Let u(t; ξ ) and v(t;ξ ) denote the state trajectory from the initial data Clearly, system (4) admits a trivial solution u(t, 0) ≡ 0 and v(t, 0) ≡ 0 corresponding to the initial data ξ = 0 andξ = 0, respectively. For simplicity, we write u(t; ξ ) = u(t) and v(t,ξ ) = v(t). (4) is said to be globally exponentially stable in the mean square if, for any

Definition 1.3 The equilibrium point of neural networks
, there exist positive constants η, T , ξ , and ξ correlated with ξ andξ such that, when t > T , the following inequality holds: Definition 1. 4 We introduce the stochastic Lyapunov-Krasovskii functional V ∈ C 2,1 (R + × R n × R n × M; R + ) of system (4), the weak infinitesimal generator of random

Lemma 1.6 ([39]) If f i is an α-inverse
Holder function and f i (0) = 0, then there exist constants q i 0 > 0 and

Lemma 1.7 ([21])
For any real matrix M > 0, scalars a and b with 0 ≤ a < b, vector function x(α) such that the following integrals are well defined, we have

Lemma 1.8 ([39]
) Let x, y ∈ R n , and G is a positive definite matrix, then

Lemma 1.11 ([33]) For given matrices D, E, and F with F T F ≤ I and scalar
> 0, the following inequality holds: Remark 1.12 Lakshmanan et al. in [18] analyzed the impact of time-delayed BAM neural networks for ensuring the stability performance when the leakage delay occurred. In [12], the authors discussed the stability behavior in the sense of asymptotic for BAM neural networks with mixed time delays and uncertain parameters. Moreover, the comparisons for maximum allowable upper bounds of discrete time-varying delays have been listed.
Lou and Cui in [29] conversed the exponential stability conditions for time-delayed BAM NNs while Markovian jump parameters arose. Further, the stochastic effects on neural networks and stability criteria were conversed via exponential sense by Huang and Li in [40] by the aid of Lyapunov-Krasovskii functionals. In all the above mentioned references, the stability problem for BAM neural networks was considered only with leakage delays or mixed time delays, or stochastic effects, or Markovian jump parameters, or parameter uncertainties, but all the above factors have not been taken into one account and no one investigated exponential stability via delays at a time. Considering the above facts is very challenging and advanced in this research work.

Global exponential stability of uncertain system
Now consider the following BAM neural networks with stochastic noise disturbance, Markovian jump parameters, leakage and mixed time delays, which are in the uncertainty case system:
Also N 5 , and N 6 are positive definite diagonal matrices that are defined as follows: n)) and the remaining terms are defined in a similar way, which characterizes how the deterministic uncertain parameter in enters the nominal matrices H i , H j , R b i (b = 1, 2, 3), R c j (c = 1, 2, 3), S, S, T, T, N 1 , N 2 , N 3 , N 4 , N 5 , and N 6 . The matrix with real entries, which may be time-varying, is unknown and satisfies T ≤ I. Remark 3.1 Overall, the stability of time-delayed neural networks fully depends on the Lyapunov-Krasovskii functional and LMI concepts. In particular, based on the neural networks, different types of LKF are chosen or handled to lead to the system stability. Up to now, no one has considered uncertain parameters in Lyapunov-Krasovskii functional terms. Without loss of generality, the gap is initially filled in this work, and also this kind of approach gives more advanced and less conserved stability results.
where * The remaining values of * 1 , * 1 are the same as in Theorem 2.1, and * means the symmetric terms.
Hence, by applying the same procedure of Theorem 2.1 and using Assumption 5, Lemmas 1.8, 1.9, 1.10 and 1.11 and putting η = max i,j∈M {max i∈M η i , max j∈M η j }, we have from (28) and Definition 2 (weak infinitesimal operator LV ) that where (t) and (t) are given in Theorem 2.1. The remaining proof of this theorem is similar to the procedure of Theorem 2.1, and we get that the uncertain neural network (28) is global robust exponentially stable in the mean square sense.

Conclusions
In this paper, we have treated the problem of global exponential stability analysis for the leakage delay terms. By employing the Lyapunov stability theory and the LMI framework, we have attained a new sufficient condition to justify the global exponential stability of stochastic impulsive uncertain BAMNNs with two kinds of time-varying delays and leakage delays. The advantage of this paper is that different types of uncertain parameters were introduced into the Lyapunov-Krasovskii functionals, and the exponential stability behavior was studied. Additionally, two numerical examples have been provided to reveal the usefulness of our obtained deterministic and uncertain results. To the best of our knowledge, there are no results on the exponential stability analysis of inertial-type BAM neural networks with both time-varying delays by using Wirtinger based inequality, which might be our future research work.

Notations
R is the set of real numbers; R n is the n-dimensional Euclidean space; R n×n denotes the set of all n × n real matrices; Z + is the set of all positive integers. For any matrix A, A T is the transpose of A, A -1 is the inverse of A; * means the symmetric terms in a symmetric matrix. Positive definite matrix A is denoted by A > 0, negative definite A is denoted by A < 0, λ min (·) denotes minimum eigenvalues of a real symmetric matrix; λ max (·) is maximum eigenvalues of a real symmetric matrix; I n denotes the n × n identity matrix; x = (x 1 , x 2 , . . . , x n ) T , y = (y 1 , y 2 , . . . , y n ) T are column vectors; x T y = n i=1 x i y i , x = ( n i=1 x 2 i ) 1/2 ;ẋ(t),ẏ(t) denote the derivatives of x(t) and y(t), respectively; * is the symmetric form of a matrix; C 2,1 (R + × R n × M; R + ) is the family of all nonnegative functions V(t, u(t), i) on R + × R n × M which are continuously twice differentiable in u and once differentiable in t; (A, F , {F t } t≥0 , P) is a complete probability space, where