- Open Access
Global robust exponential synchronization of BAM recurrent FNNs with infinite distributed delays and diffusion terms on time scales
© Zhao; licensee Springer. 2014
- Received: 3 September 2014
- Accepted: 28 November 2014
- Published: 16 December 2014
In this article, the global robust exponential synchronization of reaction-diffusion BAM recurrent fuzzy neural networks (FNNs) with infinite distributed delays on time scales is investigated. Applied Lyapunov functional and inequality skills, some sufficient criteria are established to guarantee the global robust exponential synchronization of reaction-diffusion BAM recurrent FNNs with infinite distributed delays on time scales. One example is given to illustrate the effectiveness of our results.
- globally robust exponential synchronization
- reaction-diffusion BAM recurrent FNNs
- infinite distributed delays
- Lyapunov functional
- time scales
The study on the artificial neural networks has attracted much attention because of their potential applications such as signal processing, image processing, pattern classification, quadratic optimization, associative memory, moving object speed detection, etc. Many kinds of models of neural networks have been proposed by some famous scholars. One of these important neural network models is the bidirectional associative memory (BAM) neural network models, which were first introduced by Kosko [1–3]. It is a special class of recurrent neural networks that can store bipolar vector pairs. The BAM neural network is composed of neurons arranged in two layers, the X-layer and the Y-layer. The neurons in one layer are fully interconnected to the neurons in the other layer. Through iterations of forward and backward information flows between the two layers, it performs a two-way associative search for stored bipolar vector pairs and generalize the single-layer auto-associative Hebbian correlation to a two-layer pattern-matched heteroassociative circuits. Therefore, this class of networks possesses good application prospects in some fields such as pattern recognition, signal and image process, artificial intelligence . In general, artificial neural networks have complex dynamical behaviors such as stability, synchronization, periodic or almost periodic solutions, invariant sets and attractors, and so forth. We can refer to [5–27] and the references cited therein. Therefore, the analysis of dynamical behaviors for neural networks is a necessary step for practical design of neural networks. As one of the famous neural network models, it has attracted many attention in the past two decades [28–48] since the BAM model was proposed by Kosko. The dynamical behaviors such as uniqueness, global asymptotic stability, exponential stability and invariant sets and attractors of the equilibrium point or periodic solutions were investigated for BAM neural networks with different types of time delays (see [28–44, 48]).
Synchronization has attracted much attention after it was proposed by Carrol et al. [49, 50]. The principle of drive-response synchronization is this: the driver system sends a signal through a channel to the responder system, which uses this signal to synchronize itself with the driver. Namely, the response system is influenced by the behavior of the drive system, but the drive system is independent of the response one. In recent years, many results concerning a synchronization problem of time lag neural networks have been investigated in the literature [5, 6, 8–15, 27, 36, 49, 50].
As is well known, both in biological and man-made neural networks, strictly speaking, diffusion effects cannot be avoided when electrons are moving in asymmetric electromagnetic fields, so we must consider that the activations vary in space as well as in time. Many researchers have studied the dynamical properties of continuous time reaction-diffusion neural networks (see, for example, [8, 11, 17, 18, 24, 25, 27, 32, 48]).
where ; . is a time scale and is unbounded and . is constant time delay. and is a bounded compact set with smooth boundary ∂ Ω in space . , . and are the state of the i th neurons and the j th neurons at time t and in space x, respectively. and are constant input vectors. The smooth functions and correspond to the transmission diffusion operators along with the i th neurons and the j th neurons, respectively. , , , , , , , , , , , , , , , , , , are constants. and denote the rate with which the i th neurons and j th neurons will reset their potential to the resting state in isolation when disconnected from the network and external inputs, respectively. , , , , , , , , , , , , , , denote the connection weights. () and () denote the activation function of the j th neurons of Y-layer on the i th neurons of X-layer and the i th neurons of X-layer on the j th neurons of Y-layer at time t and in space x, respectively. () denotes the fuzzy activation function of the j th neurons on the i th neurons inside of X-layer. () denotes the fuzzy activation function of the i th neurons on the j th neurons inside of Y-layer. () denotes the bias of the j th neurons on the i th neurons inside of X-layer. () denotes the bias of the i th neurons on the j th neurons inside of Y-layer. ⋀, ⋁ denote the fuzzy AND and fuzzy OR operations, respectively. , are rd-continuous with respect to and continuous with respect to .
In order to investigate the global robust exponential synchronization for system (1.1)-(1.3), the quantities , , , , , , , , , , , and may be considered as intervals as follows: , , , , , , , , , , , , , .
where , τ is a positive integer, , , .
If we choose , then , . In this case, system (1.1)-(1.3) is the continuous reaction-diffusion BAM recurrent FNNs (1.4)-(1.6). If , then , system (1.1)-(1.3) is the discrete difference reaction-diffusion BAM recurrent FNNs (1.7)-(1.9). In this paper, we study the global robust exponential synchronization of reaction-diffusion BAM recurrent FNNs (1.1)-(1.3), which unify both the continuous case and the discrete difference case. What is more, system (1.1)-(1.3) is a good model for handling many problems such as predator-prey forecast or optimizing of goods output.
The rest of this paper is organized as follows. In Section 2, some notations and basic theorems or lemmas on time scales are given. In Section 3, the main results of global robust exponential synchronization are obtained by constructing the appropriate Lyapunov functional and applying inequality skills. In Section 4, one example is given to illustrate the effectiveness of our results.
In this section, we first recall some basic definitions and lemmas on time scales which are used in what follows.
A point is called left-dense if and , left-scattered if , right-dense if and , and right-scattered if . If has a left-scattered maximum m, then , otherwise . If has a right-scattered minimum m, then , otherwise .
Definition 2.1 ()
A function is called regulated provided its right-hand side limits exist (finite) at all right-hand side points in and its left-hand side limits exist (finite) at all left-hand side points in .
Definition 2.2 ()
A function is called rd-continuous provided it is continuous at right-dense point in and its left-hand side limits exist (finite) at left-dense points in . The set of rd-continuous function will be denoted by .
Definition 2.3 ()
for all . We call the delta (or Hilger) derivative of f at t. The set of functions that is a differentiable and whose derivative is rd-continuous is denoted by .
If f is continuous, then f is rd-continuous. If f is rd-continuous, then f is regulated. If f is delta differentiable at t, then f is continuous at t.
Lemma 2.1 ()
Let f be regulated, then there exists a function F which is delta differentiable with region of differentiation D such that for all .
Definition 2.4 ()
where C is an arbitrary constant and F is a Δ-antiderivative of f. We define the Cauchy integral by for all .
A function is called an antiderivative of provided for all .
Lemma 2.2 ()
if for all , then ,
if on , then .
If , then .
The generalized exponential function has the following properties.
Lemma 2.3 ()
for all ;
Lemma 2.4 ()
Lemma 2.5 ()
where . If t is right-scattered and is continuous at t, this reduces to .
Next, we introduce the Banach space which is suitable for system (1.1)-(1.3).
Let be an open bounded domain in with smooth boundary ∂ Ω. Let be the set consisting of all the vector function which is rd-continuous with respect to and continuous with respect to . For every and , we define the set . Then is a Banach space with the norm , where . Let consist of all functions which map into and is rd-continuous with respect to and continuous with respect to . For every and , we define the set . Then is a Banach space equipped with the norm , where , , .
where () and () are error functions. () is a constant error weighting coefficient. , , , .
The following definition is significant to study the global robust exponential synchronization of coupled neural networks (1.1)-(1.3) and (2.1)-(2.3).
where α is called the degree of exponential synchronization on time scales.
In this section, we will consider the global robust exponential synchronization of coupled systems (1.1)-(1.3) and (2.1)-(2.3). At first, we need to introduce some useful lemmas.
Lemma 3.1 ()
Lemma 3.2 ()
Throughout this paper, we always assume that:
(H1) The neurons activation , , and are Lipschitz continuous, that is, there exist positive constants , , and such that , , , for any , ; .
Theorem 3.1 Assume that (H1)-(H3) hold. Then the controlled slave system (2.1)-(2.3) is globally robustly exponentially synchronous with the master system (1.1)-(1.3).
where , , .
where , , .