Bochner definition of Stepanov-like almost automorphic functions on time scales and an application to cellular neural networks with delays

The definition of Stepanov-like almost automorphic functions on time scales had been proposed in the literature, but at least one result was incorrect, which involved Bochner transform. In our work, we give the Bochner definition of Stepanov-like almost automorphic functions on time scales, and prove that a function is Stepanov-like almost automorphic if and only if it satisfies Bochner definition of Stepanov-like almost automorphic function on time scales. The Bochner definition of Stepanov-like almost automorphic functions on time scales corrects the faulty result, and perfects the definition of Stepanov-like almost automorphic functions. As applications, we discuss the almost automorphy of a certain dynamic equation and some cellular neural networks with delays on time scales.


Introduction
In 1988, S. Hilger first introduced the theory of time scales in his PhD thesis (see [1]), which unifies continuous and discrete situations. Thus the field of differential equations on time scales can extend the classical differential equations and difference equations on R. Moreover, this theory provides important tools in the areas of economics, populations models, quantum physics, and some others. Therefore, many researchers are interested in this theory (see [2][3][4][5]). Dynamic systems and cellular neural networks are two important models in applications. In recent years, the investigations of the above two systems on time scales have attracted many mathematicians' attention.
The theory of almost periodicity on time scales dates back to 2011. The authors of [6,7] first extended the classical almost periodic functions on R to time scales, and also applied them to investigate the almost periodicity of some high-order Hopfield neural networks on time scales. Thereafter, the theory of almost periodicity on time scales has attracted attention of many scientists. Lizama, Mesquita, and Ponce [8] gave a related concept of almost periodicity on time scales, and also constructed one almost periodic function on R as well as on time scales. These results enriched the theory of almost periodicity. As the development of the theory of almost periodicity on R, generalizations of the definition of an almost periodic function on time scales are also interesting topics. Li and Wang defined pseudo almost periodicity in [9], and then, Li and Zhao investigated weighted pseudo periodic functions [10]. Meanwhile, Lizama and Mesquita [11] defined almost automorphy on time scales. Furthermore, the Stepanov-like definitions were also suggested. Tang and Li [12,13] gave the concept of Stepanov-like almost periodic functions and Stepanovlike pseudo almost periodic functions on time scales, and Dhama and Abbas [14] showed weighted Stepanov-like pseudo almost automorphic functions on time scales. In the definition of Stepanov-like functions on R, Bochner transform is an important tool, but it is not available in the investigation on time scales. To solve this problem, Tang and Li [12] proposed the definition of Bochner-like transform on time scales. At least one result in [14] involving the Bochner definition was not true, and so it is important to give the Bochner definition of Stepanov-like almost automorphic functions on time scales.
The first contribution of our work is an introduction of the Bochner definition of Stepanov-like almost automorphic functions on time scales by applying a Bochner-like transform, which is equivalent to the earlier definition of Stepanov-like almost automorphic functions on time scales provided in [14]. Our definition corrects the result in [14]. Secondly, as applications, we present a study of the almost automorphy for the following dynamic equation: with A(s) being almost automorphic, and ϕ(s) Stepanov-like almost automorphic on T. The almost automorphy [11] and pseudo almost periodicity [13] of equation (1.1) have been studied, with ϕ(s) being almost automorphic and pseudo almost periodic, respectively. Our results extend earlier investigations. Finally, we show a result about the almost automorphy for the following system: for s ∈ T, l = 1, 2, . . . , n. This system is called a cellular neural network, where u l (s) denotes the activation of the lth neuron at time s, a l (s) corresponds to the rate cell l resets its potential to the resting state when isolated from the other cells and inputs at time s, γ lm denotes the transmission delay, I l (s) and F m correspond to the external input and the activation function, respectively, and b lm (s) and c lm (s) are the connection weights at time s. Our work is divided into four parts. In the second section, we present preliminaries, including the definitions and properties of Stepanov-like almost automorphic functions on time scales. The third section is devoted to establishing the result about the almost automorphy for some linear dynamic equation on time scales. At last, we show the existence of an almost automorphic solution for some cellular neural networks with delays, and present an example.

Definitions and properties
In the sequel, we set Z = {s : s is a integer}, R = {s : s is a real number}, C = {s : s is a complex number}, R + = {s : s is a real number, and s ≥ 0}.
The Euclidean space R n is equipped with Euclidean norm | · |. Let X be a Banach space with the norm · . We will show some definitions about time scales, which are taken from [2,4]. If T is a nonempty closed subset of R, we say it is a time scale. Let s ∈ T, the forward jump operator σ : T → T and backward jump operator ρ : T → T are defined by If ρ(s) = s, we call s left-dense, and otherwise, we call it left-scattered. Similarly, if σ (s) = s, s ∈ T is called right-dense. Otherwise, s ∈ T is said to be right-scattered. Now we show the definition of the graininess μ : Let a 1 , a 2 ∈ T and a 1 ≤ a 2 , [a 1 , a 2 ] T = {s ∈ T : a 1 ≤ s ≤ a 2 }, and [a 1 , a 2 ] be the usual interval on R.

Definition 2.2 ([2])
Let ϕ : T → X and s ∈ T k . If ϕ (s) ∈ X is such that for any positive constant ε we can find a neighborhood U s of s, satisfying we call ϕ (s) the delta derivative of ϕ(s). The definition of integration on time scales is summarized in [12], and the authors of from [3] showed that the integral [a 1 ,a 2 ] T ϕ(s) s also satisfies the Lebesgue integration theory on T. In the following, we always suppose T is invariant under translation. In [12], the authors gave its properties. Set L := inf |τ | : τ ∈ , τ = 0 . (2.1) with L defined in (2.1). In the following, we work under the assumption that p ≥ 1. Define the norm · S p as If ϕ S p < ∞, we call function ϕ S p -bounded, and denote their space by BS p (T, X).
Remark 2.1 In [12], the authors have shown some results about the space BS p (T, X). That is,

Definition 2.5 ([14])
A function ϕ ∈ BS p (T, X) is said to be Stepanov-like almost automorphic if for every sequence {α n } ∈ , we can extract a subsequence {α n } and find a functionφ ∈ L p loc (T, X) such that In [14], the authors claimed the following conclusion: . But the result is not true, and the following example shows it.
We can check that ϕ ∈ S p AA(T, R). But That is,φ / ∈ AA(T, L p ([0, L] , R)), and the conclusion (A) is not true.
It is important to introduce the Bochner definition of Stepanov-like almost automorphic functions. In the following, we will give the Bochner definition of Stepanov-like almost automorphic functions on time scales by applying a Bochner-like transform which is introduced in [12].
If T = R, let ω be a left-scattered point in T. For any fixed s ∈ T, it is not difficult to verify that there exists a unique integer n s such that sn s L ∈ [ω, ω + L) T .
We call ϕ c the Bochner-like transform of ϕ.
By using Bochner-like transform, we can present the following Bochner definition: Definition 2.7 Let ϕ ∈ BS p (T, X). If the Bochner-like transform ϕ c ∈ AA(T, BS p (T, X)), we say that ϕ is Stepanov-like almost automorphic on T based on Bocher definition. Denote their set by S p AA(T, X).
Proof If T = R, we can easily get N s+α L = s + α = N s L + α, and the conclusion is true. If Since n s+α is unique, we have n s+α = n s + α , thus n s+α L = n s L + α, and this completes the proof.
Theorem 2.1 For ϕ ∈ BS p (T, X), the following statements are equivalent: Proof For T = R, let ϕ,φ ∈ BS p (T, X), and consider a sequence {α n } ⊂ . From Lemma 2.2, for each s ∈ T, we have Thus the two statements are equivalent for for s, t ∈ R. From the above argument, we know the two statements are equivalent, and this completes the proof.
Remark 2.2 (i) Clearly, we can get that AA(T, X) ⊂ S p AA(T, X).
(ii) By the above Theorem 2.1, we can easily get the following result: ϕ ∈ S p AA(T, X) if and only if for each sequence {α n } in , there exist {α n } ⊂ {α n } and ϕ ∈ BS p (T, X) such that as n → ∞ pointwise on T. Thus the Bochner definition of Stepanov-like almost automorphic function is equivalent to Definition 2.12 in [14].
(iii) If the convergent "pointwise" in (ii) is changed to "uniformly", we know it is Stepanov-like almost periodic on T. Thus S p AP(T, X) ⊂ S p AA(T, X). Remark 2.3 ϕ ∈ S p AA(Z, X) if and only if ϕ ∈ AA(Z, X). Indeed, letting T = Z, we can easily get that = Z, L = 1, and μ(s) = 1 for s ∈ Z. If ϕ ∈ S p AA(Z, X), for any {α n } in , there exist as n → ∞ for every s ∈ Z. Then ϕ(s + α n ) -φ(s) → 0 as n → ∞ for every s ∈ Z. That is, ϕ is an almost automorphic sequence on Z.
From Definition 2.7, the space S p AA(T, X) can inherit the following important properties from AA(T, BS p (T, X)) directly: Now we give the composition theorem on T. Let In the proof, we will apply the following assumption: (H) The operator T (ϕ(ν(·)))(s) is continuous on the set of S p -bounded functions ν : T → R at each s ∈ T, that is, for any positive constant ε, we can find a positive constant δ such that if T (ν 1ν 2 )(s) < δ, then we have T (ϕ(ν 1 (·)ν 2 (·)))(s) < ε for each s ∈ T.
Proof Since ν ∈ S p AA(T, R), we know that for each {α n } ⊂ , there exist {α n } ⊂ {α n } and ν ∈ BS p (T, R), satisfying for every s ∈ T. From the assumption (H) and (2.2), we have pointwise on s ∈ T. By a similar argument, we can get pointwise on s ∈ T. Therefore, ϕ(ν(sγ )) ∈ S p AA(T, R), and the proof is complete.
Remark 2.4 Theorem 3.19 in [11] showed a result about the composition of almost automorphic functions and continuous functions. Our result extends Theorem 3.19 to Stepanov-like almost automorphic functions by replacing "continuous" with respect to the sup norm by "continuous" with respect to the S p norm.

The almost automorphy for a dynamic system on T
In this section, the concept of an exponential function on time scales will be given first. We say that k : T → R is regressive if 1 + μ(s)k(s) = 0, for s ∈ T k . Denote R = R(T) = R(T; R) = k : k ∈ C rd (T, R), and it is regressive and Suppose that k, l ∈ R and define k ⊕ l and k as follows: for s ∈ T k . We can easily check that the set R is an Abelian group with operation ⊕.
The function log means the principal logarithm. We say that equation (3.2) admits an exponential dichotomy on time scale T [3], if there exist a projection P and positive constants ω, K such that for s ∈ T.
Proof By a similar argument as in the proof of Lemma 4.7 in [13], it is not difficult to show that the function in (3.4) is the unique bounded solution for (3.1). Now we proceed in two steps to prove it is almost automorphic.
Step 1. We show the property of e A (s, σ (τ )). From the definition of an almost automorphic function on T, we know that for any {α n } in , we can find {α n } ⊂ {α n }, and functions A,φ, satisfying piecewise on T as n → ∞, and It is easy to get that Thus from the variation of constants formula, we have By (3.5) and [10, Theorem 2.1], for every ε > 0, we can find N 1 ∈ Z + such that, when n > N 1 , we get since on the interval [σ (τ ), s] T the functions A(r),Ã(r), and 1 + ωμ(r) are all bounded.
for n > N 1 , s ∈ T.
Step 2. We prove that u is almost automorphic. Set By (3.6), for the ε given above, there is N ∈ Z + such that N > N 1 , and, when n > N , we have for every s ∈ T. By Hölder inequality, (3.3), and Proposition 3.1, for n > N , s ∈ T, we have (3.8) where 1/p + 1/q = 1. By (3.7) and Proposition 3.1, we get for n > N , s ∈ T, where 1/p + 1/q = 1. By (3.8) and (3.9), we know the series K 1 , K 2 are convergent, and then we have Thus for every point s in T, we get that lim n→∞ |u(s + α n ) -ũ(s)| = 0. By a similar argument as in the above proof, we can show that lim n→∞ |ũ(sα n )u(s)| = 0 for every s ∈ T.
Therefore, the solution u ∈ AA(T, R), and this completes the proof. (ii) Assume T = Z, then equation (3.1) is the difference equation u(n + 1)u(n) = A(n)u(n) + ϕ(n). From Remark 2.3, we know that a Stepanov-like almost automorphic function on Z is equal to an almost automorphic sequence. Thus Theorem 3.1 is Theorem 3.8 in [15]. If T = R, it is clear that condition (3.3) can verify the exponential dichotomy condition on R, and Theorem 3.1 can generalize Theorem 3.2 in [15]. That is, our Theorem 3.1 combines the two previous results, and generalizes them.

The almost automorphy for cellular neural networks on T
At this part, we will show the result about almost automorphy for the following system: u l (s) = -a l (s)u l (s) + (H 2 ) For every l = 1, 2, . . . , n, we have a l > 0, -a l (s) ∈ R + , with s in T.
where λ a l is given in Proposition 3.1. By a similar argument as in the proof of [6, Lemma 2.15]], we can get the following result, and here we omit the detailed proof. Proof Let E = {ψ ∈ S p AA(T, R n ) : ψ S p ≤ r 0 }, where r 0 = max 1≤l≤n {λ a l ( n m=1 C m ( b lm + c lm ) + I l S p )}. For any given ψ = (ψ 1 , ψ 2 , . . . , ψ n ) T ∈ E, we investigate the linear system: u l (s) = -a l (s)u l (s) + Let ψ, χ ∈ E. Then, by (H 1 ), we have By (H 3 ), we know that : E → E is a contraction mapping. Thus we know that the operator has a unique fixed point u * (s) in E. Therefore, the system (4.1) admits a unique almost automorphic solution, and this completes the proof.