Skip to main content

Theory and Modern Applications

Logarithmic density and logarithmic statistical convergence

Abstract

In this paper we use the idea of logarithmic density to define the concept of logarithmic statistical convergence. We find the relations of logarithmic statistical convergence with statistical convergence, statistical summability (H,1) introduced by Móricz (Analysis 24:127-145, 2004) and [ H , 1 ] q -summability. We also give subsequence characterization of statistical summability (H,1).

MSC:40A05, 40A30.

1 Introduction and preliminaries

The concept of statistical summability (H,1), which is a generalization of statistical convergence due to Fast [1], has recently been introduced by Móricz [2]. In this paper we use the idea of logarithmic density to define the concept of logarithmic statistical convergence. We find its relation with statistical convergence and statistical summability (H,1). We further define [ H , 1 ] q -summability and establish some inclusion relations.

Definition 1.1 Let be the set of all natural numbers and let χ E denote the characteristic function of EN. Put d n (E)= 1 n k = 1 n χ E (k) and δ n (E)= 1 l n k = 1 n χ E ( k ) k for nN, where l n = k = 1 n 1/k (n=1,2,3,). The numbers d ̲ (E)=lim inf n d n (E) and d ¯ (E)=lim sup n d n (E) are called the lower and upper asymptotic density of E, respectively. Similarly, the numbers δ ̲ (E)=lim inf n δ n (E) and δ ¯ (E)=lim sup n δ n (E) are called the lower and upper logarithmic density of E, respectively. If d ̲ (E)= d ¯ (E)=d(E), then d(E) is called the asymptotic density of E ( δ ̲ (E)= δ ¯ (E)= δ ln (E) is called the logarithmic density of E, respectively).

Note that for k=1, l n = k = 1 n 1/k=n and hence δ ln (E) reduces to d(E).

Now recall the concept of statistical convergence of real sequences (see Fast [1] and Fridy [3]).

Definition 1.2 A sequence x=( x k ) is said to be statistically convergent to L if for every ϵ>0, d({k:| x k L|ϵ})=0. That is,

lim n 1 n | { k n : | x k L | ϵ } | =0.

Several extensions, variants and generalizations of this notion have been investigated by various authors, namely [2, 416].

2 Logarithmic statistical convergence

In this section we define the logarithmic statistical convergence and [ H , 1 ] q -summability and establish some inclusion relations.

Definition 2.1 A sequence x=( x k ) is said to be logarithmic statistically convergent to L if for every ϵ>0, the set {k:| x k L|ϵ} has logarithmic density zero. That is,

lim n 1 l n | { k n : 1 k | x k L | ϵ } | =0.
(2.1)

In this case we write st ln -limx=L and we denote the set of all logarithmic statistically convergent sequences by st ln .

Remark 2.1 One can say that logarithmic statistical convergence is a special case of weighted statistical convergence [15] if p k = 1 k . But this is not exactly true, since for p k = 1 k , P n = k = 1 n p k = k = 1 n 1/klogn (n=1,2,3,), and consequently, the definition of weighted statistical convergence gives that lim n 1 l n |{k l n logn: 1 k | x k L|ϵ}|=0. So, one can see the difference between this and (2.1), i.e., in (2.1) the enclosed set has bigger cardinality.

Definition 2.2 Let τ n := l n 1 k = 1 n x k /k, where l n = k = 1 n 1/klogn (n=1,2,3,). We say that x=( x k ) is (H,1)-summable to L if the sequence τ=( τ n ) converges to L, i.e., (H,1)-limx=L.

If k=1 then l n = k = 1 n 1/k=n, and (H,1)-summability is reduced to (C,1)-summability.

Definition 2.3 A sequence x=( x k ) is said to be [ H , 1 ] q -summable (0<q<) to the limit L if lim n 1 l n k = 1 n 1 k | x k L | q =0, and we write it as x k L [ H , 1 ] q . In this case L is called the [ H , 1 ] q -limit of x.

Let q=1. If k=1 then l n = k = 1 n 1/k=n, [ H , 1 ] q -summability is reduced to strong (C,1)-summability. Also, [ H , 1 ] q -summability is a special case of [ N ¯ , p n ] q -summability (cf. [15]) for p k = 1 k .

Recently, Móricz [2] has defined the concept of statistical summability (H,1) as follows.

Definition 2.4 A sequence x=( x k ) is said to be statistically summable (H,1) to L if the sequence τ=( τ n ) is statistically convergent to L, i.e., st-limτ=L=H(st)-limx. We denote by H(st) the set of all sequences which are statistically summable (H,1) and we call such sequences statistically (H,1)-summable sequences.

Remark 2.2 If x=( x k ) is bounded, then st- lim k x k =L implies (C,1)- lim k x k =L (see [17]). The converse is obviously not true, e.g., x=(1,0,1,0,) is (C,1)-summable to 1 2 but not statistically convergent. However, for bounded sequences, statistical convergence to some number is equivalent to strong Cesàro summability to the same number. But for logarithmic statistical convergence the situation is different (see [8]).

Theorem 2.1 Statistical convergence implies logarithmic statistical convergence but converse need not be true.

Proof It is well known that for each EN, d ̲ (E) δ ̲ (E) δ ¯ (E) d ¯ (E) (see [18], pp.70-75, pp.95-96). Hence if d(E) exists, then also δ ln (E) exists and d(E)= δ ln (E). Hence statistical convergence implies logarithmic statistical convergence.

Take E k ={ k k 2 +1, k k 2 +2,, k k 2 + 1 } (kN) and E= k = 2 E k . If E(n)= d n (E) for nN, then

d ¯ (E)lim sup k E ( k k 2 + 1 ) k k 2 + 1 lim sup k k k 2 + 1 k k 2 k k 2 + 1 =1.

Hence d ¯ (E)=1.

Since j E n 1 j =lnk+O( 1 k k 2 ) (kN, k2), we get

δ ¯ (E) lim n k = 1 n ln k + O ( 1 ) j = 1 n n 2 + 1 1 j lim n ln n + O ( 1 ) ( n 2 + 1 ) ln n + O ( 1 ) =0.

Hence δ ln (E)=0 and consequently d ̲ (E)=0, i.e., d(E) does not exist. Define the sequence x=( x k ) by

x k = { 1 if  k E , 0 if  k N E .

Since δ ln (E)=0, we have st ln - lim n x n =0. But (C,1)- lim n x n does not exist because 1 n m = 1 n x m = E ( n ) n (nN) and hence st- lim n x n does not exist.

This completes the proof. □

3 Main results

In the following theorem we establish the relation between logarithmic statistical convergence and Móricz’s statistical summability (H,1).

Theorem 3.1 If a sequence x=( x k ) is bounded and logarithmic statistically convergent to L then it is statistically summable (H,1) to L, but not conversely.

Proof Let x=( x k ) be bounded and logarithmic statistically convergent to L. Write K ϵ :={kN: 1 k | x k L|ϵ}. Then

| τ n L | = | l n 1 k = 1 n x k / k L | = | l n 1 k = 1 n 1 k ( x k L ) | l n 1 k = 1 n 1 k | x k L | l n 1 k K ϵ | x k L | l n 1 ( sup k | x k L | ) K ϵ 0

as n, which implies that τ n L as n. That is, x is (H,1)-summable to L and hence statistically summable (H,1) to L.

For converse, we consider the special case when k=1, then l n =n as above. Consider the sequence x=( x k ) defined by

x k = { 1 if  k  is odd , 0 if  k  is even .

Of course this sequence is not logarithmic statistically convergent. On the other hand, x is (H,1)-summable to 1 and hence statistically summable (H,1) to 1.

This completes the proof of the theorem. □

Remark 3.1 The above theorem is analogous to Theorem 2.1 of [15] but this holds for any bounded sequence.

In the next result we establish the inclusion relation between logarithmic statistical convergence and [ H , 1 ] q -summability.

Theorem 3.2 (a) If 0<q< and a sequence x=( x k ) is [ H , 1 ] q -summable to the limit L, then it is logarithmic statistically convergent to L.

  1. (b)

    If ( x k ) is bounded and logarithmic statistically convergent to L, then x k L [ H , 1 ] q .

Proof (a) If 0<q< and x k L [ H , 1 ] q , then

0 l n 1 k = 1 n 1 k | x k L | q l n 1 k = 1 | x k / k L | ϵ n 1 k | x k L | q ϵ q l n | K ϵ |

as n. That is, lim n 1 l n | K ϵ |=0 and so δ ln ( K ϵ )=0, where K ϵ :={kn: 1 k | x k L|ϵ}. Hence x=( x k ) is logarithmic statistically convergent to L.

  1. (b)

    Suppose that x=( x k ) is bounded and logarithmic statistically convergent to L. Then, for ϵ>0, we have δ ln ( K ϵ )=0. Since x l , there exists M>0 such that | x k L|M (k=1,2,). We have

    l n 1 k = 1 n 1 k | x k L | q = 1 l n k = 1 k K ϵ n 1 k | x k L | q + 1 l n k = 1 k K ϵ n 1 k | x k L | q = S 1 (n)+ S 2 (n),

where

S 1 (n)= 1 l n k = 1 k K ϵ n 1 k | x k L | q and S 2 (n)= 1 l n k = 1 k K ϵ n 1 k | x k L | q .

Now if k K ϵ then S 1 (n)< ϵ q . For k K ϵ , we have

S 2 (n) ( sup | x k L | ) ( | K ϵ | / l n ) M| K ϵ |/ l n 0

as n, since δ ln ( K ϵ )=0. Hence x k L [ H , 1 ] q .

This completes the proof of the theorem. □

Remark 3.2 The above theorem is analogous to Theorem 2.2 of [15] but with less restrictions on the sequence x=( x k ).

In the next result we characterize statistical summability (H,1) through the (H,1)-summable subsequences.

Theorem 3.3 A sequence x=( x k ) is statistically summable (H,1) to L if and only if there exists a set K={ k 1 < k 2 << k n <}N such that δ(K)=1 and (H,1)-lim x k n =L.

Proof Suppose that there exists a set K={ k 1 < k 2 << k n <}N such that δ(K)=1 and (H,1)-lim x k n =L. Then there is a positive integer N such that for n>N,

| τ n L|<ϵ.
(3.1)

Put K ϵ :={nN:| τ k n L|ϵ} and K ={ k N + 1 , k N + 2 ,}. Then δ( K )=1 and K ϵ N K , which implies that δ( K ϵ )=0. Hence x=( x k ) is statistically summable (H,1) to L.

Conversely, let x=( x k ) be statistically summable (H,1) to L. For r=1,2,3, , put K r :={jN:| τ k j L|1/r} and M r :={jN:| τ k j L|<1/r}. Then δ( K r )=0 and

M 1 M 2 M i M i + 1
(3.2)

and

δ( M r )=1,r=1,2,3,.
(3.3)

Now we have to show that for j M r , ( x k j ) is (H,1)-summable to L. Suppose that ( x k j ) is not (H,1)-summable to L. Therefore there is ϵ>0 such that | τ k j L|ϵ for infinitely many terms. Let M ϵ :={jN:| τ k j L|<ϵ} and ϵ>1/r (r=1,2,3,). Then

δ( M ϵ )=0,
(3.4)

and by (3.2), M r M ϵ . Hence δ( M r )=0, which contradicts (3.3) and therefore ( x k j ) is (H,1)-convergent to L.

This completes the proof of the theorem. □

Similarly we can prove the following dual statement.

Theorem 3.4 A sequence x=( x k ) is logarithmic statistically convergent to L if and only if there exists a set K={ k 1 < k 2 << k n <}N such that δ ln (K)=1 and lim x k n n =L.

References

  1. Fast H: Sur la convergence statistique. Colloq. Math. 1951, 2: 241-244.

    MathSciNet  Google Scholar 

  2. Móricz F: Theorems relating to statistical harmonic summability and ordinary convergence of slowly decreasing or oscillating sequences. Analysis 2004, 24: 127-145.

    Article  Google Scholar 

  3. Fridy JA: On statistical convergence. Analysis 1985, 5: 301-313.

    Article  MathSciNet  Google Scholar 

  4. Alotaibi A, Mursaleen M: A -Statistical summability of Fourier series and Walsh-Fourier series. Appl. Math. Inf. Sci. 2012, 6(3):535-538.

    MathSciNet  Google Scholar 

  5. Edely OHH, Mursaleen M: On statistical A -summability. Math. Comput. Model. 2009, 49: 672-680. 10.1016/j.mcm.2008.05.053

    Article  MathSciNet  Google Scholar 

  6. Fridy JA, Orhan C: Lacunary statistical convergence. Pac. J. Math. 1993, 160: 43-51. 10.2140/pjm.1993.160.43

    Article  MathSciNet  Google Scholar 

  7. Kolk E: The statistical convergence in Banach spaces. Tartu Ülik. Toim. 1991, 928: 41-52.

    MathSciNet  Google Scholar 

  8. Kostyrko P, Šalát T, Wilczyński W: I -Convergence. Real Anal. Exch. 2000/2001, 26(2):669-686.

    Google Scholar 

  9. Mohiuddine SA, Alotaibi A, Mursaleen M: Statistical convergence of double sequences in locally solid Riesz spaces. Abstr. Appl. Anal. 2012., 2012: Article ID 719729 10.1155/2012/719729

    Google Scholar 

  10. Mursaleen M, Alotaibi A, Mohiuddine SA: Statistical convergence through de la Vallée-Poussin mean in locally solid Riesz spaces. Adv. Differ. Equ. 2013., 2013: Article ID 66 10.1186/1687-1847-2013-66

    Google Scholar 

  11. Mursaleen M, Alotaibi A: Statistical summability and approximation by de la Vallée-Poussin mean. Appl. Math. Lett. 2011, 24: 320-324. (Erratum: Appl. Math. Lett. 25, 665 (2012)) 10.1016/j.aml.2010.10.014

    Article  MathSciNet  Google Scholar 

  12. Mursaleen M, Alotaibi A: On I -convergence in random 2-normed spaces. Math. Slovaca 2011, 61(6):933-940. 10.2478/s12175-011-0059-5

    Article  MathSciNet  Google Scholar 

  13. Mursaleen M, Edely OHH: Generalized statistical convergence. Inf. Sci. 2004, 162: 287-294. 10.1016/j.ins.2003.09.011

    Article  MathSciNet  Google Scholar 

  14. Mursaleen M, Edely OHH: On the invariant mean and statistical convergence. Appl. Math. Lett. 2009, 22: 1700-1704. 10.1016/j.aml.2009.06.005

    Article  MathSciNet  Google Scholar 

  15. Mursaleen M, Karakaya V, Ertürk M, Gürsoy F: Weighted statistical convergence and its application to Korovkin type approximation theorem. Appl. Math. Comput. 2012, 218: 9132-9137. 10.1016/j.amc.2012.02.068

    Article  MathSciNet  Google Scholar 

  16. Mursaleen M, Mohiuddine SA: On ideal convergence in probabilistic normed spaces. Math. Slovaca 2012, 62: 49-62. 10.2478/s12175-011-0071-9

    Article  MathSciNet  Google Scholar 

  17. Connor JS: The statistical and strong p -Cesàro convergence of sequences. Analysis 1988, 8: 47-63.

    Article  MathSciNet  Google Scholar 

  18. Ostmann HH I. In Additive Zahlentheorie. Springer, Berlin; 1956.

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the Deanship of Scientific Research at King Abdulaziz University for its financial support under grant number 151-130-1432.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Momammad Mursaleen.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

All authors contributed equally and significantly in writing this paper. All authors read and approved the final manuscript.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Alghamdi, M.A., Mursaleen, M. & Alotaibi, A. Logarithmic density and logarithmic statistical convergence. Adv Differ Equ 2013, 227 (2013). https://doi.org/10.1186/1687-1847-2013-227

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1687-1847-2013-227

Keywords