Shannon type inequalities via time scales theory

*Correspondence: iqrar@math.qau.edu.pk 1Department of Mathematics, University of Sargodha, Sargodha, Pakistan Full list of author information is available at the end of the article Abstract The aim of present paper is to obtain Shannon type inequalities using the extended version of Jensen’s inequality in time scales settings. The concept of differential entropy of a continuous random variable on time scales is introduced, and its bounds for some particular distributions are also estimated.


Introduction and preliminaries
In recent times, Shannon entropy and Zipf-Mandelbrot law have been the topics of great interest, see for example [1,9,11,12]. The concept of Shannon entropy, the central source of information theory, is sometimes referred to as measure of uncertainty. Shannon entropy allows to estimate the average minimum number of bits needed to encode a string of symbols based on the alphabet size and the frequency of the symbols.
The following definition of Shannon entropy is given in [8]. A fundamental inequality related to the notion of Shannon entropy is the following inequality given in [16]: which is valid for all r i , f i > 0 with © The Author(s) 2020. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. (1) if and only if r i = f i for all i. This result sometimes called the fundamental lemma of information theory has extensive applications (see [14]). In [13], Matić et al. gave the refinement of Shannon's inequality in its discrete and integral forms by presenting upper estimates of the difference between its two sides. In [10], Sadia et al. studied some interesting results related to the bounds of the Shannon entropy by using nonincreasing (nondecreasing) sequences of real numbers. One of the main approaches to unifying continuous and discrete mathematics is time scale calculus which was founded by German mathematician Stefan Hilger in 1988. A time scale is an arbitrary nonempty closed subset of the real numbers. For an introduction to the theory of dynamic equations on time scales, see [6]. In [7], Guseinov studied the process of Riemann and Lebesgue integration on time scales. Bohner and Guseinov [4,5] defined the multiple Riemann and multiple Lebesgue integration on time scales and compared the Lebesgue -integral with the Riemann -integral. Various authors examined certain integral inequalities on time scales. In

Definition 3 ([6]) A function
for all t ∈ T k and the delta integral The following theorems are useful in the proof of the main results.
The inequality in (3) is strict if g is strictly convex.

Main results
Throughout the paper 'log' refers to logarithms to baseb for some fixedb > 1. We initiate with the following result.
Now, by adding log( b a r(s)ξ (s) s b a r(s) s ) on both sides of (7), we get which is inequality (5). Inequality (6) is a straightforward outcome of the following inequality given in [13]:

Shannon entropy
Consider X to be a continuous random variable with a nonnegative density function r(s) on T such that b a r(s) s = 1, whenever the integral exists, we have the following definition.

Definition 4
The nominal differential entropy of X on time scale is defined by The following result is the time scale extension of integral Shannon inequality [13,Theorem 18]. Moreover, one can get results related to Shannon entropy by choosing time scale to be the set of integers with positive probability distributions in the following result.
Suppose that forb > 1 at least one of the following -integrals is finite: Since therefore replacing x by f (s) r(s) and multiplying both sides by r(s) in (11), we get Whenever Q r is finite, then Q r -J = Q f is also finite, further if Q f is finite, then Q f + J = Q r is finite as well. Therefore we may write J = Q r -Q f , and consequently the desired result is proved.
Suppose that forb > 1 at least one of the following -integrals is finite: Proof Use In the proof of our next result, we need the following weighted Grüss type inequality on time scales established by Sarikaya et al. in [17].

Entropy of continuous random variable
In the sequel, we denote mean and variance of a continuous random variable X by μ m = b a sr(s) s and v 2 = b a (sμ m ) 2 r(s) s respectively.

Theorem 7 Consider a continuous random variable X and density function r(s) (s ∈ T). (a) If X has a finite mean μ m and variance
then h¯b(X) is finite and Suppose that X has finite mean and r(s) = 0 for all s < 0. If  = log(μ m e).
Again apply Corollary 1 to obtain the required result. Theorem 21 a, b].
Remark 4 Theorem 7 shows that h¯b(X) ≈ log(λv √ 2πe) whenever the distribution of X is nearly equal to the Gaussian distribution with variance v 2 . If the distribution of X is close to the exponential distribution with mean μ m , then we have h¯b(X) ≈ log(λμ m e).

Theorem 8
(a) Under the assumptions of Theorem 7(a), if The following generalization of Jensen's inequality on time scales established by Anwar [3] et al. is needed in the proof of our next result.

Theorem 9 Let J ⊂ R be an interval and assume that
The following result is a generalization of Theorem 3.
Proof Use inequality (21) and follow similar steps as in the proof of Theorem 3 to get the stated result.
Corollary 5 Assume the conditions of Proposition 1 with D ψ(w) w = 1, then we have Remark 5 Choose T = R with D ψ(w) w = 1 in Proposition 1 to get [13,Proposition 1].
Suppose that X and Z are random variables whose distributions have density functions r(s) and r(z) respectively, and let r(s, z) be the joint density function for (X, Z). Denote By analogy, we may state the following definition.

Conclusion
In the paper, Shannon type inequalities on time scales have been established by using the time scales version of Jensen's inequality. Bounds are obtained for some Shannon type inequalities which have direct association to information theory. Differential entropy on time scales has been introduced and its bounds for some particular distributions have been obtained. The given results are the generalization of corresponding results established by Matić, Pearce, and Pečarić in [13], and the idea may stimulate further research in the theory of Shannon entropy, delta integrals, and generalized convex functions.