- Research
- Open access
- Published:
Estimation of divergences on time scales via the Green function and Fink’s identity
Advances in Difference Equations volume 2021, Article number: 394 (2021)
Abstract
The aim of the present paper is to obtain new generalizations of an inequality for n-convex functions involving Csiszár divergence on time scales using the Green function along with Fink’s identity. Some new results in h-discrete calculus and quantum calculus are also presented. Moreover, inequalities for some divergence measures are also deduced.
1 Introduction
The development of the theory of time scales was initiated by Hilger in 1988 as a theory efficient to contain both difference and differential calculus in a steady approach. The books of Bohner and Peterson [8, 9] related to time scales are compact and resolve a lot of time scales calculus. This theory allows one to get some insight into and right understanding of the precise differences between discrete and continuous systems.
In the past years, new developments in the theory and applications of dynamic derivatives on time scales have emerged. Many results from the continuous case are carried over to the discrete one very easily, but some seem to be completely different. The study on time scales comes to reveal such discrepancies and to make us understand the difference between the two cases. The results in time scale calculus are unified and extended. This hybrid theory is also extensively used on dynamic inequalities.
Various linear and nonlinear integral inequalities on time scales have been established by many authors [3, 4, 32, 35].
Quantum calculus or q-calculus is usually called calculus without limits. In 1910, Jackson [18] described q-analogue of derivative and integral operator along with their applications. He was the first to establish q-calculus in an organized form. It is important to note that quantum integral inequalities are more significant and constructive than their classical counterparts. It has been primarily for the reason that quantum integral inequalities can interpret the hereditary properties of the fact and technique under consideration.
Recently, there has been a rapid development in q-calculus. Consequently, new generalizations of the classical approach of quantum calculus have been proposed and analyzed in various literature works, see [10, 17, 27, 44] and the references therein. The concepts of quantum calculus on finite intervals were given by Tariboon and Ntouyas [37, 38], and they obtained certain q-analogues of classical mathematical objects, which motivated numerous researchers to explore the subject in detail. Subsequently, several new results related to quantum counterpart of classical mathematical results have been established, see [7, 29, 34].
Divergence measure is the measure of distance between two probability distributions. The idea of divergence measure is used to solve a variety of problems in probability theory. In the literature, several types of divergence measures exist that compare two probability distributions and are used in statistics and information theory. Information and divergence measure are very useful and play a vital part in various areas, namely sensor networks [24], testing the order in a Markov chain [26], finance [33], economics [39], and approximation of probability distributions [14]. Shannon entropy and the related measures are often used in different fields such as information theory, molecular ecology, population genetics, statistical physics, and dynamical systems (see [13, 25]). Kullback–Leibler divergence is one of the best known among information divergences. The well-known divergence measure is used in information theory, mathematical statistics, and signal processing (see [42]). Jeffreys distance and triangular discrimination have many applications in statistics, information theory, and pattern recognition (see [23, 40, 41]).
Recently, various types of bounds on the distance, divergence, and information measures have been obtained (see [2, 6, 12, 15, 19, 22, 36] and the references therein). In [1], Adeel et al. generalized Levinson’s inequality for 3-convex function by using two Green functions. Moreover, the obtained results are applied to information theory via f-divergence, Rényi divergence, and Shannon entropy. In [21], Khan et al. introduced a new functional based on a classical f-divergence functional and obtained some estimates for the new functionals, the f-divergence, and Rényi divergence. In [11], Butt et al. established new refinements of Popoviciu’s inequality for higher order convex functions utilizing Abel–Gontscharoff interpolation in combination with new Green functions. New inequalities are obtained for n-convex functions. They also gave applications in information theory by finding new estimates for relative, Shannon, and Zipf–Mandelbrot entropies.
Motivated by the above discussion, we generalize an inequality involving Csiszár divergence on time scales for n-convex functions by using the Green function along with Fink’s identity. In addition, we estimate Kullback–Leibler divergence, differential entropy, Shannon entropy, Jeffreys distance, and triangular discrimination on time scales by using the obtained results.
2 Preliminaries
Throughout this paper, assume that \(\mathbb{T}\) is a time scale, \(a, b \in\mathbb{T}\) with \(a < b\). The following definitions and results are given in [8].
For \(\zeta\in\mathbb{T}\), the forward jump operator \(\sigma: \mathbb {T} \rightarrow\mathbb{T}\) is defined as follows:
A function \(g : \mathbb{T} \rightarrow\mathbb{R}\) is known as right-dense continuous (rd-continuous), provided it is continuous at right-dense points in \(\mathbb{T}\) and its left-sided limit exists (finite) at left-dense points in \(\mathbb{T}\). The set of all rd-continuous functions will be denoted in this paper by \(C_{rd}\). \(\mathbb{T}^{k}\) is defined as follows:
Suppose that \(g : \mathbb{T} \rightarrow\mathbb{R}\) and \(\zeta\in \mathbb{T}^{k}\). Delta derivative \(g^{\Delta}(\zeta)\) is defined to be the number (provided it exists) if for each \(\epsilon> 0\) there exists a neighborhood U of ζ such that
holds for all \(\lambda\in U\). Then g is said to be delta differentiable at ζ.
For \(\mathbb{T}= \mathbb{R}\), \(g^{\Delta}\) is the usual derivative \(g^{\prime}\), and \(g^{\Delta}\) turns into the forward difference operator \(\Delta g(\zeta) = g(\zeta+1) - g(\zeta)\) for \(\mathbb{T} = \mathbb{Z}\). If \(\mathbb{T} = \overline{q^{\mathbb{Z}}} = \{q^{n}: n \in\mathbb{Z} \} \bigcup\{0\}\), the so-called q-difference operator \(q > 1\),
Theorem A
(Existence of antiderivatives)
Every rd-continuous function has an antiderivative. If \(x_{0} \in \mathbb{T}\), then F defined by
is an antiderivative of f.
For \(\mathbb{T} = \mathbb{R}\), we obtain \(\int_{a}^{b}f(\zeta)\Delta\zeta= \int_{a}^{b}f(\zeta) d\zeta\), and if \(\mathbb{T} = \mathbb{N}\), then \(\int_{a}^{b}f(\zeta)\Delta\zeta= \sum_{\zeta=a}^{b-1}f(\zeta)\), where \(a, b \in\mathbb{T}\) with \(a\leq b\).
3 Improvement of the inequality involving Csiszár divergence
Assume \(\mathbb{T}\) to be a time scale and consider the set of all probability densities on \(\mathbb{T}\) to be
Let \(\zeta_{1}, \zeta_{2} \in\mathbb{R}\), where \(\zeta_{1} < \zeta _{2}\). Consider the Green function \(G : [\zeta_{1}, \zeta_{2}]\times [\zeta_{1}, \zeta_{2}] \rightarrow\mathbb{R}\) defined by
where G is convex and continuous corresponding to both x and s. It is notable that (see for example [20, 28, 30, 43]) any function \(\Psi\in C^{2}([\zeta_{1}, \zeta_{2}],\mathbb{R})\) can be written as
where \(G(x, s)\) is defined in (1).
In [5], Ansari et al. proved the following inequality.
Theorem B
Let \(\Psi: [0, \infty) \rightarrow\mathbb{R}\) be a convex function on the interval \([\zeta_{1}, \zeta_{2}] \subset[0, \infty)\) and \(\zeta_{1}\leq1 \leq \zeta_{2}\). If \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1}\leq\frac{p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\), then
Motivated by inequality (3), we initiate with the following result.
Theorem 1
Under the assumptions of Theorem Bwith \(\int_{a}^{b} p_{1}(y) \Delta y = \int_{a}^{b} p_{2}(y) \Delta y = 1\), then (3) and (4) are equivalent
where \(G(\cdot, s)\) is defined in (1) and \(s \in[\zeta_{1}, \zeta_{2}]\). Moreover, if we reverse the inequality in both (3) and (4), then again (3) and (4) are equivalent.
Proof
Let (3) be valid. Since the function \(G(\cdot, s) (s \in [\zeta_{1}, \zeta_{2}])\) is continuous and convex, therefore (4) holds.
Let (4) be valid. Let \(\Psi\in C^{2}([\zeta_{1}, \zeta_{2}], \mathbb{R})\). Then, by using (2), one can get
Utilize Fubini’s theorem with \(\int_{a}^{b} p_{1}(y) \Delta y = \int _{a}^{b} p_{2}(y) \Delta y = 1\) in (5) to obtain
For all \(s \in[\zeta_{1}, \zeta_{2}]\), if the function Ψ is convex, then \(\Psi^{\prime\prime}(s)\geq0\), and thus for every convex function \(\Psi\in C^{2}([\zeta_{1}, \zeta _{2}],\mathbb{R})\) inequality (3) holds. One can prove the last part of the theorem analogously. □
Remark 1
Under the assumptions of Theorem 1, the following two statements are equivalent:
- \((c^{\prime}_{1})\):
-
If the function \(\Psi\in C([\zeta_{1}, \zeta _{2}],\mathbb{R})\) is concave, then the reverse inequality in (3) holds.
- \((c^{\prime}_{2})\):
-
For all \(s \in[\zeta_{1}, \zeta_{2}]\), the reverse inequality in (4) holds.
In addition, if we reverse the inequality in both statements \((c^{\prime}_{1})\) and \((c^{\prime}_{2})\), then again \((c^{\prime}_{1})\) and \((c^{\prime}_{2})\) are equivalent.
Theorem 2
Assume the conditions of Theorem 1, we define the following functional:
if the inequality in (4) holds for all \(s \in[\zeta_{1}, \zeta_{2}]\).
Remark 2
Suppose that all the assumptions of Theorem 2 hold. If Ψ is continuous and convex, then \(\mathfrak{J}_{1}(\Psi) \geq0\).
The following theorem is proved by Fink in [16].
Theorem 3
Let \(f : [\zeta_{1}, \zeta_{2}]\rightarrow\mathbb{R}\), \(n\geq1\), and \(f^{(n-1)}\) is absolutely continuous on \([\zeta_{1}, \zeta_{2}]\), where \(\zeta_{1}, \zeta_{2} \in\mathbb{R}\). Then
where
4 Interpolation of the functional involving Csiszár divergence by Fink’s identity
Theorem 4
Assume \(n \in\mathbb{Z}^{+}\) and the function \(\Psi: [\zeta_{1}, \zeta _{2}] \rightarrow\mathbb{R}\) with \(\Psi^{(n-1)}\) is absolutely continuous and \(\zeta_{1} \leq1 \leq\zeta_{2}\). If \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1} \leq\frac {p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\), then we have the following new identity:
where
Proof
Use (2) in (6) and the linearity of \(\mathfrak {J}_{1}(\cdot)\) to obtain
Replacing n with \(n - 2\) in (7), one gets
Use (13) in (12) and rearrange the indices to have
Utilize Fubini’s theorem on the last term of (14) to obtain (9). □
Example 1
Choose \(\mathbb{T} = \mathbb{R}\) in Theorem 4, to get the same result as one can obtain from [15, (2.1)] by utilizing (1) and (7).
Example 2
Put \(\mathbb{T} = h\mathbb{Z}~(h>0)\) in Theorem 4 to obtain a new identity in h-discrete calculus with the following values:
and
Remark 3
Choose \(h = 1\) in Example 2. Suppose that \(a = 0, b = n, p_{1}(j) = (p_{1})_{j}\), and \(p_{2}(j) = (p_{2})_{j}\) to get a new identity in the discrete case with the following values:
and
Example 3
Use \(\mathbb{T} = q^{\mathbb{N}_{0}}~(q > 1), a = q^{l}\), and \(b = q^{n}\) with \(l < n\) in Theorem 4 to obtain a new identity in q-calculus with the following values:
and
As a result of the earlier obtained identities, the following theorem yields sublime generalization of inequalities involving Csiszár divergence on time scales for n-convex \((n \geq3)\) functions.
Theorem 5
Assume the conditions of Theorem 4. Also suppose that Ψ is an n-convex function with \(\Psi^{(n-1)}\) is absolutely continuous. If
then
Proof
Since \(\Psi^{(n-1)}\) is absolutely continuous on \([\zeta_{1}, \zeta _{2}]\), therefore \(\Psi^{(n)}\) exists almost everywhere. Given that Ψ is n-convex, hence for all \(x \in[\zeta_{1}, \zeta_{2}]\) we have \(\Psi^{(n)}(x)\geq0\) (see [31, p. 16]). Thus use Theorem 4 to get the required result. □
Theorem 6
Suppose that all the assumptions of Theorem 4hold. Let \(\Psi\in C^{n}[\zeta_{1}, \zeta_{2}]\) be such that \(\Psi^{(n-1)}\) is absolutely continuous. Moreover, for the functional \(\mathfrak{J}_{1}(\cdot)\) given in (6), we get the following:
\((i)\) Inequality (17) holds provided that n is even and (\(n \geq4\)).
\((ii)\) Let inequality (17) be satisfied and
for all \(s \in[\zeta_{1}, \zeta_{2}]\). Then
Proof
It is obvious that the Green function \(G(\cdot, s)\) given in (1) is convex. Therefore, by applying Theorem 2 and by using Remark 2, one has \(\mathfrak{J_{1}}G(\cdot, s) \geq0\).
\((i)\) \(W^{[\zeta_{1}, \zeta_{2}]}(t, x) \geq0\) for \(n=4, 6, \ldots\) , so (16) holds. As Ψ is n-convex, hence by utilizing Theorem 5, one gets (17).
\((ii)\) Use (18) in (17) to get (19). □
Remark 4
Grüss, Cebyšev, and Ostrowski-type bounds corresponding to the obtained generalizations can also be deduced.
5 Application to information theory
Shannon entropy is the fundamental term in information theory and is often dealt with measure of uncertainty. The random variable, entropy, is characterized regarding its probability distribution, and it can appear as a better measure of uncertainty or predictability. The Shannon entropy allows the estimation of the normal least number of bits essential to encode a string of symbols based on alphabet size and frequency of symbols.
5.1 Differential entropy on time scales
Consider a positive density function p on time scale to a continuous random variable X with \(\int_{a}^{b} p(\zeta)\Delta\zeta = 1\), wherever the integral exists.
In [4], Ansari et al. defined the so-called differential entropy on a time scale by
Theorem 7
Let X be a continuous random variable and \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1}\leq\frac{p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\). If n is even \((n = 6, 8,\ldots)\), then
where
and
Proof
It is obvious that the Green function \(G(\cdot, s)\) given in (1) is convex, therefore by using Remark 2, \(\mathfrak{J}_{1}G(\cdot, s)\geq0\). Let
Consequently,
Since Φ is n-convex for even n, where \(n > 4\), (16) holds for even values of \(n \geq6\). The function \(\Psi(x) = -\log x\) is n-convex \(n = 6, 8, \ldots\) . Use \(\Psi(x) = -\log x \) in Theorem 5 to get (21), where \(\tilde{h}_{\bar{b}}(X)\) is given in (20). □
Example 4
Choose \(\mathbb{T} = \mathbb{R}\) in Theorem 7 to have a new inequality with the following values:
where
and
Example 5
Choose \(\mathbb{T} = h\mathbb{Z}, h > 0\) in Theorem 7 to get a new inequality for the Shannon entropy in h-discrete calculus with the following values:
where
and
Remark 5
Choose \(h = 1\) in Example 5. Suppose that \(a = 0,~b = n,~p_{1}(j) = (p_{1})_{j}\), and \(p_{2}(j) = (p_{2})_{j}\) to get a new inequality involving the discrete Shannon entropy with the following values:
where
and
Example 6
Choose \(\mathbb{T} = q^{\mathbb{N}_{0}} (q > 1), a = q^{l}\), and \(b = q^{n}\) with \(l < n\) in Theorem 7 to obtain a new inequality for the Shannon entropy in q-calculus with the following values:
where
and
5.2 Kullback–Leibler divergence
Kullback–Leibler divergence on time scales is defined in [5] as follows:
Theorem 8
Let X be a continuous random variable and \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1}\leq\frac{p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\). If n is even \((n = 6, 8,\ldots)\), then
where
and
where \(D(p_{1}, p_{2})\) is given in (23).
Proof
The function \(\Psi(x) = x\ln x\) is n-convex for \(n = 6, 8, \ldots\) . Use \(\Psi(x) = x\ln x \) in Theorem 5 to get (24). □
Example 7
Choose \(\mathbb{T} = \mathbb{R}\) in Theorem 8 to have a new inequality in classical calculus with the following values:
where
is Kullback–Leibler divergence and
Example 8
Choose \(\mathbb{T} = h\mathbb{Z}~(h>1)\) in Theorem 8 to get a new inequality in h-discrete calculus with the following values:
where
and
Remark 6
Choose \(h = 1\) in Example 8. Suppose that \(a = 0, b = n, p_{1}(j) = (p_{1})_{j}\), and \(p_{2}(j) = (p_{2})_{j}\) to get a new inequality involving discrete Kullback–Leibler divergence with the following values:
where
and
Example 9
Choose \(\mathbb{T} = q^{\mathbb{N}_{0}} (q > 1), a = q^{l}\), and \(b = q^{n}\) with \(l < n\) in Theorem 8 to have a new inequality involving Kullback–Leibler divergence in q-calculus with the following values:
where
and
5.3 Jeffreys distance
Jeffreys distance on time scale is defined in [5] as follows:
Theorem 9
Let X be a continuous random variable and \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1}\leq\frac{p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\). If n is even \((n = 6, 8,\ldots)\), then
where
and
where \(D_{J}(p_{1}, p_{2})\) is given in (25).
Proof
The function \(\Psi(x) = (x-1)\ln x\) is n-convex for \(n = 6, 8, \ldots\) . Use \(\Psi(x) = (x-1)\ln x \) in Theorem 5 to get (26). □
Example 10
Choose \(\mathbb{T} = \mathbb{R}\) in Theorem 9 to have a new inequality in classical calculus with the following values:
where
is Jeffreys distance and
Example 11
Choose \(\mathbb{T} = h\mathbb{Z}~(h>1)\) in Theorem 9 to get a new inequality in h-discrete calculus with the following values:
where
and
Remark 7
Put \(h = 1\) in Example 11. Suppose that \(a = 0, b = n, p_{1}(j) = (p_{1})_{j}\), and \(p_{2}(j) = (p_{2})_{j}\) to get a new inequality involving discrete Jeffreys distance with the following values:
where
and
Example 12
Choose \(\mathbb{T} = q^{\mathbb{N}_{0}} (q > 1), a = q^{l}\), and \(b = q^{n}\) with \(l < n\) in Theorem 9 to have a new inequality involving Jeffreys distance in q-calculus with the following values:
where
and
5.4 Triangular discrimination
Triangular discrimination on time scales is defined in [5] as follows:
Theorem 10
Let X be a continuous random variable and \(p_{1}, p_{2} \in\Omega\) with \(\zeta_{1}\leq\frac{p_{1}(y)}{p_{2}(y)} \leq\zeta_{2}\) for all \(y \in\mathbb{T}\). If n is even \((n = 6, 8,\ldots)\), then
where
and
where \(D_{\Delta}(p_{1}, p_{2})\) is given in (27).
Proof
The function \(\Psi(x) = \frac{(x - 1)^{2}}{x + 1}\) is n-convex for \(n = 6, 8, \ldots\) . Use \(\Psi(x) = \frac{(x - 1)^{2}}{x + 1} \) in Theorem 5 to get (28). □
Example 13
Choose \(\mathbb{T} = \mathbb{R}\) in Theorem 10 to have a new inequality in classical calculus with the following values:
where
is triangular discrimination and
Example 14
Choose \(\mathbb{T} = h\mathbb{Z}~(h>1)\) in Theorem 10 to get a new inequality in h-discrete calculus with the following values:
where
and
Remark 8
Take \(h = 1\) in Example 14 and consider \(a = 0, b = n, p_{1}(j) = (p_{1})_{j}\), and \(p_{2}(j) = (p_{2})_{j}\) to get a new inequality involving discrete triangular discrimination with the following values:
where
and
Example 15
Choose \(\mathbb{T} = q^{\mathbb{N}_{0}} (q > 1), a = q^{l}\), and \(b = q^{n}\) with \(l < n\) in Theorem 10 to have a new inequality involving triangular discrimination in q-calculus with the following values:
where
and
Availability of data and materials
Data sharing is not applicable to this paper as no data sets were generated or analyzed during the current study.
References
Adeel, M., Khan, K.A., Pečarić, Ð., Pečarić, J.: Generalization of the Levinson inequality with applications to information theory. J. Inequal. Appl. 2019, 230 (2019)
Adil Khan, M., Husain, Z., Chu, Y.M.: New estimates for Csiszár divergence and Zipf-Mandelbrot entropy via Jensen-Mercer’s inequality. Complexity 2020, 8928691 (2020)
Agarwal, R., Bohner, M., Peterson, A.: Inequalities on time scales: a survey. Math. Inequal. Appl. 4, 535–557 (2001)
Ansari, I., Khan, K.A., Nosheen, A., Pečarić, Ð., Pečarić, J.: Shannon type inequalities via time scales theory. Adv. Differ. Equ. 2020, 135 (2020)
Ansari, I., Khan, K.A., Nosheen, A., Pečarić, Ð., Pečarić, J.: Some inequalities for Csiszár divergence via theory of time scales. Adv. Differ. Equ. 2020, 698 (2020)
Ansari, I., Khan, K.A., Nosheen, A., Pečarić, Ð., Pečarić, J.: Estimation of divergence measures via weighted Jensen inequality on time scales. J. Inequal. Appl. 2021, 93 (2021)
Ben Makhlouf, A., Kharrat, M., Hammami, M.A., Baleanu, D.: Henry-Gronwall type q-fractional integral inequalities. Math. Methods Appl. Sci. 44(2), 3–9 (2021)
Bohner, M., Peterson, A.: Dynamic Equations on Time Scales. Birkhäuser, Boston (2001)
Bohner, M., Peterson, A.: Advances in Dynamic Equations on Time Scales. Birkhäuser, Boston (2003)
Brahim, K., Bettaibi, N., Sellemi, M.: On some Feng Qi type q-integral inequalities. J. Inequal. Pure Appl. Math. 9(2), 1–7 (2008)
Butt, S.I., Rasheed, T., Pečarić, Ð., Pečarić, J.: Combinatorial extensions of Popoviciu’s inequality via Abel-Gontscharoff polynomial with applications in information theory. Rad Hrvat. Akad. Znan. Umjet. Mat. Znan. 542, 59–80 (2020)
Cerone, P., Dragomir, S.S.: Some new Ostrowski-type bounds for the Čebyšev functional and applications. J. Math. Inequal. 8(1), 159–170 (2014)
Chao, A., Jost, L., Hsieh, T.C., Ma, K.H., Sherwin, W.B., Rollins, L.A.: Expected Shannon entropy and Shannon differentiation between subpopulations for neutral genes under the finite island model. PLoS ONE 10(6), 1–24 (2015)
Chow, C.K., Lin, C.N.: Approximating discrete probability distributions with dependence trees. IEEE Trans. Inf. Theory 14(3), 462–467 (1968)
Dragomir, S.S.: Other inequalities for Csiszár divergence and applications. Preprint, RGMIA Res. Rep. Coll. (2000)
Fink, A.M.: Bounds of the deviation of a function from its averages. Czechoslov. Math. J. 42(117), 289–310 (1992)
Gauchman, H.: Integral inequalities in q-calculus. Comput. Math. Appl. 47(2–3), 281–300 (2004)
Jackson, H.: On q-definite integrals. Q. J. Pure Appl. Math. 41, 193–203 (1910)
Jain, K.C., Mathur, R.: A symmetric divergence measure and its bounds. Tamkang J. Math. 42(4), 493–503 (2011)
Khan, A.R., Pečarić, J., Lipanović, M.R.: n-Exponential convexity for Jensen-type inequalities. J. Math. Inequal. 7(3), 313–335 (2013)
Khan, K.A., Niaz, T., Pečarić, Ð., Pečarić, J.: Refinement of Jensen’s inequality and estimation of f- and Rényi divergence via Montgomery identity. J. Inequal. Appl. 2018, 318 (2018)
Khan, M.A., Pecaric, Ð., Pecaric, J.: A new refinement of the Jensen inequality with applications in information theory. Bull. Malays. Math. Sci. Soc. 44, 267–278 (2021)
Kullback, S.: Information Theory and Statistics. Peter Smith, Gloucester (1978)
Leandro, P.: Statistical Inference Based on Divergence Measures. Chapman and Hall, London (2006)
Lesne, A.: Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics. Math. Struct. Comput. Sci. 24(3), E240311 (2014)
Lyndon, S.H.: Region segmentation using information divergence measures. Med. Image Anal. 8, 233–244 (2004)
Miao, Y., Qi, F.: Several q-integral inequalities. J. Math. Inequal. 3(1), 115–121 (2009)
Niculescu, C.P., Persson, L.E.: Convex Functions and Their Applications. A Contemporary Approach. Springer, New york (2006)
Noor, M.A., Awan, M.U., Noor, K.I.: Quantum Ostrowski inequalities for q-differentiable convex functions. J. Math. Inequal. 10(4), 1013–1018 (2016)
Pečarić, J., Perić, I., Rodić Lipanović, M.: Uniform treatment of Jensen type inequalities. Math. Rep. 16(2), 183–205 (2014)
Pečarić, J., Proschan, F., Tong, Y.L.: Convex Functions, Partial Orderings, and Statistical Applications. Mathematics in Science and Engineering., vol. 187. Academic Press, Boston (1992)
Saker, S.H.: Some nonlinear dynamic inequalities on time scales. Math. Inequal. Appl. 14(3), 633–645 (2011)
Sen, A.: On Economic Inequality. Oxford University Press, London (1973)
Sudsutad, W., Ntouyas, S.K., Tariboon, J.: Quantum integral inequalities for convex functions. J. Math. Inequal. 9(3), 781–793 (2015)
Sun, Y.G., Hassan, T.: Some nonlinear dynamic integral inequalities on time scales. Appl. Math. Comput. 220(4), 221–225 (2013)
Taneja, I.J.: Bounds on non symmetric divergence measures in terms of symmetric divergence measures. J. Comb. Inf. Syst. Sci. 29(14), 115–134 (2005)
Tariboon, J., Ntouyas, S.K.: Quantum calculus on finite intervals and applications to impulsive difference equations. Adv. Differ. Equ. 2013, 1 (2013)
Tariboon, J., Ntouyas, S.K.: Quantum integral inequalities on finite intervals. J. Inequal. Appl. 2014, 1 (2014)
Theil, H.: Economics and Information Theory. North-Holland, Amsterdam (1967)
Topsoe, F.: Some inequalities for information divergence and related measures of discrimination. RGMIA Res. Rep. Collect. 2(1), 85–98 (1999)
Tou, J.T., Gonzales, R.C.: Pattern Recognition Principle. Addison-Wesley, Reading (1974)
Wedrowska, E.: Application of Kullback-Leibler relative entropy for studies on the divergence of household expenditures structures. Olszt. Econ. J. 6, 133–142 (2011)
Widder, D.V.: Completely convex function and Lidstone series. Trans. Am. Math. Soc. 51, 387–398 (1942)
Zhu, C., Yang, W., Zhao, Q.: Some new fractional q-integral Grüss-type inequalities and other inequalities. J. Inequal. Appl. 2012(1), 299 (2012)
Acknowledgements
The authors wish to thank the anonymous referees for their very careful reading of the manuscript and fruitful comments and suggestions. The research of the 5th author (Josip Pečarić) is supported by the Ministry of Education and Science of the Russian Federation (Agreement number 02.a03.21.0008).
Funding
There is no funding for this work.
Author information
Authors and Affiliations
Contributions
All authors jointly worked on the results and they read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ansari, I., Khan, K.A., Nosheen, A. et al. Estimation of divergences on time scales via the Green function and Fink’s identity. Adv Differ Equ 2021, 394 (2021). https://doi.org/10.1186/s13662-021-03550-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13662-021-03550-2