- Research
- Open access
- Published:
New bounds for soft margin estimator via concavity of Gaussian weighting function
Advances in Difference Equations volume 2020, Article number: 644 (2020)
Abstract
In the present article, we elaborate on the notion to obtain bounds for the soft margin estimator of “Identification of Patient Zero in Static and Temporal Network-Robustness and Limitations”. To achieve these bounds for the soft margin estimator, we utilize the concavity of the Gaussian weighting function and well-known Jensen’s inequality. To acquire some more general bounds for the soft margin estimator, we consider some general functions defined on rectangles. We also use the behavior of the Jaccard similarity function to extract some handsome bounds for the soft margin estimator.
1 Introduction
One of the most dangerous threats to the human society is the infectious disease. When this infectious disease becomes an epidemic, it will cause a big loss to human life and damage the economy on a large scale. The epidemic infectious diseases are also very dangerous in the sense that they are spreading very rapidly to a massive quantity of people in a given population in a limited period of time. Many factors that are contributing to epidemic infectious diseases are climate change, genetic change, globalization, and urbanization, and most of these factors are to some extent caused by humans. Many people from different fields have a lot of contribution to the detection of epidemic source and controlling of epidemic spreading. Mathematicians have also played a vital role in the modeling of epidemic spreading.
The contagion processes are the most attractive dynamic processes for the real life complex network of public interest [11, 12, 22, 24]. To model epidemic spreading, epidemiologists frequently use the compartmental models such as SIR models [17], SIS models [16], and SEIR models [20]. These models are very important when explicitly modeling and estimating the quantity of susceptible and infected individuals in a population at risk.
Epidemiologists have obtained many models for the epidemic source detection by imposing some restrictions on the network structure or on the spreading model process of compartmental models (SIR, SIS) or both [12–14, 23, 25, 28]. The epidemiologists analyze the virous genetic evolution [15, 26] and detect the epidemic source or do backtracking by using given data [10]. Zhu et al. [28] initiated a model in which they established that the maximum distance to the infected nodes can be minimized by the source nodes on infinite trees. Altarelli et al. [8] estimated the epidemic source by using the message passing method, where they replaced the independent assumption by a tree-like contact network. Lokhov et al. [21] estimated the probability of a given node to produce the observed snapshot by considering the SIR model and using message-passing algorithm. Antulov-Fantulin et al. [9] proposed a model to analyze source probability estimators. They dropped the independency assumptions on nodes and all network structures and analyzed the source probability estimators for general compartmental models. The soft margin estimator for the proposed model of Antulov-Fantulin et al. [9] is given by
where \(\overrightarrow{R_{\theta }}\) is a binary vector that indicates the random outcomes of the epidemic process, \(\{\overrightarrow{r}_{\theta,1},\overrightarrow{r}_{\theta,2}, \ldots,\overrightarrow{r}_{\theta,n}\}\) are the sample vectors that show the n independent outcomes of the epidemic process with the source term θ, \(\varphi:{\mathbb{R}^{n}}\times {\mathbb{R}^{n}}\rightarrow [0,1]\) is a Jaccard similarity function, which can be calculated by dividing the cardinality of the intersection of the set of infected nodes in \(\overrightarrow{r}_{1}, \overrightarrow{r}_{2}\) by the cardinality of their union, \(\varphi (\overrightarrow{r_{*}},\overrightarrow{r}_{\theta,i})\) is a random variable that measures the similarity between the fixed realization vector \(\overrightarrow{r_{*}}\) and the random realization vector \(\overrightarrow{r}_{\theta,i}\), and \(\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is the Gaussian weighting function with \(a>0\).
We will use the following hypothesis for the construction of our results throughout the paper.
\(\mathbf{H:}\) Let \(R_{\theta }\) be a binary vector, \(\{\overrightarrow{r}_{\theta,1},\overrightarrow{r}_{\theta,2}, \ldots,\overrightarrow{r}_{\theta,n}\}\) be n independent vectors, \(\overrightarrow{r}_{*}\) be a fixed realization vector, a be a positive real number, \(\varphi:\mathbb{R}^{n}\times \mathbb{R}^{n}\mapsto [0,1]\) be the Jaccard similarity function, and \(\hat{P}(\overrightarrow{R}=\overrightarrow{r}_{*}|\Theta =\theta )\) be the soft margin estimator as given in (1).
In the remaining portion of this section, we are going to discuss briefly convexity and concavity.
The notion of convex and concave functions is so impressive in all fields of science, especially in mathematics, because of its notable property. Therefore many generalized and interesting results for convex and concave functions and their application have been accomplished [1–7, 18, 19, 27].
Now, the formal definition of convex and concave functions is stated as follows.
Definition 1
Let I be an arbitrary interval in \(\mathbb{R}\). Then the function \(\Psi:I\rightarrow \mathbb{R}\) is convex if the inequality
holds for all \(x,y\in I\) and \(\lambda \in [0,1]\).
If inequality (2) holds in the reverse direction, then the function \(\Psi:I\rightarrow \mathbb{R}\) is said to be concave.
There are many inequalities proved for convex and concave functions. Among these inequalities, one of the most prominent and dynamic inequality is the well known Jensen’s inequality in the literature. Jensen’s inequality is one of the most leading and generalized inequality in the sense that many inequalities can be assumed from it. The formal statement of Jensen’s inequality can be read in the following theorem.
Theorem 1
Let I be an interval in \(\mathbb{R}\), \(\mathbf{x}=(x_{1},x_{2},\ldots,x_{n})\) be an n-tuple such that \(x_{i}\in {I}\) for all \(i\in \{1,2,\ldots,n\}\), and \(\mathbf{p}=(p_{1},p_{2},\ldots,p_{n})\) be a positive n-tuple of real entries with \(P_{n}=\sum_{i=1}^{n}p_{i}\). If the function \(\Psi:{I}\rightarrow \mathbb{R}\) is convex, then
If the function \(\Psi:{I}\rightarrow \mathbb{R}\) is concave, then inequality (3) holds in the reverse direction.
In this paper, we advance the idea to give bounds for the soft margin estimator given in (1) while accustoming the existing notion of concave function. To achieve bounds for the soft margin estimator, we consume the concavity of Gaussian weighting function and Jensen’s inequality. To obtain some more general and clear bounds for soft margin estimator, we use some general functions defined on rectangles, which are monotonic with respect to the first variable. We also utilize the behavior of the Jaccard similarity function for obtaining the desire bounds of soft margin estimator.
2 Main results
In order to build our results, we first establish the following lemma, which will support us in the achievement of our results.
Lemma 1
The Gaussian weighting function \(\Psi:[0,1]\rightarrow \mathbb{R}\) defined by
is concave for all \(a\in [\sqrt{2},\infty )\).
Proof
To show the concavity of Gaussian function \(\Psi (x)\), we use the double derivative test. For this, differentiating two times \(\Psi (x)\) with respect to x, we get
Since
So, we just need to show that
As
and
Now, adding (4) and (5), we obtain
Hence
for all \(x\in [0,1]\) and \(a\in [\sqrt{2},\infty )\).
Consequently,
is a concave function for all \(x\in [0,1]\) and \(a\in [\sqrt{2},\infty )\). □
In the following result, we acquire bounds for soft margin estimator adopting the concavity of the Gaussian function.
Theorem 2
Let hypothesis H hold with \(a\in [\sqrt{2},\infty )\). Then
Proof
By Lemma 1, the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave on \([0,1]\) for \(a\in [\sqrt{2},\infty )\). Therefore
Now, putting \(x=\varphi (\overrightarrow{r}_{*},\overrightarrow{r}_{\theta,i})\) in (7), we obtain
Multiplying both sides of (8) by \(\frac{1}{n}\) and taking summation over i, we get
Since, by Lemma 1, the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave. Therefore, using Theorem 1, we have
Now, comparing (9) and (10), we obtain (6). □
In the following theorem, we get some clearer bounds for soft margin estimator by imposing a restriction on the Jaccard function.
Theorem 3
Let all the hypotheses of Theorem 2hold. If \(0< d\leq \varphi (\overrightarrow{r}_{*},\overrightarrow{r}_{ \theta,i})\leq D<1\), then
Proof
By Lemma 1, for \(a\in [\sqrt{2},\infty )\) and \(x\in [d,D]\), the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave. Therefore
Now, substituting \(x=\varphi (\overrightarrow{r}_{*},\overrightarrow{r}_{\theta,i})\) in (12) and then multiplying by \(\frac{1}{n}\) and taking summation over i, we gain
By Lemma 1, the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave. Therefore, using Theorem 1, we have
Now, comparing (13) and (14), we achieve
which is equivalent to (11). □
In the following theorem, we acquire some general bounds for soft margin estimator by considering a general function defined on rectangles, which is increasing with respect to the first variable.
Theorem 4
Let hypothesis H hold with \(a\in [\sqrt{2},\infty )\). Also assume that ϒ is an interval in \(\mathbb{R}\), \(F:\Upsilon \times \Upsilon \rightarrow \mathbb{R}\) is an increasing function with respect to the first variable, and \(\phi:[0,1]\rightarrow \Upsilon \) is an arbitrary function. Then
Proof
By utilizing inequality (6) and increasing the property of F with respect to the first variable, we get (16). □
In the following result, we obtain some more general bounds for soft margin estimator by using a general function defined on rectangles and imposing a restriction on the Jaccard function.
Theorem 5
Let hypothesis H hold with \(a\in [\sqrt{2},\infty )\). Also assume that ϒ is an interval in \(\mathbb{R}\) and \(F:\Upsilon \times \Upsilon \rightarrow \mathbb{R}\) is an increasing function with respect to the first variable. If \(0< d\leq \varphi (\overrightarrow{r}_{*},\overrightarrow{r}_{ \theta,i})\leq D<1\) and \(\phi:[d,D]\rightarrow \Upsilon \) is an arbitrary function, then
Furthermore, the right-hand side of (17) is a decreasing function of D and an increasing function of d.
Proof
By utilizing inequality (11) and increasing the property of F with respect to the first variable, we obtain (17).
Now, we show that the right-hand side of (17) is a decreasing function of D.
Let \(d\leq k_{1}< k_{2}\leq D\). By Lemma 1, the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave for \(a\in [\sqrt{2},\infty )\). Therefore, the first-order divided difference of \(\Psi (x)\) is decreasing, that is,
Multiplying both sides of (18) by \(x-d\) and then adding \(\exp (-\frac{(d-1)^{2}}{a^{2}} )\), we get
By utilizing (19) and the fact that \([d,k_{1}]\subseteq [d,k_{2}]\) and F is increasing with respect to the first variable, we attain
Hence, (20) proves that the right-hand side of (17) is a decreasing function of D.
Similarly, we can prove that the right-hand side of (17) is an increasing function of d. □
In the succeeding theorem, we acquire some general bounds for soft margin estimator by taking a general function defined on rectangles and decreasing with respect to the first variable.
Theorem 6
Let hypothesis H hold with \(a\in [\sqrt{2},\infty )\). Also assume that ϒ is an interval in \(\mathbb{R}\), \(F:\Upsilon \times \Upsilon \rightarrow \mathbb{R}\) is a decreasing function with respect to the first variable and \(\phi:[0,1]\rightarrow \Upsilon \) is an arbitrary function. Then
Proof
By utilizing inequality (6) and decreasing the property of F with respect to the first variable, we get (21). □
In the next result, we secure more certain general bounds for soft margin estimator by using a general function, which is decreasing with respect to the first variable, defined on rectangles and also imposing restriction on the Jaccard function.
Theorem 7
Let hypothesis H hold with \(a\in [\sqrt{2},\infty )\). Also assume that ϒ is an interval in \(\mathbb{R}\) and \(F:\Upsilon \times \Upsilon \rightarrow \mathbb{R}\) is a decreasing function with respect to the first variable. If \(0< d\leq \varphi (\overrightarrow{r}_{*},\overrightarrow{r}_{ \theta,i})\leq D<1\) and \(\phi:[d,D]\rightarrow \Upsilon \) is an arbitrary function, then
Furthermore, the right-hand side of (22) is an increasing function of D and a decreasing function of d.
Proof
By using inequality (11) and decreasing the property of F with respect to the first variable, we get (22).
Now, we show that the right-hand side of (22) is an increasing function of D.
Let \(d\leq k_{1}< k_{2}\leq D\). By Lemma 1, the Gaussian function \(\Psi (x)=\exp (-\frac{(x-1)^{2}}{a^{2}} )\) is concave for \(a\in [\sqrt{2},\infty )\). Therefore, the first-order divided difference of \(\Psi (x)\) is decreasing, that is,
Multiplying both sides of (23) by \(x-d\) and then adding \(\exp (-\frac{(d-1)^{2}}{a^{2}} )\), we get
By utilizing (24) and the fact that \([d,k_{1}]\subseteq [d,k_{2}]\) and F is decreasing with respect to the first variable, we obtain
Hence (25) confirms that the right-hand side of (22) is an increasing function of D.
Similarly, we can prove that the right-hand side of (22) is a decreasing function of d. □
3 Conclusion
In this paper, we extracted some useful bounds for the soft margin estimator given in (1) with the help of notion of concavity. Acquiring these beneficial bounds, we exercised the characteristics of the Jaccard similarity function. To obtain some more advanced bounds for the soft margin estimator, we considered some broad function defined on rectangles and monotonic with respect to the first variable.
Availability of data and materials
Not applicable.
References
Adil Khan, M., Khan, J., Pečarić, J.: Generalization of Jensen’s and Jensen–Steffensen’s inequalities by generalized majorization theorem. J. Math. Inequal. 11(4), 1049–1074 (2017)
Adil Khan, M., Khan, S., Chu, Y.-M.: A new bound for the Jensen gap with applications in information theory. IEEE Access 8, 98001–98008 (2020)
Adil Khan, M., Latif, N., Pečarić, J.: Generalization of majorization theorem. J. Math. Inequal. 9(3), 847–872 (2015)
Adil Khan, M., Latif, N., Pečarić, J.: Generalization of majorization theorem via Abel–Gontscharoff polynomial. Rad Hrvat. Akad. Znan. Umjet. Mat. Znan. 19(523), 91–116 (2015)
Adil Khan, M., Pečarić, Ð., Pečarić, J.: New refinement of the Jensen inequality associated to certain functions with applications. J. Inequal. Appl. 2020, Article ID 74 (2020)
Adil Khan, M., Wu, S.-H., Ullah, H., Chu, Y.-M.: Discrete majorization type inequalities for convex functions on rectangles. J. Inequal. Appl. 2019, Article ID 16 (2019)
Ahmed, K., Adil Khan, M., Khan, S., Ali, A., Chu, Y.-M.: New estimates for generalized Shannon and Zipf–Mandelbrot entropies via convexity results. Results Phys. 18, Article ID 103305 (2020)
Altarelli, F., Braunstein, A., Dall, L., Asta, A., Lage-Castellanos, A., Zecchina, R.: Bayesian inference of epidemics on networks via belief propagation. Phys. Rev. Lett. 112(11), 118701 (2014)
Antulov-Fantulin, N., Lančić, A., Šmuc, T., Štefančić, H., Šikić, M.: Identification of patient zero in static and temporal networks: robustness and limitations. Phys. Rev. Lett. 114, 248701 (2015)
Auerbach, D.M., Darrow, W.W., Jaffe, H.W., Curran, J.W.: Cluster of cases of the acquired immune deficiency syndrome. Am. J. Med. 76, 487–491 (1984)
Castellano, C., Satorras, R.P.: Thresholds for epidemic spreading in networks. Phys. Rev. Lett. 105, 218701 (2010)
Colizza, V., Pastor-Satorras, R., Vespignani, A.: Reaction–diffusion processes and metapopulation models in heterogeneous networks. Nat. Phys. 3(4), 276–282 (2007)
Comin, C.H., da F. Costa, L.: Identifying the starting point of a spreading process in complex networks. Phys. Rev. E 84, 056105 (2011)
Dong, W., Zhang, W., Tan, C.W.: Rooting out the rumor culprit from suspects. In: IEEE Intl. Symp. on Information Theory, pp. 2671–2675 (2013)
Du, X., Dong, L., Lan, Y., Peng, Y., Wu, A., Zhang, Y., Huang, W., Wang, D., Wang, M., Guo, Y.: Mapping of H3N2 influenza antigenic evolution in China reveals a strategy for vaccine strain recommendation. Nat. Commun. 3(1), 1–9 (2012)
Gray, A., Greenhalgh, D., Hu, L., Mao, X., Pan, J.: A stochastic differential equation SIS epidemic model. SIAM J. Appl. Math. 71(3), 876–902 (2011)
Kelling, M.J., Pejman, R.: Modeling Infectious Diseases in Humans and Animals. Princeton University Press, Princeton (2011)
Khan, S., Adil Khan, M., Butt, S.I., Chu, Y.-M.: A new bound for the Jensen gap pertaining twice differentiable functions with applications. Adv. Differ. Equ. 2020, Article ID 333 (2020)
Khan, S., Adil Khan, M., Chu, Y.-M.: New converses of the Jensen’s inequality via Green functions with applications. Rev. R. Acad. Cienc. Exactas Fís. Nat., Ser. A Mat. 114(3), Article ID 114 (2020)
Legrand, J., Grais, R.F., Boelle, P.Y., Valleron, A.J., Flahault, A.: Understanding the dynamics of Ebola epidemics. Epidemiol. Infect. 135, 610–621 (2007)
Lokhov, A.Y., Mézard, M., Ohta, H., Zdeborová, L.: Inferring the origin of an epidemic with a dynamic message-passing algorithm. Phys. Rev. E 90(1), 012801 (2014)
Pastor-Satorras, R.P., Vespignani, A.: Epidemic spreading in scale-free networks. Phys. Rev. Lett. 86(14), 3200 (2001)
Pinto, P.C., Thiran, P., Vetterli, M.: Locating the source of diffusion in large-scale networks. Phys. Rev. Lett. 109(6), 068702 (2012)
Vespignani, A.: Modelling dynamical processes in complex socio-technical systems. Nat. Phys. 8(1), 32–39 (2012)
Wang, Z., Dong, W., Zhang, W., Tan, C.W.: Rumor source detection with multiple observations: fundamental limits and algorithms. In: ACM SIGMETRICS Perform. Eval. Rev., pp. 1–13 (2014)
Worobey, M., Han, G.Z., Rambaut, A.: Genesis and pathogenesis of the 1918 pandemic H1N1 influenza a virus. Proc. Natl. Acad. Sci. USA 111, 8107–8112 (2014)
Wu, S., Adil Khan, M., Haleemzai, H.U.: Refinements of majorization inequality involving convex functions via Taylor’s theorem with mean value form of the remainder. Mathematics 7(8), 663 (2019)
Zhu, K., Ying, L.: Information source detection in the SIR model: a sample path based approach. In: Information Theory and Applications Workshop. ITA, pp. 1–9. IEEE Press, New York (2013)
Acknowledgements
The publication was supported by the Ministry of Education and Science of the Russian Federation (the Agreement number No. 02.a03.21.0008).
Funding
There is no funding.
Author information
Authors and Affiliations
Contributions
All the authors contributed equally to the writing of this paper. All the authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing interests.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ullah, H., Adil Khan, M. & Pečarić, J. New bounds for soft margin estimator via concavity of Gaussian weighting function. Adv Differ Equ 2020, 644 (2020). https://doi.org/10.1186/s13662-020-03103-z
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13662-020-03103-z