New bounds for soft margin estimator via concavity of Gaussian weighting function

In the present article, we elaborate on the notion to obtain bounds for the soft margin estimator of “Identification of Patient Zero in Static and Temporal Network-Robustness and Limitations”. To achieve these bounds for the soft margin estimator, we utilize the concavity of the Gaussian weighting function and well-known Jensen’s inequality. To acquire some more general bounds for the soft margin estimator, we consider some general functions defined on rectangles. We also use the behavior of the Jaccard similarity function to extract some handsome bounds for the soft margin estimator.


Introduction
One of the most dangerous threats to the human society is the infectious disease. When this infectious disease becomes an epidemic, it will cause a big loss to human life and damage the economy on a large scale. The epidemic infectious diseases are also very dangerous in the sense that they are spreading very rapidly to a massive quantity of people in a given population in a limited period of time. Many factors that are contributing to epidemic infectious diseases are climate change, genetic change, globalization, and urbanization, and most of these factors are to some extent caused by humans. Many people from different fields have a lot of contribution to the detection of epidemic source and controlling of epidemic spreading. Mathematicians have also played a vital role in the modeling of epidemic spreading.
The contagion processes are the most attractive dynamic processes for the real life complex network of public interest [11,12,22,24]. To model epidemic spreading, epidemiologists frequently use the compartmental models such as SIR models [17], SIS models [16], and SEIR models [20]. These models are very important when explicitly modeling and estimating the quantity of susceptible and infected individuals in a population at risk.
Epidemiologists have obtained many models for the epidemic source detection by imposing some restrictions on the network structure or on the spreading model process of compartmental models (SIR, SIS) or both [12-14, 23, 25, 28]. The epidemiologists analyze the virous genetic evolution [15,26] and detect the epidemic source or do back-tracking by using given data [10]. Zhu et al. [28] initiated a model in which they established that the maximum distance to the infected nodes can be minimized by the source nodes on infinite trees. Altarelli et al. [8] estimated the epidemic source by using the message passing method, where they replaced the independent assumption by a tree-like contact network. Lokhov et al. [21] estimated the probability of a given node to produce the observed snapshot by considering the SIR model and using message-passing algorithm. Antulov-Fantulin et al. [9] proposed a model to analyze source probability estimators. They dropped the independency assumptions on nodes and all network structures and analyzed the source probability estimators for general compartmental models. The soft margin estimator for the proposed model of Antulov-Fantulin et al. [9] is given bŷ where − → R θ is a binary vector that indicates the random outcomes of the epidemic process, . . , − → r θ,n } are the sample vectors that show the n independent outcomes of the epidemic process with the source term θ , ϕ : R n × R n → [0, 1] is a Jaccard similarity function, which can be calculated by dividing the cardinality of the intersection of the set of infected nodes in − → r 1 , − → r 2 by the cardinality of their union, ϕ( − → r * , − → r θ,i ) is a random variable that measures the similarity between the fixed realization vector − → r * and the random realization vector − → r θ,i , and exp(-(x-1) 2 a 2 ) is the Gaussian weighting function with a > 0. We will use the following hypothesis for the construction of our results throughout the paper.
The notion of convex and concave functions is so impressive in all fields of science, especially in mathematics, because of its notable property. Therefore many generalized and interesting results for convex and concave functions and their application have been accomplished [1-7, 18, 19, 27]. Now, the formal definition of convex and concave functions is stated as follows.

Definition 1
Let I be an arbitrary interval in R. Then the function : holds for all x, y ∈ I and λ ∈ [0, 1]. If inequality (2) holds in the reverse direction, then the function : I → R is said to be concave.
There are many inequalities proved for convex and concave functions. Among these inequalities, one of the most prominent and dynamic inequality is the well known Jensen's inequality in the literature. Jensen's inequality is one of the most leading and generalized inequality in the sense that many inequalities can be assumed from it. The formal statement of Jensen's inequality can be read in the following theorem.

Theorem 1 Let I be an interval in
. , x n ) be an n-tuple such that x i ∈ I for all i ∈ {1, 2, . . . , n}, and p = (p 1 , p 2 , . . . , p n ) be a positive n-tuple of real entries with P n = n i=1 p i . If the function : I → R is convex, then If the function : I → R is concave, then inequality (3) holds in the reverse direction.
In this paper, we advance the idea to give bounds for the soft margin estimator given in (1) while accustoming the existing notion of concave function. To achieve bounds for the soft margin estimator, we consume the concavity of Gaussian weighting function and Jensen's inequality. To obtain some more general and clear bounds for soft margin estimator, we use some general functions defined on rectangles, which are monotonic with respect to the first variable. We also utilize the behavior of the Jaccard similarity function for obtaining the desire bounds of soft margin estimator.

Main results
In order to build our results, we first establish the following lemma, which will support us in the achievement of our results.

Lemma 1 The Gaussian weighting function
Proof To show the concavity of Gaussian function (x), we use the double derivative test. For this, differentiating two times (x) with respect to x, we get Since exp -(x -1) 2 a 2 > 0 and a 4 > 0.
So, we just need to show that and Now, adding (4) and (5), we obtain In the following result, we acquire bounds for soft margin estimator adopting the concavity of the Gaussian function.
In the following theorem, we get some clearer bounds for soft margin estimator by imposing a restriction on the Jaccard function.
In the following theorem, we acquire some general bounds for soft margin estimator by considering a general function defined on rectangles, which is increasing with respect to the first variable.

Theorem 4 Let hypothesis H hold with a ∈ [
√ 2, ∞). Also assume that ϒ is an interval in R, F : ϒ × ϒ → R is an increasing function with respect to the first variable, and φ : [0, 1] → ϒ is an arbitrary function. Then Proof By utilizing inequality (6) and increasing the property of F with respect to the first variable, we get (16).
In the following result, we obtain some more general bounds for soft margin estimator by using a general function defined on rectangles and imposing a restriction on the Jaccard function.

Theorem 5 Let hypothesis H hold with a ∈ [
√ 2, ∞). Also assume that ϒ is an interval in R and F : ϒ × ϒ → R is an increasing function with respect to the first variable. If Furthermore, the right-hand side of (17) is a decreasing function of D and an increasing function of d.
Proof By utilizing inequality (11) and increasing the property of F with respect to the first variable, we obtain (17). Now, we show that the right-hand side of (17) is a decreasing function of D.
. Therefore, the first-order divided difference of (x) is decreasing, that is, Multiplying both sides of (18) by xd and then adding exp(-(d-1) 2 a 2 ), we get By utilizing (19) Hence, (20) proves that the right-hand side of (17) is a decreasing function of D.
Similarly, we can prove that the right-hand side of (17) is an increasing function of d.
In the succeeding theorem, we acquire some general bounds for soft margin estimator by taking a general function defined on rectangles and decreasing with respect to the first variable.

Theorem 6
Let hypothesis H hold with a ∈ [ √ 2, ∞). Also assume that ϒ is an interval in R, F : ϒ × ϒ → R is a decreasing function with respect to the first variable and φ : [0, 1] → ϒ is an arbitrary function. Then Proof By utilizing inequality (6) and decreasing the property of F with respect to the first variable, we get (21).
In the next result, we secure more certain general bounds for soft margin estimator by using a general function, which is decreasing with respect to the first variable, defined on rectangles and also imposing restriction on the Jaccard function.
Furthermore, the right-hand side of (22) is an increasing function of D and a decreasing function of d.
Proof By using inequality (11) and decreasing the property of F with respect to the first variable, we get (22). Now, we show that the right-hand side of (22) is an increasing function of D.
Similarly, we can prove that the right-hand side of (22) is a decreasing function of d.

Conclusion
In this paper, we extracted some useful bounds for the soft margin estimator given in (1) with the help of notion of concavity. Acquiring these beneficial bounds, we exercised the characteristics of the Jaccard similarity function. To obtain some more advanced bounds for the soft margin estimator, we considered some broad function defined on rectangles and monotonic with respect to the first variable.