Abstract
In this paper, we present some inequalities involving k-gamma and k-beta functions via some classical inequalities, like Chebyshev’s inequality for synchronous (asynchronous) mappings, Grüss’, and Ostrowski’s inequality. Also, we give applications of k-beta function in probability distributions. Most of the inequalities produced in this paper are the k-analogs of existing results. If , we have the classical one.
Similar content being viewed by others
1 Introduction
In this section, we present some fundamental relations for k-gamma and k-beta functions introduced in [1–7]. In Section 2, we introduce some k-analog properties of the mapping , which is helpful in coming sections. Sections 3 to 5 are devoted to the applications of some integral inequalities like Chebyshev’s, Grüss’, and Ostrowski’s inequality for k-beta mappings. In the last section, we give the applications of the said function for the probability distribution and the probability density function.
Recently, Diaz and Pariguan [1] introduced the generalized k-gamma function as
and also gave the properties of the said function. is one parameter deformation of the classical gamma function such that as . is based on the repeated appearance of the expression of the following form:
The function of the variable α given by the statement (2), denoted by is called the Pochhammer k-symbol. We obtain the usual Pochhammer symbol by taking . The definition given in (1) is the generalization of and the integral form of is given by
From (3), we can easily show that
The same authors defined the k-beta function as
and the integral form of is
From the definition of given in (5) and (6), we can easily prove that
Also, the researchers in [2–6] have worked on the generalized k-gamma and k-beta functions and discussed the following properties:
Using (5) and (7), we see that, for and , the following properties of k-beta function are valid (see [2, 3] and [7]):
Note that when , .
For more details about the theory of k-special functions like the k-gamma function, the k-polygamma function, the k-beta function, the k-hypergeometric functions, solutions of k-hypergeometric differential equations, contiguous functions relations, inequalities with applications and integral representations with applications involving k-gamma and k-beta functions, k-gamma and k-beta probability distributions, and so forth (see [8–15]).
2 Main results: some k-analog properties of the mapping
For the applications of some integral inequalities involving k-gamma and k-beta functions, we have to discuss some k-analog properties regarding these mappings. For this purpose, consider the mapping , defined by
and differentiation of above equation gives
Here, we see that has the solution in the interval . Also, on and on . Thus, we conclude that is the maximum point in the interval and consequently, we have
and
Also, we have
and
Further, we observe that
Now, we have the estimations
and
Again, the second derivative of the said mapping gives
Now, consider the mapping , defined by
Here, we have and . If , then has a solution on the interval and one solution in the interval . Also, the quadratic function has a vertex at . So, the coordinates of the vertex are
and
Consequently, we have
and then we get
If , we have
From (30), if , we get
and if
Remark If , we have the properties of the mapping given in [16].
3 Chebyshev type inequalities involving k-beta and k-gamma functions
In this section, we prove some inequalities which involve k-gamma and k-beta functions by using some natural inequalities [17]. The following result is well known in the literature as Chebyshev’s integral inequality for synchronous (asynchronous) functions. Here, we use this result to prove some k-analog inequalities.
Lemma 3.1 Let be such that for all and h, , hf, and hg are integrable on I. If f, g are synchronous (asynchronous) on I, i.e.,
then we have the inequality (see [18, 19])
Lemma 3.1 can be proved by using Korkine’s identity [20],
and an inequality generalizing Chebyshev’s inequality is
provided that and , are differentiable and the first derivatives are bounded on I.
Theorem 3.2 For , let and , then we have the following inequality for the k-beta function:
where
Proof Consider the mappings
defined on the interval . Using the generalized version of Lemma 3.1, i.e., (37), along with the mappings defined above, we get
Applying (6), (39) gives
Now, taking into account the fact
for all , we can deduce the desired inequality (38). □
Corollary 3.3 For and , we have the following inequality for the k-beta function:
Proof Just use in Theorem 3.2 to get the required corollary. □
Theorem 3.4 For , , and , we have the following inequality for the k-beta function:
Proof Consider the mappings
defined on the interval , . Now, we have
Using the generalized version of Lemma 3.1, i.e., (37), along with the above results, we get
which will be equivalent to Theorem 3.4 by applying (6) on both sides of the above inequality. □
Corollary 3.5 If and , then Theorem 3.4 takes the form
and inequality (41) is equivalent to
Proof Taking , in Theorem 3.4, we get
Use of (5) and (7) implies
By (8) and (10), inequality (41) can be obtained and some algebraic calculations give the desired inequality (42). □
4 Some other inequalities for k-beta mappings
In 1935, Grüss established an integral inequality which gives an estimation for the integral of a product in terms of the product of integrals [17]. We use the following lemma [21] to prove our next theorem which is based on the Grüss integral inequality.
Lemma 4.1 If f and g are two functions defined and integrable on , then
Theorem 4.2 Let and , then we have the following inequality for the k-beta mapping:
where
Proof Consider the mappings
defined on the interval . Using Lemma 4.1, along with the mappings defined above, we get
Applying (24) and (26) and using the fact given in Section 2, we get
□
Theorem 4.3 Let and , then the k-beta mapping satisfies the inequality
Proof Consider the mappings
defined on the interval . Here, we observe that
and
Now, by Lemma 4.1, we get the required result. □
The following inequality of Grüss type has been established in [22].
Lemma 4.4 If f and g are two functions defined and integrable on , then
provided
Theorem 4.5 Let , then we have the following inequality for the k-beta mapping:
where , .
Proof Consider the mappings
defined on the interval . Using Lemma 4.4, along with the mappings defined above, we get
provided
Now, using the fact that
and
we have our required result. □
Remarks If we use , inequalities (43) and (44) are the results for the classical beta function proved in [22].
Lemma 4.6 If f and g are two functions defined and integrable on , then we have the inequality
Theorem 4.7 If , the following inequalities for the k-beta mapping hold good:
and
Proof Consider the mappings
defined on the interval . Here, we note that
Using Lemma 4.4 for the above results, we get
provided
By (6) and the fact , we have the inequality (45). Also, and . Thus, using Lemma 4.6 we have the inequality (46). □
5 Main results: via Ostrowski’s inequality
In this section, we use the integral inequality which is known in the literature as Ostrowki’s inequality [23]. The following lemma concerning Ostrowski’s inequality for absolutely continuous mappings whose derivatives belong to spaces hold [24, 25]. Here, we give some lemmas which are helpful for the results involving k-beta mapping.
Lemma 5.1 Let be an absolutely continuous mapping for which , . Then
for all , where
and the best inequality for (47) is embodied in the form
For the application of the above inequalities to some numerical quadrature rules, we have the following lemma.
Lemma 5.2 Let be an absolutely continuous mapping for which , . Then for any partition of and any intermediate point vector satisfying (), we have
Here denotes the quadrature rule of the Riemann type defined by
and the remainder satisfies the estimate
where (). Lemmas 5.1 and 5.2 are proved in [19]and the best quadrature formula that can be obtained from the above result is one for which , , and is given in the following corollary.
Corollary 5.3 Let f and be as in the Lemma 5.2, then
where denotes the mid point quadrature rule i.e.,
and the remainder satisfies the estimation
We are now able to apply the above results for Euler’s k-beta mapping.
Theorem 5.4 Let , , and . Then we have the inequality for the k-beta function as
provided that .
Proof Consider the mapping , . From Lemma 5.1 along with this mapping, we get
where and . Now, taking the derivatives of the above mapping, we have
If , and if , then , which shows that at , we have a maximum for and
Consequently, we have
for all . Thus
Using (48) and (49), we get the desired Theorem 5.4. □
Now, we have the result concerning the approximation of the k-beta function in terms of the Riemann sums.
Theorem 5.5 Let , and . If is a division of , , a sequence of intermediate points for , then we have the formula for the k-beta function:
where the remainder satisfies the estimate
where () and .
Proof Taking , , along with Lemma 5.2 we get Theorem 5.5. The proof of Lemma 5.2 is available in [11], so details are omitted. □
6 Inequalities in probability theory and applications for k-beta function
Here, we give some applications of the Ostrowski type inequality for the k-beta function and cumulative distribution functions. For this purpose, we need some basic concepts of random variable, distribution function, probability density function and expected values.
A process which generates raw data is called an experiment and an experiment which gives different results under similar conditions, even though it is repeated a large number of times, is termed a random experiment. A variable whose values are determined by the outcomes of a random experiment is called a random variable or simply a variate. The random variables are usually denoted by capital letters, X, Y, and Z, while the values associated to them by corresponding small letters x, y, and z. The random variables are classified into two classes namely discrete and continuous random variables.
A random variable that can assume only a finite or countably infinite number of values is known as a discrete random variable, while a variable which can assume each and every value within some interval is called a continuous random variable. The distribution function of a random variable X, denoted by , is defined by i.e., the distribution function gives the probability of the event that X takes a value less than or equal to a specified value x.
A random variable X may also be defined as continuous if its distribution function is continuous and differentiable everywhere except at isolated points in the given range. Let the derivative of be denoted by i.e., . Since is a non-decreasing function of x,
Here, the function is called the probability density function, denoted by or simply a density function of the random variable X. A probability density function has the properties
and the probability that the random variable X takes on a value in the interval is given by
which shows the area under the curve between and .
A moment designates the power to which the deviations are raised before averaging them. In statistics, we have three kinds of moments:
-
(i)
Moments about any value is the r th power of the deviation of variable from A and is called the r th moment of the distribution about A.
-
(ii)
Moments about is the r th power of the deviation of variable from 0 and is called the r th moment of the distribution about 0.
-
(iii)
Moments about mean i.e., is the r th power of the deviation of variable from mean and is called the r th moment of the distribution about mean. If a random variable X assumes all the values from a to b, then for a continuous distribution, the r th moments about the arbitrary number A and 0, respectively, are given by and (see [26–28]).
Definition 6.1 In a random experiment with n outcomes, suppose a variable X assumes the values with corresponding probabilities , then the paring , , is called a probability distribution and (in the case of discrete distributions). Also, if is a continuous probability density function defined on an interval , then . The expected value of the variate is defined as the first moment of the probability distribution about i.e.,
Definition 6.2 Let X be a continuous random variable, then it is said to have a beta k-distribution of the first kind with two parameters m and n, if its probability k-density function is defined by [8]
In the above distribution, the k-beta variable of the first kind is referred to as and its k-distribution function is given by
Remarks We can call the above function an incomplete k-beta function because, if , it is an incomplete beta function, as tabulated in [29, 30].
Proposition 6.3 For the parameters , the expected value of the k-beta random variable is given by
Proof For the k-beta random variable defined above, we observe that
Using (5), (6), and (8), we have
□
Lemma 6.4 Let X be a random variable taking values in the finite interval , with the cumulative distribution function . Then the following inequalities of Ostrowski type hold:
for all . All the inequalities are sharp and the constant is the best possible. However, by the integration by parts formula for the Riemann-Stieltjes integral, we have
and
The proof of Lemma 6.4 is given in [19, 31]. Now, we are able to give some applications for the k-beta random variable.
Theorem 6.5 Let X be a k-beta random variable with parameters , then we have the following inequalities:
and
for all and, in particular,
and
Proof Using Lemma 6.4 along with the k-beta random variable and defined in (50) and (51) and Proposition 6.3 for the expected values, we get
and, by (52), we have
for all . In particular, for the intermediate point of the interval , i.e., at , we have the remaining results of Theorem 6.5. □
Lemma 6.6 Let X be a random variable with probability density function and with cumulative distribution function . If , , then the following inequality holds:
for all , where .
Now, we have the application of the beta random variable X in terms of the parameter . A k-beta random variable X with positive parameters p, q, and k has the probability density function
where is the k-beta function. Here, we observe that
Thus, we have
provided
Theorem 6.7 Let X be a k-beta random variable with parameters , and . Then we have the following inequalities:
for all and, in particular
Proof Using Lemma 6.6 along with the k-beta random variable and defined in (50), (51), and (53) for the expected values, we get the required Theorem 6.7. □
References
Diaz R, Pariguan E: On hypergeometric functions and k -Pochhammer symbol. Divulg. Mat. 2007,15(2):179–192.
Kokologiannaki CG: Properties and inequalities of generalized k -gamma, beta and zeta functions. Int. J. Contemp. Math. Sci. 2010,5(14):653–660.
Kokologiannaki CG, Krasniqi V: Some properties of k -gamma function. Matematiche 2013, LXVIII: 13–22.
Krasniqi V: A limit for the k -gamma and k -beta function. Int. Math. Forum 2010,5(33):1613–1617.
Mansoor M: Determining the k -generalized gamma function by functional equations. Int. J. Contemp. Math. Sci. 2009,4(21):1037–1042.
Mubeen S, Habibullah GM: An integral representation of some k -hypergeometric functions. Int. Math. Forum 2012,7(4):203–207.
Mubeen S, Habibullah GM: k -Fractional integrals and applications. Int. J. Contemp. Math. Sci. 2012,7(2):89–94.
Rehman G, Mubeen S, Rehman A, Naz M: On k -gamma, k -beta distributions and moment generating functions. J. Probab. Stat. 2014., 2014: Article ID 982013
Zhang J, Shi HN: Two double inequalities for k -gamma and k -Riemann zeta functions. J. Inequal. Appl. 2014., 2014: Article ID 191
Mubeen S, Naz M, Rahman G: A note on k -hypergeometric differential equations. J. Inequal. Spec. Funct. 2013,4(3):38–43.
Mubeen S, Rahman G, Rehman A, Naz M: Contiguous function relations for k -hypergeometric functions. ISRN Math. Anal. 2014., 2014: Article ID 410801
Mubeen S, Naz M, Rehman A, Rahman G: Solutions of k -hypergeometric differential equations. J. Appl. Math. 2014., 2014: Article ID 128787
Mubeen S, Rehman A, Shaheen F: Properties of k -gamma, k -beta and k -psi functions. Bothalia 2014, 44: 371–379.
Rehman A, Mubeen S, Sadiq N, Shaheen F: Some inequalities involving k -gamma and k -beta functions with applications. J. Inequal. Appl. 2014., 2014: Article ID 224
Krasniqi V: Inequalities and monotonicity for the ration of k -gamma function. Sci. Magna 2010,6(1):40–45.
Dragomir, SS, Kumar, P, Singh, SP: Mathematical Inequalities with Applications to the Beta and Gamma Mappings-I. Survey paper (1999)
Mitrinovic DS, Pecaric JE, Fink AM: Classical and New Inequalities in Analysis. Kluwer Academic, Dordrecht; 1993.
Kumar P, Singh SP, Dragomir SS: Some inequalities involving beta and gamma functions. Nonlinear Anal. Forum 2001,6(1):143–150.
Dragomir SS, Agarwal RP, Barnett NS: Inequalities for beta and gamma functions via some classical and new integral inequalities. J. Inequal. Appl. 2000, 5: 103–165.
Dragomir SS, Wang S: Applications of Ostrowski’s inequality for the estimation of error bounds for some special means and for some numerical quadrature rules. Appl. Math. Lett. 1998,11(1):105–109. 10.1016/S0893-9659(97)00142-0
Dragomir, NM, Pranesh, K, Dragomir, SS: On some inequalities for Euler’s gamma functions and their applications to gamma probability distribution. Survey paper (1999)
Dragomir SS: Some integral inequalities of Grüss type. Indian J. Pure Appl. Math. 2000,31(4):397–415.
Mitrinovic DS, Pecaric JE, Fink AM: Inequalities for Functions and Their Integrals and Derivatives. Kluwer Academic, Dordrecht; 1994.
Dragomir SS, Wang S:A new inequalities of Ostrowski type in norm. Indian J. Math. 1998,40(3):299–304.
Dragomir, SS: On the Ostrowski’s integral inequality to Lipschitz mappings and applications. Survey paper (1999)
Larsen RJ, Marx ML: An Introduction to Mathematical Statistics and Its Applications. 5th edition. Prentice-Hall International, Englewood Cliffs; 2011.
Walac, C: A Hand Book on Statictical Distributations for Experimentalists. Last modification 10 september (2007)
Hasting NAJ, Peacock JB: Statistical Distributions. Butterworth, Stoneham; 1975.
Spanier, J, Oldham, KB: ‘The Gamma Function ’ and ‘The incomplete Gamma Function and related Functions’. Washington, DC (1987)
Pearson K: Tables of Incomplete Beta Function. Cambridge University Press, Cambridge; 1934. 2nd edition (1968)
Barnett NS, Dragomir SS: An inequality of Ostrowski’s type for cumulative distribution functions. Kyungpook Math. J. 1999,39(2):303–311.
Acknowledgements
The authors are grateful to the editor for suggestions, which improved the contents of the article.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
The authors AR and SM contributed and approved equally to the writing of this paper.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0), which permits use, duplication, adaptation, distribution, and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Rehman, A., Mubeen, S. Some inequalities involving k-gamma and k-beta functions with applications - II. J Inequal Appl 2014, 445 (2014). https://doi.org/10.1186/1029-242X-2014-445
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1029-242X-2014-445