Gamma Distribution out of sum of exponential random variables












11












$begingroup$


I have a sequence $T_1,T_2,ldots$ of independent exponential random variables with paramter $lambda$. I take the sum $S=sum_{i=1}^n T_i$ and now I would like to calculate the probability density function.



Well, I know that $P(T_i>t)=e^{-lambda t}$ and therefore $f_{T_i}(t)=lambda e^{-lambda t}$ so I need to find $P(T_1+cdots+T_n>t)$ and take the derivative. But I cannot expand the probability term, you have any ideas?










share|cite|improve this question











$endgroup$












  • $begingroup$
    it's called Erlang distribution
    $endgroup$
    – Alex
    Jan 29 '14 at 0:14










  • $begingroup$
    $P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
    $endgroup$
    – Alecos Papadopoulos
    Jan 29 '14 at 0:26
















11












$begingroup$


I have a sequence $T_1,T_2,ldots$ of independent exponential random variables with paramter $lambda$. I take the sum $S=sum_{i=1}^n T_i$ and now I would like to calculate the probability density function.



Well, I know that $P(T_i>t)=e^{-lambda t}$ and therefore $f_{T_i}(t)=lambda e^{-lambda t}$ so I need to find $P(T_1+cdots+T_n>t)$ and take the derivative. But I cannot expand the probability term, you have any ideas?










share|cite|improve this question











$endgroup$












  • $begingroup$
    it's called Erlang distribution
    $endgroup$
    – Alex
    Jan 29 '14 at 0:14










  • $begingroup$
    $P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
    $endgroup$
    – Alecos Papadopoulos
    Jan 29 '14 at 0:26














11












11








11


15



$begingroup$


I have a sequence $T_1,T_2,ldots$ of independent exponential random variables with paramter $lambda$. I take the sum $S=sum_{i=1}^n T_i$ and now I would like to calculate the probability density function.



Well, I know that $P(T_i>t)=e^{-lambda t}$ and therefore $f_{T_i}(t)=lambda e^{-lambda t}$ so I need to find $P(T_1+cdots+T_n>t)$ and take the derivative. But I cannot expand the probability term, you have any ideas?










share|cite|improve this question











$endgroup$




I have a sequence $T_1,T_2,ldots$ of independent exponential random variables with paramter $lambda$. I take the sum $S=sum_{i=1}^n T_i$ and now I would like to calculate the probability density function.



Well, I know that $P(T_i>t)=e^{-lambda t}$ and therefore $f_{T_i}(t)=lambda e^{-lambda t}$ so I need to find $P(T_1+cdots+T_n>t)$ and take the derivative. But I cannot expand the probability term, you have any ideas?







probability probability-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 29 '14 at 0:28









Michael Hardy

1




1










asked Jan 29 '14 at 0:13









TI JonesTI Jones

1261110




1261110












  • $begingroup$
    it's called Erlang distribution
    $endgroup$
    – Alex
    Jan 29 '14 at 0:14










  • $begingroup$
    $P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
    $endgroup$
    – Alecos Papadopoulos
    Jan 29 '14 at 0:26


















  • $begingroup$
    it's called Erlang distribution
    $endgroup$
    – Alex
    Jan 29 '14 at 0:14










  • $begingroup$
    $P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
    $endgroup$
    – Alecos Papadopoulos
    Jan 29 '14 at 0:26
















$begingroup$
it's called Erlang distribution
$endgroup$
– Alex
Jan 29 '14 at 0:14




$begingroup$
it's called Erlang distribution
$endgroup$
– Alex
Jan 29 '14 at 0:14












$begingroup$
$P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
$endgroup$
– Alecos Papadopoulos
Jan 29 '14 at 0:26




$begingroup$
$P(T_1+...+T_n>t)$ is $1-F_S(t)$ i.e. it relates to the cumulative distribution function, not to the density. Different degrees of computational difficulty.
$endgroup$
– Alecos Papadopoulos
Jan 29 '14 at 0:26










1 Answer
1






active

oldest

votes


















24












$begingroup$

The usual way to do this is to consider the moment generating function, noting that if $S = sum_{i=1}^n X_i$ is the sum of IID random variables $X_i$, each with MGF $M_X(t)$, then the MGF of $S$ is $M_S(t) = (M_X(t))^n$. Applied to the exponential distribution, we can get the gamma distribution as a result.



If you don't go the MGF route, then you can prove it by induction, using the simple case of the sum of the sum of a gamma random variable and an exponential random variable with the same rate parameter. Let's actually do this. Suppose $Y sim {rm Gamma}(a,b)$ and $X sim {rm Exponential}(b)$ are independent, so that $$f_Y(y) = frac{b^a y^{a-1} e^{-by}}{Gamma(a)} mathbb 1(y > 0), quad f_X(x) = be^{-bx} mathbb 1(x > 0), quad a, b > 0.$$ Then, we notice that if $a = 1$, $Y$ would also be exponential (i.e., the exponential distribution is a special case of the Gamma with $a = 1$). Now consider $Z = X+Y$. The PDF is $$begin{align*} f_Z(z) &= int_{y=0}^z f_Y(y) f_X(z-y) , dy \ &= int_{y=0}^z frac{b^{a+1} y^{a-1} e^{-by} e^{-b(z-y)}}{Gamma(a)} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} int_{y=0}^z y^{a-1} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} cdot frac{z^a}{a} = frac{b^{a+1} z^a e^{-bz}}{Gamma(a+1)}. end{align*}$$ But this is just a gamma PDF with new shape parameter $a^* = a+1$. So, it is easy to see by induction that the sum of $n$ IID exponential variables with common rate parameter $lambda$ is gamma with shape parameter $a = n$, and rate parameter $b = lambda$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
    $endgroup$
    – TI Jones
    Jan 29 '14 at 21:01










  • $begingroup$
    Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
    $endgroup$
    – heropup
    Jan 29 '14 at 21:10










  • $begingroup$
    @heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
    $endgroup$
    – sky-light
    Sep 7 '16 at 21:49











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f655302%2fgamma-distribution-out-of-sum-of-exponential-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









24












$begingroup$

The usual way to do this is to consider the moment generating function, noting that if $S = sum_{i=1}^n X_i$ is the sum of IID random variables $X_i$, each with MGF $M_X(t)$, then the MGF of $S$ is $M_S(t) = (M_X(t))^n$. Applied to the exponential distribution, we can get the gamma distribution as a result.



If you don't go the MGF route, then you can prove it by induction, using the simple case of the sum of the sum of a gamma random variable and an exponential random variable with the same rate parameter. Let's actually do this. Suppose $Y sim {rm Gamma}(a,b)$ and $X sim {rm Exponential}(b)$ are independent, so that $$f_Y(y) = frac{b^a y^{a-1} e^{-by}}{Gamma(a)} mathbb 1(y > 0), quad f_X(x) = be^{-bx} mathbb 1(x > 0), quad a, b > 0.$$ Then, we notice that if $a = 1$, $Y$ would also be exponential (i.e., the exponential distribution is a special case of the Gamma with $a = 1$). Now consider $Z = X+Y$. The PDF is $$begin{align*} f_Z(z) &= int_{y=0}^z f_Y(y) f_X(z-y) , dy \ &= int_{y=0}^z frac{b^{a+1} y^{a-1} e^{-by} e^{-b(z-y)}}{Gamma(a)} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} int_{y=0}^z y^{a-1} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} cdot frac{z^a}{a} = frac{b^{a+1} z^a e^{-bz}}{Gamma(a+1)}. end{align*}$$ But this is just a gamma PDF with new shape parameter $a^* = a+1$. So, it is easy to see by induction that the sum of $n$ IID exponential variables with common rate parameter $lambda$ is gamma with shape parameter $a = n$, and rate parameter $b = lambda$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
    $endgroup$
    – TI Jones
    Jan 29 '14 at 21:01










  • $begingroup$
    Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
    $endgroup$
    – heropup
    Jan 29 '14 at 21:10










  • $begingroup$
    @heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
    $endgroup$
    – sky-light
    Sep 7 '16 at 21:49
















24












$begingroup$

The usual way to do this is to consider the moment generating function, noting that if $S = sum_{i=1}^n X_i$ is the sum of IID random variables $X_i$, each with MGF $M_X(t)$, then the MGF of $S$ is $M_S(t) = (M_X(t))^n$. Applied to the exponential distribution, we can get the gamma distribution as a result.



If you don't go the MGF route, then you can prove it by induction, using the simple case of the sum of the sum of a gamma random variable and an exponential random variable with the same rate parameter. Let's actually do this. Suppose $Y sim {rm Gamma}(a,b)$ and $X sim {rm Exponential}(b)$ are independent, so that $$f_Y(y) = frac{b^a y^{a-1} e^{-by}}{Gamma(a)} mathbb 1(y > 0), quad f_X(x) = be^{-bx} mathbb 1(x > 0), quad a, b > 0.$$ Then, we notice that if $a = 1$, $Y$ would also be exponential (i.e., the exponential distribution is a special case of the Gamma with $a = 1$). Now consider $Z = X+Y$. The PDF is $$begin{align*} f_Z(z) &= int_{y=0}^z f_Y(y) f_X(z-y) , dy \ &= int_{y=0}^z frac{b^{a+1} y^{a-1} e^{-by} e^{-b(z-y)}}{Gamma(a)} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} int_{y=0}^z y^{a-1} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} cdot frac{z^a}{a} = frac{b^{a+1} z^a e^{-bz}}{Gamma(a+1)}. end{align*}$$ But this is just a gamma PDF with new shape parameter $a^* = a+1$. So, it is easy to see by induction that the sum of $n$ IID exponential variables with common rate parameter $lambda$ is gamma with shape parameter $a = n$, and rate parameter $b = lambda$.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
    $endgroup$
    – TI Jones
    Jan 29 '14 at 21:01










  • $begingroup$
    Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
    $endgroup$
    – heropup
    Jan 29 '14 at 21:10










  • $begingroup$
    @heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
    $endgroup$
    – sky-light
    Sep 7 '16 at 21:49














24












24








24





$begingroup$

The usual way to do this is to consider the moment generating function, noting that if $S = sum_{i=1}^n X_i$ is the sum of IID random variables $X_i$, each with MGF $M_X(t)$, then the MGF of $S$ is $M_S(t) = (M_X(t))^n$. Applied to the exponential distribution, we can get the gamma distribution as a result.



If you don't go the MGF route, then you can prove it by induction, using the simple case of the sum of the sum of a gamma random variable and an exponential random variable with the same rate parameter. Let's actually do this. Suppose $Y sim {rm Gamma}(a,b)$ and $X sim {rm Exponential}(b)$ are independent, so that $$f_Y(y) = frac{b^a y^{a-1} e^{-by}}{Gamma(a)} mathbb 1(y > 0), quad f_X(x) = be^{-bx} mathbb 1(x > 0), quad a, b > 0.$$ Then, we notice that if $a = 1$, $Y$ would also be exponential (i.e., the exponential distribution is a special case of the Gamma with $a = 1$). Now consider $Z = X+Y$. The PDF is $$begin{align*} f_Z(z) &= int_{y=0}^z f_Y(y) f_X(z-y) , dy \ &= int_{y=0}^z frac{b^{a+1} y^{a-1} e^{-by} e^{-b(z-y)}}{Gamma(a)} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} int_{y=0}^z y^{a-1} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} cdot frac{z^a}{a} = frac{b^{a+1} z^a e^{-bz}}{Gamma(a+1)}. end{align*}$$ But this is just a gamma PDF with new shape parameter $a^* = a+1$. So, it is easy to see by induction that the sum of $n$ IID exponential variables with common rate parameter $lambda$ is gamma with shape parameter $a = n$, and rate parameter $b = lambda$.






share|cite|improve this answer











$endgroup$



The usual way to do this is to consider the moment generating function, noting that if $S = sum_{i=1}^n X_i$ is the sum of IID random variables $X_i$, each with MGF $M_X(t)$, then the MGF of $S$ is $M_S(t) = (M_X(t))^n$. Applied to the exponential distribution, we can get the gamma distribution as a result.



If you don't go the MGF route, then you can prove it by induction, using the simple case of the sum of the sum of a gamma random variable and an exponential random variable with the same rate parameter. Let's actually do this. Suppose $Y sim {rm Gamma}(a,b)$ and $X sim {rm Exponential}(b)$ are independent, so that $$f_Y(y) = frac{b^a y^{a-1} e^{-by}}{Gamma(a)} mathbb 1(y > 0), quad f_X(x) = be^{-bx} mathbb 1(x > 0), quad a, b > 0.$$ Then, we notice that if $a = 1$, $Y$ would also be exponential (i.e., the exponential distribution is a special case of the Gamma with $a = 1$). Now consider $Z = X+Y$. The PDF is $$begin{align*} f_Z(z) &= int_{y=0}^z f_Y(y) f_X(z-y) , dy \ &= int_{y=0}^z frac{b^{a+1} y^{a-1} e^{-by} e^{-b(z-y)}}{Gamma(a)} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} int_{y=0}^z y^{a-1} , dy \ &= frac{b^{a+1} e^{-bz}}{Gamma(a)} cdot frac{z^a}{a} = frac{b^{a+1} z^a e^{-bz}}{Gamma(a+1)}. end{align*}$$ But this is just a gamma PDF with new shape parameter $a^* = a+1$. So, it is easy to see by induction that the sum of $n$ IID exponential variables with common rate parameter $lambda$ is gamma with shape parameter $a = n$, and rate parameter $b = lambda$.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Apr 26 '16 at 6:20

























answered Jan 29 '14 at 0:29









heropupheropup

63.8k762102




63.8k762102












  • $begingroup$
    Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
    $endgroup$
    – TI Jones
    Jan 29 '14 at 21:01










  • $begingroup$
    Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
    $endgroup$
    – heropup
    Jan 29 '14 at 21:10










  • $begingroup$
    @heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
    $endgroup$
    – sky-light
    Sep 7 '16 at 21:49


















  • $begingroup$
    Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
    $endgroup$
    – TI Jones
    Jan 29 '14 at 21:01










  • $begingroup$
    Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
    $endgroup$
    – heropup
    Jan 29 '14 at 21:10










  • $begingroup$
    @heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
    $endgroup$
    – sky-light
    Sep 7 '16 at 21:49
















$begingroup$
Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
$endgroup$
– TI Jones
Jan 29 '14 at 21:01




$begingroup$
Thanks for your answer, according to your first part concerning the MGF, I have some difficuties with this, the MGF for each of the $T_i$ is $M_T(x)=(1-frac{x}{lambda})^{-1}$, therefore $M_S(x)=(1-frac{x}{lambda})^{-n}$, but how to derive $f_S(x)$ now?
$endgroup$
– TI Jones
Jan 29 '14 at 21:01












$begingroup$
Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
$endgroup$
– heropup
Jan 29 '14 at 21:10




$begingroup$
Take the PDF of a gamma distribution, and calculate its MGF. When you see it has the same form, it follows from the uniqueness of MGFs that the sum of exponential RVs is therefore gamma distributed.
$endgroup$
– heropup
Jan 29 '14 at 21:10












$begingroup$
@heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
$endgroup$
– sky-light
Sep 7 '16 at 21:49




$begingroup$
@heropup Could you please let me know how to find the PDF of the sum of $n$ (independent non-identically distributed) Exponential random variables?
$endgroup$
– sky-light
Sep 7 '16 at 21:49


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f655302%2fgamma-distribution-out-of-sum-of-exponential-random-variables%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

The Binding of Isaac: Rebirth/Afterbirth

What does “Dominus providebit” mean?