States of Markov chain and stationary distribution
Let $X$ be a Markov chain with a state space $S={{0,1,2,... }}$ and a transition matrix $P$ with given $p_{i,0}=frac{i}{i+1}$ and $p_{i,i+1}=frac{1}{i+1}$, for $i=0,1,2,...$. Find out which states are transient, null and not-null. Find stationary distribution.
I have that $p_{0,0}=0, p_{1,0}=frac{1}{2}, p_{2,0}=frac{2}{3}, p_{3,0}=frac{3}{4}$ and $p_{0,1}=1, p_{1,2}=frac{1}{2}, p_{2,3}=frac{1}{3}, p_{3,4}=frac{1}{4}$.
I think that all states are persistent, so no states are transient.
There is a theorem which says that a state is null $iff lim_{n to infty} p_{ii}(n)=0$, so state $0$ is null. I guess all the other states are non-null but I don't know how to prove it.
Also I don't know how to find stationary distribution.
Please help me with the above exercise, correct me where I'm wrong, and send any tips to the rest. Any will be much appreciated.
probability-theory stochastic-processes markov-chains
add a comment |
Let $X$ be a Markov chain with a state space $S={{0,1,2,... }}$ and a transition matrix $P$ with given $p_{i,0}=frac{i}{i+1}$ and $p_{i,i+1}=frac{1}{i+1}$, for $i=0,1,2,...$. Find out which states are transient, null and not-null. Find stationary distribution.
I have that $p_{0,0}=0, p_{1,0}=frac{1}{2}, p_{2,0}=frac{2}{3}, p_{3,0}=frac{3}{4}$ and $p_{0,1}=1, p_{1,2}=frac{1}{2}, p_{2,3}=frac{1}{3}, p_{3,4}=frac{1}{4}$.
I think that all states are persistent, so no states are transient.
There is a theorem which says that a state is null $iff lim_{n to infty} p_{ii}(n)=0$, so state $0$ is null. I guess all the other states are non-null but I don't know how to prove it.
Also I don't know how to find stationary distribution.
Please help me with the above exercise, correct me where I'm wrong, and send any tips to the rest. Any will be much appreciated.
probability-theory stochastic-processes markov-chains
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21
add a comment |
Let $X$ be a Markov chain with a state space $S={{0,1,2,... }}$ and a transition matrix $P$ with given $p_{i,0}=frac{i}{i+1}$ and $p_{i,i+1}=frac{1}{i+1}$, for $i=0,1,2,...$. Find out which states are transient, null and not-null. Find stationary distribution.
I have that $p_{0,0}=0, p_{1,0}=frac{1}{2}, p_{2,0}=frac{2}{3}, p_{3,0}=frac{3}{4}$ and $p_{0,1}=1, p_{1,2}=frac{1}{2}, p_{2,3}=frac{1}{3}, p_{3,4}=frac{1}{4}$.
I think that all states are persistent, so no states are transient.
There is a theorem which says that a state is null $iff lim_{n to infty} p_{ii}(n)=0$, so state $0$ is null. I guess all the other states are non-null but I don't know how to prove it.
Also I don't know how to find stationary distribution.
Please help me with the above exercise, correct me where I'm wrong, and send any tips to the rest. Any will be much appreciated.
probability-theory stochastic-processes markov-chains
Let $X$ be a Markov chain with a state space $S={{0,1,2,... }}$ and a transition matrix $P$ with given $p_{i,0}=frac{i}{i+1}$ and $p_{i,i+1}=frac{1}{i+1}$, for $i=0,1,2,...$. Find out which states are transient, null and not-null. Find stationary distribution.
I have that $p_{0,0}=0, p_{1,0}=frac{1}{2}, p_{2,0}=frac{2}{3}, p_{3,0}=frac{3}{4}$ and $p_{0,1}=1, p_{1,2}=frac{1}{2}, p_{2,3}=frac{1}{3}, p_{3,4}=frac{1}{4}$.
I think that all states are persistent, so no states are transient.
There is a theorem which says that a state is null $iff lim_{n to infty} p_{ii}(n)=0$, so state $0$ is null. I guess all the other states are non-null but I don't know how to prove it.
Also I don't know how to find stationary distribution.
Please help me with the above exercise, correct me where I'm wrong, and send any tips to the rest. Any will be much appreciated.
probability-theory stochastic-processes markov-chains
probability-theory stochastic-processes markov-chains
asked Jan 6 at 12:04
MacAbraMacAbra
17019
17019
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21
add a comment |
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21
add a comment |
1 Answer
1
active
oldest
votes
You need to realise that since $p_{0,1} = 1$ and $p_{i,0} + p_{i,i+1} = 1$ then $p_{0,j}$ must be $0$ for all $j ne 1$ and $p_{i,k}$ must be $0$ for all $k ne 0 mbox{ or } i+1$ .
Let $pi_j = frac{e^{-1}}{j!}$ for all $j ge 0$. Then $pi_j$ is the stationary distribution because
begin{eqnarray}
sum_{i=0}^infty pi_i &=& 1 ,\
sum_{i=0}^infty pi_i p_{i,j} &=& frac{1}{j} pi_{j-1} = pi_j mbox{ for $j ge 1$, and}\
sum_{i=0}^infty pi_i p_{i,0}
&=& sum_{i=0}^infty frac{i}{i+1} pi_i \
&=& sum_{i=0}^infty left(1 - frac{1}{(i+1)}right)pi_i\
&=& e^{-1}sum_{i=0}^infty left(frac{1}{i!} - frac{1}{(i+1)!}right)\
&=& pi_0 .
end{eqnarray}
Consequently, all states are positive recurrent—none are transient or null recurrent.
New contributor
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063774%2fstates-of-markov-chain-and-stationary-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You need to realise that since $p_{0,1} = 1$ and $p_{i,0} + p_{i,i+1} = 1$ then $p_{0,j}$ must be $0$ for all $j ne 1$ and $p_{i,k}$ must be $0$ for all $k ne 0 mbox{ or } i+1$ .
Let $pi_j = frac{e^{-1}}{j!}$ for all $j ge 0$. Then $pi_j$ is the stationary distribution because
begin{eqnarray}
sum_{i=0}^infty pi_i &=& 1 ,\
sum_{i=0}^infty pi_i p_{i,j} &=& frac{1}{j} pi_{j-1} = pi_j mbox{ for $j ge 1$, and}\
sum_{i=0}^infty pi_i p_{i,0}
&=& sum_{i=0}^infty frac{i}{i+1} pi_i \
&=& sum_{i=0}^infty left(1 - frac{1}{(i+1)}right)pi_i\
&=& e^{-1}sum_{i=0}^infty left(frac{1}{i!} - frac{1}{(i+1)!}right)\
&=& pi_0 .
end{eqnarray}
Consequently, all states are positive recurrent—none are transient or null recurrent.
New contributor
add a comment |
You need to realise that since $p_{0,1} = 1$ and $p_{i,0} + p_{i,i+1} = 1$ then $p_{0,j}$ must be $0$ for all $j ne 1$ and $p_{i,k}$ must be $0$ for all $k ne 0 mbox{ or } i+1$ .
Let $pi_j = frac{e^{-1}}{j!}$ for all $j ge 0$. Then $pi_j$ is the stationary distribution because
begin{eqnarray}
sum_{i=0}^infty pi_i &=& 1 ,\
sum_{i=0}^infty pi_i p_{i,j} &=& frac{1}{j} pi_{j-1} = pi_j mbox{ for $j ge 1$, and}\
sum_{i=0}^infty pi_i p_{i,0}
&=& sum_{i=0}^infty frac{i}{i+1} pi_i \
&=& sum_{i=0}^infty left(1 - frac{1}{(i+1)}right)pi_i\
&=& e^{-1}sum_{i=0}^infty left(frac{1}{i!} - frac{1}{(i+1)!}right)\
&=& pi_0 .
end{eqnarray}
Consequently, all states are positive recurrent—none are transient or null recurrent.
New contributor
add a comment |
You need to realise that since $p_{0,1} = 1$ and $p_{i,0} + p_{i,i+1} = 1$ then $p_{0,j}$ must be $0$ for all $j ne 1$ and $p_{i,k}$ must be $0$ for all $k ne 0 mbox{ or } i+1$ .
Let $pi_j = frac{e^{-1}}{j!}$ for all $j ge 0$. Then $pi_j$ is the stationary distribution because
begin{eqnarray}
sum_{i=0}^infty pi_i &=& 1 ,\
sum_{i=0}^infty pi_i p_{i,j} &=& frac{1}{j} pi_{j-1} = pi_j mbox{ for $j ge 1$, and}\
sum_{i=0}^infty pi_i p_{i,0}
&=& sum_{i=0}^infty frac{i}{i+1} pi_i \
&=& sum_{i=0}^infty left(1 - frac{1}{(i+1)}right)pi_i\
&=& e^{-1}sum_{i=0}^infty left(frac{1}{i!} - frac{1}{(i+1)!}right)\
&=& pi_0 .
end{eqnarray}
Consequently, all states are positive recurrent—none are transient or null recurrent.
New contributor
You need to realise that since $p_{0,1} = 1$ and $p_{i,0} + p_{i,i+1} = 1$ then $p_{0,j}$ must be $0$ for all $j ne 1$ and $p_{i,k}$ must be $0$ for all $k ne 0 mbox{ or } i+1$ .
Let $pi_j = frac{e^{-1}}{j!}$ for all $j ge 0$. Then $pi_j$ is the stationary distribution because
begin{eqnarray}
sum_{i=0}^infty pi_i &=& 1 ,\
sum_{i=0}^infty pi_i p_{i,j} &=& frac{1}{j} pi_{j-1} = pi_j mbox{ for $j ge 1$, and}\
sum_{i=0}^infty pi_i p_{i,0}
&=& sum_{i=0}^infty frac{i}{i+1} pi_i \
&=& sum_{i=0}^infty left(1 - frac{1}{(i+1)}right)pi_i\
&=& e^{-1}sum_{i=0}^infty left(frac{1}{i!} - frac{1}{(i+1)!}right)\
&=& pi_0 .
end{eqnarray}
Consequently, all states are positive recurrent—none are transient or null recurrent.
New contributor
edited 2 days ago
New contributor
answered Jan 7 at 6:29
lonza leggieralonza leggiera
584
584
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063774%2fstates-of-markov-chain-and-stationary-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
"so state 0 is null" How do you know? Did you prove that $limlimits_{n to infty} p_{00}(n)=0$? How?
– Did
Jan 6 at 12:46
I just assumed that if $p_{0,0}=0$, then $lim_{n to infty}p_{0,0}(n)=0$, I know it might be naive, but I'm not aware how to approach this problem.
– MacAbra
Jan 6 at 13:10
Not especially naive, but squarely wrong because obviously absurd, right? Say, do you have any serious attempt to present?
– Did
Jan 6 at 13:21