Every finite state Markov chain has a stationary probability distribution The Next CEO of...
Where to find order of arguments for default functions
How do scammers retract money, while you can’t?
Example of a Mathematician/Physicist whose Other Publications during their PhD eclipsed their PhD Thesis
What does this shorthand mean?
Are there languages with no euphemisms?
Should I tutor a student who I know has cheated on their homework?
Science fiction (dystopian) short story set after WWIII
Whats the best way to handle refactoring a big file?
Return the Closest Prime Number
Term for the "extreme-extension" version of a straw man fallacy?
How to get regions to plot as graphics
Does the Brexit deal have to be agreed by both Houses?
Customer Requests (Sometimes) Drive Me Bonkers!
How do we know the LHC results are robust?
Is a stroke of luck acceptable after a series of unfavorable events?
Can the Reverse Gravity spell affect the Meteor Swarm spell?
What is the difference between "behavior" and "behaviour"?
Why is there a PLL in CPU?
What makes a siege story/plot interesting?
The King's new dress
% symbol leads to superlong (forever?) compilations
Would this house-rule that treats advantage as a +1 to the roll instead (and disadvantage as -1) and allows them to stack be balanced?
How to make a software documentation "officially" citable?
How do spells that require an ability check vs. the caster's spell save DC work?
Every finite state Markov chain has a stationary probability distribution
The Next CEO of Stack OverflowFinite State Markov Chain Stationary DistributionStationary distribution behavior - Markov chainGiven an invariant distribution is the (finite state) Markov transition matrix unique?Definition of Stationary Distributions of a Markov ChainFinite state space Markov chainfinite state markov chain stationary distribution existenceDetermining the Stationary Distribution of a Homogeneous Markov ChainContinuous-state Markov chain: Existence and uniqueness of stationary distributionFinite state Markov Chain always has long-term stationary distribution?$pi = pi P$Always exists?Selecting a Stationary distribution of a Markov chain
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbb{R}_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^{-1} · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^{-1} · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
add a comment |
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbb{R}_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^{-1} · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^{-1} · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
$begingroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbb{R}_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^{-1} · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^{-1} · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
$endgroup$
I am trying to understand the following proof that every finite-state Markov chain has a stationary distribution. The proof is from here.
Let $P$ be the $k times k$ (stochastic) transition probability matrix for our Markov chain. Now,
... $1$ is an eigenvalue for $P$ and therefore also for $P^t$ .
Writing a $P^t$ invariant $v$ as $v = v^+ − v^−$ with $v^+ , v^− in (
> mathbb{R}_+ )^k$ , we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone; if $v^+neq 0$ take $ν = ( sum v^+_i )^{-1} · v^+,$
otherwise normalize $v^−$.
The main thing I don't understand is
we obtain $P^t v^± = v^±$ because $P^t$ preserves
the positive cone
Why is this true?
I also don't understand why $( sum v^+_i )^{-1} · v^+$ works if $v^+ neq 0$.
Is there any easier way to show that every finite state Markov chain has a stationary probability distribution?
probability proof-verification markov-chains stochastic-matrices
probability proof-verification markov-chains stochastic-matrices
asked Dec 1 '18 at 22:23
jackson5jackson5
648513
648513
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = text{id}_X$ vs $text{im}(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbf{P}$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbb{R}^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
begin{equation}
begin{split}
mathbf{P} : Delta_d &to Delta_d \
mu &mapsto mu mathbf{P}
end{split}
end{equation}
As $|mathbf{P}|_2 leq sqrt{d} < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbf{P} = pi$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021916%2fevery-finite-state-markov-chain-has-a-stationary-probability-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = text{id}_X$ vs $text{im}(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = text{id}_X$ vs $text{im}(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
add a comment |
$begingroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = text{id}_X$ vs $text{im}(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
$endgroup$
The wording in this article is a little ambiguous. I thought of two interpretations, the first of which is incorrect. The second is correct but it doesn't explain the bit about "preservation of the positive cone".
It looks like it may be a case of mistakenly using that a mapping fixes a subset when it only preserves a subset. (I.e. $f|_X = text{id}_X$ vs $text{im}(f) subset X$.)
Interpretation 1. Maybe the statement below is being claimed:
(*) Let $C$ be the positive cone. If $P^tv = v$ then for all $v^+,v^- in C$ such that $v = v^+ - v^-$ we have $P^tv^+ = v^+$.
This is false unless $P$ is the identity matrix. Let $x in C$, then according to (*) then $bar v^pm = v^pm + x$ must satisfy $P^tbar v^+ = bar v^+$ as well. But linearity says then $P^tx = x$ too. So $P^t$ fixes the positive cone. This is only true if $P$ is the identity matrix because the span of the positive cone is the whole space.
Interpretation 2. Maybe instead it means to set $v^+$ to be the vector of positive entries of $v$ with 0s in place of negatives. E.g. if $v = (1,0,2,-7)$ then $v^+ = (1,0,2,0)$ and $v^- = (0,0,0,7)$. Then the claim would be:
If $P^tv = v$ then we have $P^tv^+ = v^+$ where $v^+$ is the vector of positive entries described above.
This is true, but I don't think the cited article offers any explanation as to why, and I don't know how to prove it without using Frobenius-Perron, which is maybe a harder theorem than the one we are trying to prove.
It is a trivial consequence of Frobenius-Perron in the case of an irreducible stochastic matrix, because one has either $v = v^+$ or $v = v^-$. This is because there is a stationary state $v$ (by F-P) and the eigenspace for $lambda = 1$ is simple (also F-P). So any invariant vector is a scalar multiple of it and also has this property.
For reducible matrices the eigenspace for $lambda = 1$ is no longer simple, so we can do things like $v = v_1 - v_2$ where $v_i$ is the stationary state for the $i$th block. Then $v^+ = v_1, v^- = v_2$. Following the suggestion in the article, one would then find a stationary distribution by normalizing just the positive part $v^+ = v_1$.
edited Dec 7 '18 at 14:48
answered Dec 2 '18 at 13:37
BenBen
4,313617
4,313617
add a comment |
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbf{P}$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbb{R}^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
begin{equation}
begin{split}
mathbf{P} : Delta_d &to Delta_d \
mu &mapsto mu mathbf{P}
end{split}
end{equation}
As $|mathbf{P}|_2 leq sqrt{d} < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbf{P} = pi$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbf{P}$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbb{R}^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
begin{equation}
begin{split}
mathbf{P} : Delta_d &to Delta_d \
mu &mapsto mu mathbf{P}
end{split}
end{equation}
As $|mathbf{P}|_2 leq sqrt{d} < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbf{P} = pi$.
$endgroup$
add a comment |
$begingroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbf{P}$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbb{R}^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
begin{equation}
begin{split}
mathbf{P} : Delta_d &to Delta_d \
mu &mapsto mu mathbf{P}
end{split}
end{equation}
As $|mathbf{P}|_2 leq sqrt{d} < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbf{P} = pi$.
$endgroup$
To your last question, as to whether there is a simpler way of proving existence of a stationary distribution for finite state Markov chains, that depends what tools you have at your disposal. Here is a nice and short consequence of a fixed point theorem:
Let a Markov chain $mathbf{P}$ over $d$ states. The simplex $Delta_d$ is a convex and compact subset of $mathbb{R}^d$, which is a Euclidean vector space with usual inner product. We can look at the kernel as the following linear operator,
begin{equation}
begin{split}
mathbf{P} : Delta_d &to Delta_d \
mu &mapsto mu mathbf{P}
end{split}
end{equation}
As $|mathbf{P}|_2 leq sqrt{d} < infty$, the operator is bounded and therefore continuous. As a consequence, we can apply Brouwer's fixed point theorem to show that $exists pi in Delta_d, pi mathbf{P} = pi$.
answered Mar 16 at 8:46
ippiki-ookamiippiki-ookami
451317
451317
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3021916%2fevery-finite-state-markov-chain-has-a-stationary-probability-distribution%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
For your second question, what must a state distribution vector look like and what can you say about the elements of the scaled version of $v^+$?
$endgroup$
– amd
Dec 2 '18 at 0:23
$begingroup$
They add up to 1, makes sense. I get that part now
$endgroup$
– jackson5
Dec 2 '18 at 0:25