There exists a measure-preserving transformation with any given (nonnegative) entropyNotation for an event...

Time travel short story where dinosaur doesn't taste like chicken

Can't find the Shader/UVs tab

In the late 1940’s to early 1950’s what technology was available that could melt a LOT of ice?

Why does the negative sign arise in this thermodynamic relation?

Finding algorithms of QGIS commands?

Examples of a statistic that is not independent of sample's distribution?

Best approach to update all entries in a list that is paginated?

Is having access to past exams cheating and, if yes, could it be proven just by a good grade?

Are there historical instances of the capital of a colonising country being temporarily or permanently shifted to one of its colonies?

BitNot does not flip bits in the way I expected

How could our ancestors have domesticated a solitary predator?

Why the color red for the Republican Party

Why would a jet engine that runs at temps excess of 2000°C burn when it crashes?

Unreachable code, but reachable with exception

Peter's Strange Word

Why is this plane circling around the Lucknow airport every day?

A three room house but a three headED dog

Make a transparent 448*448 image

If the Captain's screens are out, does he switch seats with the co-pilot?

What do you call the air that rushes into your car in the highway?

Why is there a voltage between the mains ground and my radiator?

My story is written in English, but is set in my home country. What language should I use for the dialogue?

Accountant/ lawyer will not return my call

Making a sword in the stone, in a medieval world without magic



There exists a measure-preserving transformation with any given (nonnegative) entropy


Notation for an event related to some measure-preserving transformationDefinition of entropy of an ergodic measureExistence of a measure-preserving mapping between two given measure spaces?Does any measure preserving system have an invertible extension?What does “there exists a subset $S_0 subset S$ of full $mu$-measure” mean?Show that frac(1/x) is a measure-preserving transformationshow that there is a set $ A $ with positive measure such that $ T(A) cap A= phi $Entropy of a Measure Preserving TransformationDifferential entropy vs Kolmogorov-Sinai “partition trick”measure preserving transformation inequality













1












$begingroup$


Let $(X,mathscr{B},mu,T)$ be a measure-preserving system and let $xi$ be a partition of $X$ with finite entropy. Then the entropy of $T$ with respect to $xi$ is
$$h_mu(T,xi)=lim_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi)=sup_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi).$$
The entropy of $T$ is
$$h_mu(T)=sup_{xi:H_mu(xi)<infty}h_mu(T,xi).$$



Let $h$ be a nonnegative number. I wonder if there always exists a measure-preserving transformation with $h$ as its entropy.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
    $endgroup$
    – kimchi lover
    Mar 9 at 17:14










  • $begingroup$
    @kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
    $endgroup$
    – No One
    Mar 9 at 20:32
















1












$begingroup$


Let $(X,mathscr{B},mu,T)$ be a measure-preserving system and let $xi$ be a partition of $X$ with finite entropy. Then the entropy of $T$ with respect to $xi$ is
$$h_mu(T,xi)=lim_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi)=sup_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi).$$
The entropy of $T$ is
$$h_mu(T)=sup_{xi:H_mu(xi)<infty}h_mu(T,xi).$$



Let $h$ be a nonnegative number. I wonder if there always exists a measure-preserving transformation with $h$ as its entropy.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
    $endgroup$
    – kimchi lover
    Mar 9 at 17:14










  • $begingroup$
    @kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
    $endgroup$
    – No One
    Mar 9 at 20:32














1












1








1


1



$begingroup$


Let $(X,mathscr{B},mu,T)$ be a measure-preserving system and let $xi$ be a partition of $X$ with finite entropy. Then the entropy of $T$ with respect to $xi$ is
$$h_mu(T,xi)=lim_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi)=sup_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi).$$
The entropy of $T$ is
$$h_mu(T)=sup_{xi:H_mu(xi)<infty}h_mu(T,xi).$$



Let $h$ be a nonnegative number. I wonder if there always exists a measure-preserving transformation with $h$ as its entropy.










share|cite|improve this question











$endgroup$




Let $(X,mathscr{B},mu,T)$ be a measure-preserving system and let $xi$ be a partition of $X$ with finite entropy. Then the entropy of $T$ with respect to $xi$ is
$$h_mu(T,xi)=lim_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi)=sup_{nto infty}frac{1}{n}H_mu(bigvee_{i=0}^{n-1}T^{-i}xi).$$
The entropy of $T$ is
$$h_mu(T)=sup_{xi:H_mu(xi)<infty}h_mu(T,xi).$$



Let $h$ be a nonnegative number. I wonder if there always exists a measure-preserving transformation with $h$ as its entropy.







measure-theory information-theory






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Mar 9 at 17:07







No One

















asked Mar 9 at 16:31









No OneNo One

2,0661519




2,0661519












  • $begingroup$
    Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
    $endgroup$
    – kimchi lover
    Mar 9 at 17:14










  • $begingroup$
    @kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
    $endgroup$
    – No One
    Mar 9 at 20:32


















  • $begingroup$
    Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
    $endgroup$
    – kimchi lover
    Mar 9 at 17:14










  • $begingroup$
    @kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
    $endgroup$
    – No One
    Mar 9 at 20:32
















$begingroup$
Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
$endgroup$
– kimchi lover
Mar 9 at 17:14




$begingroup$
Hint (and it's hard to think of an hint that's not a complete giveaway): look at the definition of $H_mu$.
$endgroup$
– kimchi lover
Mar 9 at 17:14












$begingroup$
@kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
$endgroup$
– No One
Mar 9 at 20:32




$begingroup$
@kimchilover $H_mu$ is defined on partitions... It is not obvious to me how looking at its definition gives me an approach. Note that I hope this proposition is true for a general measure space.
$endgroup$
– No One
Mar 9 at 20:32










1 Answer
1






active

oldest

votes


















0












$begingroup$

For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<log k$ and then finding probabilities $p_1,ldots,p_k$ such that $h=-sum p_ilog p_i$.)



(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ terms evaluate to $nh$, so for that $xi$, $H_mu(T,xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)



You can pick a single measure space, say $S=([0,1],mathcal B, lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],mathcal B, lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)



I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.



I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
    $endgroup$
    – No One
    2 days ago










  • $begingroup$
    I have edited my answer.
    $endgroup$
    – kimchi lover
    2 days ago











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141302%2fthere-exists-a-measure-preserving-transformation-with-any-given-nonnegative-en%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0












$begingroup$

For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<log k$ and then finding probabilities $p_1,ldots,p_k$ such that $h=-sum p_ilog p_i$.)



(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ terms evaluate to $nh$, so for that $xi$, $H_mu(T,xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)



You can pick a single measure space, say $S=([0,1],mathcal B, lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],mathcal B, lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)



I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.



I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
    $endgroup$
    – No One
    2 days ago










  • $begingroup$
    I have edited my answer.
    $endgroup$
    – kimchi lover
    2 days ago
















0












$begingroup$

For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<log k$ and then finding probabilities $p_1,ldots,p_k$ such that $h=-sum p_ilog p_i$.)



(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ terms evaluate to $nh$, so for that $xi$, $H_mu(T,xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)



You can pick a single measure space, say $S=([0,1],mathcal B, lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],mathcal B, lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)



I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.



I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
    $endgroup$
    – No One
    2 days ago










  • $begingroup$
    I have edited my answer.
    $endgroup$
    – kimchi lover
    2 days ago














0












0








0





$begingroup$

For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<log k$ and then finding probabilities $p_1,ldots,p_k$ such that $h=-sum p_ilog p_i$.)



(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ terms evaluate to $nh$, so for that $xi$, $H_mu(T,xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)



You can pick a single measure space, say $S=([0,1],mathcal B, lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],mathcal B, lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)



I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.



I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.






share|cite|improve this answer











$endgroup$



For each $h>0$ there is a Bernoulli shift $B(h)$ with entropy $h$. (Details and further info in the wikipedia article. The hard-math part here is finding a $k$ such that $h<log k$ and then finding probabilities $p_1,ldots,p_k$ such that $h=-sum p_ilog p_i$.)



(The construction is basically to let the coordinates be independent identically distributed copies of a random variable which takes the value $i$ with probability $p_i$, where $T$ is the shift. Because of independence, the $H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ is just the entropy of an $n$ tuple whose coordinates are independent, when $xi$ is the partition induced by a single coordinate. Because of the multiplicative property of independence, and the way logarithms work, all the $ H_mu(bigvee_{i=1}^{n-1} T^{-i}xi)$ terms evaluate to $nh$, so for that $xi$, $H_mu(T,xi)=h$. And so on. Billingsley's book Ergodic Theory and Information has details.)



You can pick a single measure space, say $S=([0,1],mathcal B, lambda)$ with Lebesgue measure and, for any given $h$, find a measure-theoretic isomorphism between $B(h)$ and $S$, and use it to make a $T$ such that $([0,1],mathcal B, lambda,T)$ does what you want. (Conjugate the shift on $B(h)$ by the isomorphism.)



I don't quite understand what your comment "I hope this proposition is true for a general measure space" means, but maybe this is good enough for you.



I am also puzzled by how you came across this definition of a shift's entropy without also coming across examples of ergodic processes such as the Bernoulli shift, which are to information theory and ergodic theory as triangles are to Euclidean geometry. Most textbooks talk about the Kolmogorov Sinai definition and then about the Ornstein theorem, by which point your question should be obvious.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited 2 days ago

























answered 2 days ago









kimchi loverkimchi lover

11.1k31229




11.1k31229












  • $begingroup$
    For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
    $endgroup$
    – No One
    2 days ago










  • $begingroup$
    I have edited my answer.
    $endgroup$
    – kimchi lover
    2 days ago


















  • $begingroup$
    For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
    $endgroup$
    – No One
    2 days ago










  • $begingroup$
    I have edited my answer.
    $endgroup$
    – kimchi lover
    2 days ago
















$begingroup$
For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
$endgroup$
– No One
2 days ago




$begingroup$
For the Bernoulli shift case, even if we have found $h=-sum p_ilog p_i$, I am still not sure why taking limits and supremum keeps this $h$ unchanged
$endgroup$
– No One
2 days ago












$begingroup$
I have edited my answer.
$endgroup$
– kimchi lover
2 days ago




$begingroup$
I have edited my answer.
$endgroup$
– kimchi lover
2 days ago


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141302%2fthere-exists-a-measure-preserving-transformation-with-any-given-nonnegative-en%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Nidaros erkebispedøme

Birsay

Was Woodrow Wilson really a Liberal?Was World War I a war of liberals against authoritarians?Founding Fathers...