Limit of $Y_n = left(prodlimits_{i=1}^{n} X_iright)^{1/n}$ for $ntoinfty$.Sum of Sequence of Random...
How to explain that I do not want to visit a country due to personal safety concern?
Rules about breaking the rules. How do I do it well?
Making a sword in the stone, in a medieval world without magic
How to answer questions about my characters?
How to deal with a cynical class?
The use of "touch" and "touch on" in context
Can hydraulic brake levers get hot when brakes overheat?
Why does Deadpool say "You're welcome, Canada," after shooting Ryan Reynolds in the end credits?
PTIJ: Who should pay for Uber rides: the child or the parent?
Could the Saturn V actually have launched astronauts around Venus?
How is the Swiss post e-voting system supposed to work, and how was it wrong?
Does this AnyDice function accurately calculate the number of ogres you make unconcious with three 4th-level castings of Sleep?
Replacing Windows 7 security updates with anti-virus?
Know when to turn notes upside-down(eighth notes, sixteen notes, etc.)
Employee lack of ownership
Current sense amp + op-amp buffer + ADC: Measuring down to 0 with single supply
What has been your most complicated TikZ drawing?
What options are left, if Britain cannot decide?
Is a lawful good "antagonist" effective?
How to make healing in an exploration game interesting
Why using two cd commands in bash script does not execute the second command
Rejected in 4th interview round citing insufficient years of experience
Why did it take so long to abandon sail after steamships were demonstrated?
It's a yearly task, alright
Limit of $Y_n = left(prodlimits_{i=1}^{n} X_iright)^{1/n}$ for $ntoinfty$.
Sum of Sequence of Random VariablesProve that the series $sumlimits_{n=0}^{infty}X_n$ converges almost surelyShow that $X_n/n$ does not converge almost surelyIs it true that $sum_{n=1}^infty dfrac{Var(Y_n)}{n}<infty$?When does $sum_{i=1}^{infty} X_i$ exist for random sequences ${X_i}_{i=1}^{infty}$?I.I.D. random variables almost sure convergenceProbability exercise about SLLNProof that a Gamma distribution is finite almost surelyAlmost sure convergence of a certain sequence of random variablesShow that $frac1nmaxlimits_{1le i le n } X_ito0$ almost surely, with no independence assumption
$begingroup$
Let $(X_n)_{ngeq1}$ be a sequence of i.i.d. random variables such that $P(X_1 = 1) = P(X_1 = 2) = frac{1}{2}$. Let $(Y_n)_{ngeq1}$ be defined as
$$Y_n = left(prod_{i=1}^{n} X_iright)^{1/n}$$ for all $n geq 1$.
Show that there exists (and determine it) a real number $a$ such that $Y_n to a$ almost surely as $n to infty$.
probability-theory
$endgroup$
add a comment |
$begingroup$
Let $(X_n)_{ngeq1}$ be a sequence of i.i.d. random variables such that $P(X_1 = 1) = P(X_1 = 2) = frac{1}{2}$. Let $(Y_n)_{ngeq1}$ be defined as
$$Y_n = left(prod_{i=1}^{n} X_iright)^{1/n}$$ for all $n geq 1$.
Show that there exists (and determine it) a real number $a$ such that $Y_n to a$ almost surely as $n to infty$.
probability-theory
$endgroup$
add a comment |
$begingroup$
Let $(X_n)_{ngeq1}$ be a sequence of i.i.d. random variables such that $P(X_1 = 1) = P(X_1 = 2) = frac{1}{2}$. Let $(Y_n)_{ngeq1}$ be defined as
$$Y_n = left(prod_{i=1}^{n} X_iright)^{1/n}$$ for all $n geq 1$.
Show that there exists (and determine it) a real number $a$ such that $Y_n to a$ almost surely as $n to infty$.
probability-theory
$endgroup$
Let $(X_n)_{ngeq1}$ be a sequence of i.i.d. random variables such that $P(X_1 = 1) = P(X_1 = 2) = frac{1}{2}$. Let $(Y_n)_{ngeq1}$ be defined as
$$Y_n = left(prod_{i=1}^{n} X_iright)^{1/n}$$ for all $n geq 1$.
Show that there exists (and determine it) a real number $a$ such that $Y_n to a$ almost surely as $n to infty$.
probability-theory
probability-theory
edited Mar 10 at 15:07
Mars Plastic
1,451121
1,451121
asked Mar 10 at 14:13
Jasper Jasper
607
607
add a comment |
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
$$log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$$
With strong law of large numbers $log(Y_n)$ converges almost surely to $E(log(X_1))$.
I let you do the computation of the expected value.
$endgroup$
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
add a comment |
$begingroup$
Hints:
Observe that: $$Y_{n}=2^{frac{1}{n}sum_{i=1}^{n}mathbf{1}_{left{ X_{i}=2right} }}$$
where the $mathbf{1}_{left{ X_{i}=2right} }$ are iid with Bernoulli
distribution that has parameter $frac{1}{2}$.If $a_n$ converges then so does $2^{a_n}$.
- SLLN
$endgroup$
add a comment |
$begingroup$
Hint: Look at $log (Y_n)$ and use the law of large numbers for $(log(X_n))_{nge 1}$.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3142419%2flimit-of-y-n-left-prod-limits-i-1n-x-i-right1-n-for-n-to-infty%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
$$log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$$
With strong law of large numbers $log(Y_n)$ converges almost surely to $E(log(X_1))$.
I let you do the computation of the expected value.
$endgroup$
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
add a comment |
$begingroup$
$$log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$$
With strong law of large numbers $log(Y_n)$ converges almost surely to $E(log(X_1))$.
I let you do the computation of the expected value.
$endgroup$
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
add a comment |
$begingroup$
$$log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$$
With strong law of large numbers $log(Y_n)$ converges almost surely to $E(log(X_1))$.
I let you do the computation of the expected value.
$endgroup$
$$log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$$
With strong law of large numbers $log(Y_n)$ converges almost surely to $E(log(X_1))$.
I let you do the computation of the expected value.
answered Mar 10 at 14:32
JenniferJennifer
8,60521837
8,60521837
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
add a comment |
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
$mathbb{E}$(log($X_1$)) = $frac{1}{2}$(log(1) + log(2), but I don't see why log($Y_n$) converges almost surely to $mathbb{E}$(log($X_1$)).
$endgroup$
– Jasper
Mar 10 at 20:59
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
$begingroup$
The random varaibles $log(X_1), dots ,log(X_n)$ are iid so you can apply strong law of large numbers thus $log(Y_n) = frac{sum_{i=1}^{n} log(X_i)}{n}$ convers almost surely to $E(log(X_1))$
$endgroup$
– Jennifer
Mar 10 at 21:14
add a comment |
$begingroup$
Hints:
Observe that: $$Y_{n}=2^{frac{1}{n}sum_{i=1}^{n}mathbf{1}_{left{ X_{i}=2right} }}$$
where the $mathbf{1}_{left{ X_{i}=2right} }$ are iid with Bernoulli
distribution that has parameter $frac{1}{2}$.If $a_n$ converges then so does $2^{a_n}$.
- SLLN
$endgroup$
add a comment |
$begingroup$
Hints:
Observe that: $$Y_{n}=2^{frac{1}{n}sum_{i=1}^{n}mathbf{1}_{left{ X_{i}=2right} }}$$
where the $mathbf{1}_{left{ X_{i}=2right} }$ are iid with Bernoulli
distribution that has parameter $frac{1}{2}$.If $a_n$ converges then so does $2^{a_n}$.
- SLLN
$endgroup$
add a comment |
$begingroup$
Hints:
Observe that: $$Y_{n}=2^{frac{1}{n}sum_{i=1}^{n}mathbf{1}_{left{ X_{i}=2right} }}$$
where the $mathbf{1}_{left{ X_{i}=2right} }$ are iid with Bernoulli
distribution that has parameter $frac{1}{2}$.If $a_n$ converges then so does $2^{a_n}$.
- SLLN
$endgroup$
Hints:
Observe that: $$Y_{n}=2^{frac{1}{n}sum_{i=1}^{n}mathbf{1}_{left{ X_{i}=2right} }}$$
where the $mathbf{1}_{left{ X_{i}=2right} }$ are iid with Bernoulli
distribution that has parameter $frac{1}{2}$.If $a_n$ converges then so does $2^{a_n}$.
- SLLN
edited Mar 10 at 14:35
answered Mar 10 at 14:27
drhabdrhab
103k545136
103k545136
add a comment |
add a comment |
$begingroup$
Hint: Look at $log (Y_n)$ and use the law of large numbers for $(log(X_n))_{nge 1}$.
$endgroup$
add a comment |
$begingroup$
Hint: Look at $log (Y_n)$ and use the law of large numbers for $(log(X_n))_{nge 1}$.
$endgroup$
add a comment |
$begingroup$
Hint: Look at $log (Y_n)$ and use the law of large numbers for $(log(X_n))_{nge 1}$.
$endgroup$
Hint: Look at $log (Y_n)$ and use the law of large numbers for $(log(X_n))_{nge 1}$.
answered Mar 10 at 14:32
Mars PlasticMars Plastic
1,451121
1,451121
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3142419%2flimit-of-y-n-left-prod-limits-i-1n-x-i-right1-n-for-n-to-infty%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown