Finding the maximum entropy.Prove the A-G-M Inequality using Lagrange multipliers.Finding maximum of a...

Intersection of two sorted vectors in C++

Were any external disk drives stacked vertically?

AES: Why is it a good practice to use only the first 16bytes of a hash for encryption?

Why does Arabsat 6A need a Falcon Heavy to launch

Why is the 'in' operator throwing an error with a string literal instead of logging false?

Could gravitational lensing be used to protect a spaceship from a laser?

Do I have a twin with permutated remainders?

Can a virus destroy the BIOS of a modern computer?

What is going on with Captain Marvel's blood colour?

Is it possible to run Internet Explorer on OS X El Capitan?

I would say: "You are another teacher", but she is a woman and I am a man

Why are electrically insulating heatsinks so rare? Is it just cost?

Watching something be written to a file live with tail

What is the most common color to indicate the input-field is disabled?

Infinite Abelian subgroup of infinite non Abelian group example

In Romance of the Three Kingdoms why do people still use bamboo sticks when paper had already been invented?

Would Slavery Reparations be considered Bills of Attainder and hence Illegal?

What killed these X2 caps?

Why is consensus so controversial in Britain?

How could indestructible materials be used in power generation?

Does a druid starting with a bow start with no arrows?

Why does Kotter return in Welcome Back Kotter

If human space travel is limited by the G force vulnerability, is there a way to counter G forces?

1960's book about a plague that kills all white people



Finding the maximum entropy.


Prove the A-G-M Inequality using Lagrange multipliers.Finding maximum of a function with an ellipse constrainthow to find the maximum of the cross-entropy of a discrete random variable?Lagrange Multiplier Method On Linear Equation SetLagrange Multiplier - equation systemFind the distance between two ellipsoidsFind maximum value of a function over a given regionMaximum under a constraintMaximum or minimum values of $x^2+y^2+z^2$ subject to the condition $x+y+z=1$ and $xyz+1=0$What's the maximum entropy subject to linear constraints?













1












$begingroup$


I'm trying to solve the following question:



enter image description here



Here is my attempt using Lagrange multipliers:



$L=-x_{1}lnx_{1}-x_{2}lnx_{2}-cdots -x_{n}lnx_{n}+lambda (x_{1}+cdots +x_{n})$



$0=frac{partial L}{partial x_{1}}=-1-lnx_{1}+lambda x_{1}$



$vdots$



$0=frac{partial L}{partial x_{n}}=-1-lnx_{n}+lambda x_{n}$



Solving this gives $lambda x_{1}-lnx_{1}=cdots = lambda x_{n}-lnx_{n}$.



I am not sure if this is right, but I thought this implied that $x_{1}=cdots =x_{n}$.



Then using the constraint, $x_{1}=cdots =x_{n}=frac{1}{n}$.



Then $h$'s maximum after substitution is $-lnfrac{1}{n}=ln(n)$.



I am not a physics major so I am not sure whether this is the correct approach.










share|cite|improve this question









$endgroup$








  • 2




    $begingroup$
    This is correct. Can you see why having each probability equal would give maximum entropy?
    $endgroup$
    – David G. Stork
    Mar 19 at 4:27
















1












$begingroup$


I'm trying to solve the following question:



enter image description here



Here is my attempt using Lagrange multipliers:



$L=-x_{1}lnx_{1}-x_{2}lnx_{2}-cdots -x_{n}lnx_{n}+lambda (x_{1}+cdots +x_{n})$



$0=frac{partial L}{partial x_{1}}=-1-lnx_{1}+lambda x_{1}$



$vdots$



$0=frac{partial L}{partial x_{n}}=-1-lnx_{n}+lambda x_{n}$



Solving this gives $lambda x_{1}-lnx_{1}=cdots = lambda x_{n}-lnx_{n}$.



I am not sure if this is right, but I thought this implied that $x_{1}=cdots =x_{n}$.



Then using the constraint, $x_{1}=cdots =x_{n}=frac{1}{n}$.



Then $h$'s maximum after substitution is $-lnfrac{1}{n}=ln(n)$.



I am not a physics major so I am not sure whether this is the correct approach.










share|cite|improve this question









$endgroup$








  • 2




    $begingroup$
    This is correct. Can you see why having each probability equal would give maximum entropy?
    $endgroup$
    – David G. Stork
    Mar 19 at 4:27














1












1








1





$begingroup$


I'm trying to solve the following question:



enter image description here



Here is my attempt using Lagrange multipliers:



$L=-x_{1}lnx_{1}-x_{2}lnx_{2}-cdots -x_{n}lnx_{n}+lambda (x_{1}+cdots +x_{n})$



$0=frac{partial L}{partial x_{1}}=-1-lnx_{1}+lambda x_{1}$



$vdots$



$0=frac{partial L}{partial x_{n}}=-1-lnx_{n}+lambda x_{n}$



Solving this gives $lambda x_{1}-lnx_{1}=cdots = lambda x_{n}-lnx_{n}$.



I am not sure if this is right, but I thought this implied that $x_{1}=cdots =x_{n}$.



Then using the constraint, $x_{1}=cdots =x_{n}=frac{1}{n}$.



Then $h$'s maximum after substitution is $-lnfrac{1}{n}=ln(n)$.



I am not a physics major so I am not sure whether this is the correct approach.










share|cite|improve this question









$endgroup$




I'm trying to solve the following question:



enter image description here



Here is my attempt using Lagrange multipliers:



$L=-x_{1}lnx_{1}-x_{2}lnx_{2}-cdots -x_{n}lnx_{n}+lambda (x_{1}+cdots +x_{n})$



$0=frac{partial L}{partial x_{1}}=-1-lnx_{1}+lambda x_{1}$



$vdots$



$0=frac{partial L}{partial x_{n}}=-1-lnx_{n}+lambda x_{n}$



Solving this gives $lambda x_{1}-lnx_{1}=cdots = lambda x_{n}-lnx_{n}$.



I am not sure if this is right, but I thought this implied that $x_{1}=cdots =x_{n}$.



Then using the constraint, $x_{1}=cdots =x_{n}=frac{1}{n}$.



Then $h$'s maximum after substitution is $-lnfrac{1}{n}=ln(n)$.



I am not a physics major so I am not sure whether this is the correct approach.







lagrange-multiplier maxima-minima entropy






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Mar 19 at 4:23









numericalorangenumericalorange

1,879313




1,879313








  • 2




    $begingroup$
    This is correct. Can you see why having each probability equal would give maximum entropy?
    $endgroup$
    – David G. Stork
    Mar 19 at 4:27














  • 2




    $begingroup$
    This is correct. Can you see why having each probability equal would give maximum entropy?
    $endgroup$
    – David G. Stork
    Mar 19 at 4:27








2




2




$begingroup$
This is correct. Can you see why having each probability equal would give maximum entropy?
$endgroup$
– David G. Stork
Mar 19 at 4:27




$begingroup$
This is correct. Can you see why having each probability equal would give maximum entropy?
$endgroup$
– David G. Stork
Mar 19 at 4:27










0






active

oldest

votes












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3153679%2ffinding-the-maximum-entropy%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3153679%2ffinding-the-maximum-entropy%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Nidaros erkebispedøme

Birsay

Was Woodrow Wilson really a Liberal?Was World War I a war of liberals against authoritarians?Founding Fathers...