An inequality related to the Renyi divergencePoisson distribution and probability distributionsPositivity of...

Fly on a jet pack vs fly with a jet pack?

Should I stop contributing to retirement accounts?

Drawing ramified coverings with tikz

Diode in opposite direction?

Can I sign legal documents with a smiley face?

Transformation of random variables and joint distributions

Varistor? Purpose and principle

Did US corporations pay demonstrators in the German demonstrations against article 13?

List of people who lose a child in תנ"ך

Are all species of CANNA edible?

How do I implement a file system driver driver in Linux?

Proof of Lemma: Every nonzero integer can be written as a product of primes

Greatest common substring

Some numbers are more equivalent than others

How can "mimic phobia" be cured or prevented?

Java - What do constructor type arguments mean when placed *before* the type?

Bob has never been a M before

Customize circled numbers

Translation of Scottish 16th century church stained glass

How do ground effect vehicles perform turns?

Should I install hardwood flooring or cabinets first?

Is a model fitted to data or is data fitted to a model?

Divine apple island

Will adding a BY-SA image to a blog post make the entire post BY-SA?



An inequality related to the Renyi divergence


Poisson distribution and probability distributionsPositivity of Renyi Mutual InformationRenyi entropy (zeroth order)Bayesian Updating with 1 Signal but 2 UnknownsWhat modification is this of the notion of Renyi divergence?The middle point for KL divergenceDistributions with equal Renyi EntropiesHow to 'randomize' a given discrete probability distibution?Data processing inequality for renyi divergenceIs the Renyi entropy an invertible functional of ordered probability distributions?













2












$begingroup$


Can you prove the following?



Conjecture. Let $lambda > 1$. Let $p_i$, $q_i$, $mu_i$, $nu_i$ be probability densities over $mathbb R$ for $i = 1, ..., n$, such that for all $i = 1, ..., n$, (all integrations are over $mathbb R$)



$$int {p_i(x)^lambda over q_i(x)^{lambda - 1}} dx le int {mu_i(x)^lambda over nu_i(x)^{lambda - 1}} dx$$



Then



$$int {(sum_i p_i(x))^lambda over (sum_i q_i(x))^{lambda - 1}} dx le int {(sum_i mu_i(x))^lambda over (sum_i nu_i(x))^{lambda - 1}} dx $$



[End of Conjecture]



The Conjecture is equivalent to its special case when $n = 2$ (by an induction argument).



Where does this conjecture come from? Well, let $p$ and $q$ be two probability densities, then the Renyi divergence $D_lambda(p || q)$ is defined by



$$D_lambda(p || q) = {1 over lambda - 1} log int {p(x)^lambda over q(x)^{lambda - 1}} dx.$$



Like the KL divergence ($lambda = 1$), it measures the difference between the two densities. And the Conjecture basically says: If each $p_i$ and $q_i$ are closer than each $mu_i$ and $nu_i$, then so are their averages:



$$D_lambda(p_i || q_i) le D_lambda(mu_i || nu_i)$$



$$Rightarrow$$



$$D_lambda(n^{-1} sum_i p_i || n^{-1} sum_i q_i) le D_lambda(n^{-1} sum_i mu_i || n^{-1} sum_i nu_i).$$



Alternatively, can you prove the Conjecture specialised to Gaussian distributions:



$$p_i sim N(alpha_i, sigma^2);qquad q_i sim N(beta_i, sigma^2);qquad mu_i sim N(delta_i, sigma^2);qquad nu_i sim N(gamma_i, sigma^2)$$



where $|alpha_i - beta_i| le |delta_i - gamma_i|$.



Note that



$$D_lambda(N(alpha, sigma^2) || N(beta, sigma^2)) = {lambda (alpha - beta)^2 over 2 sigma^2}.$$










share|cite|improve this question











$endgroup$

















    2












    $begingroup$


    Can you prove the following?



    Conjecture. Let $lambda > 1$. Let $p_i$, $q_i$, $mu_i$, $nu_i$ be probability densities over $mathbb R$ for $i = 1, ..., n$, such that for all $i = 1, ..., n$, (all integrations are over $mathbb R$)



    $$int {p_i(x)^lambda over q_i(x)^{lambda - 1}} dx le int {mu_i(x)^lambda over nu_i(x)^{lambda - 1}} dx$$



    Then



    $$int {(sum_i p_i(x))^lambda over (sum_i q_i(x))^{lambda - 1}} dx le int {(sum_i mu_i(x))^lambda over (sum_i nu_i(x))^{lambda - 1}} dx $$



    [End of Conjecture]



    The Conjecture is equivalent to its special case when $n = 2$ (by an induction argument).



    Where does this conjecture come from? Well, let $p$ and $q$ be two probability densities, then the Renyi divergence $D_lambda(p || q)$ is defined by



    $$D_lambda(p || q) = {1 over lambda - 1} log int {p(x)^lambda over q(x)^{lambda - 1}} dx.$$



    Like the KL divergence ($lambda = 1$), it measures the difference between the two densities. And the Conjecture basically says: If each $p_i$ and $q_i$ are closer than each $mu_i$ and $nu_i$, then so are their averages:



    $$D_lambda(p_i || q_i) le D_lambda(mu_i || nu_i)$$



    $$Rightarrow$$



    $$D_lambda(n^{-1} sum_i p_i || n^{-1} sum_i q_i) le D_lambda(n^{-1} sum_i mu_i || n^{-1} sum_i nu_i).$$



    Alternatively, can you prove the Conjecture specialised to Gaussian distributions:



    $$p_i sim N(alpha_i, sigma^2);qquad q_i sim N(beta_i, sigma^2);qquad mu_i sim N(delta_i, sigma^2);qquad nu_i sim N(gamma_i, sigma^2)$$



    where $|alpha_i - beta_i| le |delta_i - gamma_i|$.



    Note that



    $$D_lambda(N(alpha, sigma^2) || N(beta, sigma^2)) = {lambda (alpha - beta)^2 over 2 sigma^2}.$$










    share|cite|improve this question











    $endgroup$















      2












      2








      2





      $begingroup$


      Can you prove the following?



      Conjecture. Let $lambda > 1$. Let $p_i$, $q_i$, $mu_i$, $nu_i$ be probability densities over $mathbb R$ for $i = 1, ..., n$, such that for all $i = 1, ..., n$, (all integrations are over $mathbb R$)



      $$int {p_i(x)^lambda over q_i(x)^{lambda - 1}} dx le int {mu_i(x)^lambda over nu_i(x)^{lambda - 1}} dx$$



      Then



      $$int {(sum_i p_i(x))^lambda over (sum_i q_i(x))^{lambda - 1}} dx le int {(sum_i mu_i(x))^lambda over (sum_i nu_i(x))^{lambda - 1}} dx $$



      [End of Conjecture]



      The Conjecture is equivalent to its special case when $n = 2$ (by an induction argument).



      Where does this conjecture come from? Well, let $p$ and $q$ be two probability densities, then the Renyi divergence $D_lambda(p || q)$ is defined by



      $$D_lambda(p || q) = {1 over lambda - 1} log int {p(x)^lambda over q(x)^{lambda - 1}} dx.$$



      Like the KL divergence ($lambda = 1$), it measures the difference between the two densities. And the Conjecture basically says: If each $p_i$ and $q_i$ are closer than each $mu_i$ and $nu_i$, then so are their averages:



      $$D_lambda(p_i || q_i) le D_lambda(mu_i || nu_i)$$



      $$Rightarrow$$



      $$D_lambda(n^{-1} sum_i p_i || n^{-1} sum_i q_i) le D_lambda(n^{-1} sum_i mu_i || n^{-1} sum_i nu_i).$$



      Alternatively, can you prove the Conjecture specialised to Gaussian distributions:



      $$p_i sim N(alpha_i, sigma^2);qquad q_i sim N(beta_i, sigma^2);qquad mu_i sim N(delta_i, sigma^2);qquad nu_i sim N(gamma_i, sigma^2)$$



      where $|alpha_i - beta_i| le |delta_i - gamma_i|$.



      Note that



      $$D_lambda(N(alpha, sigma^2) || N(beta, sigma^2)) = {lambda (alpha - beta)^2 over 2 sigma^2}.$$










      share|cite|improve this question











      $endgroup$




      Can you prove the following?



      Conjecture. Let $lambda > 1$. Let $p_i$, $q_i$, $mu_i$, $nu_i$ be probability densities over $mathbb R$ for $i = 1, ..., n$, such that for all $i = 1, ..., n$, (all integrations are over $mathbb R$)



      $$int {p_i(x)^lambda over q_i(x)^{lambda - 1}} dx le int {mu_i(x)^lambda over nu_i(x)^{lambda - 1}} dx$$



      Then



      $$int {(sum_i p_i(x))^lambda over (sum_i q_i(x))^{lambda - 1}} dx le int {(sum_i mu_i(x))^lambda over (sum_i nu_i(x))^{lambda - 1}} dx $$



      [End of Conjecture]



      The Conjecture is equivalent to its special case when $n = 2$ (by an induction argument).



      Where does this conjecture come from? Well, let $p$ and $q$ be two probability densities, then the Renyi divergence $D_lambda(p || q)$ is defined by



      $$D_lambda(p || q) = {1 over lambda - 1} log int {p(x)^lambda over q(x)^{lambda - 1}} dx.$$



      Like the KL divergence ($lambda = 1$), it measures the difference between the two densities. And the Conjecture basically says: If each $p_i$ and $q_i$ are closer than each $mu_i$ and $nu_i$, then so are their averages:



      $$D_lambda(p_i || q_i) le D_lambda(mu_i || nu_i)$$



      $$Rightarrow$$



      $$D_lambda(n^{-1} sum_i p_i || n^{-1} sum_i q_i) le D_lambda(n^{-1} sum_i mu_i || n^{-1} sum_i nu_i).$$



      Alternatively, can you prove the Conjecture specialised to Gaussian distributions:



      $$p_i sim N(alpha_i, sigma^2);qquad q_i sim N(beta_i, sigma^2);qquad mu_i sim N(delta_i, sigma^2);qquad nu_i sim N(gamma_i, sigma^2)$$



      where $|alpha_i - beta_i| le |delta_i - gamma_i|$.



      Note that



      $$D_lambda(N(alpha, sigma^2) || N(beta, sigma^2)) = {lambda (alpha - beta)^2 over 2 sigma^2}.$$







      probability-distributions information-theory entropy






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Mar 14 at 13:27







      Y. Pei

















      asked Mar 14 at 12:54









      Y. PeiY. Pei

      203




      203






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          The conjecture is false. Here is a simple counter-example with Gaussians and $n=2$:
          begin{align}p_1 &sim N(-1,1), quad q_1 sim N(1,1), quad u_1 sim N(-2,1), quad v_1 sim N(2,1)\
          p_2& sim N(-1,1), quad q_2 sim N(1,1), quad u_2 sim N(2,1), quad v_2 sim N(-2,1).
          end{align}



          Then, $$D_lambda(p_1Vert q_1) = D_lambda(p_2Vert q_2) = 2lambda <8 lambda = D_lambda(u_1Vert v_1) = D_lambda(u_2Vert v_2) $$



          However, $p_1 + p_2 ne q_1 + q_2$ while $u_1 + u_2 = v_1 + v_2$, so
          $$D_lambda((u_1+u_2)/2Vert (v_1 + v_2)/2) = 0 < D_lambda((p_1+p_2)/2Vert (q_1 + q_2)/2).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
            $endgroup$
            – Y. Pei
            Mar 19 at 15:59










          • $begingroup$
            @Y.Pei I think you should be able to "accept" the answer.
            $endgroup$
            – Artemy
            Mar 19 at 22:17











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3147963%2fan-inequality-related-to-the-renyi-divergence%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          The conjecture is false. Here is a simple counter-example with Gaussians and $n=2$:
          begin{align}p_1 &sim N(-1,1), quad q_1 sim N(1,1), quad u_1 sim N(-2,1), quad v_1 sim N(2,1)\
          p_2& sim N(-1,1), quad q_2 sim N(1,1), quad u_2 sim N(2,1), quad v_2 sim N(-2,1).
          end{align}



          Then, $$D_lambda(p_1Vert q_1) = D_lambda(p_2Vert q_2) = 2lambda <8 lambda = D_lambda(u_1Vert v_1) = D_lambda(u_2Vert v_2) $$



          However, $p_1 + p_2 ne q_1 + q_2$ while $u_1 + u_2 = v_1 + v_2$, so
          $$D_lambda((u_1+u_2)/2Vert (v_1 + v_2)/2) = 0 < D_lambda((p_1+p_2)/2Vert (q_1 + q_2)/2).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
            $endgroup$
            – Y. Pei
            Mar 19 at 15:59










          • $begingroup$
            @Y.Pei I think you should be able to "accept" the answer.
            $endgroup$
            – Artemy
            Mar 19 at 22:17
















          1












          $begingroup$

          The conjecture is false. Here is a simple counter-example with Gaussians and $n=2$:
          begin{align}p_1 &sim N(-1,1), quad q_1 sim N(1,1), quad u_1 sim N(-2,1), quad v_1 sim N(2,1)\
          p_2& sim N(-1,1), quad q_2 sim N(1,1), quad u_2 sim N(2,1), quad v_2 sim N(-2,1).
          end{align}



          Then, $$D_lambda(p_1Vert q_1) = D_lambda(p_2Vert q_2) = 2lambda <8 lambda = D_lambda(u_1Vert v_1) = D_lambda(u_2Vert v_2) $$



          However, $p_1 + p_2 ne q_1 + q_2$ while $u_1 + u_2 = v_1 + v_2$, so
          $$D_lambda((u_1+u_2)/2Vert (v_1 + v_2)/2) = 0 < D_lambda((p_1+p_2)/2Vert (q_1 + q_2)/2).$$






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
            $endgroup$
            – Y. Pei
            Mar 19 at 15:59










          • $begingroup$
            @Y.Pei I think you should be able to "accept" the answer.
            $endgroup$
            – Artemy
            Mar 19 at 22:17














          1












          1








          1





          $begingroup$

          The conjecture is false. Here is a simple counter-example with Gaussians and $n=2$:
          begin{align}p_1 &sim N(-1,1), quad q_1 sim N(1,1), quad u_1 sim N(-2,1), quad v_1 sim N(2,1)\
          p_2& sim N(-1,1), quad q_2 sim N(1,1), quad u_2 sim N(2,1), quad v_2 sim N(-2,1).
          end{align}



          Then, $$D_lambda(p_1Vert q_1) = D_lambda(p_2Vert q_2) = 2lambda <8 lambda = D_lambda(u_1Vert v_1) = D_lambda(u_2Vert v_2) $$



          However, $p_1 + p_2 ne q_1 + q_2$ while $u_1 + u_2 = v_1 + v_2$, so
          $$D_lambda((u_1+u_2)/2Vert (v_1 + v_2)/2) = 0 < D_lambda((p_1+p_2)/2Vert (q_1 + q_2)/2).$$






          share|cite|improve this answer









          $endgroup$



          The conjecture is false. Here is a simple counter-example with Gaussians and $n=2$:
          begin{align}p_1 &sim N(-1,1), quad q_1 sim N(1,1), quad u_1 sim N(-2,1), quad v_1 sim N(2,1)\
          p_2& sim N(-1,1), quad q_2 sim N(1,1), quad u_2 sim N(2,1), quad v_2 sim N(-2,1).
          end{align}



          Then, $$D_lambda(p_1Vert q_1) = D_lambda(p_2Vert q_2) = 2lambda <8 lambda = D_lambda(u_1Vert v_1) = D_lambda(u_2Vert v_2) $$



          However, $p_1 + p_2 ne q_1 + q_2$ while $u_1 + u_2 = v_1 + v_2$, so
          $$D_lambda((u_1+u_2)/2Vert (v_1 + v_2)/2) = 0 < D_lambda((p_1+p_2)/2Vert (q_1 + q_2)/2).$$







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Mar 18 at 1:31









          ArtemyArtemy

          30317




          30317












          • $begingroup$
            Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
            $endgroup$
            – Y. Pei
            Mar 19 at 15:59










          • $begingroup$
            @Y.Pei I think you should be able to "accept" the answer.
            $endgroup$
            – Artemy
            Mar 19 at 22:17


















          • $begingroup$
            Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
            $endgroup$
            – Y. Pei
            Mar 19 at 15:59










          • $begingroup$
            @Y.Pei I think you should be able to "accept" the answer.
            $endgroup$
            – Artemy
            Mar 19 at 22:17
















          $begingroup$
          Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
          $endgroup$
          – Y. Pei
          Mar 19 at 15:59




          $begingroup$
          Thank you. What a nice counterexample... I upvoted your answer but it is recorded but not shown because I don't have sufficient reputation points on this site.
          $endgroup$
          – Y. Pei
          Mar 19 at 15:59












          $begingroup$
          @Y.Pei I think you should be able to "accept" the answer.
          $endgroup$
          – Artemy
          Mar 19 at 22:17




          $begingroup$
          @Y.Pei I think you should be able to "accept" the answer.
          $endgroup$
          – Artemy
          Mar 19 at 22:17


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3147963%2fan-inequality-related-to-the-renyi-divergence%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Magento 2 - Add success message with knockout Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?Success / Error message on ajax request$.widget is not a function when loading a homepage after add custom jQuery on custom themeHow can bind jQuery to current document in Magento 2 When template load by ajaxRedirect page using plugin in Magento 2Magento 2 - Update quantity and totals of cart page without page reload?Magento 2: Quote data not loaded on knockout checkoutMagento 2 : I need to change add to cart success message after adding product into cart through pluginMagento 2.2.5 How to add additional products to cart from new checkout step?Magento 2 Add error/success message with knockoutCan't validate Post Code on checkout page

          Fil:Tokke komm.svg

          Where did Arya get these scars? Unicorn Meta Zoo #1: Why another podcast? Announcing the arrival of Valued Associate #679: Cesar Manara Favourite questions and answers from the 1st quarter of 2019Why did Arya refuse to end it?Has the pronunciation of Arya Stark's name changed?Has Arya forgiven people?Why did Arya Stark lose her vision?Why can Arya still use the faces?Has the Narrow Sea become narrower?Does Arya Stark know how to make poisons outside of the House of Black and White?Why did Nymeria leave Arya?Why did Arya not kill the Lannister soldiers she encountered in the Riverlands?What is the current canonical age of Sansa, Bran and Arya Stark?