Calculating an infinite integral of log-normal distributionExpectation of random varible with normal...

What Happens when Passenger Refuses to Fly Boeing 737 Max?

Best approach to update all entries in a list that is paginated?

Examples of a statistic that is not independent of sample's distribution?

Making a sword in the stone, in a medieval world without magic

Why don't MCU characters ever seem to have language issues?

Offered promotion but I'm leaving. Should I tell?

Why does Captain Marvel assume the planet where she lands would recognize her credentials?

How to pass a string to a command that expects a file?

How much stiffer are 23c tires over 28c?

How strictly should I take "Candidates must be local"?

Low budget alien movie about the Earth being cooked

2×2×2 rubik's cube corner is twisted!

A three room house but a three headED dog

How do I express some one as a black person?

Are the terms "stab" and "staccato" synonyms?

Algorithm to convert a fixed-length string to the smallest possible collision-free representation?

Who deserves to be first and second author? PhD student who collected data, research associate who wrote the paper or supervisor?

Could a cubesat be propelled to the moon?

Is Gradient Descent central to every optimizer?

Should QA ask requirements to developers?

How did Alan Turing break the enigma code using the hint given by the lady in the bar?

How could our ancestors have domesticated a solitary predator?

Do I really need to have a scientific explanation for my premise?

Single word request: Harming the benefactor



Calculating an infinite integral of log-normal distribution


Expectation of random varible with normal distribution composed with exponentialmoment generating function for folded/absolute normal distributionMoments of the shifted log-normalConditional Expectation of the minimum of two identical log-normal distributionsOn the evaluation of the integral $int_{-frac{b}{a}}^{frac{1-b}{a}}logleft(ax+bright)expleft(-frac{1}{2}x^2right)mathrm{d}x$.Simplifying complicated integral including CDF of normal distributionExpected value of a lognormal distributionClosed form expression for the integralNormal distribution in an intervalCalculating the convolution of an Arcsine law and a Gaussian distribution













0












$begingroup$


The integral is:



$int^infty_0 x exp{Big(frac{-(log{x}-mu)^2}{2sigma^2}Big)}dx $ (it is a second moment of log-normal distribution).



I've tried several subsitutions, such as



$u=log{x}$,



$u=log{x} - mu$,



$u=(log{x} - mu)^2$.



However, all of them lead to more complicated results. I can calculate this integral when there is $x$ instead of $log{x}$, but with the logarithm it gets complicated. Is there a way to calculate it using some trick? (undergraduate level)



Thanks in advance.










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
    $endgroup$
    – Lee David Chung Lin
    Mar 9 at 17:28








  • 1




    $begingroup$
    The integral in your edited question is the third moment, not the second.
    $endgroup$
    – saz
    2 days ago












  • $begingroup$
    Hah... Yes, it was all a typo. Thank you.
    $endgroup$
    – lkky7
    2 days ago
















0












$begingroup$


The integral is:



$int^infty_0 x exp{Big(frac{-(log{x}-mu)^2}{2sigma^2}Big)}dx $ (it is a second moment of log-normal distribution).



I've tried several subsitutions, such as



$u=log{x}$,



$u=log{x} - mu$,



$u=(log{x} - mu)^2$.



However, all of them lead to more complicated results. I can calculate this integral when there is $x$ instead of $log{x}$, but with the logarithm it gets complicated. Is there a way to calculate it using some trick? (undergraduate level)



Thanks in advance.










share|cite|improve this question











$endgroup$








  • 2




    $begingroup$
    Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
    $endgroup$
    – Lee David Chung Lin
    Mar 9 at 17:28








  • 1




    $begingroup$
    The integral in your edited question is the third moment, not the second.
    $endgroup$
    – saz
    2 days ago












  • $begingroup$
    Hah... Yes, it was all a typo. Thank you.
    $endgroup$
    – lkky7
    2 days ago














0












0








0





$begingroup$


The integral is:



$int^infty_0 x exp{Big(frac{-(log{x}-mu)^2}{2sigma^2}Big)}dx $ (it is a second moment of log-normal distribution).



I've tried several subsitutions, such as



$u=log{x}$,



$u=log{x} - mu$,



$u=(log{x} - mu)^2$.



However, all of them lead to more complicated results. I can calculate this integral when there is $x$ instead of $log{x}$, but with the logarithm it gets complicated. Is there a way to calculate it using some trick? (undergraduate level)



Thanks in advance.










share|cite|improve this question











$endgroup$




The integral is:



$int^infty_0 x exp{Big(frac{-(log{x}-mu)^2}{2sigma^2}Big)}dx $ (it is a second moment of log-normal distribution).



I've tried several subsitutions, such as



$u=log{x}$,



$u=log{x} - mu$,



$u=(log{x} - mu)^2$.



However, all of them lead to more complicated results. I can calculate this integral when there is $x$ instead of $log{x}$, but with the logarithm it gets complicated. Is there a way to calculate it using some trick? (undergraduate level)



Thanks in advance.







probability integration indefinite-integrals






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago







lkky7

















asked Mar 9 at 17:13









lkky7lkky7

295




295








  • 2




    $begingroup$
    Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
    $endgroup$
    – Lee David Chung Lin
    Mar 9 at 17:28








  • 1




    $begingroup$
    The integral in your edited question is the third moment, not the second.
    $endgroup$
    – saz
    2 days ago












  • $begingroup$
    Hah... Yes, it was all a typo. Thank you.
    $endgroup$
    – lkky7
    2 days ago














  • 2




    $begingroup$
    Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
    $endgroup$
    – Lee David Chung Lin
    Mar 9 at 17:28








  • 1




    $begingroup$
    The integral in your edited question is the third moment, not the second.
    $endgroup$
    – saz
    2 days ago












  • $begingroup$
    Hah... Yes, it was all a typo. Thank you.
    $endgroup$
    – lkky7
    2 days ago








2




2




$begingroup$
Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
$endgroup$
– Lee David Chung Lin
Mar 9 at 17:28






$begingroup$
Is there a typo? The integral as shown is for the first moment. Which moment are you trying to calculate?
$endgroup$
– Lee David Chung Lin
Mar 9 at 17:28






1




1




$begingroup$
The integral in your edited question is the third moment, not the second.
$endgroup$
– saz
2 days ago






$begingroup$
The integral in your edited question is the third moment, not the second.
$endgroup$
– saz
2 days ago














$begingroup$
Hah... Yes, it was all a typo. Thank you.
$endgroup$
– lkky7
2 days ago




$begingroup$
Hah... Yes, it was all a typo. Thank you.
$endgroup$
– lkky7
2 days ago










1 Answer
1






active

oldest

votes


















0












$begingroup$

Denote by



$$p(x) = frac{1}{x} frac{1}{sqrt{2pi sigma^2}} exp left(- frac{(log x- mu)^2}{2sigma^2} right)$$



the probability density function of the log-normal distribution. For $y := log x- mu$ we have



$$frac{dy}{dx} = frac{1}{x} = exp(-y-mu),$$



i.e.



$$dx = exp(y+mu) , dy,$$



and therefore a change of variables gives



$$int_0^{infty}x p(x) , dx = frac{1}{sqrt{2pi sigma^2}} int_{mathbb{R}} exp(y+mu) exp left(- frac{y^2}{2sigma^2} right) , dy.$$



The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y sim N(0,sigma^2)$ then the right-hand side equals



$$exp(mu) mathbb{E}exp(Y).$$



Since exponential moments of Gaussian random variables can be calculated explicitly, we get



$$int_0^{infty}x p(x) , dx = exp left( mu + frac{1}{2} sigma^2 right).$$



Equivalently,



$$int_{(0,infty)} exp left(- frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2}exp left( mu + frac{1}{2} sigma^2 right).$$



Remark: The same reasoning works also for higher moments, i.e.



$$int_{(0,infty)} x^k p(x) , dx$$



for $k geq 1$. Following the argumentation from above we get



$$int_{(0,infty)} x^k p(x) , dx = exp(k mu) mathbb{E}exp(kY) = exp left( k mu + frac{1}{2} sigma^2 k^2 right),$$



i.e.



$$int_{(0,infty)} x^{k-1} exp left( - frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2} exp left( k mu + frac{1}{2} sigma^2 k^2 right).$$






share|cite|improve this answer











$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141352%2fcalculating-an-infinite-integral-of-log-normal-distribution%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0












    $begingroup$

    Denote by



    $$p(x) = frac{1}{x} frac{1}{sqrt{2pi sigma^2}} exp left(- frac{(log x- mu)^2}{2sigma^2} right)$$



    the probability density function of the log-normal distribution. For $y := log x- mu$ we have



    $$frac{dy}{dx} = frac{1}{x} = exp(-y-mu),$$



    i.e.



    $$dx = exp(y+mu) , dy,$$



    and therefore a change of variables gives



    $$int_0^{infty}x p(x) , dx = frac{1}{sqrt{2pi sigma^2}} int_{mathbb{R}} exp(y+mu) exp left(- frac{y^2}{2sigma^2} right) , dy.$$



    The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y sim N(0,sigma^2)$ then the right-hand side equals



    $$exp(mu) mathbb{E}exp(Y).$$



    Since exponential moments of Gaussian random variables can be calculated explicitly, we get



    $$int_0^{infty}x p(x) , dx = exp left( mu + frac{1}{2} sigma^2 right).$$



    Equivalently,



    $$int_{(0,infty)} exp left(- frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2}exp left( mu + frac{1}{2} sigma^2 right).$$



    Remark: The same reasoning works also for higher moments, i.e.



    $$int_{(0,infty)} x^k p(x) , dx$$



    for $k geq 1$. Following the argumentation from above we get



    $$int_{(0,infty)} x^k p(x) , dx = exp(k mu) mathbb{E}exp(kY) = exp left( k mu + frac{1}{2} sigma^2 k^2 right),$$



    i.e.



    $$int_{(0,infty)} x^{k-1} exp left( - frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2} exp left( k mu + frac{1}{2} sigma^2 k^2 right).$$






    share|cite|improve this answer











    $endgroup$


















      0












      $begingroup$

      Denote by



      $$p(x) = frac{1}{x} frac{1}{sqrt{2pi sigma^2}} exp left(- frac{(log x- mu)^2}{2sigma^2} right)$$



      the probability density function of the log-normal distribution. For $y := log x- mu$ we have



      $$frac{dy}{dx} = frac{1}{x} = exp(-y-mu),$$



      i.e.



      $$dx = exp(y+mu) , dy,$$



      and therefore a change of variables gives



      $$int_0^{infty}x p(x) , dx = frac{1}{sqrt{2pi sigma^2}} int_{mathbb{R}} exp(y+mu) exp left(- frac{y^2}{2sigma^2} right) , dy.$$



      The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y sim N(0,sigma^2)$ then the right-hand side equals



      $$exp(mu) mathbb{E}exp(Y).$$



      Since exponential moments of Gaussian random variables can be calculated explicitly, we get



      $$int_0^{infty}x p(x) , dx = exp left( mu + frac{1}{2} sigma^2 right).$$



      Equivalently,



      $$int_{(0,infty)} exp left(- frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2}exp left( mu + frac{1}{2} sigma^2 right).$$



      Remark: The same reasoning works also for higher moments, i.e.



      $$int_{(0,infty)} x^k p(x) , dx$$



      for $k geq 1$. Following the argumentation from above we get



      $$int_{(0,infty)} x^k p(x) , dx = exp(k mu) mathbb{E}exp(kY) = exp left( k mu + frac{1}{2} sigma^2 k^2 right),$$



      i.e.



      $$int_{(0,infty)} x^{k-1} exp left( - frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2} exp left( k mu + frac{1}{2} sigma^2 k^2 right).$$






      share|cite|improve this answer











      $endgroup$
















        0












        0








        0





        $begingroup$

        Denote by



        $$p(x) = frac{1}{x} frac{1}{sqrt{2pi sigma^2}} exp left(- frac{(log x- mu)^2}{2sigma^2} right)$$



        the probability density function of the log-normal distribution. For $y := log x- mu$ we have



        $$frac{dy}{dx} = frac{1}{x} = exp(-y-mu),$$



        i.e.



        $$dx = exp(y+mu) , dy,$$



        and therefore a change of variables gives



        $$int_0^{infty}x p(x) , dx = frac{1}{sqrt{2pi sigma^2}} int_{mathbb{R}} exp(y+mu) exp left(- frac{y^2}{2sigma^2} right) , dy.$$



        The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y sim N(0,sigma^2)$ then the right-hand side equals



        $$exp(mu) mathbb{E}exp(Y).$$



        Since exponential moments of Gaussian random variables can be calculated explicitly, we get



        $$int_0^{infty}x p(x) , dx = exp left( mu + frac{1}{2} sigma^2 right).$$



        Equivalently,



        $$int_{(0,infty)} exp left(- frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2}exp left( mu + frac{1}{2} sigma^2 right).$$



        Remark: The same reasoning works also for higher moments, i.e.



        $$int_{(0,infty)} x^k p(x) , dx$$



        for $k geq 1$. Following the argumentation from above we get



        $$int_{(0,infty)} x^k p(x) , dx = exp(k mu) mathbb{E}exp(kY) = exp left( k mu + frac{1}{2} sigma^2 k^2 right),$$



        i.e.



        $$int_{(0,infty)} x^{k-1} exp left( - frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2} exp left( k mu + frac{1}{2} sigma^2 k^2 right).$$






        share|cite|improve this answer











        $endgroup$



        Denote by



        $$p(x) = frac{1}{x} frac{1}{sqrt{2pi sigma^2}} exp left(- frac{(log x- mu)^2}{2sigma^2} right)$$



        the probability density function of the log-normal distribution. For $y := log x- mu$ we have



        $$frac{dy}{dx} = frac{1}{x} = exp(-y-mu),$$



        i.e.



        $$dx = exp(y+mu) , dy,$$



        and therefore a change of variables gives



        $$int_0^{infty}x p(x) , dx = frac{1}{sqrt{2pi sigma^2}} int_{mathbb{R}} exp(y+mu) exp left(- frac{y^2}{2sigma^2} right) , dy.$$



        The right-hand side is the exponential moment of a Gaussian random variable; more precisely, if $Y sim N(0,sigma^2)$ then the right-hand side equals



        $$exp(mu) mathbb{E}exp(Y).$$



        Since exponential moments of Gaussian random variables can be calculated explicitly, we get



        $$int_0^{infty}x p(x) , dx = exp left( mu + frac{1}{2} sigma^2 right).$$



        Equivalently,



        $$int_{(0,infty)} exp left(- frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2}exp left( mu + frac{1}{2} sigma^2 right).$$



        Remark: The same reasoning works also for higher moments, i.e.



        $$int_{(0,infty)} x^k p(x) , dx$$



        for $k geq 1$. Following the argumentation from above we get



        $$int_{(0,infty)} x^k p(x) , dx = exp(k mu) mathbb{E}exp(kY) = exp left( k mu + frac{1}{2} sigma^2 k^2 right),$$



        i.e.



        $$int_{(0,infty)} x^{k-1} exp left( - frac{(log x-mu)^2}{2sigma^2} right) , dx = sqrt{2pi sigma^2} exp left( k mu + frac{1}{2} sigma^2 k^2 right).$$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited 2 days ago

























        answered Mar 9 at 17:43









        sazsaz

        81.7k861127




        81.7k861127






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3141352%2fcalculating-an-infinite-integral-of-log-normal-distribution%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Nidaros erkebispedøme

            Birsay

            Was Woodrow Wilson really a Liberal?Was World War I a war of liberals against authoritarians?Founding Fathers...