The implication of zero mixed partial derivatives for multivariate function's minimization The...

How easy is it to start Magic from scratch?

Is it safe to use c_str() on a temporary string?

How do I get the green key off the shelf in the Dobby level of Lego Harry Potter 2?

How do we know the LHC results are robust?

How to get regions to plot as graphics

How should I support this large drywall patch?

How to start emacs in "nothing" mode (`fundamental-mode`)

Why do remote companies require working in the US?

How to safely derail a train during transit?

Trouble understanding the speech of overseas colleagues

Term for the "extreme-extension" version of a straw man fallacy?

When Does an Atlas Uniquely Define a Manifold?

How can I quit an app using Terminal?

What is the point of a new vote on May's deal when the indicative votes suggest she will not win?

How to be diplomatic in refusing to write code that breaches the privacy of our users

MAZDA 3 2006 (UK) - poor acceleration then takes off at 3250 revs

Can I equip Skullclamp on a creature I am sacrificing?

Can a caster that cast Polymorph on themselves stop concentrating at any point even if their Int is low?

How to write the block matrix in LaTex?

What happens if you roll doubles 3 times then land on "Go to jail?"

India just shot down a satellite from the ground. At what altitude range is the resulting debris field?

Increase performance creating Mandelbrot set in python

Failed to fetch jessie backports repository

Rotate a column



The implication of zero mixed partial derivatives for multivariate function's minimization



The Next CEO of Stack OverflowIf the second mixed partial is identically zero, then the function can be written as a sum $f(x,y) = f_1(x) + f_2(y)$$dfrac{partial^2 f}{partial x partial y} = 0 nRightarrow f(x,y) = g(x) + h(y)$If a nonnegative function of $x_1,dots,x_n$ can be written as $sum g_k(x_k)$, then the summands can be taken nonnegative$f(x,y) = f_1(x) + f_2(x)$ with continuous differential real-valued functionsShow that both mixed partial derivatives exist at the origin but are not equalFunction on $mathbb{R}^{2}-{0}$.Let $F_1,F_2:bf R^2 to R$ be functions such that…How many n-th Order Partial Derivatives Exist for a Function of k Variables?Maximize $f(textbf{x},textbf{y}) = f_1(textbf{x}) + f_2(textbf{x},textbf{y})$Symmetry of higher order mixed partial derivatives under weaker assumptionsWhy does the partial of $f: Delta to mathbb{R}^2$ fail to exist?To find function satisfying given partial derivatives2 variable, 2 valued function $f(x_1,,x_2)=(x_1,,x_2)$What will be $frac {partial F_i} {partial x_j}$?












2












$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_{12}=f''_{21}=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_{textbf x} f(textbf x)equiv min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24
















2












$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_{12}=f''_{21}=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_{textbf x} f(textbf x)equiv min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$












  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24














2












2








2





$begingroup$


Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_{12}=f''_{21}=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_{textbf x} f(textbf x)equiv min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.










share|cite|improve this question











$endgroup$




Suppose $f(textbf x)=f(x_1,x_2) $ has mixed partial derivatives $f''_{12}=f''_{21}=0$, so can I say: there exist $f_1(x_1)$ and $f_2(x_2)$ such that $min_{textbf x} f(textbf x)equiv min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$? Or even further, as follows:
$$f(textbf x)equiv f_1(x_1)+ f_2(x_2)$$



A positive simple case is $f(x_1,x_2)=x_1^2+x_2^3$. I can not think of any opposite cases, but I am not so sure about it and may need a proof.







multivariable-calculus optimization partial-derivative






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 1 '15 at 20:04







user147263

















asked Jun 24 '13 at 22:43









jorter.jijorter.ji

258




258












  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24


















  • $begingroup$
    Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
    $endgroup$
    – user147263
    Dec 29 '15 at 0:24
















$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24




$begingroup$
Concerning the passage from $f(textbf x)equiv f_1(x_1)+ f_2(x_2)$ to $min_{textbf x} f(textbf x) = min_{x_1}f_1(x_1)+ min_{x_2}f_2(x_2)$, see this question.
$endgroup$
– user147263
Dec 29 '15 at 0:24










2 Answers
2






active

oldest

votes


















1












$begingroup$

For a mixed derivative $f_{xy} = 0$, integrating with respect to $y$ gives:
$$
f_x(x,y) = int f_{xy} ,dy + h(x).
$$
Integrating with respect to $x$:
$$
f(x,y) = iint f_{xy} ,dydx + int h(x)dx + g(y).
$$
Similar result yields if we start from $f_{yx}$, now this implies
$$
f(x,y) = f_1(x) + f_2(y),
$$
and there goes your conclusion in the question.






share|cite|improve this answer









$endgroup$













  • $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13












  • $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32










  • $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49










  • $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52












  • $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41



















0












$begingroup$

The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



Here I try to provide a proof without that assumption.



Restatement



I can restate the conjecture with little weaker conditions:



If $f(x, y)$ has $f_{yx} = 0$, then $z(x, y) = f(x) + g(y)$.



Proof



From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_{yx}(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



$forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



Annotation



The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






share|cite|improve this answer









$endgroup$














    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    For a mixed derivative $f_{xy} = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_{xy} ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_{xy} ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_{yx}$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13












    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52












    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41
















    1












    $begingroup$

    For a mixed derivative $f_{xy} = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_{xy} ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_{xy} ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_{yx}$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13












    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52












    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41














    1












    1








    1





    $begingroup$

    For a mixed derivative $f_{xy} = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_{xy} ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_{xy} ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_{yx}$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.






    share|cite|improve this answer









    $endgroup$



    For a mixed derivative $f_{xy} = 0$, integrating with respect to $y$ gives:
    $$
    f_x(x,y) = int f_{xy} ,dy + h(x).
    $$
    Integrating with respect to $x$:
    $$
    f(x,y) = iint f_{xy} ,dydx + int h(x)dx + g(y).
    $$
    Similar result yields if we start from $f_{yx}$, now this implies
    $$
    f(x,y) = f_1(x) + f_2(y),
    $$
    and there goes your conclusion in the question.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jun 24 '13 at 23:14









    Shuhao CaoShuhao Cao

    16.1k34292




    16.1k34292












    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13












    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52












    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41


















    • $begingroup$
      Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 6:13












    • $begingroup$
      @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 12:32










    • $begingroup$
      Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 16:49










    • $begingroup$
      @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
      $endgroup$
      – Shuhao Cao
      Jun 26 '13 at 16:52












    • $begingroup$
      OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
      $endgroup$
      – jorter.ji
      Jun 26 '13 at 17:41
















    $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13






    $begingroup$
    Hi again. Can I say $f_1$ and $f_2$ are unique? Excluding constants.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 6:13














    $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32




    $begingroup$
    @jorter.ji If you only know that the mixed derivative is zero, then any differentiable function $f_1$ and $f_2$ will do. So not unique unless you have other conditions.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 12:32












    $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49




    $begingroup$
    Like what kind of conditions? Or can you think of any instances such that $f=f_1+f_2=f_3+f_4$?
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 16:49












    $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52






    $begingroup$
    @jorter.ji I meant to say that provided only this differential relation $f_{xy} = f_{yx} = 0$ is known, then there are infinitely choices for $g$ and $h$ in $f = g(x) + h(y)$, $f$ can be $f = x+y$, or $f= x^2+y^2$. Not that they are equal or something.
    $endgroup$
    – Shuhao Cao
    Jun 26 '13 at 16:52














    $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41




    $begingroup$
    OIC your meaning. Actually, I already know that $f(x,y)=f_1(x)+f_2(y)+V(x,y)$, where $V(x,y)$ is a implicit but determined function, i.e., $f(x,y)$ is somehow implicitly determined, then I wonder if $f=g(x)+h(y)$ is unique.
    $endgroup$
    – jorter.ji
    Jun 26 '13 at 17:41











    0












    $begingroup$

    The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



    Here I try to provide a proof without that assumption.



    Restatement



    I can restate the conjecture with little weaker conditions:



    If $f(x, y)$ has $f_{yx} = 0$, then $z(x, y) = f(x) + g(y)$.



    Proof



    From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_{yx}(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



    $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



    Annotation



    The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



      Here I try to provide a proof without that assumption.



      Restatement



      I can restate the conjecture with little weaker conditions:



      If $f(x, y)$ has $f_{yx} = 0$, then $z(x, y) = f(x) + g(y)$.



      Proof



      From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_{yx}(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



      $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



      Annotation



      The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



        Here I try to provide a proof without that assumption.



        Restatement



        I can restate the conjecture with little weaker conditions:



        If $f(x, y)$ has $f_{yx} = 0$, then $z(x, y) = f(x) + g(y)$.



        Proof



        From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_{yx}(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



        $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



        Annotation



        The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.






        share|cite|improve this answer









        $endgroup$



        The answer of @Shuhao Cao needs an assumption that the first partial derivative is integrable.



        Here I try to provide a proof without that assumption.



        Restatement



        I can restate the conjecture with little weaker conditions:



        If $f(x, y)$ has $f_{yx} = 0$, then $z(x, y) = f(x) + g(y)$.



        Proof



        From Mean Value Theorem, $$f_y(x, y) = f_y(0, y) + f_{yx}(xi , y) x, xi in (0, x)$$This implies $f_y = f_y(0, y)$.



        $forall x_0$, $f(x_0, y)$ is an antiderivative of $f_y(0, y)$. Any two antiderivatives differ by constant. So, we can write that $$f(x, y) = f(0, y) + c(x) = f(0, y) + f(x, 0) - f(0, 0) = f_1(x) + f_2(y)$$



        Annotation



        The key is that "any two antiderivatives differ by constant" can be proved only based on Mean Value Theorem, but nothing wih Reimann Integral.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Mar 16 at 8:52









        TA123TA123

        958




        958






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f428662%2fthe-implication-of-zero-mixed-partial-derivatives-for-multivariate-functions-mi%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Nidaros erkebispedøme

            Birsay

            Was Woodrow Wilson really a Liberal?Was World War I a war of liberals against authoritarians?Founding Fathers...