What does it mean when I add a new variable to my linear model and the R^2 stays the same?How can I predict values from new inputs of a linear model in R?Does a stepwise approach produce the highest $R^2$ model?F test and t test in linear regression modelCompare linear regression models (same and different response variable)In linear model, if you add one more variable, then what happens to the constant?Getting estimate and CI for dummy variable in linear modelCircularity in Linear Regression: Independent variable used as dependent in the same modelWhat is the difference between generalized linear models and generalized least squaresPCA without response variable to get linearly dependent set of linear (mixed) model inputswhy does adding new variables to a regression model keep R squared unchanged

Are Wave equations equivalent to Maxwell equations in free space?

Align equations with text before one of them

How do we objectively assess if a dialogue sounds unnatural or cringy?

Problems with rounding giving too many digits

The past tense for the quoting particle って

School performs periodic password audits. Is my password compromised?

Affine transformation of circular arc in 3D

How spaceships determine each other's mass in space?

Is being socially reclusive okay for a graduate student?

Quitting employee has privileged access to critical information

Why won't the strings command stop?

ESPP--any reason not to go all in?

Can a Mexican citizen living in US under DACA drive to Canada?

Practical reasons to have both a large police force and bounty hunting network?

Python 3.6+ function to ask for a multiple-choice answer

Where do you go through passport control when transiting through another Schengen airport on your way out of the Schengen area?

PTiJ: How should animals pray?

Why would the IRS ask for birth certificates or even audit a small tax return?

Dukha vs legitimate need

What is "desert glass" and what does it do to the PCs?

Rationale to prefer local variables over instance variables?

3.5% Interest Student Loan or use all of my savings on Tuition?

Does the US political system, in principle, allow for a no-party system?

Deal the cards to the players



What does it mean when I add a new variable to my linear model and the R^2 stays the same?


How can I predict values from new inputs of a linear model in R?Does a stepwise approach produce the highest $R^2$ model?F test and t test in linear regression modelCompare linear regression models (same and different response variable)In linear model, if you add one more variable, then what happens to the constant?Getting estimate and CI for dummy variable in linear modelCircularity in Linear Regression: Independent variable used as dependent in the same modelWhat is the difference between generalized linear models and generalized least squaresPCA without response variable to get linearly dependent set of linear (mixed) model inputswhy does adding new variables to a regression model keep R squared unchanged













4












$begingroup$


I'm inclined to think that the new variable is not correlated to the response. But could the new variable be correlated to another variable in the model?










share|cite|improve this question









$endgroup$











  • $begingroup$
    It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
    $endgroup$
    – OliverFishCode
    8 hours ago






  • 5




    $begingroup$
    It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
    $endgroup$
    – gung
    8 hours ago






  • 5




    $begingroup$
    @gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
    $endgroup$
    – whuber
    8 hours ago










  • $begingroup$
    @whuber, yes, I suppose so.
    $endgroup$
    – gung
    8 hours ago










  • $begingroup$
    Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
    $endgroup$
    – Tom Zinger
    6 hours ago
















4












$begingroup$


I'm inclined to think that the new variable is not correlated to the response. But could the new variable be correlated to another variable in the model?










share|cite|improve this question









$endgroup$











  • $begingroup$
    It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
    $endgroup$
    – OliverFishCode
    8 hours ago






  • 5




    $begingroup$
    It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
    $endgroup$
    – gung
    8 hours ago






  • 5




    $begingroup$
    @gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
    $endgroup$
    – whuber
    8 hours ago










  • $begingroup$
    @whuber, yes, I suppose so.
    $endgroup$
    – gung
    8 hours ago










  • $begingroup$
    Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
    $endgroup$
    – Tom Zinger
    6 hours ago














4












4








4





$begingroup$


I'm inclined to think that the new variable is not correlated to the response. But could the new variable be correlated to another variable in the model?










share|cite|improve this question









$endgroup$




I'm inclined to think that the new variable is not correlated to the response. But could the new variable be correlated to another variable in the model?







linear-model r-squared






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked 9 hours ago









Chance113Chance113

362




362











  • $begingroup$
    It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
    $endgroup$
    – OliverFishCode
    8 hours ago






  • 5




    $begingroup$
    It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
    $endgroup$
    – gung
    8 hours ago






  • 5




    $begingroup$
    @gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
    $endgroup$
    – whuber
    8 hours ago










  • $begingroup$
    @whuber, yes, I suppose so.
    $endgroup$
    – gung
    8 hours ago










  • $begingroup$
    Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
    $endgroup$
    – Tom Zinger
    6 hours ago

















  • $begingroup$
    It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
    $endgroup$
    – OliverFishCode
    8 hours ago






  • 5




    $begingroup$
    It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
    $endgroup$
    – gung
    8 hours ago






  • 5




    $begingroup$
    @gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
    $endgroup$
    – whuber
    8 hours ago










  • $begingroup$
    @whuber, yes, I suppose so.
    $endgroup$
    – gung
    8 hours ago










  • $begingroup$
    Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
    $endgroup$
    – Tom Zinger
    6 hours ago
















$begingroup$
It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
$endgroup$
– OliverFishCode
8 hours ago




$begingroup$
It depends, could you provide us with some reduced data lines or output from your linear models. Without more information it's hard to assist you
$endgroup$
– OliverFishCode
8 hours ago




5




5




$begingroup$
It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
$endgroup$
– gung
8 hours ago




$begingroup$
It shouldn't stay exactly the same unless it is perfectly orthogonal to your response, or is a linear combination of the variables already included. It may be that the change is smaller than the number of decimal places displayed.
$endgroup$
– gung
8 hours ago




5




5




$begingroup$
@gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
$endgroup$
– whuber
8 hours ago




$begingroup$
@gung What you can infer is that the new variable is orthogonal to the response modulo the subspace generated by the other variables. That's more general than the two options you mention.
$endgroup$
– whuber
8 hours ago












$begingroup$
@whuber, yes, I suppose so.
$endgroup$
– gung
8 hours ago




$begingroup$
@whuber, yes, I suppose so.
$endgroup$
– gung
8 hours ago












$begingroup$
Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
$endgroup$
– Tom Zinger
6 hours ago





$begingroup$
Test your variables for multicollinearity en.wikipedia.org/wiki/Multicollinearity probably some features are linearly connected. Use caret package and vif() in R sthda.com/english/articles/39-regression-model-diagnostics/…
$endgroup$
– Tom Zinger
6 hours ago











2 Answers
2






active

oldest

votes


















4












$begingroup$

Seeing little to no change in $R^2$ when you add a variable to a linear model means that the variable has little to no additional explanatory power to the response over what is already in your model. As you note, this can be either because it tells you almost nothing about the response or it explains the same variation in the response as the variables already in the model.






share|cite|improve this answer









$endgroup$




















    1












    $begingroup$

    As others have alluded, seeing no change in $R^2$ when you add a variable to your regression is unusual. In finite samples, this should only happen when your new variable is a linear combination of variables already present. In this case, most standard regression routines simply exclude that variable from the regression, and your $R^2$ will remain unchanged because the model was effectively unchanged.



    As you notice, this does not mean the variable is unimportant, but rather that you are unable to distinguish its effect from that of the other variables in your model.



    More broadly however, I (and many here at Cross Validated) would caution against using R^2 for model selection and interpretation. What I've discussed above is how the $R^2$ could not change and the variable still be important. Worse yet, the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable. Broadly, using $R^2$ for model selection fell out of favor in the 70s, when it was dropped in favor of AIC (and its contemporaries). Today -- a typical statistician would recommend using cross validation (see the site name) for your model selection.



    In general, adding a variable increases $R^2$ -- so using $R^2$ to determine a variables importance is a bit of a wild goose chase. Even when trying to understand simple situations you will end up with a completely absurd collection of variables.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
      $endgroup$
      – Richard Hardy
      7 hours ago











    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f396220%2fwhat-does-it-mean-when-i-add-a-new-variable-to-my-linear-model-and-the-r2-stays%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    4












    $begingroup$

    Seeing little to no change in $R^2$ when you add a variable to a linear model means that the variable has little to no additional explanatory power to the response over what is already in your model. As you note, this can be either because it tells you almost nothing about the response or it explains the same variation in the response as the variables already in the model.






    share|cite|improve this answer









    $endgroup$

















      4












      $begingroup$

      Seeing little to no change in $R^2$ when you add a variable to a linear model means that the variable has little to no additional explanatory power to the response over what is already in your model. As you note, this can be either because it tells you almost nothing about the response or it explains the same variation in the response as the variables already in the model.






      share|cite|improve this answer









      $endgroup$















        4












        4








        4





        $begingroup$

        Seeing little to no change in $R^2$ when you add a variable to a linear model means that the variable has little to no additional explanatory power to the response over what is already in your model. As you note, this can be either because it tells you almost nothing about the response or it explains the same variation in the response as the variables already in the model.






        share|cite|improve this answer









        $endgroup$



        Seeing little to no change in $R^2$ when you add a variable to a linear model means that the variable has little to no additional explanatory power to the response over what is already in your model. As you note, this can be either because it tells you almost nothing about the response or it explains the same variation in the response as the variables already in the model.







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 8 hours ago









        TrynnaDoStatTrynnaDoStat

        5,55211335




        5,55211335























            1












            $begingroup$

            As others have alluded, seeing no change in $R^2$ when you add a variable to your regression is unusual. In finite samples, this should only happen when your new variable is a linear combination of variables already present. In this case, most standard regression routines simply exclude that variable from the regression, and your $R^2$ will remain unchanged because the model was effectively unchanged.



            As you notice, this does not mean the variable is unimportant, but rather that you are unable to distinguish its effect from that of the other variables in your model.



            More broadly however, I (and many here at Cross Validated) would caution against using R^2 for model selection and interpretation. What I've discussed above is how the $R^2$ could not change and the variable still be important. Worse yet, the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable. Broadly, using $R^2$ for model selection fell out of favor in the 70s, when it was dropped in favor of AIC (and its contemporaries). Today -- a typical statistician would recommend using cross validation (see the site name) for your model selection.



            In general, adding a variable increases $R^2$ -- so using $R^2$ to determine a variables importance is a bit of a wild goose chase. Even when trying to understand simple situations you will end up with a completely absurd collection of variables.






            share|cite|improve this answer









            $endgroup$












            • $begingroup$
              Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
              $endgroup$
              – Richard Hardy
              7 hours ago
















            1












            $begingroup$

            As others have alluded, seeing no change in $R^2$ when you add a variable to your regression is unusual. In finite samples, this should only happen when your new variable is a linear combination of variables already present. In this case, most standard regression routines simply exclude that variable from the regression, and your $R^2$ will remain unchanged because the model was effectively unchanged.



            As you notice, this does not mean the variable is unimportant, but rather that you are unable to distinguish its effect from that of the other variables in your model.



            More broadly however, I (and many here at Cross Validated) would caution against using R^2 for model selection and interpretation. What I've discussed above is how the $R^2$ could not change and the variable still be important. Worse yet, the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable. Broadly, using $R^2$ for model selection fell out of favor in the 70s, when it was dropped in favor of AIC (and its contemporaries). Today -- a typical statistician would recommend using cross validation (see the site name) for your model selection.



            In general, adding a variable increases $R^2$ -- so using $R^2$ to determine a variables importance is a bit of a wild goose chase. Even when trying to understand simple situations you will end up with a completely absurd collection of variables.






            share|cite|improve this answer









            $endgroup$












            • $begingroup$
              Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
              $endgroup$
              – Richard Hardy
              7 hours ago














            1












            1








            1





            $begingroup$

            As others have alluded, seeing no change in $R^2$ when you add a variable to your regression is unusual. In finite samples, this should only happen when your new variable is a linear combination of variables already present. In this case, most standard regression routines simply exclude that variable from the regression, and your $R^2$ will remain unchanged because the model was effectively unchanged.



            As you notice, this does not mean the variable is unimportant, but rather that you are unable to distinguish its effect from that of the other variables in your model.



            More broadly however, I (and many here at Cross Validated) would caution against using R^2 for model selection and interpretation. What I've discussed above is how the $R^2$ could not change and the variable still be important. Worse yet, the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable. Broadly, using $R^2$ for model selection fell out of favor in the 70s, when it was dropped in favor of AIC (and its contemporaries). Today -- a typical statistician would recommend using cross validation (see the site name) for your model selection.



            In general, adding a variable increases $R^2$ -- so using $R^2$ to determine a variables importance is a bit of a wild goose chase. Even when trying to understand simple situations you will end up with a completely absurd collection of variables.






            share|cite|improve this answer









            $endgroup$



            As others have alluded, seeing no change in $R^2$ when you add a variable to your regression is unusual. In finite samples, this should only happen when your new variable is a linear combination of variables already present. In this case, most standard regression routines simply exclude that variable from the regression, and your $R^2$ will remain unchanged because the model was effectively unchanged.



            As you notice, this does not mean the variable is unimportant, but rather that you are unable to distinguish its effect from that of the other variables in your model.



            More broadly however, I (and many here at Cross Validated) would caution against using R^2 for model selection and interpretation. What I've discussed above is how the $R^2$ could not change and the variable still be important. Worse yet, the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable. Broadly, using $R^2$ for model selection fell out of favor in the 70s, when it was dropped in favor of AIC (and its contemporaries). Today -- a typical statistician would recommend using cross validation (see the site name) for your model selection.



            In general, adding a variable increases $R^2$ -- so using $R^2$ to determine a variables importance is a bit of a wild goose chase. Even when trying to understand simple situations you will end up with a completely absurd collection of variables.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered 8 hours ago









            user5957401user5957401

            29727




            29727











            • $begingroup$
              Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
              $endgroup$
              – Richard Hardy
              7 hours ago

















            • $begingroup$
              Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
              $endgroup$
              – Richard Hardy
              7 hours ago
















            $begingroup$
            Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
            $endgroup$
            – Richard Hardy
            7 hours ago





            $begingroup$
            Could you elaborate on the $R^2$ could change somewhat (or even dramatically) when you include an irrelevant variable, specifically on the case of a dramatical change? In which sense would the variable then be irrelevant?
            $endgroup$
            – Richard Hardy
            7 hours ago


















            draft saved

            draft discarded
















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f396220%2fwhat-does-it-mean-when-i-add-a-new-variable-to-my-linear-model-and-the-r2-stays%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Nidaros erkebispedøme

            Birsay

            Where did Arya get these scars? Unicorn Meta Zoo #1: Why another podcast? Announcing the arrival of Valued Associate #679: Cesar Manara Favourite questions and answers from the 1st quarter of 2019Why did Arya refuse to end it?Has the pronunciation of Arya Stark's name changed?Has Arya forgiven people?Why did Arya Stark lose her vision?Why can Arya still use the faces?Has the Narrow Sea become narrower?Does Arya Stark know how to make poisons outside of the House of Black and White?Why did Nymeria leave Arya?Why did Arya not kill the Lannister soldiers she encountered in the Riverlands?What is the current canonical age of Sansa, Bran and Arya Stark?