Is the sample correlation always positively correlated with the sample variance? The Next CEO of Stack OverflowGiven known bivariate normal means and variances, update correlation estimate, $P(rho)$, with new data?Where does the correlation come from in the regression coefficient equation for simple regressionCDF of the ratio of two correlated $chi^2$ random variablesIs there a version of the correlation coefficient that is less-sensitive to outliers?Correlation in Distances of Points Within a Circle from Centre and One Other PointHow do I reproduce this distribution (with observed means, sd, kurtosis, skewness and correlation)?Is the formula of covariance right?Is my Correlation reasoning correct?Variance of $Y|x$ from regression lineIn a bivariate normal sample, why is the squared sample correlation Beta distributed?

How to pronounce fünf in 45

What does this strange code stamp on my passport mean?

Is a distribution that is normal, but highly skewed, considered Gaussian?

Strange use of "whether ... than ..." in official text

Is a linearly independent set whose span is dense a Schauder basis?

Why do we say “un seul M” and not “une seule M” even though M is a “consonne”?

Salesforce opportunity stages

Small nick on power cord from an electric alarm clock, and copper wiring exposed but intact

Is it possible to make a 9x9 table fit within the default margins?

Free fall ellipse or parabola?

Direct Implications Between USA and UK in Event of No-Deal Brexit

How does a dynamic QR code work?

Is it reasonable to ask other researchers to send me their previous grant applications?

Oldie but Goldie

Compensation for working overtime on Saturdays

"Eavesdropping" vs "Listen in on"

Is there a rule of thumb for determining the amount one should accept for a settlement offer?

Early programmable calculators with RS-232

How to coordinate airplane tickets?

How can I separate the number from the unit in argument?

Is it possible to create a QR code using text?

Traveling with my 5 year old daughter (as the father) without the mother from Germany to Mexico

Airship steam engine room - problems and conflict

Compilation of a 2d array and a 1d array



Is the sample correlation always positively correlated with the sample variance?



The Next CEO of Stack OverflowGiven known bivariate normal means and variances, update correlation estimate, $P(rho)$, with new data?Where does the correlation come from in the regression coefficient equation for simple regressionCDF of the ratio of two correlated $chi^2$ random variablesIs there a version of the correlation coefficient that is less-sensitive to outliers?Correlation in Distances of Points Within a Circle from Centre and One Other PointHow do I reproduce this distribution (with observed means, sd, kurtosis, skewness and correlation)?Is the formula of covariance right?Is my Correlation reasoning correct?Variance of $Y|x$ from regression lineIn a bivariate normal sample, why is the squared sample correlation Beta distributed?










3












$begingroup$


The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.



However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:



$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$



$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$



I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?



EDIT



Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?










share|cite|improve this question











$endgroup$











  • $begingroup$
    It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
    $endgroup$
    – half-pass
    7 hours ago










  • $begingroup$
    I should probably also note that while I wish this were a homework question, it's not... :)
    $endgroup$
    – half-pass
    7 hours ago






  • 1




    $begingroup$
    Ah, I didn't read the question carefully enough. My apologies.
    $endgroup$
    – jbowman
    7 hours ago










  • $begingroup$
    The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
    $endgroup$
    – Andrew M
    7 hours ago











  • $begingroup$
    It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
    $endgroup$
    – half-pass
    7 hours ago















3












$begingroup$


The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.



However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:



$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$



$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$



I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?



EDIT



Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?










share|cite|improve this question











$endgroup$











  • $begingroup$
    It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
    $endgroup$
    – half-pass
    7 hours ago










  • $begingroup$
    I should probably also note that while I wish this were a homework question, it's not... :)
    $endgroup$
    – half-pass
    7 hours ago






  • 1




    $begingroup$
    Ah, I didn't read the question carefully enough. My apologies.
    $endgroup$
    – jbowman
    7 hours ago










  • $begingroup$
    The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
    $endgroup$
    – Andrew M
    7 hours ago











  • $begingroup$
    It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
    $endgroup$
    – half-pass
    7 hours ago













3












3








3





$begingroup$


The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.



However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:



$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$



$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$



I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?



EDIT



Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?










share|cite|improve this question











$endgroup$




The sample correlation $r$ and the sample standard deviation of $X$ (call it $s_X$) seem to be positively correlated if I simulate bivariate normal $X$, $Y$ with a positive true correlation (and seem to be negatively correlated if the true correlation between $X$ and $Y$ is negative). I found this somewhat counterintuitive. Very heuristically, I suppose it reflects the fact that $r$ represents the expected increase in Y (in units of SD(Y)) for a one-SD increase in X, and if we estimate a larger $s_X$, then $r$ reflects the change in Y associated with a larger change in X.



However, I would like to know if $Cov(r, s_x) >0$ for $r>0$ holds in general (at least for the case in which X and Y are bivariate normal and with large n). Letting $sigma$ denote a true SD, we have:



$$Cov(r, s_X) = E [ r s_X] - rho sigma_x$$



$$ approx E Bigg[ fracwidehatCov(X,Y)s_Y Bigg] - fracCov(X,Y)sigma_Y $$



I tried using a Taylor expansion on the first term, but it depends on $Cov(widehatCov(X,Y), s_Y)$, so that’s a dead end. Any ideas?



EDIT



Maybe a better direction would be to try to show that $Cov(widehatbeta, s_X)=0$, where $widehatbeta$ is the OLS coefficient of Y on X. Then we could argue that since $widehatbeta = r fracs_Ys_X$, this implies the desired result. Since $widehatbeta$ is almost like a difference of sample means, maybe we could get the former result using something like the known independence of the sample mean and variance for a normal RV?







correlation covariance independence






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 5 hours ago







half-pass

















asked 7 hours ago









half-passhalf-pass

1,43441931




1,43441931











  • $begingroup$
    It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
    $endgroup$
    – half-pass
    7 hours ago










  • $begingroup$
    I should probably also note that while I wish this were a homework question, it's not... :)
    $endgroup$
    – half-pass
    7 hours ago






  • 1




    $begingroup$
    Ah, I didn't read the question carefully enough. My apologies.
    $endgroup$
    – jbowman
    7 hours ago










  • $begingroup$
    The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
    $endgroup$
    – Andrew M
    7 hours ago











  • $begingroup$
    It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
    $endgroup$
    – half-pass
    7 hours ago
















  • $begingroup$
    It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
    $endgroup$
    – half-pass
    7 hours ago










  • $begingroup$
    I should probably also note that while I wish this were a homework question, it's not... :)
    $endgroup$
    – half-pass
    7 hours ago






  • 1




    $begingroup$
    Ah, I didn't read the question carefully enough. My apologies.
    $endgroup$
    – jbowman
    7 hours ago










  • $begingroup$
    The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
    $endgroup$
    – Andrew M
    7 hours ago











  • $begingroup$
    It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
    $endgroup$
    – half-pass
    7 hours ago















$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
7 hours ago




$begingroup$
It would be unchanged. Hmm. I'm afraid I don't yet see the relevance, though.
$endgroup$
– half-pass
7 hours ago












$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
7 hours ago




$begingroup$
I should probably also note that while I wish this were a homework question, it's not... :)
$endgroup$
– half-pass
7 hours ago




1




1




$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
7 hours ago




$begingroup$
Ah, I didn't read the question carefully enough. My apologies.
$endgroup$
– jbowman
7 hours ago












$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
7 hours ago





$begingroup$
The first equality in your calculation is not correct. $s_x = sqrts^2_x$ is consistent for the standard deviation, but is not unbiased: en.wikipedia.org/wiki/Unbiased_estimation_of_standard_deviation
$endgroup$
– Andrew M
7 hours ago













$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
7 hours ago




$begingroup$
It's extremely close to unbiased for large n, though -- the rule-of-thumb correction factor for a normal RV is (n - 1.5) vs. (n-1).
$endgroup$
– half-pass
7 hours ago










2 Answers
2






active

oldest

votes


















1












$begingroup$

It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.



For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
    $endgroup$
    – half-pass
    5 hours ago


















1












$begingroup$

Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:



enter image description here






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "65"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f400643%2fis-the-sample-correlation-always-positively-correlated-with-the-sample-variance%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$

    It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.



    For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
      $endgroup$
      – half-pass
      5 hours ago















    1












    $begingroup$

    It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.



    For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
      $endgroup$
      – half-pass
      5 hours ago













    1












    1








    1





    $begingroup$

    It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.



    For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.






    share|cite|improve this answer









    $endgroup$



    It will depend on the joint distribution. For the example you mention, the bivariate (zero-mean) Normal distribution is characterized by the $rho, sigma_x, sigma_y$. It follows that one can have all possible combinations of values of these three parameters, implying that no relation between $rho$ and the standard deviations can be established.



    For other bivariate distributions, the correlation coefficient may be fundamentally a function of the standard deviations (essentially both will be functions of more primitive parameters), in which case one can examine whether a monotonic relation exists.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered 5 hours ago









    Alecos PapadopoulosAlecos Papadopoulos

    42.8k297197




    42.8k297197











    • $begingroup$
      I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
      $endgroup$
      – half-pass
      5 hours ago
















    • $begingroup$
      I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
      $endgroup$
      – half-pass
      5 hours ago















    $begingroup$
    I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
    $endgroup$
    – half-pass
    5 hours ago




    $begingroup$
    I understand that the three parameters can have arbitrary relationships for the BVN distribution, but I don't think it follows that the sample estimates of these are asymptotically independent.
    $endgroup$
    – half-pass
    5 hours ago













    1












    $begingroup$

    Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:



    enter image description here






    share|cite|improve this answer









    $endgroup$

















      1












      $begingroup$

      Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:



      enter image description here






      share|cite|improve this answer









      $endgroup$















        1












        1








        1





        $begingroup$

        Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:



        enter image description here






        share|cite|improve this answer









        $endgroup$



        Yes, it does hold asymptotically regardless of the distribution of X and Y. I was on the right track with the Taylor expansion; I just needed to make a symmetry argument:



        enter image description here







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered 4 hours ago









        half-passhalf-pass

        1,43441931




        1,43441931



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Cross Validated!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f400643%2fis-the-sample-correlation-always-positively-correlated-with-the-sample-variance%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Reverse int within the 32-bit signed integer range: [−2^31, 2^31 − 1]Combining two 32-bit integers into one 64-bit integerDetermine if an int is within rangeLossy packing 32 bit integer to 16 bitComputing the square root of a 64-bit integerKeeping integer addition within boundsSafe multiplication of two 64-bit signed integersLeetcode 10: Regular Expression MatchingSigned integer-to-ascii x86_64 assembler macroReverse the digits of an Integer“Add two numbers given in reverse order from a linked list”

            Category:Fedor von Bock Media in category "Fedor von Bock"Navigation menuUpload mediaISNI: 0000 0000 5511 3417VIAF ID: 24712551GND ID: 119294796Library of Congress authority ID: n96068363BnF ID: 12534305fSUDOC authorities ID: 034604189Open Library ID: OL338253ANKCR AUT ID: jn19990000869National Library of Israel ID: 000514068National Thesaurus for Author Names ID: 341574317ReasonatorScholiaStatistics

            Kiel Indholdsfortegnelse Historie | Transport og færgeforbindelser | Sejlsport og anden sport | Kultur | Kendte personer fra Kiel | Noter | Litteratur | Eksterne henvisninger | Navigationsmenuwww.kiel.de54°19′31″N 10°8′26″Ø / 54.32528°N 10.14056°Ø / 54.32528; 10.14056Oberbürgermeister Dr. Ulf Kämpferwww.statistik-nord.deDen danske Stats StatistikKiels hjemmesiderrrWorldCat312794080n790547494030481-4