About 507,000 results
Open links in new tab
  1. regression - When is R squared negative? - Cross Validated

    Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is equivalent to …

  2. regression - What does it mean to regress a variable against another ...

    Dec 21, 2016 · Those words connote causality, but regression can work the other way round too (use Y to predict X). The independent/dependent variable language merely specifies how one thing depends …

  3. What's the difference between correlation and simple linear regression ...

    Aug 1, 2013 · Note that one perspective on the relationship between regression & correlation can be discerned from my answer here: What is the difference between doing linear regression on y with x …

  4. Multivariable vs multivariate regression - Cross Validated

    Feb 2, 2020 · Multivariable regression is any regression model where there is more than one explanatory variable. For this reason it is often simply known as "multiple regression". In the simple …

  5. regression - Trying to understand the fitted vs residual plot? - Cross ...

    Dec 23, 2016 · A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is reasonable. The …

  6. regression - Interpret log-linear with dummy variable - Cross Validated

    Apr 30, 2019 · I have the following model: ln(y) = b0 + B1 X1 + B2 ln(X2) + B3 X3 My X1 is a dummy that can take the values 0, 1 and 2. The coefficient for the dummy 1 is -0.500. My question is how do …

  7. How should outliers be dealt with in linear regression analysis?

    What statistical tests or rules of thumb can be used as a basis for excluding outliers in linear regression analysis? Are there any special considerations for multilinear regression?

  8. Linear model with both additive and multiplicative effects

    Sep 23, 2020 · In a log-level regression, the independent variables have an additive effect on the log-transformed response and a multiplicative effect on the original untransformed response:

  9. correlation - What is the difference between linear regression on y ...

    The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be the ...

  10. When conducting multiple regression, when should you center your ...

    Jun 5, 2012 · In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean and dividin...