The Simple Linear Regression model is Y = b0 + b1*X1 + u and the Multiple Linear Regression model with k variables is: Y = b0 + b1*X1 + b2*X2 + ... + bk*Xk + u Y is the dependent variable, the X1, X2, ..., Xk are the explanatory variables, b0 is the intercept, b1, b2, ..., bk are the slope coefficients, and u is the error term, Yhat represents the OLS fitted values, uhat represent the OLS residuals, b0_hat represents the OLS estimated intercept, and b1_hat, b2_hat,..., bk_hat, represent the OLS estimated slope coefficients. QUESTION 7 In the MLR model, the assumption of ‘linearity in parameters’ is violated if: one of the slope coefficients appears as a power (e.g. Y = b0 + b1*(X1^b2) + b3*X2 + u) the model includes the reciprocal of a variable (e.g. 1/X1) the model includes a variable squared (e.g. X1^2) the model includes a variable in its logarithmic form (i.e. log(X1) ) QUESTION 8 In the MLR model, the assumption of 'no perfect collinearity' is violated if: the model includes two variables that are not correlated the model includes both X1 and X1^2 (i.e. X1-squared) if two of the explanatory variables have a Pearson correlation equal to 0.98 if two of the explanatory variables have a Pearson correlation equal to -1 QUESTION 9 Suppose the estimated MLR model is, Yhat = 2 + 1.5*X1 + 0.5*X2 + 2*X3. Suppose that for an observation with X1=2, X2=-2, X3=5, we observe an actual value in the sample of Y=10. What is the residual, uhat, for this observation? negative 4 negative 14 positive 4 positive 14
The Simple Linear Regression model is Y = b0 + b1*X1 + u and the Multiple Linear Regression model with k variables is: Y = b0 + b1*X1 + b2*X2 + ... + bk*Xk + u Y is the dependent variable, the X1, X2, ..., Xk are the explanatory variables, b0 is the intercept, b1, b2, ..., bk are the slope coefficients, and u is the error term, Yhat represents the OLS fitted values, uhat represent the OLS residuals, b0_hat represents the OLS estimated intercept, and b1_hat, b2_hat,..., bk_hat, represent the OLS estimated slope coefficients. QUESTION 7 In the MLR model, the assumption of ‘linearity in parameters’ is violated if: one of the slope coefficients appears as a power (e.g. Y = b0 + b1*(X1^b2) + b3*X2 + u) the model includes the reciprocal of a variable (e.g. 1/X1) the model includes a variable squared (e.g. X1^2) the model includes a variable in its logarithmic form (i.e. log(X1) ) QUESTION 8 In the MLR model, the assumption of 'no perfect collinearity' is violated if: the model includes two variables that are not correlated the model includes both X1 and X1^2 (i.e. X1-squared) if two of the explanatory variables have a Pearson correlation equal to 0.98 if two of the explanatory variables have a Pearson correlation equal to -1 QUESTION 9 Suppose the estimated MLR model is, Yhat = 2 + 1.5*X1 + 0.5*X2 + 2*X3. Suppose that for an observation with X1=2, X2=-2, X3=5, we observe an actual value in the sample of Y=10. What is the residual, uhat, for this observation? negative 4 negative 14 positive 4 positive 14
Elementary Linear Algebra (MindTap Course List)
8th Edition
ISBN:9781305658004
Author:Ron Larson
Publisher:Ron Larson
Chapter2: Matrices
Section2.CR: Review Exercises
Problem 90CR
Related questions
Question
The Simple Linear Regression model is
Y = b0 + b1*X1 + u
and the Multiple Linear Regression model with k variables is:
Y = b0 + b1*X1 + b2*X2 + ... + bk*Xk + u
Y is the dependent variable, the X1, X2, ..., Xk are the explanatory variables, b0 is the intercept, b1, b2, ..., bk are the slope coefficients, and u is the error term,
Yhat represents the OLS fitted values, uhat represent the OLS residuals, b0_hat represents the OLS estimated intercept, and b1_hat, b2_hat,..., bk_hat, represent the OLS estimated slope coefficients.
QUESTION 7
In the MLR model, the assumption of ‘linearity in parameters’ is violated if:
- one of the slope coefficients appears as a power (e.g. Y = b0 + b1*(X1^b2) + b3*X2 + u)
- the model includes the reciprocal of a variable (e.g. 1/X1)
- the model includes a variable squared (e.g. X1^2)
- the model includes a variable in its logarithmic form (i.e. log(X1) )
QUESTION 8
In the MLR model, the assumption of 'no perfect collinearity' is violated if:
- the model includes two variables that are not correlated
- the model includes both X1 and X1^2 (i.e. X1-squared)
- if two of the explanatory variables have a Pearson
correlation equal to 0.98 - if two of the explanatory variables have a Pearson correlation equal to -1
QUESTION 9
Suppose the estimated MLR model is, Yhat = 2 + 1.5*X1 + 0.5*X2 + 2*X3.
Suppose that for an observation with X1=2, X2=-2, X3=5, we observe an actual value in the sample of Y=10. What is the residual, uhat, for this observation?
- negative 4
- negative 14
- positive 4
- positive 14
Expert Solution
This question has been solved!
Explore an expertly crafted, step-by-step solution for a thorough understanding of key concepts.
Step by step
Solved in 3 steps with 3 images
Follow-up Questions
Read through expert solutions to related follow-up questions below.
Follow-up Question
So for question 7 and 8 are the answers options: 2 and 1?
Solution
by Bartleby Expert
Recommended textbooks for you
Elementary Linear Algebra (MindTap Course List)
Algebra
ISBN:
9781305658004
Author:
Ron Larson
Publisher:
Cengage Learning
College Algebra
Algebra
ISBN:
9781305115545
Author:
James Stewart, Lothar Redlin, Saleem Watson
Publisher:
Cengage Learning
Elementary Linear Algebra (MindTap Course List)
Algebra
ISBN:
9781305658004
Author:
Ron Larson
Publisher:
Cengage Learning
College Algebra
Algebra
ISBN:
9781305115545
Author:
James Stewart, Lothar Redlin, Saleem Watson
Publisher:
Cengage Learning
Linear Algebra: A Modern Introduction
Algebra
ISBN:
9781285463247
Author:
David Poole
Publisher:
Cengage Learning
Algebra & Trigonometry with Analytic Geometry
Algebra
ISBN:
9781133382119
Author:
Swokowski
Publisher:
Cengage