133230DHS618 - Logistic Regression Example

.doc

School

Trident University International *

*We aren’t endorsed by this school

Course

618

Subject

Statistics

Date

Feb 20, 2024

Type

doc

Pages

6

Uploaded by DoctorBraveryWalrus7988

Module 3 Logistic Regression Example QUESTION Is the assumption of collinearity met? To test this assumption, you need to run this in linear regression (as this is not included under logistic regression). Select Analyze Regression Linear. Then select Statistics and check collinearity diagnostics. When you run linear regression, you should obtain the results below. Disregard all results except for collinearity statistics. Since the tolerance values are above 2 and the VIF is below 4, the assumption of collinearity is met.
Perform logistic regression for all independent variables Select analyze regression binary logistic to obtain screen below. Move dependent variable into box (i.e. pass). Move independent variables into box (i.e. score, exp). As the independent variables are not categorical, you do not need to select the categorical box. In order to obtain goodness-of-fit test results and 95% CI for odds ratios, select options and check the following: Hosmer-Lemeshow goodness-of-fit, Classification plots and CI for exp(B). Select continue.
Run regression by selecting OK. Your SPSS output should include the following: 1. The Variables in Equation table below contains Beta, Wald statistics, p-values, adjusted OR, and 95% CI for all independent variables. Only experience was statistically significant (OR=1.158, 95% CI =1.004-1.336). The logistic regression equation is Logit(p)=-3.166+.147*exp +.288*score 2. The model summary table below contains log likelihood and R Square statistics. These results are analogous to R 2 in multiple regression. The model accounted for 19.9% to 27% of the variance. 3. The tables below contain the chi-square goodness of fit statistics and Hosmer and Lemeshow test. The omnibus chi-square goodness-of-fit test reached near significance (X 2 =5.768, df=2, p=.056). The Hosmer and Lemeshow test indicated that there was no significant difference between observed and expected values (X 2 = 9.087, df=4, p=.335), indicating that the model fit the data.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help