[HW9] Hypothesis Testing and Regression Analysis
.xlsx
keyboard_arrow_up
School
Bloomsburg University *
*We aren’t endorsed by this school
Course
101
Subject
Statistics
Date
Apr 3, 2024
Type
xlsx
Pages
17
Uploaded by PresidentGrouse4378
SUMMARY OUTPUT
Regression Statistics
Multiple R
0.8973275062
R Square
0.8051966534
Adjusted R Square
0.7917619398
Standard Error
2.2781474488
Observations
32
ANOVA
df
SS
MS
F
Significance F
Regression
2 622.110031848 311.05501592
59.93403952 5.0020496E-11
Residual
29 150.508718152 5.1899557983
Total
31
772.61875
Coefficients
Standard Error
t Stat
P-value
Lower 95%
Intercept
-13.365770151 7.65113640121 -1.7468999964 0.0912393677 -29.014101115
Yards/Game
0.1220941466
0.0123093957 9.9187766456 7.938703E-11 0.0969186056
Opponent Yards/Game
-0.0142907103 0.01617121466 -0.8837128587 0.3841190458 -0.0473645579
Seeing as our yards/game p-value is lower than our alpha, we can say that our yards/game predictor is a
However, our opponent yards/game is not a signifigant estimate as its p-value is far larger than our alph
So changes in yard/game are associated with points/game, but changes with opponent yards/game are The intercept coefficient shows us that if yards/game and opponent yards/game are zero, then the pred
The yards/game coefficient means that as our yards per game increases by one unit while holding all oth
Even though our opponent yards/game is not signifigant, we can still explain it. The opponent yards/gam
Upper 95%
Lower 95.0%
Upper 95.0%
2.2825608123 -29.014101115 2.2825608123
0.1472696876 0.0969186056 0.1472696876
0.0187831372 -0.0473645579 0.0187831372
a signifigant estimate
ha value
not associated with points/game
dicted points/game is -13.365
her variables constant, our points per game increases by 0.122
me coefficient means that as our oppoenent yards per game increases by one unit while holding all other variab
bles constant, our points per game decreases by 0.014
SUMMARY OUTPUT
Regression Statistics
Multiple R
0.7563284474
R Square
0.5720327203
Adjusted R Square
0.5577671443
Standard Error
3.3199173918
Observations
32
ANOVA
df
SS
MS
F
Significance F
Regression
1 441.963205342 441.96320534 40.098816955 5.5278694E-07
Residual
30 330.655544658 11.021851489
Total
31
772.61875
Coefficients
Standard Error
t Stat
P-value
Lower 95%
Intercept
-0.616076942 3.57169120526 -0.1724888594 0.8642116413 -7.9104435129
Passing Yards/Game
0.1041010312
0.0164395245 6.3323626676 5.527869E-07 0.0705270432
Since as our p-value is less than 1% for our passing yards per game coefficient, we can say that passin
Now looking at our intercept, we can say that with 0 passing yards per game, our points per game wil
This may seem ridiculuous since it would be impossible to have negative points in a game, but we are
Now our passing yards per game coefficient means that if we passing yards per game increases by on
With this information, we can say changes in passing yards per game impact our points per game
10
15
20
25
30
35
0
50
100
150
200
250
300
350
Passing Yards/Game vs Points/Game
Points/Game
Passing Yeards/Game
Upper 95%
Lower 95.0%
Upper 95.0%
6.6782896289 -7.9104435129 6.6782896289
0.1376750193 0.0705270432 0.1376750193
ng yards is a signifigant estimator for points per game
ll be -0.61608
e using linear regression so this is just where our intercept happens to fall and it is almost impossible to have ze
ne, the points per game will increase by 0.104101
The graph shows a strong positive relationship between these two variables. In other words, as passing yards per game increases, so does points per game (in general).
5
40
ero passing yards in a game
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Related Questions
using the following ANOVA Table for regression analysis to compute the adjusted R^2 value
arrow_forward
Conduct a global test on the set of independent variables. Interpret
Regression Statistics
Multiple R
0.87027387
R Square
0.75737661
Adjusted R Square
0.75615535
Standard Error
14.6932431
Observations
600
ANOVA
df
SS
MS
F
Significance F
Regression
3
401662.063
133887.354
620.160683
8.708E-183
Residual
596
128671.271
215.891394
Total
599
530333.333
Coefficients
Standard Error
t Stat
P-value
Lower 95%
Upper 95%
Lower 95.0%
Upper 95.0%
Intercept
-3.9995369
3.05935528
-1.3073137
0.19161031
-10.007965
2.00889078
-10.007965
2.00889078
Annual Income
0.0002132
3.1402E-05
6.78944156
2.7269E-11
0.00015153
0.00027487
0.00015153
0.00027487
Married
45.7808695
1.20203164
38.0862434
8.444E-162
43.4201368
48.1416023
43.4201368
48.1416023
Male
21.9175699
1.20122625
18.2459964
2.0045E-59…
arrow_forward
A multiple regression analysis produced the following tables.
Summary Output
Regression Statistics
Multiple R
0.978724022
R Square
0.957900711
Adjusted R Square
0.952287472
Standard Error
67.67055418
Observations
18
ANOVA
df
SS
MS
F
Significance F
Regression
2
1562918.941
781459.5
170.6503
4.80907E-11
Residual
15
68689.55855
4579.304
Total
17
1631608.5
Coefficients
Standard Error
t Stat
P-value
Intercept
1959.709718
306.4905312
6.39403
1.21E-05
X1
-0.469657287
0.264557168
-1.77526
0.096144
X2
-2.163344882
0.278361425
-7.77171
1.23E-06
The regression equation for…
arrow_forward
A multiple regression analysis produced the following tables.
Summary Output
Regression Statistics
Multiple R
0.978724022
R Square
0.957900711
Adjusted R Square
0.952287472
Standard Error
67.67055418
Observations
18
ANOVA
df
SS
MS
F
Significance F
Regression
2
1562918.941
781459.5
170.6503
4.80907E-11
Residual
15
68689.55855
4579.304
Total
17
1631608.5
Coefficients
Standard Error
t Stat
P-value
Intercept
1959.709718
306.4905312
6.39403
1.21E-05
X1
-0.469657287
0.264557168
-1.77526
0.096144
X2
-2.163344882
0.278361425
-7.77171
1.23E-06
Using α = 0.01 to test the…
arrow_forward
A multiple regression analysis produced the following tables.
Summary Output
Regression Statistics
Multiple R
0.978724022
R Square
0.957900711
Adjusted R Square
0.952287472
Standard Error
67.67055418
Observations
18
ANOVA
df
SS
MS
F
Significance F
Regression
2
1562918.941
781459.5
170.6503
4.80907E-11
Residual
15
68689.55855
4579.304
Total
17
1631608.5
Coefficients
Standard Error
t Stat
P-value
Intercept
1959.709718
306.4905312
6.39403
1.21E-05
X1
-0.469657287
0.264557168
-1.77526
0.096144
X2
-2.163344882
0.278361425
-7.77171
1.23E-06
For x1= 360 and x2 = 220, the…
arrow_forward
A multiple regression analysis produced the following tables.
Summary Output
Regression Statistics
Multiple R
0.978724022
R Square
0.957900711
Adjusted R Square
0.952287472
Standard Error
67.67055418
Observations
18
ANOVA
df
SS
MS
F
Significance F
Regression
2
1562918.941
781459.5
170.6503
4.80907E-11
Residual
15
68689.55855
4579.304
Total
17
1631608.5
Coefficients
Standard Error
t Stat
P-value
Intercept
1959.709718
306.4905312
6.39403
1.21E-05
X1
-0.469657287
0.264557168
-1.77526
0.096144
X2
-2.163344882
0.278361425
-7.77171
1.23E-06
Using α = 0.01 to test the…
arrow_forward
A multiple regression analysis produced the following tables.
Summary Output
Regression Statistics
Multiple R
0.978724022
R Square
0.957900711
Adjusted R Square
0.952287472
Standard Error
67.67055418
Observations
18
ANOVA
df
SS
MS
F
Significance F
Regression
2
1562918.941
781459.5
170.6503
4.80907E-11
Residual
15
68689.55855
4579.304
Total
17
1631608.5
Coefficients
Standard Error
t Stat
P-value
Intercept
1959.709718
306.4905312
6.39403
1.21E-05
X1
-0.469657287
0.264557168
-1.77526
0.096144
X2
-2.163344882
0.278361425
-7.77171
1.23E-06
These results indicate that…
arrow_forward
Analyse the following regression
Regression Statistics
Multiple R
0.79716916
R Square
0.63547867
Adjusted R Square
0.63254686
Standard Error
198.375358
Observations
377
ANOVA
df
SS
MS
F
Significance F
Regression
3
25589530
8529843.34
216.753245
2.2501E-81
Residual
373
14678587.9
39352.7826
Total
376
40268117.9
Coefficients
Standard Error
t Stat
P-value
Lower 95%
Upper 95%
Lower 95.0%
Upper 95.0%
Intercept
-97.981174
79.2847437
-1.2358137
0.21730553
-253.88228
57.9199296
-253.88228
57.9199296
EquivArea
3.5523258
0.15149067
23.4491399
2.2549E-75
3.254443
3.85020861
3.254443
3.85020861
Years
2.04629486
0.31770115
6.44094253
3.6636E-10
1.42158501
2.67100471
1.42158501
2.67100471
Condition
15.0379067
10.2144479
1.47221924
0.14180492
-5.0472147
35.1230282
-5.0472147
35.1230282
arrow_forward
A sales manager for an advertising agency believes there is a relationship between the number of contacts that a salesperson makes and the amount of sales dollars earned. A regression ANOVA shows the following results:ANOVA
df
SS
MS
F
Significance F
Regression
1.00
13,555.42
13,555.42
156.38
0.00
Residual
8.00
693.48
86.88
Total
9.00
14,248.90
What is the value of the standard error of estimate?
Multiple Choice
9.321
8.789
8.339
86.88
arrow_forward
Analyse the following regression model
Regression Statistics
Multiple R
0.7958395
R Square
0.63336051
Adjusted R Square
0.63139987
Standard Error
198.684728
Observations
377
ANOVA
df
SS
MS
F
Significance F
Regression
2
25504235.6
12752117.8
323.037801
3.2564E-82
Residual
374
14763882.3
39475.6211
Total
376
40268117.9
Coefficients
Standard Error
t Stat
P-value
Lower 95%
Upper 95%
Lower 95.0%
Upper 95.0%
Intercept
6.10606259
35.9370414
0.16991
0.86517279
-64.557919
76.770044
-64.557919
76.770044
EquivArea
3.62347902
0.1437982
25.1983622
1.3118E-82
3.34072471
3.90623332
3.34072471
3.90623332
Years
1.90428034
0.30317478
6.28113036
9.3418E-10
1.30813952
2.50042116
1.30813952
2.50042116
arrow_forward
WALKING
RUNNING
Mean
7.181818182
7.3636364
Variance
4.563636364
6.4545455
Observations
11
11
Pooled Variance
5.509090909
Hypothesized Mean Difference
0
df
20
t Stat
-0.181668105
P(T<=t) one-tail
0.428835886
t Critical one-tail
1.724718243
P(T<=t) two-tail
0.857671772
t Critical two-tail
2.085963447
If the t-Stat value is less than the t Critical two-tail value I can reject the null, right? I am still a little confused on what a few numbers represent.
arrow_forward
Based on the ANOVA table given, is there enough evidence at the 0.01 level of significance to conclude that the linear relationship between the independent variables and
the dependent variable is statistically significant?
ANOVA
Source
df
MS
F
Significance F
Regression
975.230086
487.615043
6.928784
0.009990
Residual
12
844.503247
70.375271
Total
14
1819.733333
Copy Data
Answer
画 Tables
国 Keypad
Keyboard Shortcuts
O Yes
O No
arrow_forward
Based on the ANOVA table given, is there enough evidence at the 0.050.05 level of significance to conclude that the linear relationship between the independent variables and the dependent variable is statistically significant?
ANOVA
Source
df
SS
MS
F
Significance F
Regression
3
212.987150
70.995717
1.092713
0.448523
Residual
4
259.887850
64.971963
Total
7
472.875000
arrow_forward
What is the R2?
What is the adjusted R2?
Explain the purpose of having an adjusted R2?
arrow_forward
Mobiles
Mean
Standard Error
8200871.429
95839.39868
MOBILES
Median
8211400
Mode
|Standard Deviation
Sample Variance
#N/A
7,755,200
8,063,000
8,420,700
8,093,300
8,211,400
8,399,700
8,462,800
A
B
253567.2147
64296332381
D
Kurtosis
0.080691805
Skewness
Range
Minimum
Maximum
Sum
Count
E
-0.80048705
F
707600
G
7755200
8462800
57406100
7
arrow_forward
E2
arrow_forward
Based on the ANOVA table given, is there enough evidence at the 0.05 level of significance to conclude that the linear
relationship between the independent variables and the dependent variable is statistically significant? Source df SS
Regression 2 984.715358 Residual 7 298.884642 Total 9 1283.600000 ANOVA MS F 492.357679 11.531217 42.697806
Significance F 0.006092 Copy Data
Ĵ
Based on the ANOVA table given, is there enough evidence at the 0.05 level of significance to conclude that the linear relationship between the
independent variables and the dependent variable is statistically significant?
Source df
SS
Regression 2 984.715358
Residual 7 298.884642
Total
9 1283.600000
ANOVA
MS
F
Significance F
492.357679 11.531217 0.006092
42.697806
Copy Data
arrow_forward
pre-test post-test
Mean
-2.25
0.75
Variance
9.30
13.30
T
Observations
12
12
Pearson Correlation
0.9478
Hypothesized Mean
0
Difference
df
11
t Stat
2.044
P(T<=t) one-tail
0.034
t Critical one-tail
1.796
P(T<=t) two-tail
0.066
t Critical two-tail
2.201
Based on the table above, which statement below is correct.
a. The results of a paired t-test indicates no significant difference between conditions (t = 2.044, df = 11, p = 0.034)
O b. The results of a two sample t-test indicates no significant difference between conditions (t = 2.044, df = 11, p = 0.034)
O c. The results of a two sample t-test indicates no significant difference between conditions (t = 2.044, df = 11, p = 0.066)
d. The results of a paired t-test indicates a significant difference between conditions (t = 2.044, df = 11, p = 0.066)
e. The results of a two sample t-test indicates a significant difference between conditions (t = 2.044, df = 11, p = 0.034)
f. The results of a paired t-test indicates a significant difference…
arrow_forward
Female
Variance
s2= 3.266666429
Standard Deviation
s= 1.8074
Coefficient of Variation
cv= 2.93736%
Male
Variance
s2= 4.523809286
Standard Deviation
s= 2.1269
Coefficient of Variation
cv= 3.1588%
Convert to z-score and use the areas under the normal curve table to answer the following:
What percent (male and female) of your classmates are shorter than 5 feet?
What percent (male and female) of your classmates are taller than 5 feet?
What percent (male and female) of your classmates are shorter than 6 feet?
What percent (male and female) of your classmates are between 60 inches and 70 inches?
arrow_forward
What type of data is the dependent variable in a two sample t-test?
Discrete
Continuous
Qualitative
Binary
arrow_forward
Refer to the ANOVA table for this regression.
Source
SS
d.f.
MS
Regression
1,164,578
5
232,916
Residual
1,500,689
45
33,349
Total
2,665,267
50
(a) State the degrees of freedom for the F test for overall significance.
arrow_forward
Define how to use t-Statistic in Regression When the Sample Size Is Small? Explain Use of the Student t Distribution in Practice?
arrow_forward
An educational psychologist is examining response times to an on-screen stimulus. The researcher
believes there might be a weak effect from age, but expects a more pronounced effect for different
color contrasts. She decides to examine a black on white (B/W) combination compared to 2
alternatives: red on white (R/W) and yellow on blue (Y/B). Here is the data for response times (in
milliseconds):
Color Scheme
B/W
R/W
Y/B
10
17
23
26
44
44
22
21
25
22
21
4
16-17
11
32
34
31
24
20
16
20
21
36
8
8
24
37
23
11
39
36
23
17
28
32
36
31
Age
18-19
19
15
45
15
35
36
13
47
23
16
19
31
24
30
25
16
40
16
27
25
42
23
28
35
20-21
21
18
42
27
19
39
39
28
27
18
26
38
Using MS Excel, conduct a 2-way ANOVA with a =
0.05. Fill in the summary table.
The last column in the table below is read as, "Partial Eta-squared " and is a concept that was
NOT covered in the lectures. Partial n2 is used when there is a chance the data is not independent
(click here to read more if interested 2) Computing the values for…
arrow_forward
P-values should be accurate to 4 decimal places and all other values accurate to 3 decimal
places.
Source
S
df
MS
F-ratio
P-value
Partial n?
Age (A)
2
Color (B)
Interaction (A × B)
4
Error
63
Provide the conclusions for this 2-way ANOVA.
What is the conclusion regarding the main effect for age (A or rows):
O There appears to be an effect due to age.
O There is not enough evidence to reject the assumption that the marginal means across age
groups are equal.
What is the conclusion regarding the main effect for color (B or columns):
There appears to be an effect due to color.
There is not enough evidence to reject the assumption that the marginal means across color
groups are equal.
What is the conclusion regarding an interaction effect between age and color (A x B):
There appears to be an interaction effect between age and color.
There is not enough evidence to reject the assumption of a lack of an interaction.
arrow_forward
interpret this two way anova and multiple comparisons test with the data provided
arrow_forward
determine if the frequency of deficiencies of variable (influenza/pneumococcal shots) are statistically significantly different between for-profit and non-profit nursing homes, interpret the results for the chi-square?
arrow_forward
SEE MORE QUESTIONS
Recommended textbooks for you
MATLAB: An Introduction with Applications
Statistics
ISBN:9781119256830
Author:Amos Gilat
Publisher:John Wiley & Sons Inc
Probability and Statistics for Engineering and th...
Statistics
ISBN:9781305251809
Author:Jay L. Devore
Publisher:Cengage Learning
Statistics for The Behavioral Sciences (MindTap C...
Statistics
ISBN:9781305504912
Author:Frederick J Gravetter, Larry B. Wallnau
Publisher:Cengage Learning
Elementary Statistics: Picturing the World (7th E...
Statistics
ISBN:9780134683416
Author:Ron Larson, Betsy Farber
Publisher:PEARSON
The Basic Practice of Statistics
Statistics
ISBN:9781319042578
Author:David S. Moore, William I. Notz, Michael A. Fligner
Publisher:W. H. Freeman
Introduction to the Practice of Statistics
Statistics
ISBN:9781319013387
Author:David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:W. H. Freeman
Related Questions
- using the following ANOVA Table for regression analysis to compute the adjusted R^2 valuearrow_forwardConduct a global test on the set of independent variables. Interpret Regression Statistics Multiple R 0.87027387 R Square 0.75737661 Adjusted R Square 0.75615535 Standard Error 14.6932431 Observations 600 ANOVA df SS MS F Significance F Regression 3 401662.063 133887.354 620.160683 8.708E-183 Residual 596 128671.271 215.891394 Total 599 530333.333 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0% Intercept -3.9995369 3.05935528 -1.3073137 0.19161031 -10.007965 2.00889078 -10.007965 2.00889078 Annual Income 0.0002132 3.1402E-05 6.78944156 2.7269E-11 0.00015153 0.00027487 0.00015153 0.00027487 Married 45.7808695 1.20203164 38.0862434 8.444E-162 43.4201368 48.1416023 43.4201368 48.1416023 Male 21.9175699 1.20122625 18.2459964 2.0045E-59…arrow_forwardA multiple regression analysis produced the following tables. Summary Output Regression Statistics Multiple R 0.978724022 R Square 0.957900711 Adjusted R Square 0.952287472 Standard Error 67.67055418 Observations 18 ANOVA df SS MS F Significance F Regression 2 1562918.941 781459.5 170.6503 4.80907E-11 Residual 15 68689.55855 4579.304 Total 17 1631608.5 Coefficients Standard Error t Stat P-value Intercept 1959.709718 306.4905312 6.39403 1.21E-05 X1 -0.469657287 0.264557168 -1.77526 0.096144 X2 -2.163344882 0.278361425 -7.77171 1.23E-06 The regression equation for…arrow_forward
- A multiple regression analysis produced the following tables. Summary Output Regression Statistics Multiple R 0.978724022 R Square 0.957900711 Adjusted R Square 0.952287472 Standard Error 67.67055418 Observations 18 ANOVA df SS MS F Significance F Regression 2 1562918.941 781459.5 170.6503 4.80907E-11 Residual 15 68689.55855 4579.304 Total 17 1631608.5 Coefficients Standard Error t Stat P-value Intercept 1959.709718 306.4905312 6.39403 1.21E-05 X1 -0.469657287 0.264557168 -1.77526 0.096144 X2 -2.163344882 0.278361425 -7.77171 1.23E-06 Using α = 0.01 to test the…arrow_forwardA multiple regression analysis produced the following tables. Summary Output Regression Statistics Multiple R 0.978724022 R Square 0.957900711 Adjusted R Square 0.952287472 Standard Error 67.67055418 Observations 18 ANOVA df SS MS F Significance F Regression 2 1562918.941 781459.5 170.6503 4.80907E-11 Residual 15 68689.55855 4579.304 Total 17 1631608.5 Coefficients Standard Error t Stat P-value Intercept 1959.709718 306.4905312 6.39403 1.21E-05 X1 -0.469657287 0.264557168 -1.77526 0.096144 X2 -2.163344882 0.278361425 -7.77171 1.23E-06 For x1= 360 and x2 = 220, the…arrow_forwardA multiple regression analysis produced the following tables. Summary Output Regression Statistics Multiple R 0.978724022 R Square 0.957900711 Adjusted R Square 0.952287472 Standard Error 67.67055418 Observations 18 ANOVA df SS MS F Significance F Regression 2 1562918.941 781459.5 170.6503 4.80907E-11 Residual 15 68689.55855 4579.304 Total 17 1631608.5 Coefficients Standard Error t Stat P-value Intercept 1959.709718 306.4905312 6.39403 1.21E-05 X1 -0.469657287 0.264557168 -1.77526 0.096144 X2 -2.163344882 0.278361425 -7.77171 1.23E-06 Using α = 0.01 to test the…arrow_forward
- A multiple regression analysis produced the following tables. Summary Output Regression Statistics Multiple R 0.978724022 R Square 0.957900711 Adjusted R Square 0.952287472 Standard Error 67.67055418 Observations 18 ANOVA df SS MS F Significance F Regression 2 1562918.941 781459.5 170.6503 4.80907E-11 Residual 15 68689.55855 4579.304 Total 17 1631608.5 Coefficients Standard Error t Stat P-value Intercept 1959.709718 306.4905312 6.39403 1.21E-05 X1 -0.469657287 0.264557168 -1.77526 0.096144 X2 -2.163344882 0.278361425 -7.77171 1.23E-06 These results indicate that…arrow_forwardAnalyse the following regression Regression Statistics Multiple R 0.79716916 R Square 0.63547867 Adjusted R Square 0.63254686 Standard Error 198.375358 Observations 377 ANOVA df SS MS F Significance F Regression 3 25589530 8529843.34 216.753245 2.2501E-81 Residual 373 14678587.9 39352.7826 Total 376 40268117.9 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0% Intercept -97.981174 79.2847437 -1.2358137 0.21730553 -253.88228 57.9199296 -253.88228 57.9199296 EquivArea 3.5523258 0.15149067 23.4491399 2.2549E-75 3.254443 3.85020861 3.254443 3.85020861 Years 2.04629486 0.31770115 6.44094253 3.6636E-10 1.42158501 2.67100471 1.42158501 2.67100471 Condition 15.0379067 10.2144479 1.47221924 0.14180492 -5.0472147 35.1230282 -5.0472147 35.1230282arrow_forwardA sales manager for an advertising agency believes there is a relationship between the number of contacts that a salesperson makes and the amount of sales dollars earned. A regression ANOVA shows the following results:ANOVA df SS MS F Significance F Regression 1.00 13,555.42 13,555.42 156.38 0.00 Residual 8.00 693.48 86.88 Total 9.00 14,248.90 What is the value of the standard error of estimate? Multiple Choice 9.321 8.789 8.339 86.88arrow_forward
- Analyse the following regression model Regression Statistics Multiple R 0.7958395 R Square 0.63336051 Adjusted R Square 0.63139987 Standard Error 198.684728 Observations 377 ANOVA df SS MS F Significance F Regression 2 25504235.6 12752117.8 323.037801 3.2564E-82 Residual 374 14763882.3 39475.6211 Total 376 40268117.9 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0% Intercept 6.10606259 35.9370414 0.16991 0.86517279 -64.557919 76.770044 -64.557919 76.770044 EquivArea 3.62347902 0.1437982 25.1983622 1.3118E-82 3.34072471 3.90623332 3.34072471 3.90623332 Years 1.90428034 0.30317478 6.28113036 9.3418E-10 1.30813952 2.50042116 1.30813952 2.50042116arrow_forwardWALKING RUNNING Mean 7.181818182 7.3636364 Variance 4.563636364 6.4545455 Observations 11 11 Pooled Variance 5.509090909 Hypothesized Mean Difference 0 df 20 t Stat -0.181668105 P(T<=t) one-tail 0.428835886 t Critical one-tail 1.724718243 P(T<=t) two-tail 0.857671772 t Critical two-tail 2.085963447 If the t-Stat value is less than the t Critical two-tail value I can reject the null, right? I am still a little confused on what a few numbers represent.arrow_forwardBased on the ANOVA table given, is there enough evidence at the 0.01 level of significance to conclude that the linear relationship between the independent variables and the dependent variable is statistically significant? ANOVA Source df MS F Significance F Regression 975.230086 487.615043 6.928784 0.009990 Residual 12 844.503247 70.375271 Total 14 1819.733333 Copy Data Answer 画 Tables 国 Keypad Keyboard Shortcuts O Yes O Noarrow_forward
arrow_back_ios
SEE MORE QUESTIONS
arrow_forward_ios
Recommended textbooks for you
- MATLAB: An Introduction with ApplicationsStatisticsISBN:9781119256830Author:Amos GilatPublisher:John Wiley & Sons IncProbability and Statistics for Engineering and th...StatisticsISBN:9781305251809Author:Jay L. DevorePublisher:Cengage LearningStatistics for The Behavioral Sciences (MindTap C...StatisticsISBN:9781305504912Author:Frederick J Gravetter, Larry B. WallnauPublisher:Cengage Learning
- Elementary Statistics: Picturing the World (7th E...StatisticsISBN:9780134683416Author:Ron Larson, Betsy FarberPublisher:PEARSONThe Basic Practice of StatisticsStatisticsISBN:9781319042578Author:David S. Moore, William I. Notz, Michael A. FlignerPublisher:W. H. FreemanIntroduction to the Practice of StatisticsStatisticsISBN:9781319013387Author:David S. Moore, George P. McCabe, Bruce A. CraigPublisher:W. H. Freeman
MATLAB: An Introduction with Applications
Statistics
ISBN:9781119256830
Author:Amos Gilat
Publisher:John Wiley & Sons Inc
Probability and Statistics for Engineering and th...
Statistics
ISBN:9781305251809
Author:Jay L. Devore
Publisher:Cengage Learning
Statistics for The Behavioral Sciences (MindTap C...
Statistics
ISBN:9781305504912
Author:Frederick J Gravetter, Larry B. Wallnau
Publisher:Cengage Learning
Elementary Statistics: Picturing the World (7th E...
Statistics
ISBN:9780134683416
Author:Ron Larson, Betsy Farber
Publisher:PEARSON
The Basic Practice of Statistics
Statistics
ISBN:9781319042578
Author:David S. Moore, William I. Notz, Michael A. Fligner
Publisher:W. H. Freeman
Introduction to the Practice of Statistics
Statistics
ISBN:9781319013387
Author:David S. Moore, George P. McCabe, Bruce A. Craig
Publisher:W. H. Freeman