Just Part e The article \"The Influence of Temperature and Sunshine on the Alpha
ID: 3240236 • Letter: J
Question
Just Part e
The article "The Influence of Temperature and Sunshine on the Alpha-Acid Contents of Hops'' reports the following data on yield (y), mean temperature over the period between date of coming into hops and date of picking (x_1), and mean percentage of sunshine during the same period (x_2) for the Fuggle variety of hop: Use the following R Code to complete the regression analysis: x1 = c(16.7, 17.4, 18.4, 16.8, 18.9, 17.1, 17.3, 18.2, 21.3, 21.2, 20 7, 18.5) x2 = c(30, 42, 47, 47, 43, 41, 48, 44, 43, 50, 56, 60) y = c(210, 110, 103, 103, 91, 76, 73, 70, 68, 53, 45, 31) mod = m(y tilde x1 + x2) summary (mod) (a) According to the output, what is the least squares regression equation y = b_0 + b_1x_1 + b_2x_2: y = 415.11 + -6.593 x_1 + -4.504 x_2 (b) What is the estimate for sigma? S = 24.4545 (c) According to the model what is the predicted value for y when x_1 = 18.2 and x_2 = 44 and what is the corresponding residual? Residual: y = 96.9414 y - y = -26.94 (d) Test H_0: beta_1 = beta_2 = 0 versus H_a: either beta_1 or beta_2 notequalto 0. From the output state the test statistic and the p-value. f = 14.9 p-value = 0.001395 State the conclusion in the problem context. There is moderately suggestive evidence at least one of the explanatory variables is a significant predictor of the response. There is slightly suggestive evidence at least one of the explanatory variables is a significant predictor of the response. There is no suggestive evidence at least one of the explanatory variables is a significant predictor of the response. There is convincing evidence at least one of the explanatory variables is a significant predictor of the response. (e) The estimated standard deviation of y when x_1 = 18.2 and x_2 = 44 is sy = 7.34. Use this to obtain the 95% CI for a mu_y 18.2, 43.52, 118.54 f) Use the information in parts (b) and (e) to obtain a 95% PI for yield in a future experiment when x_1 = 18.2 and x_2 = 44. 14.20, 147.87Explanation / Answer
> x1<-c(16.7,17.4,18.4,16.8,18.9,17.1,17.3,18.2,21.3,21.2,20.7,18.5)
> x2<-c(30,42,47,47,43,41,48,44,43,50,56,60)
> y<-c(210,110,103,103,91,76,73,70,68,53,45,31)
> #======= Question (a) =================
> Reg<-lm(y~x1+x2)
> Reg
Call:
lm(formula = y ~ x1 + x2)
Coefficients:
(Intercept) x1 x2
415.113 -6.593 -4.504
> #========= Question (b) ================
> #======= calculate Residuals Standred error or estimate of sigma ==========
> #=== it is positive square root of the mean square error ======
> anova(Reg)
Analysis of Variance Table
Response: y
Df Sum Sq Mean Sq F value Pr(>F)
x1 1 7245.5 7245.5 12.116 0.006930 **
x2 1 10571.2 10571.2 17.677 0.002292 **
Residuals 9 5382.2 598.0
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
> # mean squre error = 598.0
> sqrt(598.0)
[1] 24.45404
> #====================================================
> #===== Question (c) =================================
> data<-data.frame(x1=18.2,x2=44)
> ycap<-predict(Reg, data)
> ycap
1
96.96769
> #===== ycap = 96.97 and value of Y is 70
> Residuals = 96.97 - 70
> Residuals
[1] 26.97
> #=========== Question (d) ===========================
> # for testing the coefficent of the regression model we use f- statistics and p- value
> # multiple regression it will get through summary of the model
> summary(Reg)
Call:
lm(formula = y ~ x1 + x2)
Residuals:
Min 1Q Median 3Q Max
-41.730 -12.174 0.791 12.374 40.093
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 415.113 82.517 5.031 0.000709 ***
x1 -6.593 4.859 -1.357 0.207913
x2 -4.504 1.071 -4.204 0.002292 **
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 24.45 on 9 degrees of freedom
Multiple R-squared: 0.768, Adjusted R-squared: 0.7164
F-statistic: 14.9 on 2 and 9 DF, p-value: 0.001395
> # the calculated F- value is 14.9 with degree of freedom 2 and 9. the corresponding
> # p- value is 0.00139. On the basis of p- value we reject the null hypothesis because
> # the p- value is less than 0.05. hence we say that the coefficent of the rehression model
> # is not equal to zero.
> # ====================== Question (e) ===========================
> # calculate the confidence interval of the ycap when x1=18.2 and x2=44
> data<-data.frame(x1=18.2,x2=44)
> ycap<-predict(Reg, newdata, interval="confidence",level=0.95)
> ycap
fit lwr upr
1 96.96769 80.35953 113.5759
> # lower limit = 80.35953 and upper limit = 113.5759
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.