Please answer all parts with detailed solutions data: > y=c(0.000450, 0.000450,
ID: 3066782 • Letter: P
Question
Please answer all parts with detailed solutions
data:
> y=c(0.000450, 0.000450, 0.000473, 0.000507, 0.000457, 0.000452, 0.000453, 0.000426, 0.001215, 0.001256, 0.001145, 0.001085, 0.001066, 0.001111, 0.001364, 0.001254, 0.001396, 0.001575, 0.001615, 0.001733, 0.002753, 0.003186, 0.003227, 0.003469, 0.001911, 0.002588, 0.002635, 0.002725)
> x1=c(0.0105, 0.0110, 0.0106, 0.0116, 0.0121, 0.0123, 0.0122, 0.0122, 0.0123, 0.0122, 0.0094, 0.0100, 0.0101, 0.0099, 0.0110, 0.0117, 0.0110, 0.0104, 0.0067, 0.0066, 0.0044, 0.0073, 0.0078, 0.0067, 0.0091, 0.0079, 0.0068, 0.0065)
> x2=c(90.9, 84.6, 88.9, 488.7, 454.4, 439.2, 447.1, 451.6, 487.8, 467.6, 95.4, 87.1, 82.7, 87.0, 516.4, 488.0, 534.5, 542.3, 98.8, 84.8, 69.6, 436.9, 406.3, 447.9, 58.5, 394.3, 461, 469.2)
> x3=c(0.0164,0.0165, 0.0164,0.0187,0.0187,0.0187,0.0186,0.0187,0.0192,0.0192,0.0163,0.0162,0.0162,0.0163,0.0190,0.0189,0.0189,0.0189,0.0163,0.0162,0.0163,0.0189,0.0192,0.0192,0.0164,0.0177,0.0173,0.0173)
>x4=c(0.0177,0.0172,0.0157,0.0082,0.0070,0.0065,0.0071,0.0062,0.0153,0.0129,0.0354,0.0342,0.0323,0.0337,0.0161,0.0149,0.0163,0.0164,0.0379,0.0360,0.0327,0.0263,0.0200,0.0197,0.0331,0.0674,0.0770,0.0780)
Explanation / Answer
Soution: All solution are performed in R and Rstudio
#input the data
y = c(0.000450, 0.000450, 0.000473, 0.000507, 0.000457, 0.000452, 0.000453, 0.000426, 0.001215, 0.001256, 0.001145, 0.001085, 0.001066, 0.001111, 0.001364, 0.001254, 0.001396, 0.001575, 0.001615, 0.001733, 0.002753, 0.003186, 0.003227, 0.003469, 0.001911, 0.002588, 0.002635, 0.002725)
x1=c(0.0105, 0.0110, 0.0106, 0.0116, 0.0121, 0.0123, 0.0122, 0.0122, 0.0123, 0.0122, 0.0094, 0.0100, 0.0101, 0.0099, 0.0110, 0.0117, 0.0110, 0.0104, 0.0067, 0.0066, 0.0044, 0.0073, 0.0078, 0.0067, 0.0091, 0.0079, 0.0068, 0.0065)
x2=c(90.9, 84.6, 88.9, 488.7, 454.4, 439.2, 447.1, 451.6, 487.8, 467.6, 95.4, 87.1, 82.7, 87.0, 516.4, 488.0, 534.5, 542.3, 98.8, 84.8, 69.6, 436.9, 406.3, 447.9, 58.5, 394.3, 461, 469.2)
x3=c(0.0164,0.0165, 0.0164,0.0187,0.0187,0.0187,0.0186,0.0187,0.0192,0.0192,0.0163,0.0162,0.0162,0.0163,0.0190,0.0189,0.0189,0.0189,0.0163,0.0162,0.0163,0.0189,0.0192,0.0192,0.0164,0.0177,0.0173,0.0173)
x4=c(0.0177,0.0172,0.0157,0.0082,0.0070,0.0065,0.0071,0.0062,0.0153,0.0129,0.0354,0.0342,0.0323,0.0337,0.0161,0.0149,0.0163,0.0164,0.0379,0.0360,0.0327,0.0263,0.0200,0.0197,0.0331,0.0674,0.0770,0.0780)
z.lin = lm(y ~., data=z) # running a linear regression model on the data
f)
t.test(x1,x4)
Both the Regressor is not required in the model
g) Checking for Multicollinearity in the data
z = data.frame(y,x1,x2,x3,x4)
round(cor(z),2)
After looking into the data. Most of the predicter variable are Multi correlated.
h) anova(z.lin)
I) Since x1 and X4 are highly correleated with each other, Thus any one variable can be dropped.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.