Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

For many firms, the cost of sales (Y) is a linear function of net revenue (X). N

ID: 3268938 • Letter: F

Question

For many firms, the cost of sales (Y) is a linear function of net revenue (X). Now we consider a company whose predicted cost of sales satisfies a linear relationship with the net revenue as follows. E(Y|X) = beta _0 + beta _1 X To estimate the coefficients and make predictions for the future cost of sales, we collect data for 9 consecutive years. Both variables are measured in the unit of millions of dollars. The data satisfies the following properties: sigma^9_ i = 1 x_i = 42740, sigma^9_ i = 1 y_i = 17841, sigma^9_ i = 1 x_i y_i = 108, 239, 948, and sigma^9_ i = 1 x_i^2 = 259, 833, 038. a) Find the estimate of beta_1, beta^_1, from the method of least squares. How do you interpret it? b) From the first order condition of least squares, calculate beta^_0. Now you are provided with the detailed information about x_i, and y_i as follows. (x_1, ..., x_9) = (1687, 2178, 2649, 3289, 4076, 5294, 6369, 7787, 9411) (y_1, y_9) = (748, 962, 1113, 1350, 1686, 2199, 2605, 3179, 3999) c) What are the regression residuals for each of the nine years? d) Suppose homoskedasticity is satisfied. What is the asymptotic variance for beta^_1, Avar(beta^_1) ? why? e) Let sigma^2 = 3000. Test H_0: beta_1 = 0 against H1: beta_1 > 0. Let alpha = 5%. f) If sigma^2 is unknown, do the same test as in e). Find beta (mu). g) If hereroskedasticity is satisfied, how to estimate the asymptotic variance of beta^_1? (no need to do the calculation)

Explanation / Answer

a. ^1 =  [ n(xy) - (x)((y)]/ [ n (x2 ) - (x)2 ] = [9 * 108238948 - 42740 * 17841] / [9 * 259833038 - 427402 ]

^1 = 211626192/ 511789742 = 0.4135

b. ^0 = [(y) (x2 ) - (x) (xy)]/ [ n (x2 ) - (x)2 ] = [17841 * 259833038 - 42740 * 108238948]/ [  [9 * 259833038 - 427402 ] = 9548593438 / 511789742 = 18.5737

(c) Regression Residual Table

d. As homoskedasticity is satisfied. Mean Square Error or MSE is the asymptotic variance for   ^1 , Avaar( ^1 ).

It is under the  homoskedasticity, the points are randomly scattered around regression line with equal distances.So, we use MSE here as asymptoptic variance.

x y Predicted y Residuals 1687 748 716.1817 31.81835 2178 962 919.2199 42.78013 2649 1113 1113.988 -0.9877 3289 1350 1378.64 -28.6404 4076 1686 1704.08 -18.0805 5294 2199 2207.748 -8.74757 6369 2605 2652.281 -47.2814 7787 3179 3238.652 -59.6524 9411 3999 3910.209 88.79141