Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

anyone here knows R and can help with this? 5. R, Bonus In class we talked about

ID: 3350659 • Letter: A

Question

anyone here knows R and can help with this?

5. R, Bonus In class we talked about the maxiu ikelihood estimation. In some cases, after writing down the log-likelihood function, we may simply take the first derivative. However, in some other cases, we may not be so lucky and the function can only be maximized numerically. (For example, the log-likelihood function may be non-differentiable.) Now suppose X1, , Xn i id. n 110,0 . code the log-likelihood function in R and figure out how to obtain the estimates by maximizing the log-likelihood function numerically. Since we know exactly the MLEs of 0 and , you are able to compare the answers of yours with the theoretical results via Monte Carlo experiments Hint #1 You may need the command optim (https://stat.ethz.ch/R-manual/R-devel/library/stats/html/optim.html). Hint #2 Remember that to maximize L (|x) is the same as to minimize-L( )

Explanation / Answer

rm(list=ls(all=TRUE))
mu=5; sg=2
x=rnorm(500,mu,sg)
n=length(x)
##################### normal ##########
LogL=function(th){
sg=th[1]; mu=th[2]
z=-(n/2)*log(2*pi)-n*log(sg)-sum(((x-mu)^2)/(2*(sg^2)))
return(-z)
}
M=nlm(LogL, c(sg,mu), hessian=T)
Msg=M$estimate[1]; Mmu=M$estimate[2];
np=c(Mmu,Msg);np

[1] 5.062249 2.001793
########## Using numerical ##############
mu0=mean(x)
sg0=sqrt((1/n)*sum((x-mu0)^2))
mp=c(mu0,sg0);mp

[1] 5.062252 2.001794

From the result we can see that theroetical results is similar to obtained by command.