Interpretation of MLE We use maximum likelihood to estimate parameters because t
ID: 3303255 • Letter: I
Question
Interpretation of MLE
We use maximum likelihood to estimate parameters because the parameter value with the highest likelihood is the value that has the highest probability of being correct. ‘Likelihood’ is just a different word for ‘probability’.
Here are 1 statements or question about statistics, mainly about regression, for you to ponder and comment on.
Is this statement true, false or does it truth depend on unstated conditions?
In the last case, on what conditions does it depend on and how?
PLEASE SHOW ME SPECIFIC PROCESS OF THE ANSWER
I REALLY HAVE NO IDEA HOW TO ANSWER THIS QUESTION
COULD YOU PLEASE EXPLAIN MORE SPECIFICLY SO THAT I CAN KNOW MUCH MORE ABOUT THAT.
TNANK YOU SO MUCH
Explanation / Answer
See MLE is the value of the parameter that supports the given sample best.IN other words we maximise the likelihood function given that sample with respect to the parameter.therefore the MLE maximises the likelihood function given the sample.we assume that the sample we have got had the highest probability of being selected and that is why we have got this.sample is like God here.so we need to find the value of the parameter the r which the given sample is most likely to occur.
Now in case of discrete distribution pmf is a probability and hence likelihood function is also a probability.but in case of a continuous variable likelihood function is a density and it is not a probability because PDF is probability d nsity function.
I hope it clears your doubt
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.