Table 2. Temperatures and AC operations Outdoor Temp ACIndoor Temp Index Cold Ho
ID: 3589277 • Letter: T
Question
Table 2. Temperatures and AC operations Outdoor Temp ACIndoor Temp Index Cold Hot Hot Hot Off On Off On On Off On Off On Off Cold Cold Hot Hot Cold Cold Cold Hot Cold Hot Cold Cold Cold Cold 10 Problem 6. (20 points) Kevin learns Naive Bayes classifier in the class. So he uses the dataset in Table 2 to train the Naive Bayes Classifier. He further observes a new data record as follows Xe{"Outdoor Temp": "Hot", “AC Running Condition"-"On") Please illustrate how to use Naive Bayes classifier to predict the label "Indoor Temp" from the above data object.Explanation / Answer
Naive mathematician classifiers area unit extremely ascendible, requiring range|variety} of parameters linear within the number of variables (features/predictors) in an exceedingly learning downside. Maximum-likelihood coaching may be done by evaluating a closed-form expression that takes linear time, instead of by pricey repetitive approximation as used for several alternative sorts of classifiers.
we simply need to convert them to possibilities and multiply them bent get the expected classification. Let’s say we have a tendency to needed to search out the chance that the review did not am passionate about it expresses a negative sentiment. we'd realize the entire range of times the word did not occured within the negative reviews, and divide it by the entire range of words within the negative reviews to urge the chance of x given y. we'd then do an equivalent for like and it. we'd multiply all 3 possibilities, so multiply by the chance of any document expressing a negative sentiment to urge our final chance that the sentence expresses negative sentiment.
We would do an equivalent for positive sentiment, so whichever chance is bigger would be the category that the review is allotted to.
P(Evidence| familiar Outcome) to P(Outcome|Known Evidence). Often, we all know however oftentimes some explicit proof is determined, given a familiar outcome. we've got to use this familiar truth to cypher the reverse, to cypher the prospect of that outcome happening, given the proof.
P(Outcome only if we all know some Evidence) =
P(Evidence only if we all know the Outcome) times Prob(Outcome), scaled by the P(Evidence)
The classic example to grasp Bayes' Rule:
In the given table data
a data set consists of fields : Index, outdoor tem.ac.indoor temp.
Generally we analyse the data that is if the oudoor temp is hot AC is on and indoore temp is cold.
If we analyze the dataset Outdoor temp is hot then AC is on similar AC is off outdoor temp is Cold.
Surrounding data : Indoor air temperature and oydoor air temperature
Behavioral data AC operation (“Turn on,” “Turn off”), temperature hot or cold.
if we want to predict Indoor temp: then we need get the value of out dorre temp.These two linked together.
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.