In a particular binary hypothesis testing application, the conditional density f
ID: 3806941 • Letter: I
Question
In a particular binary hypothesis testing application, the conditional density for a scalar feature y given class w1 is Given class w2 the conditional density is a. Find k1 and k2, and plot the two densities on a single graph using Mat-ab/Octave b. Assume that the prior probabilities of the two classes are equal, and that the cost for choosing correctly is zero. If the costs for choosing incorrectly are C12 1 and C21 v5, what is the expression for the Bayes risk? c. Find the decision regions which minimize the Bayes risk, and indicate them on the plot you made in part (a) d. For the decision regions in part (c), what isthe numerical value of the Bayes risk?Explanation / Answer
In a particular binary hypothesis testing application, the conditional density for a scalar feature y given
class w1 is
p(y|w1) = k1 e(y 2 /10)
Given class w2 the conditional density is
p(y|w2) = k2 e((y 6)2 /2)
a. Find k1 and k2, and plot the two densities on a single graph using Matlab.
Solution: We solve for the parameters k1 and k2 by recognizing that the two equations are in the form of the normal Gaussian distribution.
k1e y2 /10 = 1 / 21 2 e (yµ1) 2 /22 1
y 2/ 20 = (y µ1) 2/ 2 2 1
µ1 = 0; 2 1 = 10
k1 = 1/ 20
b. Assume that the prior probabilities of the two classes are equal, and that the cost for choosing correctly is zero. If the costs for choosing incorrectly are C12 = 1 and C21 = 5 (where Cij corresponds to predicting class i when it belongs to class j), what is the expression for the conditional risk?
Solution: The conditional risk for Two-Category Classification is where we can see that the formulas for the risk associated with the action, i , of classifying a feature vector, x, as class, i , is different for each action:
R(1|y) = 11P(1|y) + 12P(2|y)
R(2|y) = 21P(1|y) + 22P(2|y)
The prior probabilities of the two classes are equal:
P(1) = P(2) = 1 /2
The assumption that the cost for choosing correctly is zero:
11 = 0, 22 = 0
The costs for choosing incorrectly are given as C12 and C21:
12 = C12 = 1
21 = C21 = 5
Thus the expression for the conditional risk of 1 is:
R(1|y) = 11P(1|y) + 12P(2|y)
R(1|y) = 0P(1|y) + 1P(2|y)
R(1|y) = 1P(2|y)
And the expression for the conditional risk of 2:
R(2|y) = 21P(1|y) + 22P(2|y)
R(2|y) = 5P(1|y) + 0P(2|y)
R(2|y) = 5P(1|y)
Find the decision regions which minimize the Bayes risk, and indicate them on the plot you made in part (a)
Solution: The Bayes Risk is the integral of the conditional risk when we use the optimal decision regions, R1 and R2. So, solving for the optimal decision boundary is a matter of solving for the roots of the equation:
R(1|y) = R(2|y)
1P(2|y) = 5P(1|y)
1P(y|2)P(2)/ P(y) = 5P(y|1)P(1) /P(y)
Given the priors are equal this simplifies to: 1P(y|2) = 5P(y|1)
Next, using the values, k1 and k2, from part (a), we have expressions for py|1 and py|2 .
1 1 / 12 e (y6)2 /12 = 5 1/ 20 e y 2 /20
e (y6)2 /12 = e y 2/20
(y 6)2 /12 = y 2 /20
20(y 2 – 12y + 36) = 12y 2
8y 2 – 240y + 720 = 0
The decision boundary is found by solving for the roots of this quadratic, y1 = 15 3 15 = 3.381 and
y2 = 15 + 3 15 = 26.62
For the decision regions in part (c), what is the numerical value of the Bayes risk?
Solution: For y < 15 3 15 the decision rule will choose 1
For 15 3 15 < y < 15 + 3 15 the decision rule will choose 2
For 15 + 3 15 <yx the decision rule will choose 1
Thus the decision region R1 isy < 15 3 15 and y > 15 + 3 15, and
the decision region R2 is 15 3 15 < y< 15 + 3 15
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.