Number of questions: Positive points per question: Negative points per question:
ID: 3591876 • Letter: N
Question
Number of questions: Positive points per question: Negative points per question: 3.0 1.0 1. Suppose our training set consists of the three negative points (1,4). (3,3), and (3,1) and the two positive points (3,6) and (5.3). If we use nearest-neighbor learning, where we classify a point to be in the class of the nearest member of the training set, what is the boundary between the positive and negative points? Identify in the list below, the point that is classified as positive. 0 a) (3.6.4.1) Ob) (3.9, 4.1) Oc) (4.3, 1.6)Explanation / Answer
1. Lets first calculate for (4.3,1.6) as it seems nearer to the two given positive points:
we use square of distance formula for the same:
take positive points (3,6) and (5,3)
with 3,6
(3-4.3)2 +(6-1.6)2= 1.69+ 19.36=21.05
with 5,3
(5-4.3)2 +(3-1.6)2=0.49+1.96=2.45
Now take another point (4.2,1.9)
check it with two positive points
with (3,6)
(3-4.2)2+(6-1.9)2=0.64+16.81=17.45
with(5,3)
(5-4.2)2+(3-1.9)2=0.64+1.21=1.85 which is smallest distance
So last point (4.2,1.9) is the nearest positive point.
2. attribute A2 gives best split
as it gives + for the classes wich have less difference in impurities of parent and child nodes.
3. a=(1,5), b=(5,8), c=(3,4), d=(1,3)
w=[0,0] and learning rate e =2,1,0.5,0.25
for A,a=hardlim(wp+b)=[0 0] [1 5]+ 2=hardlim(2)=1, new w=wold +e=2
similarly after solving for four training samples w=(2.0,0.25) and point G is misclassified.
Note: only 3 questions are given not 7 as mentioned in the instruction above.
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.