Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Problem 1 Chernoff Bound: Q.10.1 on P. 267: 20pts Theorem 1. Let X1, · · · , Xm

ID: 3832056 • Letter: P

Question

Problem 1 Chernoff Bound: Q.10.1 on P. 267: 20pts Theorem 1. Let X1, · · · , Xm be indepdent and identically distributed indicator random variables, with µ = E[Xi ]. If m (3 ln(2/))/( 2µ) then P r | 1 m Xm i=1 Xi µ| µ Please prove the above theorem by applying Chernoff bound. Spring 2017 CS 538

Problem 2 3 Problem 2 Application: DNF Counting 4*5 pts In class we talk about two algorithms (see chapter 10 in the textbook) for estimating the number of solutions for a given disjunctive normal form F. Let us call the Naive approach Alg-I (sampling from the solution space) and the one that cleverly samples from the TRUE solutions space as Alg-II. Let us assume that F contains t clauses and n variables. Please answer the following:

(a) Why is Alg-I not a good sampling approach

(b) In Alg-II, why is |S|/|U| 1/t? In what scenario we will have |S|/|U| = 1/t? And in what scenario we have |S|/|U| = 1?

(c) If F is a k-DNF, what is the value |SCi |/|U| Is it a fixed number for all i? Spring 2017 CS 538 Problem

4 (d) What is the major difference in sample space between Alg-I and Alg-II and why Alg-II is better? Spring 2017 CS 538

Problem 3 5 Problem 3 Application: DNF Alg-I: Q.10.4 on P. 267: 20pts Suppose we have a class of instances of the DNF satifiability problem, each with (n) satisfying truth assignments for some polynomial . Suppose we apply the naive appraoch of sampling assginments and checking wether they satisfy the formula. Show that, after sampling 2n/2 assginments, the probability of finding even a single satifying assignment for a given instance is exponenitally samll in n. [Hint: Union bound] Spring 2017 CS 538

Problem 4 6 Problem 4 FPRAS: : Q.10.3 on P. 267: 20pts Show that the following alternative definition is equivalent to the definition of an FPRAS given in the chapter: A fully polynomial randomized approximation scheme (FPRAS) for a problem is a randomized algorithm for which given an input x and any parameter with 0 < < 1 the algorithm outputs and (, 1/4)-approximation in time that is polynomial in 1/ and the size of the input x (Hint: To boost the probability of success from 3/4 to 1 consider the median of several independent runs of the algorithm. Why is the median a better choice than the mean?)

Problem 5 Practice 1: Q.10.6 on P. 267: No need to turn in In question 10.6 in the textbook, it was mentioned that we can apply a strategy to throw away edges. How did you determine k(n) and how effective is this approach?

Explanation / Answer

Before we venture into Chernoff bound, let us recall Chebyshev’s inequality which gives a simple bound on

the probability that a random variable deviates from its expected value by a certain amount.

(Chebyshev’s Inequality)

.

Let X:S R be a random variable with expectationE ( X )and variance Var ( X ) Then, for any aR

P(|XE(X)/a)Var(X)a2

As we are not able to improve Markov’s Inequality and Chebyshev’s Inequality in general, it is worth to

consider whether we can say something stronger for a more restricted, yet interesting, class of random

variables. This idea brings us to consider the case of a random variable that is the sum of a number of

independent random variables.

This scenario is particularly important and ubiquitous in statistical applications. Examples of such

random variables are the number of heads in a sequence of coin tosses, or the average support obtained by

a political candidate in a poll.

Can Markov’s and Chebyshev’s Inequality be improved for this particular kind of random variable?

Before confronting this question, let us check what Chebyshev’s Inequality (the stronger of the two) gives us

for a sum of independent random variables.

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote