Randomized Algorithms: Problem Set 2
1 Problem 1
Karger’s paper notices me that we should use Karger’s algorithm to calculate the -min-cut rather than calculate the probability Karger’s algorithm exactly returns an -min-cut.
Our algorithm runs as follows way: record the vertex which has the minimum degree (choose arbitrary one if there are multiple vertices have minimum degree) and the edges incident to it before and after each iteration, then return the edges which incident to the vertex which has the minimum degree over all the records.
Let be the size of min-cut. Without lost of generality, we can assume the minimum degree of any graph during the iterative procedure is (note that the -min-cut will be returned trivially otherwise). Since , we calculate the former to lower bound the latter. Every vertex of the vertice graph has degree at least , thus the graph has at least edges, and the probability a min-cut edge is contracted is at most . Therefore
For any -min-cut, the probability that the -th contraction contracts an edge in the cut is at most . The probability that all the edges in the cut survive in the first times contraction is at least
Notice that the factor with be undifned when . This implies we should stop when there remains vertices. The probability that the edges in the cut survive in the moment is at least
We then try to parition the remained vertices into two parition. Since there are ways to parition the vertices, the probability the cut is returned is . In conclusion, any -min-cut is returned with probablity , there are at most -min-cuts.
2 Problem 2
Let be the coin flipping results. For any 3 , let . Then
Therefore the are not mutually independent.
Let . If , are trivially independent with . Assuming w.l.o.g.,
Similarly we can prove , therefore the are pairwise independent. Note
Since and are independent, , i.e. .
Due to the Chebyshev’s inequality,
Let and ,
3 Problem 3
Let random variables denote the bin -th ball is hased into, indicator random variables denote whether -th ball and -th ball are hashed into same bin, be # of collision. Then due to the linearity of expectation, . Since the hash function is pairwise independent, ,
Let , we have , i.e. .
To generalize this argument, we define indicator random variables denote whether the balls in collide, be # of -collision. Then due to the linearity of expectation, .
Let , we have , i.e. .
can not be improved if the only thing we assume is -universality and we cosider the number of collisions only. Since the only thing we know will be the expected number of collisions.
Discarding the collisions, the only thing we know will be the marginal distribution of the load of arbitrary single bin which is deduced by Chebyshev’s Inequality. But we still can not characterize correlations of the loads betwesn different bins. By the way, we can prove a same bound by the union bound.
Therefore I believe the bound can not be improved if the only thing we assume about the hash function is the -universality.
4 Problem 4
We uniformly and independently assign each vertex to or at random. For any edge , the probability is partitioned into and is partitioned into , i.e. the probability is summed into the solution, is . Then due to the linearity of expectation,
Therefore our random sampling algorithm is -approximation.
Consider as a 0-1 indicator which indicates whether vertex is partitioned to . Since , and if and only if . Therefore is a 0-1 indicator which indicates whether edge is going from to . Finally the objective function is maximized to achieve the optimal value.
To construct an LP relaxation of the above integer programming, the constraint is relaxed to . Let and be the fractional optimal solutions to the linear programming. Observe that
Notice that the function is symmetric about line , we try to make our function ( is the probability the edge is going from to ) have this property too. This implies , i.e.
Then we have , and
Due to the symmetry, we focus on the area only. Note that in this area. Our goal, i.e. the approximation ratio function, becomes
By brute force enumerating the value of , we found that when , the approximation ratio will be exact . Therefore we get and a good approximation ratio .
Let be an indicator which indicates whether , the constraint guarantees every intersects with . Finally the objective function is maximized to achieve the optimal value.
Relax the constraint to . Let be the fracional optimal solutions to the linear programming. Try to independently choose each element at random with probability . For some subset , we assume it contains elements w.l.o.g.. Then is the probability the subset is hit. By the linear programming constraints,
Then the value of is minimized when all . Thus the probability that is hit is
Note that R.H.S. is monotonic decreasing when , and . Therefore every subset is hit with probability at least . This implies we may repeat this random choosing procedure times.
Therefore after times random choosing procedures, the probability all subset was hit is at least . Besides, the expected increment of weight after each turn of the random choosing procedure is . In conclusion, our algorithm succeeds with probability at least and return weight expectedly.
- 1 Karger, David R. ”Global Min-cuts in RNC, and Other Ramifications of a Simple Min-Cut Algorithm.” SODA. Vol. 93. 1993.
RA.PS2 by himemeizhi is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.