Section 5.2, Randomized Algorithms, Juraj Hromkovic, Springer.
5.2.1 Fundamentals
Let $Random_A(x)$ be the maximum number of random bits used over all random runs (computations) of $A$ on $x$. Then, for every $n \in \mathbb{N}$, $$Random_A(n) = \max\{Random_A(x) | x \text{ is an input of size } n\}.$$
For every run (random computation) $C$ of a randomized algorithm $A$ on an input $x$, one can consider the probability of the execution of this run on $x$, denoted by $Prob_{A, x}( C )$. The probability that $A$ outputs $y$ for an input $x$, $Prob(A(x) = y)$, is the sum of all $Prob_{A, x}( C )$ where $C$ outputs $y$. The aim of randomized algorithm designer is to achieve high $Prob(A(x) = y)$ if $y$ is the correct output for the input $x$.
Different runs of $A$ on an input $x$ may also have different lengths, i.e., they may have different costs. So the time complexity becomes a random variable. Let $Time( C )$ be the time complexity of the run $C$ of $A$ on $x$. Then the expected time complexity of $A$ on $x$ is $$Exp-Time_A(x) = \mathbb{E}[Time] = \sum\limits_C Prob_{A, x}( C ) \times Time( C ),$$ $$Exp-Time_A(n) = \max \{Exp-Time_A(x) | x \text{ is an input of size } n\},$$ where the sum is taken over all runs $C$ of $A$ on $x$.
It is often not easy to analyze $Exp-Time_A(n)$ for a given randomized algorithm $A$. To overcome this difficulty one also uses the worst case approach from the beginning. This means $$Time_A(x) = \max \{Time( C ) | C \text{ is a run of } A \text{ on } x\}.$$
Then, the (worst case) time complexity of $A$ is $$Time_A(n) = \max\{ Time_A(x) | x \text{ is an input of size } n\}.$$
5.2.2 Classification of Randomized Algorithms
Las Vegas Algorithms
A randomized algorithm $A$ is a Las Vegas algorithm computing a problem $F$ if for any input instance $x$ of $F$, $$Prob(A(x) = F(x)) = 1,$$ where $F(x)$ is the solution of $F$ for the input instance $x$. For this definition the time complexity of $A$ is always considered to be the expected time complexity $Exp-Time_A(n)$.
The second definition is that a randomized algorithm $A$ is a Las Vegas algorithm computing a problem $F$ if for every input instance $x$ of $F$, $$Prob(A(x) = F(x)) \geqslant \dfrac{1}{2},$$ and $$Prob(A(x) = “?") = 1 - Prob(A(x) = F(x)) \leqslant \dfrac{1}{2}.$$ In this approach, one may consider $Time_A(n)$ as the time complexity of $A$ because the nature of this second approach is to stop after $Time_A(\vert x \vert)$ computing steps on $x$, and to consider a new start (run) on $x$ from the initial configuration if the output was not computed in $Time_A(\vert x \vert)$ steps.
The first approach to defining Las Vegas algorithms is usually used for computing a function, while the second one is often used for decision problems.
One-Sided-Error Monte Carlo Algorithms
This type of randomized algorithm is considered for decision problems only. Let $L$ be a langauge, and let $A$ be a randomized algorithm. We say that $A$ is a one-sided-error Monte Carlo Algorithm recognizing $L$ if
- for every $x \in L$, $Prob(A(x) = 1) \geqslant \dfrac{1}{2}$, and
- for every $x \notin L$, $Prob(A(x) = 0) = 1$.
A one-sided-error Monte Carlo algorithm may err in one direction only.
Two-Sided-Error Monte Carlo Algorithms
Let $F$ be a computing problem. We say that a randomized algorithm $A$ is a two-sided-error Monte Carlo algorithm computing $F$ if there exists a real number $\varepsilon$, $0 < \varepsilon < \dfrac{1}{2}$, such that for every input $x$ of $F$, $$Prob(A(x) = F(x)) \geqslant \dfrac{1}{2} + \varepsilon.$$
Unbounded-Error Monte Carlo Algorithms
These are the general randomized algorithms also called Monte Carlo algorithms. Let $F$ be a computing problem. We say that a randomized algorithm $A$ is a (unbounded-error) Monte Carlo algorithm computing $A$ if, for every input $x$ of $F$, $$Prob(A(x) = F(x)) > \dfrac{1}{2}.$$
Randomized Optimization Algorithms
Definition 5.2.2.10 Let $U = (\Sigma_I, \Sigma_O, L, L_I, \mathcal{M}, cost, goal)$ be an optimization problem. For any positive real $\delta > 1$ a randomized algorithm $A$ is called a randomized $\delta$-approximation algorithm for $U$ if
- $Prob(A(x) \in \mathcal{M}(x)) = 1$, and
- $Prob(R_A(x) \leqslant \delta) \geqslant \dfrac{1}{2}$.
for every $x \in L_I$.
For every function $f: \mathbb{N} \rightarrow \mathbb{R}^+$, we say that $A$ is a randomized $f(n)$-approximation algorithm for $U$ if $$Prob(A(x) \in \mathcal{M}(x)) = 1 \text{ and } Prob(R_A(x) \leqslant f(\vert x \vert)) \geqslant \dfrac{1}{2}$$ for every $x \in L_I$.
A randomized algorithm $A$ is called a randomized polynomial-time approximation scheme (RPTAS) for $U$ if there exists a function $p : L_I \times \mathbb{R}^+ \rightarrow \mathbb{N}$ such that for every $(x, \delta) \in L_I \times \mathbb{R}^+$,
- $Prob(A(x, \delta) \in \mathcal{M}(x)) = 1$ (for every random choice $A$ computes a feasible solution of $U$),
- $Prob(\varepsilon_A(x, \delta) \leqslant \delta) \geqslant \dfrac{1}{2}$ (a feasible solution whose relative error is at most $\delta$ is produced with probability at least 1/2), and
- $Time_A(x, \delta^{-1}) \leqslant p(\vert x \vert, \delta^{-1})$ and $p$ is a polynomial in $\vert x \vert$.
If $p(\vert x \vert, \delta^{-1})$ is polunomial in both its arguments $\vert x \vert$ and $\delta^{-1}$, then we say that $A$ is a randomized fully polynomial-time approximation scheme (RFPTAS) for $U$.
Definition 5.2.2.11 Let $U = (\Sigma_I, \Sigma_O, L, L_I, \mathcal{M}, cost, goal)$ be an optimization problem. For any positive real $\delta > 1$ a randomized algorithm $A$ is called a randomized $\delta$-expected approximation algorithm for $U$ if
- $Prob(A(x) \in \mathcal{M}(x)) = 1$, and
- $\mathbb{E}[R_A(x)] \leqslant \delta$
for every $x \in L_I$.
5.2.3 Paradigms of Design of Randomized Algorithms
- Foling an adversary.
- Abundance of witnesses.
- Fingerprinting.
- Random sampling.
- Relaxation and random rounding.
Loading Comments By Disqus