# 7. The Probabilistic Method

## Tournament $$T(V,E)$$: $$n$$ players, each pair has a match, $$u$$ points to $$v$$ if and only if $$u$$ beats $$v$$.

$$k$$-paradoxical: For every $$k$$-subset $$S$$ of $$V$$, there is a player in $$V \setminus S$$ who beats all players in $$S$$.

Theorem (Erdős 1963): If $${n \choose k} \left( 1 - 2^{-k} \right)^{n-k} < 1$$ then there is a $$k$$-paradoxical tournament of $$n$$ players.

Proof: Pick a random tournament $$T$$ on $$n$$ players $$[n]$$. (Random $$\rightarrow$$ each player beat the other by probability $$1 / 2$$). For any $$S \in {[n] \choose k}$$, define event $$A_S$$ as no player in $$V \setminus S$$ beat all players in $$S$$.

$Pr[u \in S, \text{ beaten}] = 1 - (1 / 2)^k$

$Pr[A_S] = \left( 1 - 2^{-k} \right)^{n-k}$

Pr[\text{No $$k$$-paradoxical}] \leqslant \sum_{S \in {[n] \choose k}} Pr[A_S] < 1.

## The Probabilistic Method

• Pick random bal from a box, $$Pr[\text{is blue}] > 0 \Rightarrow$$ there is a blue ball.
• Define a probability space $$\Omega$$, and a property $$P$$: $\underset{x}{Pr}[P(x)] > 0$ $$\Rightarrow \exists x \in \Omega$$ with the property $$P$$.

## Averaging Principle

• Average height of the students in class is $$l$$. $$\Rightarrow$$ there is a student of height $$\geqslant l (\leqslant l)$$.
• For a random variable $$X$$,
• $$\exists x \leqslant E[X]$$, such that $$X = x$$ is possible,
• $$\exists x \geqslant E[X]$$, such that $$X = x$$ is possible.

## Hamiltonian Path in Tournament

Hamiltonian path: a path visiting every vertex exactly once.

Theorem (Szele 1943): There is a tournament on $$n$$ players with at least $$n! 2^{-(n-1)}$$ Hamiltonian paths.

Proof Pick a random tournament $$T$$ on $$n$$ players $$[n]$$. For every permutation $$\pi$$ of $$[n]$$, define $X_\pi = \begin{cases} 1, & \pi \text{ is a Hamiltonian path}, \\ 0, & \text{otherwise}.\end{cases}$

$E[X_\pi] = Pr[X_\pi = 1] = (1 / 2)^{n-1}$

$E[X] = \sum_\pi E[X_\pi] = n! 2^{-(n-1)}$

## Large Independent Set

An independent set $$S$$: no adjacent vertices in $$S$$.

Max independent set is NP-hard.

A uniform $$S$$ is very unlikely to be an independent set!

Theorem Graph $$G$$ has $$n$$ vertices and $$m$$ edges, then there exists an independent set $$S$$ of size $$\dfrac{n^2}{4m}$$.

Proof

1. Sample a random $$S$$: each vertex is chosen independently with probability $$p$$
2. modify $$S$$ to $$S^*$$ (an independent set): $$\forall uv \in E$$, if $$u, v \in S$$, then delete one of $$u$$, $$v$$ from $$S$$.

Define $$Y$$: # of edges in $$S$$, we have

$Y = \sum_{uv \in E} Y_{uv}, \ \ \ Y_{uv} = \begin{cases}1, & u, v \in S, \\ 0, &\text{otherwise}.\end{cases}$

$E[\vert S \vert] = np, \ \ \ E[Y] = \sum_{uv \in E} E[Y_{uv}] = mp^2$

$E[\vert S^* \vert] \geqslant E[\vert S \vert] - E[Y] = np - mp^2$

Let $$p = \dfrac{n}{2m}$$, we have $$E[\vert S^* \vert] \geqslant \dfrac{n^2}{4m}$$.

$$\dfrac{n^2}{4m} = \dfrac{n}{2d}$$, where $$d = \dfrac{2m}{n}$$ is the average degree.

## Markov's Inequality

For non-negative $$X$$, for any $$t > 0$$, $Pr[X \geqslant t] \leqslant \dfrac{E[X]}{t}.$

Proof Define $$Y$$ as

$Y = \begin{cases}1, & X \geqslant t, \\ 0, & \text{otherwise}. \end{cases}$

$Y \leqslant \lfloor \dfrac{X}{t} \rfloor \leqslant \dfrac{X}{t},$

$Pr[X \geqslant t] = E[Y] \leqslant E\left[ \dfrac{X}{t} \right] = \dfrac{E[X]}{t}.$

## Chromatic Number

Graph $$G(V,E)$$, define

• girth $$g(G)$$: length of the shortest circle,
• chromatic number $$\chi(G)$$: mnimum number of color to properly color the vertices of $$G$$.

Intuition: large cycles are easy to color.

Theorem (Erdős 1959) For all $$k$$, $$l$$, there exists a finite graph $$G$$ with $$\chi(G) \geqslant k$$ and $$g(G) \geqslant l$$.

Define independence number $$\alpha(G)$$: size of the largest independent set in $$G$$.

$\text{if } \alpha(G) \leqslant \dfrac{n}{k},$

$\text{then } \chi(G) \geqslant \dfrac{n}{\alpha(G)} \geqslant k.$

Theorem' For all $$k$$, $$l$$, there exists a finite graph $$G$$ with $$\alpha(G) \leqslant k$$ and $$g(G) \geqslant l$$.

Define a random graph of $$\vert V \vert = n$$ vertices $$G(n, p)$$: forall $$u, v \in V$$, $$Pr[{u, v} \in E] = p$$ (independently).

A uniform random graph: $$G(n, \dfrac{1}{2})$$.

Plan: fix any large $$k$$, $$l$$, exists $$n$$, generate a random graph $$G \sim G(n, p)$$ and

$\begin{cases}Pr[\alpha(G) > n / k] < 1 / 2, \\ Pr[g(G) < l] < 1 / 2. \end{cases}$