### Quick description

If you are trying to optimize a parameter associated with a combinatorial structure, and if the extremal examples appear to be highly "spread about" and unstructured, then you may well do best to consider examples that have been generated randomly. You are unlikely to prove an exact formula this way, but impressively sharp results can be obtained, often of results that nobody knows how to prove in any other way.

This (unfinished) article is a general introduction to the method. It contains a few easy examples, together with some discussion about when one should expect it to be useful. Links to more detailed articles about specific techniques associated with the probabilistic method can be found on the probabilistic combinatorics front page.

### Prerequisites

Basic concepts of combinatorics and graph theory.

### Example 1

The Ramsey number is defined to be the smallest such that if you colour the edges of the complete graph with two colours, then there must be vertices such that all the edges joining them have the same colour. We call such a collection of vertices a *monochromatic *. There is a nice inductive argument due to Erdős and Szekeres that shows that is at most .

There is also a simple but revolutionary argument of Erdős that proves that is at least . It goes as follows. Let us colour the edges of as follows. For each edge we toss a coin, and if it comes up heads then we colour the edge red, and if it comes up tails then we colour the edge blue. In other words, we colour the edges randomly.

To see that this works, let us work out the expected number of monochromatic s. First, we note that there are possible sets of vertices. Secondly, we note that for any given set of vertices, the probability that all the edges linking the vertices are red is , and so is the probability that these edges are all blue. Therefore, the expected number of monochromatic s is .

A straightforward computation shows ( You want the details? OK, here are the details. Trivially, , which can be shown to be at most . Therefore, the expectation is at most . For this to be less than 1 we need to be less than , for which it is sufficient if . This one can check by taking logs, rearranging, and exponentiating. So we really can conclude ) that if , then this expected number is less than 1. But this implies that there is a non-zero probability that the actual number is 0. In other words, there exists a colouring with no monochromatic .

### General discussion

The above argument is an example of an averaging argument, since it depends on the principle that a random variable must have a non-zero probability of being less than its average (and also a non-zero probability of being more than its average). This very simple principle, often known as Markov's inequality, is surprisingly powerful in probabilistic combinatorics. However, there are many circumstances where it is not strong enough: it is for this reason that the probabilistic method can be considered an entire area of mathematics rather than a single clever observation.

## Comments

## There might be a mistake

Sun, 24/05/2009 - 23:16 — Anonymous (not verified)There might be a mistake

According to the line "There is also a simple but revolutionary argument of Erd?s that proves that is at least (1+0(1))k 2^k /e sqr(2). "

If (1+0(1))k 2^k /e sqr(2) is the right estimate, then there might be a mistake in the straightforward computation part in the article. For the first estimate of n, it might miss the constant k and without sqr(2). Thus, the final estimate of n is missing a k and the sqr(2)is justified from taking the -1/2 power of 2^[(k-1)/2].