Tricki
a repository of mathematical know-how

Revision of Useful heuristic principles for guessing probabilistic estimates from Mon, 15/12/2008 - 18:40

Quick description

If you are hoping to use the probabilistic method as part of a long and complicated proof, you will probably want to begin by producing a plausible sketch of the argument before you go into the technical details. For this purpose it is very useful to be able to guess upper (and sometimes lower) bounds for the probabilities of various complicated events. This article discusses heuristic principles that can help one do this, and illustrates them with examples.

Principle 1: pretend your variables are independent

This is the single most useful method for guessing probabilistic bounds: if you have some variables that exhibit a reasonable degree of independence, then they will probably give estimates that are very similar to the ones that would apply if they actually were independent. Of course, one needs to be clear about what "reasonably independent" might mean, so let us look at a few examples.

Example 1

Let G be a random graph with n vertices in which each edge is chosen independently with probability \lambda n^{-1}. Let \tau be the number of triangles in G. Then \tau is a random variable: how should we expect \tau to be distributed?

Given any triangle T in the complete graph on the n vertices of G, then T will belong to G with probability \lambda^3n^{-3}. The number of such triangles is \binom n3, which is about n^3/6, so the expectation of \tau is about \lambda^3/6. Also, if T_1,\dots,T_k are disjoint triangles, then the events "T_i belongs to G" are independent; and a typical pair of triangles will be disjoint.

It is clear then that we have a lot of independence about. What would happen if the events "triangle T belongs to G" were all independent? Then \tau would be a sum of \binom n3 Bernoulli random variables, each with probability \lambda^3n^{-3}. In other words, it would be counting the number of occurrences when you have lots of independent unlikely events. The probability distribution that's appropriate for this is the Poisson distribution, so we might guess that \tau is distributed roughly like a Poisson random variable of mean \lambda^3/6.

Parent article

Probabilistic combinatorics front page