If you want to construct a set or function with strange properties, then often a good way of doing so is to define a sequence of sets or functions that converges to some kind of limit. If the sets or functions in the sequence have the sort of behaviour you want on finer and finer distance scales, then the limit may have it at all distance scales.
A familiarity with the basic concepts of real analysis.
Suppose that you are asked to find a function from to that is continuous everywhere and differentiable nowhere. No function that one can define directly by means of a formula seems to have that property, so what can one do instead?
One answer is to use a limiting argument. We shall construct a sequence of functions that converges to a function in some suitable sense: a good choice is uniform convergence, since then if all the are continuous we are guaranteed that will be as well. To get this to work, we shall also want the functions to become "less and less differentiable" as we proceed. The main challenge is to decide what this might mean.
This is not as hard as it sounds, because it can be done by means of a fairly mechanical technique: we write down the epsilon-delta definition of non-differentiability and then try to get our functions to do what is required for smaller and smaller values of whichever of epsilon and delta has "for all" in front of it.
Let's see how this works. We know that will not be differentiable anywhere if for every there exists such that for every we can find and within of such that and differ by at least . The "for all" comes in front of here, so it is natural to try to ensure that has the following property: for every we can find and within of such that and differ by at least 1. Here, 1 is an arbitrary choice of . If it turns out that we are forced to let depend on we can always go back and try again.
Before we think about how to produce a sequence of functions, let's just think whether any function satisfies the above property. It isn't very hard to see that it does: for example, take any function that oscillates a reasonable amount with a wavelength smaller than or comparable to . One could take a sawtooth curve, say, or a sine wave.
Now let's pass to the question of constructing a sequence of functions in such a way that the limit does what we want. First, here are a two ideas that don't work. One cannot take to be because these functions do not converge uniformly. And one cannot take to be because the rapid oscillation is "ironed out" in the limit and we get 0. So what we want seems to be increasingly small and increasingly rapid oscillation that somehow doesn't destroy the cruder and slower oscillation of earlier functions.
Now a construction suggests itself. We let be something like . Then we let approximate closely enough to keep the large-scale properties of , but we superimpose a much smaller and faster oscillation. And we continue this process. The sort of function we might end up with is something like . The precise details are not really necessary – it is clear that this function inherits the bad behaviour of all the functions , or will do if we replace by a function that grows sufficiently rapidly. Or at any rate, it is clear that some approach of this kind will be successful.
It must be stressed that there is nothing special about here – any old function that oscillates will do.
There exists a continuous function from to , known as a space-filling curve. To construct it, one constructs a uniformly Cauchy sequence of functions that wiggle about more and more, and thereby come closer and closer to more and more of . Then the limiting function has the desired property. (This account needs to be expanded. For now look at the Wikipedia article on the topic.)
The Koch snowflake is an example of a continuous function from to the plane, whose image is a path of infinite length. It is built up as a uniform limit of functions whose images are paths with lengths tending to infinity.