In the last post we developed the machinery of generic ultrapowers, which enabled us to go from a uniform normal ideal in to getting an elementary embedding with lying in a generic extension of , *if* we assume that we have a Woodin cardinal. The model furthermore enjoyed the property of being closed under countable sequences in the forcing extension. We now introduce a generalisation of such an embedding, which grants more closure — this is precisely given by the *stationary tower*.

So, say we want to construct a forcing notion such that a generic for induces an elementary embedding satisfying that . If we focus on a particular , we essentially want to show that is “-supercompact” in .

To witness this we need a normal fine ultrafilter over , so our forcing notion should approximate such measures for every *simultaneously*. Since we want our resulting measures to be normal, we’re not interested in considering non-stationary sets. A naive attempt would then be to define

.

What should the ordering be? We at least want to connect stationary subsets of with stationary subsets of for every . If is stationary then , so we don’t really need to keep track of the ‘s as well. The first condition is then that for ,

.

We want something more than this though, as otherwise everything would be comparable. For, say, such that , we would want that iff . This leads us to the following definition.

Definition.The-stationary towerfor some uncountable , associated to a Woodin cardinal , is the poset,

where iff and .

The -stationary tower is denoted by and called the **full stationary tower**, and the -stationary tower is denoted by and called the **countable stationary tower**.

So, let’s see that this actually works. I’ll focus on the full stationary tower, but the same proof works for any other variant as well. We need to find a -extender in witnessing the existence of a such that is closed under sequences in of length . To each we define a measure given by

iff for some such that .

Proposition.is a -measure over , for every non-empty .

**Proof.** That is a filter over follows from being a filter over . To show that is an ultrafilter, let be any set. We then want to show that

is dense.

Let therefore be arbitrary and define the two sets

and .

Then as is stationary, one of the two is stationary, say . Then we’re done, since and .** QED**

That is actually -normal and -fine, i.e. that the corresponding embedding satisfies that is closed under -sequences in , is quite a long proof and I won’t give it here. The proof goes indirectly by showing that if satisfies that given any -sequence of predense sets of we can find an inaccessible such that every predense set restricted to is so-called *semi-proper*, then is closed under -sequences. It turns out that when is Woodin, this condition is satisfied.

We still need to show that all the resulting ultrapowers give rise to a *single* ultrapower, giving us closure under all sequences of length simultaneously. In other words, we need a directed system, just like we have with extenders. But this turns out to be straightforward, using the following lemma, which follows just by going through the definitions.

Lemma.Let and .

- If is stationary then is stationary.
- iff .

We can then for define maps as , where . This grants us with a directed system, so taking the direct limit of the ultrapower we wind up with our elementary in the generic extension such that is closed under -sequences in .

One neat feature is that we can more or less choose the critical point of to be *anything we want*. More specifically, we can set it to be *any* uncountable regular cardinal . To see this, note first that is stationary in , so that . Now let be -generic such that . Analogously to measures where iff , we get that . This means that is an ordinal, so that , and furthermore that , making it the critical point of .

Summing up, we more or less did the same thing as with the generic ultrapowers, but we needed a way to go from only a single generic to get ultrapowers at every “level”, i.e. on every , which were coherent. Now, given this new tool at hand, what can we use it for? It turns out that the stationary tower forcing and the genericity iterations seem to give “dual” proofs to various applications. I’ll get back to that later.

For now, all there is left to say is: merry Christmas everyone!