One of the primary ways CPUs are made faster and more energy efficient is by reducing the size of the transistors. This both reduces energy lost as heat and increases the number of transistors they can contain, which roughly translates to higher performance. Historically, the number of transistors per square inch has doubled approximately every two years (known as "Moore's law"), but that has slowed within recent years.

Traditionally, the density metric used was the size of the smallest transistors, but that doesn't seem to necessarily be the case anymore, and the exact meaning of the stated measurements seem a little hazy. I found multiple conflicting definitions, and they seem to vary between companies.

In 1971, the smallest process size was 10µm. By 1985, this had been reduced to 1µm. Ten years later, in 1995, it was down to 350nm; and 2004 saw a shrink to 90nm. In 2014 Intel released its 14nm Broadwell microarchitecture, and 10nm microarchitectures are expected in 2017.

10nm is really, really small. At and below 7nm, quantum tunneling of electrons between transistors becomes a problem.