If you thought the infamous six blind men found too many ways to misunderstand an elephant, you don't want to face the chaos that's surrounding a current Slashdot discussion of Moore's Law. "It says things get faster!" "No. it says things get cheaper!" "No, it says you'll only actually do things if they take less than a certain threshold time to get them done!"
Wrong, wrong, wrong, and in a way that actually matters.
Moore's Law is a statement about the economics of building chips: a prediction, which has proved remarkably accurate, that optimal per-transistor cost will be found — as time goes by — in chip designs of ever-rising device count.
That leaves it up to chip designers to find something actually useful to do with all those devices (e.g., transistors and gates) — and of late, the best thing they've been able to do is to build chips with ever larger numbers of processing cores. It's proven to be no small task to reoptimize software for best use of that new hardware environment.
What that means is that the future jurisdiction of Moore's Law is in realms where concurrency is easy to achieve: in places like the multi-tenant facilities of the true cloud computing service provider. The economics of the cloud are already attractive, and within the next three years (two Moore cycles) they'll be utterly compelling.