When I read about the planned new undersea cable from the U.S. to southeast Asia, the first thing that came to mind was Neal Stephenson’s 1999 novel Cryptonomicon — and the non-fiction story he wrote for Wired magazine in 1996 about the wiring of the planet, which clearly served as useful research for some of the novel’s scenes.
In particular, I recalled the conversation in Chapter 92 of the novel that compares anti-cable sabotage to a nuclear war: "Easy to start. Devastating in its results." More cables, separated across more routes, are therefore a Very Good Thing if they reduce anyone’s temptation to make a preemptive strike.
People outside the telecom industry often seem to think that satellite communications make cables almost quaint. Even a simple DSL connection to a home, though, demonstrates the advantages of wire or fiber-optic links: the DSL line will typically provide four to ten times the data rate of a competitive end-user satellite service, with the satellite link being far more vulnerable to weather and having severe packet latency issues. At the macro scale, cable capacity is so abundant and so cheap that the creation of new satellite capacity is falling short of meeting critical needs.
Questions of connectivity seem quite personal to me, just now, because of an experience I had last week. A deteriorating direct-buried cable destabilized my DSL connection, and left me dependent for two full days — it seemed much longer — on a combination of Blackberry and dial-up modem access. As a combination, they worked rather well: the Blackberry gave me always-on notification of anything that needed an immediate response, while the dial-up link could be brought up as needed to transfer a file or view a full-screen presentation. Trying to see the glass as at least one-third full, this beat the heck out of having only one phone line and having that line go down — which might be a plausible scenario for any number of telecommuters, perhaps in the San Francisco area? Hypothetically?
And note that, precisely because my data were residing in the cloud and not on any one device, any connection to the cloud was as good as any other — at least, in terms of everything but speed and screen size.
For developers, the import of all this is that application designs should provide intermediate levels of service when network access is degraded. An application design that assumes always-on, high-bandwidth access as an all-or-nothing proposition is asking for trouble. That’s true regardless of whether the application itself is running locally or remotely, since interaction with remote data and with partners’ federated services is now the norm for even on-premise software.
Keep the big picture of connectivity in mind.