Monday 11 October 2010

Unreliable networks and tools to use them

Despite the way our internet connections are getting faster and more reliable, sometimes we have to deal with file transfers and remote control over network connections that are somewhat below the minimum reliability threshold. In those cases, it helps to have systems in place that can tolerate a lot more network interruption than is expected these days.

For instance, say you need to run some maintenance programs on a server that's halfway around the world, and the network between that point and your desk is dropping out once every minute. A normal remote desktop session isn't going to cut it, because it expects a constant connection and a certain bandwidth. You'd do better with commands relayed asynchronously, perhaps via an intermediary.

And very large file transfers in such circumstances should not have to be done in one go, because it probably won't happen. You need something that can resume when it drops out and retry parts that don't work properly. Today's standard methods simply don't expect or handle unreliable networks.

Mokalus of Borg

PS - Akira would probably fit the bill for the first part.
PPS - And I've already written about Bittorrent for the second.

No comments: