It’s been observed, both in science fact and in science fiction, that the radio transmissions of any sufficiently advanced civilization are indistinguishable from random noise. Think about this for one second. The most efficient way of utilising available bandwidth is to encrypt the stuff that needs to stay secret, and compress the hell out of everything else. End result – every repeating pattern in the signal becomes obliterated, either because it’s been deliberately hidden to prevent the signal being decrypted, or because compression algorithms have exploited – and thus eliminated – the redundancy in the repeating patterns. If E.T. is out there, he’s probably using gzip and PGP, and so even if we could pick up his transmissions they’d look exactly like random noise coming from every other corner of the universe.
Does the same thing apply to software development teams? What happens once you’ve established a solid system for code reuse, a culture of refining and refactoring, implemented continuous integration for builds, testing and deployment, and generally automated or eliminated all the routine, repetitive manual processes you can find? When you reach a point where every predictable aspect of the build and release process has been automated, you’ve effectively guaranteed that the only bits left for the people to do are the creative, unpredictable, subjective, random aspects of the process.
Does that mean that, to an outside observer, the behaviour of any sufficiently advanced software team is indistinguishable from chaos?
(And if is is… what does mean for new hires trying to learn your culture and processes?)
Photo from oooJasonOoo via Flickr, used by permission under Creative Commons. Thanks.