Connecting Gibbons and Systems Thinking

I was rather hoping that someone would respond to yesterdays post with:

Yes, Matt, that’s all well and good, but you still didn’t answer Harry’s Question – what does the Decline and Fall of the Roman Empire have to do with software testing?

No such luck, but I’m going to forge ahead anyway.

But first, I’d like to discuss general systems thinking. At the Agile Conference this year, one of the speakers, Peter Coffee, discussed the book Systemantics. It was an excellent keynote; you can download the audio here.

The author of systemantics (yes, it’s a play on words) proposes a few simple rules:

1) Complex Systems that are designed (by humans) in a big-bang manner don’t work.
2) Any sufficently complex system that does work evolved.
3) Complex systems that grow by evolution are incredibly inefficient. When you find a working system, it is only working because some sub-section is compensating for another system that is, in fact, failing.

When you stop and think about it, this makes incredible sense. Why do humans have two eyes, two ears, two lungs, and two livers? In case any one system should fail, it won’t kill you. So you need redundancy.

The space shuttle, when it was originally proposed, was supposed to be this incredibly re-usable launch vehicle that could do a mission a week. Yet we have five of them and do a couple of missions a year – because the thing was developed as the big-bang following the capsule system, and it’s error prone.

Why is it that in any software development shop, some department (dev, test, analysis, PM) is always failing and some other department has to take up the slack? Because that’s what happens in large systems that evolve – like a business.

The IRS Tax Code is entirely messed up, but it seems to mostly work well enough, enough of the time. It evolved organically. Marx’s Capitalism, as espoused by Lenin and Stalin, sought to do total re-distribution of wealth in a big-bang fashion, and it failed. (Then again, Marx, Lenin and Stalin had other problems.)

What does this have to do with Gibbon’s work, The Decline and Fall of the Roman Empire?

Gibbons gives insight and examples about one particular system – an entire culture – and how it fell apart.

In the past half year, when I find a system that is overly complex on it’s face – a system that simply can not work because it is too confusing, I refer to the system as “Byzantine.” “Byzantine” is actually a reference to the Byzantine empire, what was left of the roman empire after Rome fell. The Byzantine empire was complex and had an incredible number of government roles that provided a salary with no actual work. Just figuring out who had the authority to grant you what you wanted probably involved a number of meetings, bribes, and letters.

When I call something “Byzantine”, it usually means that it should be scrapped entirely and started over. I got that from Gibbons, and I would like to close by quoting Weinberg’s book on requirements:

Many psychological experiments have demonstrated that meaningful information is easier to recall than meaningless information. For instance, computer programmers can find errors in code by trying to memorize code sections and then recall them from memory. The parts that are hardest to remember are those that don’t make sense, and are in fact more likely to contain errors. The same is true of parts of requirements documents. – Pg. 92

… And the same is true of human systems. If you want a tool to help you find “Bad Smells” in code, try Martin Fowler’s “Refactoring.” If you want tools to help find “Bad Smells” in systems, you can check out The Decline and Fall or the latest version of John Gall’s Systemanics.

If a big part of testing is applied critical thinking, then I think finding “bad smells” counts.

Leave a Reply

Your email address will not be published.