Thursday, October 25, 2007

Incorporating inefficiencies/constraints into your conceptualisation of systems

Companies and governments are examples of systems that have kinds of goals or purposes. I think that when we try thinking about such systems, we find it very difficult to factor in the real-world inefficiencies/constraints that apply to/within them.

We tend to conceptualise and reason about those systems as if the agents within them had full/perfect information and had a clear path to work towards those "goals". (I don't think most of us realise we do this, though we can learn to realise that we do it).

But there are all sorts of constraints there - people have limited information, and not all of the information that would enable them to carry out their job properly, and not everyone has incentives that are aligned with the system's overall "goals".

Let's consider an example. If you didn't know better, you might think that, of all software, enterprise software would be the best. Unlike music playing programs or computer games, this is serious business, and companies are paying lots of money for them. And these are companies in highly-competitive environments, and since the software is crucial to their operations, they'd need it to be good.

But enterprise software is -- I'm told -- generally not that good.

In an idealised system, the software would be designed to meet the user's needs. But there are several practicalities in the actual systems that cause inefficiencies, skewing the system away from the idealisation.

Signals vs. Noise say that enterprise software sucks because it's not really designed for the end-users, but to meet the buying critiera of the software purchasers within large companies, who are not themselves end-users of the software.

The buyers don't have as keen a sense of what the software is required to do, and how well it does it, so their evaluation of the software is skewed towards "the feature list, future promises, and buzz words".

Paul Graham mentions another evaluation criteria used by software buyers: making a choice that appears "safe" or prudent:

There used to be a saying in the corporate world: "No one ever got fired for buying IBM." You no longer hear this about IBM specifically, but the idea is very much alive; there is a whole category of "enterprise" software companies that exist to take advantage of it. People buying technology for large organizations don't care if they pay a fortune for mediocre software. It's not their money. They just want to buy from a supplier who seems safe—a company with an established name, confident salesmen, impressive offices, and software that conforms to all the current fashions. Not necessarily a company that will deliver so much as one that, if they do let you down, will still seem to have been a prudent choice. So companies have evolved to fill that niche.

Leaving this example now, I think one of the reasons it's important to understand topics like economics or evolutionary biology (and probably the law -- especially its hisorical development) it to gain a better appreciation for, and awareness of, the effect of constraints upon systems.

No comments:

Post a Comment