I've spoken before (and followed-up here) about the notion of 'simplicity' and how I think people's notion of it is too, well, simplistic, and subjective. (I also gave a link to Linus Torvald's complaint about the mindless drive for simplicity in some software).
This post is about two different perspectives for assessing the simplicity/complexity of software systems, arguing that the one that is typically used is too narrow.
This post is just a very quick draft. It's the kind of thing I'd normally just write then save as a draft in blogger.. but i thought i might as well just post it anyway.
when people talk about simplicity as a desirable property of software
i think their consideration is too focused
on the small scale of a particular application
in that app, they want the core set of features used in the majority of tasks
what they don't consider as much is complexity on the global level
across all of the software they use
i think that greater simplicity at a local level can lead to greater complexity on the global level
what we might be able to do is add a bit of extra complexity at the local level that contributes to a decrease in complexity at the global level
and in fact, once it has been learnt, can lead to greater simplicity within the individual programs
this would be primarily done, I think, by finding the deeper common structure that is present across the different data, tasks and applications present, and providing explicit support for this.
i think this can be illustrated with an example.
consider cut-and-paste.
i seem to remember that, back when i just had DOS on my computer, many of the programs had no cut-and-paste feature. this made them simpler. and if all you could do was cut and paste within that application, it may have been debatable as to whether such a feature would be worth the added complexity.
but cut-and-paste is one of those features that really benefits from being able to use it across multiple applications. you can't fully appreciate its utility unless you consider things on a global scale.
and while cut-and-paste does add a certain level of complexity from the standpoint of an individual program, that complexity is ameliorated across the set of applications, because you only have to learn it once, rather than once-over for each new application.
i think the global level -- the overall experience -- is what really counts, not a blinkered view that only considers individual applications as stand-alone cases.
so what else can be done other than cut-and-paste?
here's an example. much data in systems is sets and lists. we could have an across-system, global capability for manipulating sets and lists. so, for example, set operations such as unions and differences. these would be applicable to any data in the system that is performing unions on them, getting their difference, etc. the specific program could constrain what capabilitys should be applicable to a given set of data.
i often wish I could do such things with MP3 playlists, files in directories, emails and query results in various different apps.
of course, you'd need to figure out a way to handle this in the UI (amongst other practicalities). but that's outside the scope of this post. i'm not talking about specific technical approaches for implementing this, and their feasability in the current system/infrastructure ecosystem, pluses and minus... just trying to introduce the concepts of local and global complexity and the trade-offs between them.