A while ago, I read a brilliant blog post called FlexDev. It spoke about software development methodologies. Lately, it seems like lots of people are talking about methodologies, from XP to Agile to SCRUM to BDUF to "traditional" waterfall. In my career, I've seen several companies move from waterfall to iteration based processes. Overall, these transitions have been very successful. At Clique, one of the reasons the transition was successful was that their developers had buy-in to the process. But another important reason it worked was that the iterative approach fit in line with their product -- a cutting edge product with no current users, and thus unknown customers requirements. The point made in FlexDev is that the methodology or process must fit the problem. Since software crosses so many domains, the problems are varied and require different tools and processes. Joel discusses some of these in Five Worlds. The key in my mind is to Refactor the Process on a regular basis.
Yeah Yeah, Boyee! That's all cool and all, but that's not the point I want to make in this article. Rather, I want to talk about levels of abstraction. Uber, Meta, recursion, all that hub-bub. 'Cuz it's doing itself to software itself now. Checkit:
We used to concern ourselves with actually making computers do things. Uh, back then we called them CPU's, microprocessors, or even just IC's. Feeding a program to a computer used to be a challenge. A big one involving having humans toggling switches, and then figuring out how they mis-toggled. That sucked, so lazy computer people made primitive data entry systems: Punch Cards! Oh, but now all those 1's and 0's seemed dizzying, so they came up with hex. After a while, even assembling your program was annoying, so they came up with assemblers which could understand op-codes! Next thing you know, there's permanent storage, CRT's and keyboards! Pretty soon K&R make C, letters are better than words, and Bill Gates is richer than Ecuador. Yes, the country. Some say nothing worthwhile has happened since.
Well, then we took computers, and stuck them together in different ways. Multiple processes, multi-threading, quad core, hyper threading, dual proc, multiple processor, distributed algorithms, grid computing, and ... the Internet. Coolness. Some unenlightened think this represents a lot of hot air, but we technorati know better and talk about clouds. If a butterfly flaps its wings in Jipijapa, will my cloud start to hailstorm on my crowd-sourced captcha?
But wait, something else was happening. Sure, C evolved into C++, Java, C#, and a whole host of other things. And let's not BASH too much, with Java the VM was born-again, and abruptly, all trendy programming languages started with the letter 'p'. Sure we've separated UI from processing from data. And somewhere along the way we got specialists -- Windows UI wizards, Linux hackers, SQL masters. As the tools got more powerful, and we understood software as it's own unique form of engineering, the way we built software (and software companies) changed from other disciplines.
And now we have the "Methodology Wars". We're now at the point where we can look at the software development process, the collaborative multi-human endeavor of developing software as it's own meta-thing, and figure out ways to improve it's efficacy. Now we're developing the process by which we iterate on and improve upon software development companies and their methodologies. Is this an evolutionary algorithm? The Genetic Algorithm?
Go wetware, go! People are certifiably super-cool; We have psychology, sociology, and self-referential economic systems. We can look at things in memespace, ideology space, etc... We can look at the propagation of human social systems based on their reproductive fecundity and fidelity: Which ideas and ideologies propagate with high fidelity, verses those that generate many new offshoots each "generation"; Which meme-systems "infect" many new carriers, and is this propagation by birth, by conversion or by coercion? Do these systems have self-reinforcing mechanisms to lock adherents into a particular memespace? Do the socio-cognitive systems create intrinsic value? Is this value accretive and transferable -- like a Lamarckian system? In the battle of memespace, can the impact of this value offset the cost of adoption of the meme-system? Is the impact of that value to the meme-holders sufficient when viewed in the competitive landscape to remain viable and sustainable, even in a minority penetration situation? Can it offset virulently adoptable, but perhaps less valuable memes?
And what does this have to do with computers, information processing systems, and the business of software development? Let's use collective intelligence to find out!