My thoughts on the growth of the Linux kernel and the status quo of using and developing software..
Prompted by discussion of this article: 1986 Mac Plus Vs. 2007 AMD DualCore. You Won't Believe Who Wins
[Ed: My response to accusations of Linux Kernel bloat]
The [Linux] kernel never has really been the problem. In 1 to 2 MB of compressed/compiled code on my computer (gentoo-sources + my custom .config and a couple of patch sets from future merges), there is some of the most advanced file system, networking, protocol, hardware and scheduling code ever conceived. Indeed, there are many areas that need work and are constantly being updated, but find me a kernel that supports NUMA, scales quite linearly with SMP, implements fair queuing of IO and CPU scheduling, has NO tick interval, virtualization and supports a wide gamut of platforms and hardware. It runs on systems as small as a microcontroller and as large as BlueGene/L. Did I mention it is free and I can learn from and hack on it?
The kernel isn't really expanding at a rate to be concerned with, because only a small subset ends up being needed for most users and systems. No, the problem really lies in user space on UNIX systems. Modern UNIX userland involves many many layers of programs interacting and building on top of each other. I really don't see it getting better in the future either. As higher and higher levels of programing languages are being used, more and more layers are added to the onion. This can make a programmer's life easier and allows more complex systems to be designed, but there are many drawbacks as well. Bug creep, feature creep, usability, complexity, and resource usage all come to mind.
Do I know the answer? Not at all. I don't think there is one. Software will develop organically in the wake of hardware progress for the foreseeable future. If and when this progress slows, perhaps things will change course. A sea change of compiler optimization, small is beuatiful engineering, and an emphasis on efficiency..