Amidst trying to grok mathematical formulas and other assorted mischief from my university work, I came across this post commenting on how easy it is to install Plone on Ubuntu Linux nowadays and this got me thinking… so, let me put up the big red rant alert. You’ve been warned, now.
Despite all its virtues, Linux has always been known to not make things particularly easy for the end user when he’s attempting to install software. Traditionally, an operating system from hackers to hackers from day 0, desperately trying to adapt from the moment it perceived its own brilliance and how appealing to the layman computer user it could be. Surely this has improved over the years and perhaps I’m not even the best person in the world to talk about it considering all the years I spent using Slackware which is know for not having proper package management. That kind of spoilt me the wrong way, as I had this weird tendency to compile everything from source, frequently losing track of what I had - and had not - in my system.
Granted, nowadays software in Linux is becoming easier and easier to install:
- Python applications are usually installed with minimal fuss simply by typing python setup.py install. More recently, setuptools even takes care of finding the most up-to-date package out there, downloading it, compiling it, if need be, and installing it in the appropriate location. All with a simple easy_install <package name>.
- RPM-based distributions provide adequate package management and any given package is usually a breeze to install.
- And of course, by now Debian and Debian-based distributions users are screaming bloody murder. sudo apt-get install <pkg> or its younger, more evolved cousin Synaptic pretty much rule their world.
I’m sure there are other kinds of apps that have entirely their own brand of easy installation. Now, this is all fine and dandy but I still see one slight itch: as with much else concerning Linux and the open source community in general, it lacks a unified view of what installing software should be. Afterall, the underlying OS is the same for everyone, is it not? Why then should installing different applications vary wildly, even if most are easy to begin with?
OK, I hear you say: “hey, I’ve been using Debian for so long and I just apt-get install everything the same way, be it Python apps, or anything else.” Sure you do, but then there’s a dozen other major distributions which do things their own easy way, which differs in subtle and often deceiving ways. Plus, Linux is not just Debian, though it properly includes it.
The point may be moot but honestly, coming from a Linux background I find that this is one of the biggest problems keeping it from seriously challenging both MacOS X and XP/Vista on the desktop. If the single most used aspect of any OS for the end user, which is installing software, can be enigmatic, how can the OS truly be from hackers to users and not just to other hackers?
Unfortunately, different installation procedures for different apps on the same operating system is just one of example of many such disparities - surely freedom and multitude of choice can be good things, but why are there so many full-featured window managers for Linux and none getting it fundamentally right after all these years?
This is not bashing any project in particular, not even the community in general. Like, don’t bite the hand that feeds you, in a way. The very nature of Linux, its development being highly distributed, instigates design by comittee which, more often than not, doesn’t yield optimal results - thus both GNOME and KDE being good pieces of software, but hardly posing a real challenge to the big players in their market. Obviously, this reasoning doesn’t apply to smaller pieces of software such as blog, wiki, chat or mail software - no one can beat open source on that in my book. I’m talking larger-scale engineering here, afterall the building blocks of the OS for the consumer.
So, it seems the problem is far from trivial - otherwise it would have been solved already, right? It seems the reason why the open source community has worked so well thus far is the very same that ultimately keeps it from truly conquering the real market out there in its many fronts. How can the open source community work towards this unified view I find lacking without losing its most valuable asset? Not tooting anyone’s horn, but one such example comes precisely from the Debian community: the DCC Alliance aims to unify Debian-based distributions into a cohesive whole while promoting independent development of its individual members. Shouldn’t an effort like this be amplified and applied to the community as a whole bringing together the best players in the different key areas while leveraging their own individual virtues at the same time?