Older blog entries for pfremy (starting at number 11)

Slashdot interview
Scott Tape said in the Slashdot Interview:

In my experience, programmers like to write code. Period. They don't like to write documentation, they don't like to write system tests, and they don't like to write unit tests. Programmers are also optimists--how else could they tackle building these enormously complex systems and think they had any chance of working? Programmers like instant gratification (who doesn't?). They enjoy coming up with a solution to a problem and seeing that solution implemented immediately.

Because programmers are optimists, that is reflected in their unit tests. Time and time again I've seen developer-written tests that demonstrate the feature works -- because the tests reflect the thinking of the developer about how the feature will be used. They rarely do a good job of testing corner cases, limits, or "unusual" situations (like running out of memory or other finite resources). </blockquote>

While all the remarks are good, the conclusion that programmers will not enjoy test first methodology is wrong. To my surprise, I have enjoyed it a lot. Here are some arguments :

  • Programmers like to write code : programmers write even more code with test first methodology
  • Programmers enjoy solving technical challenges : building a system so that it can be fully tested is a technical challenge wich makes the test system as interesting to write as the code for the sytem.
  • Programmers enjoy instant gratification : having one's test run successfully is instant gratification and programmers enjoy it a lot. It may sound silly but I enjoy seeing my
     Result 37 / 37 successful test, 100% test pass
    . And I like seeing the number of test (== feature) moving forward. Means I am actually doing some measurable progress in my application
  • writing tests is boring : it is boring if you write them once the application is ready. If you write them before your code, it gets interesting because it is equivalent to writing code. At some point, writing the test gets more interesting than the code, because the code "just runs the test, you know, not very exciting".
  • corner case are not tested : in test first, it is recommended to test the corner case before the normal case. That way, they are a lot easier to deal with, and you are not tempted to skip them. Having plenty of tests already working also helps in building tests for corner cases. When it is just about calling 3 tests in a row and adding 10 lines of code, corner case testing is a lot more common.
  • Geeks like to achieve good work : Having the system running many tests successfully is rewarding. You know that your system is fully working and tested. It will be easy to maintain and extend. If you modify it, just runs the tests to check that it is still working. This is really reassuring.
Advogato
One week ago, I have submitted an article for posting. Since then, I have no news. I don't know if my article is waiting in a long queue for moderation approval. But that seems improbable given the sheer number of articles that get posted these days.

I can suppose that it has been rejected but I have no mail to confirm this. And if it has been rejected, I would like to know why. I have invested my time to write a reasonably good article. I find it certainly more interesting than the latest Advogato users are lame.

For a site about open-source, I would expect the processing to be more ... open.

Qt FUD
There is a lot of FUD going on about Qt these days. Four years ago, people would complain that Qt was not free. Now, they complain that it is too free and it forces you to write free program. How evil!

I would like to remind a few points:

  • RMS recommands GPL for library the implement something that is available already in a non-free form. Clearly, Qt stands there. So it should have RMS's blessing now.
  • Qt is GPL. There is no license issue around it anymore.
  • Many free software bigots like to brag about how free software gives you more freedom and more choice. In this specific case, the GPL version of Qt does not let you choose the license for your application, which could be seen as a restriction of freedom as a user. However, if you pay, you can have all the freedom you want.
  • Some people would like you to believe that Qt is not suitable for commercial appliation because either it is too expensive, or it requires to develop a GPL application.

    Well, some business model can work very well with a GPL application. So in that case, Qt is free (if you use the unix version) and the argument does not apply.

    For the other point, if your business can not afford a version of Qt or a version of PyQt (a lot cheaper, see the PyQt website), then your business has a serious problem.

    If your business makes real money, then giving back to Trolltech to help make Qt better is a good idea. I don't see why we should support people that make money off the hard work of others without giving back. When you use Qt in a non free environement, you are forced to give back. And Trolltech gives you back too, every version of Qt has new wonders.

For the record, I have built with friends a small business based on Qt and it was successful (see www.yalbi.com). It is perfectely possible to use Qt in a commercial environment. In fact, if it wasn't, I wonder how Trolltech would have survived all these years. Trolltech does not have any VC money, it is self-funded. So it is profitable thank to all the business that use it.

I have even used Qt to do windows-only program. It is a lot better than MFC, .NET or VB and in the end, it is a lot less expensive. What you pay for the license, you spare it on your development time, feature richness, maintenability and royalties (many windows libraries have royalties scheme).

My first time configuring procmail. I really wonder what the author was smoking when he designed the syntax. I understand better the old saying : LSD and Unix both come from Berkeley. This is no coincidence.

How can someone make such a braindead syntax ? ':0:' means use a lock file when delivering. ! means forward to an address. My God.

Anyway, the good part of it is that it is very standard, fetchmail can deliver directly to procmail. And once you have set-up the basic rules, you won't touch it often. So it is more or less ok.

Now my spam is all filtered out. MMh, what a delight. Thousand thanks to bogofilter and Paul Graham.

I am using and patching CppUnit. This is such a beast!

CppUnit was developed after JavaUnit, which was developed from SmallTalkUnit, which was written by the author of Extrem Programming. I think something has been lost somewhere on the way from SmallTalk unit to Cpp Unit.

Some core principles in Extrem Programming are:

  • use the simplest design that work
  • you are not going to need it
  • refactor mercilessly
  • do things only once in your code

    I look at CppUnit and I see exactly the opposite of this. Ironically, CppUnit is used to develop test suites in the Extrem Programming spirit. Briefly, CppUnit has:

    • a very complicated architecture. There are so many classes that delegates anything to another one that you don't get a clue about who does what.
    • lot of empty classes. This comes from historical reasons I think. They changed the internal but kept the classes around.
    • lot of duplicated interfaces, probably for the same reason
    • despite so many classes, it does not have extensiblity where I need it.

    It looks to me as the typical example of over-engineering. People have read Design Pattern and they want to apply it desperatly. And they also read about templates and want a maximum number of them.

    CppUnit should fit in 3 or 4 classes. No more. I'll rewrite it one day.

  • Free Software Happiness

    Yesterday, I looked around at Free Software and I was happy.

    Three years ago, if you had asked me about the state of Free Software, I would have given a reluctant answer. Yes, there were good developments on the server side. But the desktop was really poor compared with Windows or Mac.

    But yesterday, I looked back at what has been achieved and I was happy. Most of the problems are either addressed or under active development. The things that pleases me (non exhaustive and in no particular order):

    • X: the old beast was seen as a major stopper for further development of graphical software: no hardware accelerated drivers, very old and big codebase, no anti-aliasing, no fancy cursors. But time has proven that the beast can be improved and bring high quality desktop environment. We even have promises for better quality desktops than in other OS.

    • supermount: you can now insert your floppy, type ls /mnt/floppy, change your floppy and type ls /mnt/floppy again. What a delight. Manual mounting of removable devices is has not been jutified since ages.

    • USB hotplug: ok, I heard it does not work perfectely but we are not far from that. What a world between today and three years ago.

    • journaling FS: you can boot quickly, you can turn off your computer, no more five minutes of waiting in front of a fsck.

    • KDE/Gnome: we now have a good deal of graphical software, with (and this is the most important) a consistent look'n feel. No need to relearn GUI when switching from XFig to xv.

    • good distro installers: all major distro are now at the point where it is easier to install linux than windows. Dynamic resizing of partition was heartily welcome.

    • Cups: we can manage printer in a modern way. This is the kind of services improvement I expect from technology.

    • Evolution: we have a clone of Outlook, so people can have a very smooth transition from windows to unix

    • Samba: really mature stable integration into windows environement

    • OpenOffice: we can finally have a reasonable alternative to Microsoft Office, on windows or Unix.

    • France: I see with joy Free Software making a way through the educational system. The governement has some recommendations for Free Software. Almost all major IT magazines have had a special edition for Free Software.

    • Germany: some companies have had a public contract to improve KDE the way the governement wanted it. I couldn't dream for more.

    • Other countries: many good stories with Free Software in other governements, like Peru, Spain, ...

    • mplayer, xine: we can play DVD with almost the same facility than in other OS.

    • IBM is comitting to Free Software with strong support. They do help a lot in ways that no individual or project could. That is great.

    • Virii: windows has seen many virulent virii those last year. There are more and more spyware and unwanted software on anybody's windows. People are starting to realise the interest of controlling his computer.

    • Gnome/KDE integration: common dock application, common description of application, this is the present. The future looks bright. I am eagerly waiting for common themes, common mimetype databases and things like that. It will arrive one year of the other. It is unavoidable and needed.

    • Sourceforge: you may critisize certain aspects of it, but sourceforge is a great repository and incentive for free software. It is very easy to now get the necessary infrastructure for any project.

    There are progress for which you must really be a geek to appreciate:

    • Mozilla: certainly the most powerful browser for geeks. If you want a browser feature, you certainly have it in Mozilla. And it works on windows!

    • bogofilter: this made my day. The final weapon against spam has been found. I can't wait to see it in action on my mail account.

    • ssh, ssl, https, gpg: we are using advanced crypto tools. We are the future.

    • gentoo: the coolest distribution ever. Distro self-compiling and bleeding-edge packages are great. For me, it cumulates the advantages slackware, debian and mandrake, with none of the problems I had with these three.

    • linux: some really interesting stuff in the future stable kernel. We are on the way to kernels superior to major competitors.

    • Konqueror: the best integrated ever. You can rip a CD, browse a ftp site, browse the web, browse your shares, preview many types of files, manage CVS, ...

    • CVS and co: despite a few problems, CVS rocks. Very lightweight and stable client, ssh connections, mail with diff of commits, web browsing of repository, concurrent modification of the same file, CVS is years ahead of what my company is using for managing its software.

    • ccache, distcc, ValGrind and CacheGrind: those are very powerful developers tools that have no equivalent elsewhere.

    • cool tools: this is not new but perl, python, cron, ssh, cvs, grep, bash, find are really good tools that can be used for plenty of things.

    • Linus uses bitkeeper: some see it a failure for Free Software, but I am just so happy to see Linus finally using a revision system. Makes it much easier to follow Linus developments, to patch, to fork, and so on. I am sure this improves the kernel quality.
    We are going to take over the world. I did not doubt it, because you can not stop Free Software. But now I am seeing it happening and I fill the chill.

    There are still progresses to be made in development tools. IDE with tight integration of every development tool (gcc, make, gdb, CVS, ...) are seeing the light, but plenty of progresses can be made. I am also waiting for a good C++ refactoring tool.

    I wonder why everything I post is interpreted as an agressive move. Too sensible subjects ? My bad grasp of english ? I thought I had written this last diary entry with a clearer intent. I question Eazel's contribution to Free Software, not Nautilus's capabilities. And yet I am attacked on Nautilus.

    Thomasvs Here are a few points : You specifically mention Nautilus 1.0 without mentioning that this is the OLD Nautilus developed by Eazel. All your points are ONLY valid about the OLD Nautilus, and even then they're skewed at best.

    I thought it was obvious that I am referring to the old Nautilus developed by Eazel. I was referring specifically to an interview where all the modifications that have gone into Nautilus 2.0 are explained.

    Why don't we compare KDE1 or KDE2 Konqueror with the Nautilus you bash ?

    The thing I was willing to compare was specifically the development made by Eazel, not the features of Nautilus 2.0 . Something equivalent to compare would be for example the developments made by the Kompany. They have the same problem, they usually do not use KDE's technologies. They do not integrate much with KDE. TheKompany now develops everything using Qt only, so KDE integration is even more limited. Just like for Eazel.

    One big difference between TheKompany's contribution to KDE and Eazel's contriubtion to Gnome is that none of TheKompany's applications are named "core KDE applications". The fact that they are not under a free licence certainly helps. :-)

    Yes, you are right that the original Nautilus 1.0 developers didn't put integration with the rest of GNOME high on their priority list. That is because Eazel was foremost a COMPANY trying to make MONEY on Eazel services served by Nautilus.

    Which was exactly my point. I was not willing to discuss further than that. Now another question is, should Gnome accept this ? Given that Eazel did not care about Gnome's goal, should have Nautilus been in Gnome ? My personal answer is no. I have no problem with any company making money building an application that does not integrage with Gnome, but then it should not be distributed as part of the Gnome desktop. This is contrary to the goal of the dekstop thing.

    Now, the maintainers are spending effort to integrate Nautilus properly. Had the eazel guys done Nautilus the right way, this effort could be spared.

    You say "it is not possible to reuse parts of it". Well, ha ha ;) All code that is reusable has been abstracted out into other libraries which now are used throughout Gnome.

    Are you speaking about Nautilus 2.0 or Nautilus 1.0 ? I was referring specifically to Nautilus 1.0 . I understand from the interview that most of the problems of Eazel's Nautilus are being addressed. That's good news.

    Anyway, I made my conclusions based on the interview I read. This is not the result of a five month research on Nautilus, but a diary entry reflecting what I understand from an interview. The interview makes it clear it is not possible to reuse the icon view of Nautilus. I find this surprising because Gnome has all the technologies to make this possible. Does Nautilus (1 or 2) use bonobo in any way ? I thought this was exactly the thing bonobo was created for.

    You should have picked another application to try to bash on integration issues, because Nautilus is a really good example on how to do integration nicely.

    I am still under the impression that Nautilus integration into Gnome could be far stronger than what it is.

    You do both KDE and Gnome a disservice by trying to instigate some sort of fight based on very skewed assumptions and giving a lot of reasons that are easily disproved. Code talks, and my code disproved your point.

    Sorry but I was talking neither about Gnome, nor about KDE but about the way Eazel contributed to the Free Software. I think they have missed the culture and people like the current Nautilus maintainers have to come after them to clean things up. That's a pity.

    I think Ximian did a far better work with Evolution. Not only do I find evolution more useful than Nautilus, it also integrates properly with Gnome.

    Reading my diary entry of the 25. october, on the interview of the Nautilus, I was not satisified. The entry was agressive, unjustified on certain area, not clear in its intent (too much KDE ranting, too much Gnome bashing). I have rewritten it in a less agressive stance. I keep the response because they are interesting.

    My point was:

    1. Konqueror use KDE technologies (kio slaves and KPart) to provide all its features. So anything available in Konqueror is available for any KDE application.
    2. Gnome has equivalent technlogies: gnome-vfs and bonobo
    3. Nautilus 1.0, a core gnome application, provides the same services as Konqueror but doesn't use Gnome technlogies and sometimes even duplicate (theme handling) gnome technlogies.

    My conclusion is that the Nautilus developers didn't think it was important to develop an application that would integrate with Gnome. They just thought about developing an application and did not care that much about Gnome's goal.

    The interview also gives the impression that Nautilus code os not modular (it is not possible to reuse any part of it indepententely, nor is it possible to integrate external code), not optimised and not very good:

    you really need to be two people to maintain Nautilus [...] the new code is also vastly more readable and somewhat better performing than the old code [...] the [sidebar] code was horrible [...] The Icon view is quite integrated with the core Nautilus code at the moment, so it is very hard to do things like this [...] Right now the CVS view has to recreate the whole directory view, which is a pain. It leads to views that don't integrate well with the rest of nautilus (and more often, views that just aren't written) [...] I don't think using the nautilus codebase is such a good idea, as Nautilus has an architecture that is overly complicated for a fileselection dialog [...] The Nautilus views require to much of the Nautilus internal asynchronous machinery, which we don't export (for various reasons) [...] there isn't anyone with concrete plans for fixing the mime system. [...]

    All this makes me think that Nautilus developers did not get the Free Software and especially Unix spirit. Unix is all about indenpendant modular tools and piece of codes that interacts together. Advantages of Free Software usually come from this modularity. We also strive to produce the best code. Given all the optimisation that were possible on Nautilus, I think they also missed that point.

    A few preliminary remarks:

    1. It is a lot more interesting to troll on advogato than anywhere else :-) You get interesting replies.
    2. Being able to post comments to diary entries would be better to discuss the subject.
    3. I am going on holyday so I won't (to my regret) be able to continue this very interesting discussion.

    Uraeus:

    As for the file access we have something called gnome-vfs in GNOME that does the same thing as kioslaves in KDE.

    I am aware that. But why if you have good gnome-vfs support are there _seven scripts_ on the nautilus script page to just handle an archive file ? Why do you need a script to play files in Xine or Xmms ? Or to call a shredder ? Is Gnome-vfs lacking all these features ? I was also referring to the NFS/Samba question. Why isn't Nautilus able to browse samba and NFS shares ? In KDE all these things are provided by the kio_slave library.

    The separate theme handling was a mistake, but has now been 99% fixed

    I think this mistake is a hint of the way Nautilus was developed: independantely of Gnome. Eazel did not take the time to contribute to Gnome the way it should have been done. They just developed and did not care.

    Why isn't it possible to provide a Nautilus view of a folder ? Why isn't it possible for Nautilus to embed a CVS frontend ? Both those things are possible in KDE, and not because someone took care of implementing them specifically. They are available because because a CVS frontend application (Cervisia) and a file manager application (Konqueror) had them. Both applications have used KDE's very simple component system and that's it, all other app have it. Every app shares all its features with the others, without extra-work. That fits the integrated consistent desktop goal.

    From what I read, Nautilus is a very monolithic application. Where are the bonobo components ? Isn't that the kind of thing bonobo is supposed to handle ? For every cool feature request, the answer seem to be "sorry but the architecture doesn't allow that currentely".

    I think (and correct me if I am wrong) that the reason Nautilus is so monolithic is that:

    1. Gnome's component support was not mature when Nautilus was developed
    2. Nautilus developers did not care. They simply did not fit into the integrated desktop vision. They developed an independant application that would use Gnome but not integrate it.

    Please understand that I am not critisicing the current Nautilus. The issues are obviously being adressed. I am criticising the way Eazel created Nautilus.

    As for its own icon management [...] if anything you could argue that the functionality was misplaced inside Nautilus or eel instead of being placed in a more core GNOME library.

    Indeed. So one application has some feature that the other have not and may be lacking ? This is exactly the opposite of the goal of a desktop such as Gnome or KDE. The projects are not about having good applications, we already have that. The projects are about having consistent integrated applications that communicate with eachother. A side-effect of this goal is the avoidance of code-duplication.

    On the other hand this is how we develop new stuff in GNOME. We include them it outer libs or applications and as they mature and prove usefull we migrate them further down in the toolchain.

    So this means that some application have feature the other are lacking ? I do not see this as a right way to proceed. If a feature is useful, it should be in a core library so that a maximum number of application use it and provide a consistent interface. You want to avoid code duplication and UI inconstency.

    Well a vast majority of the bugs in Nautilus bugzilla are leftovers from the Eazel days.

    Ok, forget the bug issue. The interview said nothing about this however.

    As for slow, well yes it is slower than windows explorer, but it is faster than Konqueror.....

    All I have to say is :-) . But what I was underlining is that Eazel seem to have developed the thing withoug optimising it at all.

    You did not oppose my statement about the code being complicated and buggy. I take this as an agreement.

    As for most of the future features are things that KDE already have. Yes, some of these features you already have, yet some like the video preview stuff is ugly hacks.

    I don't know if they are hack, but there are cool and they are an technological advance for KDE above Gnome. :-)

    And there are other things in Nautilus and GNOME where you implemented them after us, SVG support comes to mind as an example.

    My point was not to count which feature KDE or Gnome has, but to highlight the fact that KDE applications use core KDE technologies to provide many services in the way that makes them both easy to code and available to all applications. From my experience, Gnome application have tend to do their stuff in each application and not providing enough code/feature sharing.

    And before you jump, I understand this is being adressed in Gnome 2.

    As for lacking a 'large shared vision', I think you are mistaken. But truly we have had more discussions about these issues in GNOME than you have had in KDE, but I think this is mostly because KDE have almost no power over these issues, you have put yourself in a position where most important design decisions are handed down to you by TrollTech.

    This last sentence show how much you ignore the way KDE is being developed. You are simply completely wrong but I do not blame you for that. This is a common misbelief. KDE has full control over its development. It will be hard to convince you but maybe those few facts will bring some hints:

    • There are very few KDE developers that work for Trolltech (or put it differentely, very few Trolltech employee that develop KDE). All the KDE developers that have been employed by Trolltech have almost stopped working on KDE. However, they do work hard to make Qt good. Having a good Qt is good for KDE.

    • All interesting KDE technologies (dcop, KPart, kio_slave, XML-Gui, mimetype handling, KDE's service Trader, arts, ...) are developed completely independately of Trolltech and Qt. Those technologies depend on Qt the way Gnome application depend on glib and no more. Saying that this makes Trolltech control KDE is exactly like saying the glib developers control Gnome.

    • Something that may surprise you is that Trolltech is actually taking ideas from KDE for Qt and not the opposite. Their QSocket code, their QProcess code, their handling of dynamic libraries, their Qt translator, all this has been first seen in KDE and then added into Qt with KDE's experience. As a matter of fact, KDE doesn't use the Qt version of those.

    • Qt and KDE have completely different goals. The goal of Qt is to be a good cross-platform library for application development. KDE's goal is to bring a consistent easy to use desktop on Unix.

    • Qt is GPL. So if the Qt developed by Trolltech doesn't please KDE, KDE could fork it. But Trolltech has always helped KDE getting better, mainly by providing a very good Qt. There has never been any need for forking. But do not think it could not happen. KDE has 'forked' or recreated many tools because they did not suit KDE's need. Qt will be no exception to this.

    • Trolltech has zero influence on the KDE development or KDE releases. Remember they make money by selling non-GPL Qt versions on Windows and Unix platform. KDE is cool for them because it makes free publicity and it serves as a testbed but they have no interest at all in controlling KDE.
    I hope this explains why Trolltech has never influenced KDE's design. The opposite is however true, Trolltech has already been working on stuff that only KDE would use. I challenge you or anybody to cite me one single thing that proves that Trolltech is controlling KDE, its architecture or its developers.

    However, it is interesting in fact that the large shared vision is not discussed in KDE. It seem to be implictely shared and agreed upon by all the developers. We have far less flames than in Gnome for example. Almost everybody agrees on what is the right thing to do.

    pfremy I think your reading of my interview is heavily coloured by your premade opinion. While there are issues in Nautilus as my interview did illustrate your are blowing these issues out of proportion.

    Rereading my post, I confess the end of it is getting a bit too far :-). And please, my first name is Philippe.

    Hadess

    Every file access is done through kio_slaves: in Gnome, every file access is made via gnome-vfs methods (I wrote one for my Rio500, at the time, there are plenty more).

    I was not suggesting that Gnome was lacking this feature, I was wondering why it doesn't seem to be available in Nautilus. Why all these extra script ? Why no access to smb and NFS shares ?

    The UI of every KDE appliation is controlled through a XML file: Yep, we do that as well, libglade is taking care of it.

    The XML-gui stuff provides more than gladeui (last time I checked). It provides default shortcuts for all appliations, it provides a common menu organisation, it provides menu merging for components.

    duplicate features of Gnome into Nautilus: hmm, which ones ?

    Theme management ? Okey, this is being adressed. As I already said, I am not criticising current Nautilus but the one developed by Eazel.

    Opening of gzipped archives ? Component to display files the way Nautilus does ? Those features are either duplicated into Gnome libraries, or missing in Gnome libraries. I don't see the interest of having a cool feature in one application with other Gnome application unable to use it.

    in which cases does Nautilus not use the existing framework ?

    I was wrong with the mimetype (although I remember having read about Nautilus's mimetype database. And the article raises questions about complicated mimetype handling of Gnome and Nautilus) but what about Bonobo ? Where is the Nautilus Bonobo component ? Can nautilus display a preview of every file that has an application associated with it in Gnome (the way KDE does it in KDE, for example to embed KOffice viewers into Konqueror) ? Or do you need to code specifically every file preview directly in Nautilus ?

    Did you look at the code ?

    No, but I read what the developers said about it. I cite from the article:

    you really need to be two people to maintain Nautilus [...] the new code is also vastly more readable and somewhat better performing than the old code [...] the [sidebar] code was horrible [...] The Icon view is quite integrated with the core Nautilus code at the moment, so it is very hard to do things like this [...] Right now the CVS view has to recreate the whole directory view, which is a pain. It leads to views that don't integrate well with the rest of nautilus (and more often, views that just aren't written) [...] I don't think using the nautilus codebase is such a good idea, as Nautilus has an architecture that is overly complicated for a fileselection dialog [...] The Nautilus views require to much of the Nautilus internal asynchronous machinery, which we don't export (for various reasons) [...] there isn't anyone with concrete plans for fixing the mime system. [...]

    Nautilus 1.0 architecture seem to be very monolithic, and the code not efficient. Did _you_ look at the code ?

    I use file-roller, which handles much more than just tarballs, and there's also support in gnome-vfs for reading from tarballs.

    And you try to convince me that there is no code duplication, when you cite two tools to perform the same action ? Just joking :-), archive handling is no major feature :-) Okay, I was mislead on this topic by the Nautilus script page. I still wonder why you have seven scripts here to perform an action that is part of the gnome core libraries.

    Philippe, do a little research before talking. You won't get far on this site (and in the Free Software World) spitting flames. All you're doing is making people in the know laugh at you. Poor sod.

    This is a diary entry, not an article, hence less research. I am posting what I think and I am happy to engage discussion and to correct what is wrong. Looking at what Uraeus said on KDE development, I am certainly not the only one that have misinformation. And I am glad to learn, do you suggest I should stay in my ignorance ?

    25 Oct 2002 (updated 3 Nov 2002 at 18:10 UTC) »

    [ I rewrote this diary in a gentler manner. The original one was much more agressive. ]

    Today, I read this Interview of Nautilus maintainers. This was the confirmation of everything I had heard about Nautilus. I can summerize it like that: The project is completely wrong. And no, I am not saying that because I am a KDE supporter. I am weighting real issues.

    The goal of desktop project such as KDE and Gnome is to create a consistent desktop, with consistent and deterministic UI, familiar features that are accessible and work the same way, that have the same look. This makes the desktop easy to use, because each application doesn't require its own learning curve. This hasn't been the tradition previously in Free Software.

    The way to achieve that is to provide a maximum number of features directly in the framework, and not in the application. Applications using the framework automatically fit the goal and make the project move forward.

    KDE has achieved this goal. Most KDE applications are coded as components that can be used in other applications. Every file access is done through kio_slaves, which allow to access just any location of file: local, remote, photo camera, nfs and smb shares, embedded (zip and tgz), ... The UI of every KDE appliation is controlled through a XML file and provides default for organisation of menu, short cuts, toolbar/menu synchronisation, custom toolbar features, menu merging for plugins, ...

    Although the Gnome framework provides the same services, it seems that they are not used by Gnome applications. Nautilus, that the article calls a "core apppliation of Gnome" is a shining example of that.

    Nautilus was developed by Eazel, and I get the impression that integrating into Gnome was not a concern for them. A big part of the job of the current maintainers seem to be integration of Nautilus more into Gnome. This task is not even finished today.

    Some Nautilus feature seem interesting for a wider usage into Gnome but the answer of the maintainers to such request was always "yes it would be nice but the nautilus architecture doesn't allow us to do that". It is a pity to have a good project whose part can _not_ be reused in other projects. The idea behind Gnome and especially bonobo is exactly the opposite. Where are the Nautilus bonobo components ?

    Most of the features that were requested in the interview are features that are present in Konqueror, thank to the KDE technologies. Gnome has the equivalent technologies and yet, can not use them for this purpose. There is clearly a problem somewhere.

    I was thinking about Ada and Python. Somehow, the langages have opposite philosophy. Ada seemed more correct to me until I discovered Extrem Programming.

    In Ada, you specify the maximum number of things when programming, so that the compiler understands better what you mean, and points out mistakes. The advantage is that the compiler finds many mistakes. The drawback is that you must provide a lot of information to write code, which may seem tedious to some people. Another drawback is that it takes a lot more time to refactor your code, because it is so strictly set. I don't know if a refactoring tool exists for Ada but it is certainly a great need.

    In Python, you do not specify a lot. Objects are dynamically typed and you do not specify their types. There is no concept of interface. This makes the programmation very light, because you can quickly write stuff. It also makes the language very versatile. You can replace classes, objects and functions dynamically. This versatility make python even suitable for other programming paradigm: functional programming, prototype-based oriented programming, and aspect oriented programming. Of course, there is a drawback. Mistakes are not caught at compile time but at runtime, if they are caught at all. Passing a wrong object is not detected unless a wrong method is called on that object. If this mistake is in an rarely used program part, it may remain undetected for a long time. Tools like pychecker try to correct this.

    [ In the middle of this, C++ looks to me exactly as the wrong compromise. You'll never have the versatility of python but you still lack the strictness of Ada. You have all the drawback of static typing without enough advantages for compile-time checking. And many things remain unspecified (size of an int ?). Java is C++ getting closer to the python side and addressing some of the C++ problems. But I don't like Java, probably for some emotional reason. This is too much the C++ without the advantage of C++ to me. ]

    The reason Ada is so verbose is to grasp precisely what the programmer wants to express to solve a particular problem. This expressiveness looks like a right theorical approach to me. This gives robustness to the program being written. Ada programmers tell you they don't waste time debugging because the compiler finds all their mistakes.

    This is very powerful but I can't stand Ada's verbosity (and even C++ is too much sometime). I am much too impatient and I like thing to work immediately. This is why Python gives me great pleasure to work with. I think, I code, it works. The time wasted for when a mistake was spotted at runtime where C++ would have caught it at compile time is negligible in front of the huge gain in productivity and the pleasure I get. And pychecker almost solved this problem.

    Of course, it is all a matter of compromise. C++ is inherited from C. It provides OO with the compromise that it is very similar to C, and very rigid. Ada made no compromise on conciseness and easiness to write, to get a very robust program. Smalltalk made no compromise on Object Orientation. Python compromised on speed and specification to get a very simple and versatile language. So using one language is a matter of accepting some compromises. Different compromises are acceptable in different situation, which make each language differentely suitable. [ I know this is a trivial conclusion ]

    I know that but still, when writing python code , I can't help thinking

    Okay, I am writing code four times faster as in C++, and the verstality of the language allow me to do fancy stuff and to refactor my code very quickly. But those Ada guys got it probably right, their code is more robust. If my application grows too much, python's lack of contract (interface, type specification, ...) will probably create subtle bugs.
    This was, until I discovered Extrem Programming. XP made my python program even more robust than what they could be with a language like Ada. How come ? The idea with Ada is that the compiler ensures that the code behaves like the programmer intends it. Whether that intent is the one that solves the problem remains questionnable. The Ada paradigm assumes that the programmer knows exactly what he wants to do, and that this is exactly the solution to the problem.

    With XP, the approach is more modest. There is no assumption on the programmer. The only assumption is that the proof of the solution can be expressed by a test, which the programmer is able to write. Programming is very often equivalent to solving a problem for which you know at least one solution. Usually, you know all the solutions but they take a long time to retrieve or compute. But enclosing the solution in a test is possible. So you write tests for all the features you want, and tests that exercise all the possible entries of your program. Your program works when all your tests passes. This gives you a more robust approach than simply using Ada, because you actually get a proof that your program does what you want it to do. Whether your code does it the way you intended it or some other way does not matter, as long as your tests pass.

    Applied to python, this turned out to be great. There is no more hidden path in my programs because my tests checks all the possible paths. The versatility and simplicity of python allow to me to write very quickly many tests, some of them to test very complicated features. When my requirements change I can quickly refactor my tests and my code (the python refactoring tool, bicycle repair, helps).

    Now, when writing python code and tests, I am thinking

    Guido van Rossum and those XP guys got it right.

    The attentive reader will object that Extrem Programming is a solution that can also be applied to Ada (there is an Ada test suite unit on the XP site). And that XP alone doesn't bring all the robustness of Ada to Python. You still rely on the programmer to test every code path. This is true. But my main concern was how to have robust python code, without trading python's intrinsic verstailty. I have it, I am happy. Ada's over-specification is still certainly very useful, but it doesn't sound to me as correct as before. I know a better way to achieve a better goal.

    2 older entries...

    New Advogato Features

    New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

    Keep up with the latest Advogato features by reading the Advogato status blog.

    If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!