Older blog entries for adulau (starting at number 99)

2009-02-22 Society Economy and Metrics

Society, Economy and Metrics : Rethinking Economy In The Society

When an idea is confronted over time, there is a high risk (but that's part of the game) of destruction. If the idea is coming more and more stronger over this confrontation process, there is the possibility of the something new to be created over time. The past few months, I read again André Gorz especially Écologica and L'immatériel : Connaissance, valeur et capital. Surprised by his consistency and ability to surround the important topics in the information society, there is a common recurring concept always popping up in his works : the metric and especially the lack of universal measure (called "étalon de mesure") in the information society. Gorz pointed the issue with the capital and the operation of the economy trying to capitalize on the "intangible capital". Reading his works right now is very interesting especially that he was really pointing the risks of creating economic bubble while trying to apply the capitalism techniques of tangible asset against the intangible.

Looking back, the idea of "universal metric" in the information society was somehow already hitting my mind with the following post and projects : Wiki Creativity Index, Innovation Metric (especially that the clumsy patent system is the only metric in use) and Creativity Metrics Are Needed. Project like Ohloh is already providing a specific answer to quantify the activity in the free software community. We are still far away(?) from an "universal metric" but when it will possible to link the respective activity of a human being with an exchangeable "money" (like bitcoin), we could have the possibility of growing without impacting the natural resources and funding the society with a real citizenship.

Tags: metrics creativity positivism freedom economy society

Syndicated 2009-02-22 15:15:20 from AdulauWikiDiary: RecentChanges

2009-01-02 Google Books And Europeana Are Killing Public Domain

More than two years ago, I made a blog entry about "Google Books Killing Public Domain" where Google is adding an additional clause to render public domain works into (again) the private circle by limiting the use to private use all public domain works scanned by Google.

Reading an Interview (sorry in French) of Jean-Noël Jeanneney, Mr Jeanneney is very proud of the Europeana digital library competing with Google Books. That's nice to see competition but is it really different from Google Books? No, Europeana is also transforming public domain works into proprietary works. Just have a look at Europeana's terms of service (copying section), they make the same mistake.

I had a lot of arguments especially during a conference held by the BNF about digital libraries, their arguments is about the cost of scanning or the "add of value" in scanning those public domains works. Sorry to say that but this is pure fiction (to be polite ;-), there is nothing like "adding value" while scanning an old public domain book. If you want to create wealth for the benefit of Society, please release public domain works as public domain. You'll see unexpected use (including commercial use) of those works and that will benefit everyone even the Libraries doing the scanning.

If you want to be ahead (I'm talking to Europeana or even Google) and help everyone, please leave the public domain works in the public domain.

The Practice of Programming

Tags: google publicdomain archiving copyright europeana

Syndicated 2009-01-02 21:53:53 from AdulauWikiDiary: RecentChanges

24 Dec 2008 (updated 25 Dec 2008 at 10:04 UTC) »

2008-12-24 Oddmuse Wiki Using Git

If you are a frequent reader of my delicious feeds, you can see my addiction regarding wiki and git. But I never found a wiki similar to Oddmuse in terms of functionalities and dynamism relying on git. Before Christmas, I wanted to have something working… to post this blog entry in git. The process is very simple : oddmuse2git import the raw pages from Oddmuse into a the master branch of a local git repository. I'm using another branch local (that I merge/rebase regularly with master (while I'm doing edit via the HTTP)) to make local edit and pushing the update (a simple git-rev-list --reverse against the master) to the Oddmuse wiki. The two scripts (oddmuse2git git2oddmuse) are available. Ok it's quick-and-dirty(tm) but it works. There is space for improvements especially while getting the Oddmuse update using RSS to avoid fetching all the pages.

http://www.foo.be/blog/img/git-and-oddmuse.png

Tags: wiki oddmuse git

Syndicated 2008-12-24 21:17:19 (Updated 2008-12-25 10:04:36) from AdulauWikiDiary: RecentChanges

21 Dec 2008 (updated 29 Dec 2008 at 17:08 UTC) »

2008-12-21 Scientific Publications and Proving Empirical Results

Reading scientific/academic publications in computer science can be frustrating due to various reasons. But the most frequent reason is the inability to reproduce the results described in a paper due to the lack of the software and tools to reproduce the empirical analysis described. You can regularly read reference in papers to internal software used for the analysis or survey but the paper lacks a link to download the software. Very often, I shared this frustration with my (work and academic) colleague but I was always expecting a more formal paper describing this major issue in scientific publication especially in computer science.

By sheer luck, I hit a paper called "Empiricism is Not a Matter of Faith" written by Ted Pedersen published in Computational Linguistics Volume 34, Issue 3 of September 2008. I want to share with you the conclusion of the article :

However, the other path is to accept (and in fact insist) that highly detailed empirical 
studies must be reproducible to be credible, and that it is unreasonable to expect that 
reproducibility to be possible based on the description provided in a publication. Thus, 
releasing software that makes it easy to reproduce and modify experiments should be 
an essential part of the publication process, to the point where we might one day only 
accept for publication articles that are accompanied by working software that allows for 
immediate and reliable reproduction of results. 

The paper from Ted Pedersen is clear and concise, I couldn't explain better that. I hope it will become a requirement in any open access publication to add the free software (along with the process) used to make the experiments. Science at large could only gain from such disclosure. Open access should better integrate such requirements (e.g. reproducibility of the experiments) to attract more academic people from computer science. Just Imagine the excellent arxiv.org also including a requirements in paper submission to include a link to the free software and process used to make the experiments, that would be great.

Tags: openaccess research education freesoftware

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!