Time for a humble browser

Posted 27 May 2002 at 12:28 UTC by Alleluia Share This

The document object model (DOM) used by Mozilla and Internet Explorer are different enough that any serious DHTML coding needs to be written twice for every project. And Opera, being rigorously standards-based, won't render dynamic layers, which other browsers have done elegantly for years. Making code cross-browser compliant is currently left in the hands of the DHTML programmer. But what happens if we design a browser which can handle both DOMs? Put cross-browser compliance into the browser.

In anticipation of the 1.0 release of Mozilla, coming up in a few short days, I'm now ready for the next phase:

The Humble Browser.

In order to consider the next generation of browsers, we need to look at the current Browser Wars as drawing to an elegant closure.

It's hard for geeks to be humble, but it's also part of the unwritten ethic of truest hacking. In fact, I believe one of the underlying principles of the whole Open Source / Free Software movement is one of humility. It is humility which enables the profound foresight to see that the small return of money-for-labor is secondary to the larger return of extending the boundaries of an intangible principle like 'freedom.'

It is a faith in the concept that if you give things away, your immediate needs will be taken care of out of the wealth of OTHER people doing the same thing. So far it has worked extraordinarily well, and we have entire Operating Systems, Browsers, Portals, Scripting Languages, and Search Engines built on this 'wealth'.

Not all people who develop free software are aware of the larger implications of their work, which is presently extending the philosophical and actual boundaries of 'freedom' into areas never before accessible by so many. Many coders do it because it simply 'makes sense' or because they've relied on free software to BECOME who they are, and want to return the favor to the free software community.

But there is a deeper principle, of which humility is a component part, being invoked when we submit our code to the GPL (or similar licenses).

Now I would like to articulate that principle, in a request for a Humble Browser. While in my estimation, the folks at Opera are the most technically and philosophically capable of producing the first edition of this concept, yet I have all confidence the Mozilla coders could do so as well, although it would take longer. And be freer. :-)

However, Opera's attention to standards may keep them from doing this, (although I hope not, because while I'm writing about the Humble Browser, I'm also referring to a browser which would be such a benefit to all DHTML coders--if done well--that its financial security in the Opera marketing model is assured).

Finally, while aware they might stumble across these words, I do not think the IE crew is permitted to render their technical abilities into the open source format, so I cannot in good faith invite them to hear this plea. Although stranger things have happened.

The Humble Browser is a browser designed with the ability to unobtrusively render calls to either the IE DOM or the 'Netscape' DOM, smart enough to figure out elegantly what the developer is trying to do, even though he might be mixing 'LAYER' and 'DIV' tags in his DHTML.

And while it can render anything that meets the strictest standards, as Opera does better than any other browser, it would also be able to render code which is well outside of standards, thus incorporating some of the amazing innovations that have come out of IE development for example.

I call it Humble, because most Free Software/Open Source folks I know won't even consider this notion. Not separating technology from politics, they are too proud to recognize that IE has made some useful extensions to the DOM.

I develop in DHTML quite a bit these days, and I really want to target the Opera browser, but I cannot because that browser simply will not render dynamic layers because of their strict adherance to standards. So I work to make my pages pleasing in Opera, but they are not very function, since I use dynamic layers freely BECAUSE THEY'RE A BRILLIANT IDEA. I'm sure Opera programmers can technically make dynamic layers available, but they don't because they're holding to the political position regarding 'standards.'

I understand the fragile connection between politics and technology: they're not mutually exclusive domains, but they are too often mingled much too closely, and the current Browser Wars is a perfect example.

What is technically possible should never be limited by politics, unless we're talking about actively modifying Nature, as in genetic manipulation.

And while I really like the idea of standards, I also shudder to think of where we'll be in fifty years, if Opera's position becomes dominant. Innovation will dry up because everything is 'standards-based' and the standards are ALWAYS a couple of years behind innovations. Having worked in a giant bureaucracy, and also for a small company, I prefer the latter because the standards are so much freer. Easier to navigate. In fifty years, our 'standards' will be a burden, if we do not infuse them with the freedom to grow radically as they are doing presently.

I do not want a standards-based Internet which tolerates no new extensions, and for THIS reason, I am an Open Source advocate who is really excited about IE's innovations, frustrated by Opera's simple inability to allow a drag-n-drop interface within a browser.

So, in summary, a humble browser is one which is able to incorporate new extensions to the DOM without getting into a war over it. Surely this can be done.

Please understand: The idea is not to create sloppy JavaScript programmers, who intermingle DOM calls, but to bring an elegant conclusion to the Browser Wars, which have left DHTML coders writing two versions of every program for every release. There will be a period of developers who write slopping cross-browser code, but as time goes by, the most elegant DOM will coalesce--and also continually innovate, and the greater victory, allowing DHTML programmers to do their work in less time, will be accomplished.

All it takes is a little humility, and freedom moves forward elegantly.

Alleluia is a PHP, DHTML, VB, and a little bit of whatnot programmer who works for a small data capturing company in the American Midwest. He's been hanging around the Internet for ten years now, since it was surfed by Veronica and Gopher with Lynx, and humility was a virtue. He's really glad to see the EFF competently tackling such huge issues as copyrights in the U.S. Constitution.


Standards are a Very Good Thing (tm), posted 27 May 2002 at 13:04 UTC by Denny » (Journeyer)

> What is technically possible should never be limited by
> politics, unless we're talking about actively modifying
> Nature, as in genetic manipulation

If you can allow one exception, then in my opinion you have to allow others. It's technically possible to kill everyone on the planet several times over... I'm quite glad that politics has so far prevented this.

Personally I'd rather see legitimate scientific investigation into genetic manipulation than see Opera and Mozilla drop their adherence to standards :) You may feel my priorities are warped, but I'm not too keen on your suggestion either.

It is my opinion that without an almost total adherence to standards, the Internet would fragment. Interestingly, all of the companies that have consistently attempted to bring about this fragmentation (I'm thinking of Microsoft, AOL, and Netscape here, there may be others) have actually had to bring their products into closer alliance with the standards over the last few years because the inertia of the Internet is such that they cannot fragment it - their attempts to do so have only lost them market-share or mind-share, and they've been forced to reconsider. This makes me happy.

Incidentally, I think Mozilla renders pages to the letter of the standards as well, at least in the area I work in (XHTML/CSS generated with Perl), so I presume your objections are only to their approach to DHTML ?

There are standards for DHTML as well - I think your energies would be better spent persuading vendors to implement the standards, and persuading standards bodies to add good new ideas to the next revision of those standards, rather than trying to persuade the open source community that it would be a good idea to attach less importance to the crucial issue of standards adherence.

The easiest and possibly the most important way to persuade vendors to implement the standards is to only use standards compliant code when writing websites... if a client doesn't understand why they can't have the latest whiz-bang non-standards-compliant mis-feature on their site, I think it would be more professional to try and explain to them the implications of using non-standards compliant code than it would be to just implement the mis-feature they think they want.

All the above is IMHO of course... but I believe in it quite strongly, so perhaps the 'H' isn't fully apparent - or even appropriate? If you really believe in what you believe, should you humbly stand by while people tell you that you're wrong? I don't think so...

Regards,
Denny

Yes, Standards are good, but so is Innovation, posted 27 May 2002 at 14:33 UTC by Alleluia » (Journeyer)

>I think your energies would be better spent persuading vendors to
>implement the standards, and persuading standards bodies to add
>good new ideas to the next revision of those standards, rather
>than trying to persuade the open source community that it would
>be a good idea to attach less importance to the crucial issue of
>standards adherence.

I agree. I stress that "persuading standards bodies to add good new ideas to the next revision of those standards" is becoming more difficult over time, and here is how to respond to that increasing difficulty.

In no way am I trying to erode the value of Standards, as I understand the principled reasons underlying them. We need standards. My concern is deeper. At the present time, the Internet is very young, and foundational standards are being created as we write. This time period will soon end, and then we will all be faced with the standards we created "back then" (which is currently now).

And here is where it becomes apparent that we always need a very robust periphery to the standards. We need to build into the standards a limit to their strength. Otherwise, the standards will become oppressive tools, used in the hands of multinational corporations to do exactly what they're trying to do now by extending standards. We have to stay ahead of them, gracefully, and elegantly limiting their power before they take control of "independent" standards bodies.

Perhaps you envision that standards bodies cannot be influenced by large corporations.

What I'm talking about is the maturing of the Internet into a framework where large corporations will do all they can to control the information infrastructure, and this means they will increasingly gain control over the standards bodies until the "standards" are dictated by them for their own internally defined interests.

Then standards will be very difficult to 'extend.' MSCEs will be mandatory for everyone on a standards body... you get the picture.

I am clearly proposing creating a browser which implements the full standards, exactly as Mozilla already does. I clearly assume we need them. What we also need is room for innovation in a format which is never "controlled" by the standards bodies. Thus the need for a humble browser; one which somehow draws this current Browser War to a conclusion, elegantly.

Thus, I am advocating a browser which is designed to adhere to standards AND to elegantly incorporate innovations, so that Standards bodies maintain their proper position... without becoming oppressive.

Thank you.

So work to make it a standard, posted 27 May 2002 at 16:44 UTC by alan » (Master)

And while you are at it you can help figure out how to make dynamic layers accessible to blind users 8)

Alan

Adhere to standards OR extend them, posted 27 May 2002 at 20:27 UTC by Denny » (Journeyer)

> I am advocating a browser which is designed to adhere
> to standards AND to elegantly incorporate innovations

I think you can either adhere to the standards, OR you can allow people to extend them. Once someone has extended it, it's no longer a standard - it just bears an unfortunate resemblance to one.

Calling that act of extension by a name with a positive semantic value ("incorporating innovation") does not convince me that it is a good idea. I offer this alternative phrasing for the concept you seem to be advocating:
"I want a browser which will implement the standards, but which will then allow people to pollute them if they want."

Microsoft particularly are notorious for the 'embrace and extend' approach to killing open protocols - it would seem to me that a highly effective way to encourage that behaviour would be to allow extension of the standards at the application level.

Regards,
Denny

Humility? Innovation? No..., posted 28 May 2002 at 02:38 UTC by tk » (Observer)

What I have against new-fangled things such as JavaScript, VBScript, DHTML, etc. is software bloat.

RMS may not care about freeing people from software bloat -- many of the GNU utilities are stuffed with features which nobody will ever use -- but I do. Many people may not worry that Mozilla takes up several megabytes, and its startup time's in the order of minutes, but I do.

That's why I'm now using the w3m browser instead of one of the major browsers: even though it doesn't do cute Java animations or run fancy JavaScript code, it's small, and it's fast.

To stiffle or not to stiffle, that is the question. , posted 28 May 2002 at 06:11 UTC by Ilan » (Master)

"Innovation" should be embraced when it truly adds to the user experience and stifled when it is used as a synonym for the word "feature".

cool browser, posted 28 May 2002 at 15:18 UTC by Malx » (Journeyer)

Try Links2.0 instead of w3c. It could run both in txt mode and graphics. Also it supports basic JavaScript (it is usefull for JS form submissions widely used).

If you looking for crossBrowser DHTML - try DynAPI-II. If you are looking for browser with both IE and Mozilla support - See NeoPlanet for MS Win - it could switch between M and IE engines.

I also whould agree to standarts supporting. And... It could be possible to make DHTML library for IE, which patches it to be more DOM compliant. Or... It you could choose to create mod_ for apache, which will translate standart DOM to custom browsers DOM such as IE or NN4.

Tips for Testing Websites; Standardization of Real-World Experience, posted 28 May 2002 at 23:55 UTC by goingware » (Master)

I realize this is only peripherally related to this discussion, but it seems somewhat on-topic.

After I got a small consulting job from a client whose web server application fell over from the modest user load they experienced when they put their site live on the Internet, I thought I would write:

Any new HTML pages I write now are validated with the W3C HTML validator and I'm gradually bringing my old pages into compliance too. I recommend it. They also provide a validator for Cascading Style Sheets.

Yes, I'm sure there are great things you can do with the web that either violate or just aren't covered by the standards. But aside from proving the validity of new standards proposals, I think it is a bad idea for most people to make use of them. There will be far more progress made if the desired functionality can be covered by new standards, because then you can be assured of interoperability - and the ability for the interoperable programs to include Free Software.

Note that the IETF prefers (if not requires) new protocols to result from studies of actual working products. Yet it is compliance to the final published protocols that makes our internet work today. It is the use of extensions to protocols, formats, and programming languages in production by end users that make Microsoft products such a nightmare.

tidy, posted 29 May 2002 at 12:42 UTC by dalinian » (Journeyer)

The W3C HTML validator is one thing, but they also offer HTML Tidy. Simple stuff, and it's free software too. No need to bother using a web based validator, when you can validate your stuff on your own machine.

Emacs validator didn't work for me, posted 29 May 2002 at 13:37 UTC by goingware » (Master)

Yes, I knew about HTML tidy, but I prefer making my documents valid by manually editing them. I think one gets better results, although it may be more time consuming. Can tidy be used for simple validation?

One can do validation when in emacs using psgml. The command is C-c C-v, sgml-validate. This works fine when I use it to validate DocBook XML documents, but when I tried it for HTML, the validators complained of some obscure problem with HTML's sgml declaration.

The error I get is:

nsgmls:/usr/share/sgml/declaration/html.decl:23:32:W: characters in the document character set with numbers exceeding 65535 not supported

The line it's complaining about is

                 57344   1056768 57344

which is in a section marked "CHARSET".

It would probably be quickest of all for me (and save the W3C some bandwidth) if I could just do my HTML validation with emacs.

Validating dynamically generated HTML, posted 29 May 2002 at 14:21 UTC by goingware » (Master)

It just occurred to me another reason for preferring the W3C validator over tidy is that it can validate by URI, and so fetch the HTML via HTTP. That way you can validate programmatically generated HTML - cgi's, php, java servlets and the like.

The form submission for the URI validator uses the GET method so you can bookmark a validator for a particular page, or put a link to a validation in another page. Try validating this discussion, or if you prefer try validating Slashdot. Contrast those with validating www.debian.org.

It is handy, when you're developing a website, to write an HTML page that links to validations for each of the pages on the site, as I do here for the LinuxQuality site.

If you don't want to hit the internet to validate, you can download and build the source code for both the HTML and CSS validators. This would be particularly useful to have a company validator on an Intranet, behind a firewall, so you don't have to send confidential pages over the Internet for validation. It would also be more responsive.

Also I've found that some company firewalls disable the file upload feature of HTML forms (that is used for validating a file from your local machine), presumably to prevent h4x0rs from exploiting browser bugs to steal files off your desktop. Having a local validator avoids problems with firewalls.

They're testing an online XML validator that is based on Xerces.

Last night I posted the URL to Why You Should Use Encryption in a Slashdot comment, and someone wrote in to say the lower half of the page was just one big link in Konqueror. It looked fine to me in IE6 and Mozilla, but I checked it with the validator and found a couple dozen errors and quickly fixed them. It turned out the page really didn't render correctly in either of my browsers but the bug was much more subtle - they omitted a few words in the middle of a paragraph.

I am slowly bringing my sites into standards compliance. Anything new I write is validated, and periodically I update older pages so they will validate.

More HTML validation (kinda OT, but eh), posted 29 May 2002 at 15:14 UTC by piman » (Journeyer)

The WDG also has an HTML validator which does HTML1-4 and various XHTML versions (1.0 and 1.1 at least, probably Basic too). I find it more useful than the W3C one because it can recursively validate pages on a site (up to 100 at a time, iirc).

Validation, extension, posted 29 May 2002 at 20:26 UTC by jwb » (Journeyer)

I think the premise ofthis article is a little bogus. There is a very large subset of CSS, HTML, XML, and DOM that works in both Mozilla and IE. If you stay within that subset, you can target most users and at the same time adhere to a published standard. As long as you adhere to the standard, you can point out to whining users of 2nd-rate browser that the standard exists and is open, and their browser is free to implement it.

Regarding validation, the best way to ensure that your programmatically-generated markup is valid is to use the DOM. Build a DOM tree in your program and serialize it to XML. You can't really produce bad markup that way. Also, you gain a large amount of flexibility compared to print(<html>...).

Tidy and Validators..., posted 29 May 2002 at 20:34 UTC by link » (Apprentice)

Dave Ragget's most excellent Tidy -- originally hosted by the W3C, but now moved to SourceForge -- can do a lot of things, but it is not a Validator. It's a singularly usefull tool for cleaning up your HTML, and it will give you warnings about some things it's developers consider a bad idea, but it doesn't "Validate". It's analogous to "lint", not "gcc -wall".

:-)

The W3C HTML Validator OTOH does Validate. It uses an SGML Parser -- the Open Source OpenSP SGML Parser -- to validate it's input. It combines your HTML with the proper SGML Declaration and External Subset ("DTD") and formally validates the assertion that they are internally consistent and conform to the SGML rules.

Tidy is a complementary tool to a Validator; not a replacement!

Other formal Validators that I'd reccomend are the WDG HTML Validator -- built on much the same principle as the W3C Validator -- and the many excellent tools -- of which the HTML Validator is but one -- provided by Nick Kew and WebÞhing Ltd. at the Site Valet.

Standards vs. Implementations, posted 29 May 2002 at 21:27 UTC by link » (Apprentice)

And while I really like the idea of standards, I also shudder to think of where we'll be in fifty years, if Opera's position becomes dominant. Innovation will dry up because everything is 'standards- based' and the standards are ALWAYS a couple of years behind innovations.

Your premise is, I fear, faulty Alleluia.

The W3C published CSS1 in 1996 and CSS2 in 1998. To date, there exists not a single complete implementation of either Recommendation. Not one!

MSIE:mac 5.0 comes very very close to implementing all of CSS1 and Mozilla 1.0 is, IIRC, not very far behind (it may even have reached 100% by now), but support for CSS2 is at best spotty. Opera and Konquerer are otherwise very fine browsers, but they lagged significantly behind the two big ones in this area last I looked.

The try looking at the W3C DOM Level 1 and DOM Level 2 Recommendations. How much of these Recommendations do you think are implemented by the various browsers? How many bugs do you think they have?

No browser I'm aware of even supports the full HTML Recommendation (HTML 4.0, XHTML 1.0, or XHTML 1.1; take your pick) for crying out loud! Don't even think about looking at support for MathML or SVG. SMIL? Nope. Not even decent support for PNG despite UNISYS' validant efforts to popularize it!

Now XHTML 2.0, CSS3, and DOM Level 3 are in production. How long do you think it will be until the various browsers support these standards? How about SVG 2.0? How about the WCAG, ATAG, and UAAG from the WAI? And if you want to see real lag, try looking at any one of the 2 bazillion new XML related specs released over the last year and see how many tools -- not just browsers, any implementation! -- implement those.

The problem isn't that Standards lag behind Implementations. It never was. The problem is and always was that Implementations lag behind the Standards or willfully disregard them in favour of their own proprietary variants.

And the reason they do is that you, the web content developers, not only let them get away with it but even encourage it!

link is who he is and does what he does and does not feel a great need to tell people about it all the time. If you're terminally curious, Google is -->that way...

Mozilla and compliance, posted 29 May 2002 at 23:50 UTC by piman » (Journeyer)

link, I'm pretty sure that Mozilla supports all of HTML 4.0 at this point (I seem to recall that <link> was the last tag left to implement, and they finished that a while ago). By extension it supports all of XHTML 1.0/1.1, at least per se, namespaces and such may confuse it. It also iirc renders some invalid XHTML content (at least, it used to render upper case tags). Not sure what your beef with PNG is... Mozilla supports the full 8 bit alpha layer, any color depth, any palette, and has an MNG renderer. IE lacks the alpha layer, but supports a 1 bit alpha layer in paletted images. I also think it implements all of CSS1, but there may be a few bugs. I'm less sure of that than the HTML 4 support, though. I don't know where DOM stands because I don't use it, but my guess is DOM1 is done or close to it.

As for the XML specs, Mozilla does decent. There's at least some SVG and MathML rendering in it (although I haven't tried it), and things like XHTML 2.0 are going to come "free" by supporting XBase/XLink/etc.

Other things, like the UAAG, are heavily based in current practice already. Those guidelines are also targeted at content developers as well as browser developers, it's as much up to them. One of the things the interface guidelines recommend, by the way, is use of the <acronym> or <abbr> tag. Since Advogato doesn't allow them, you could've at least mentioned expanded the acronyms you used, or linked to their specifications for people who don't track web standards very much.

Aside from that you are correct in some ways; Opera and Konqueror lag behind IE5 Mac and Mozilla, but I think they're about equivalent to IE5/6 Win, which can be improved a lot, but the situation is not nearly as awful as you make it out to be. It's only really bad if you're trying to work with DHTML. In general, it's really easy to work with CSS1, 2, and HTML4/XHTML1 to make good pages that work in all browsers but NS4 (and work fine in NS4 if CSS is turned off).

I think we're missing the point., posted 1 Jun 2002 at 12:42 UTC by egnor » (Journeyer)

There is a question of whether standards lead innovation or innovation leads standards, and there is a question about the value of extensibility in standards, but those questions are moot here because the whole premise of the original complaint is idiotically misinformed.

Alleluia's massively ignorant mistake here is that nearly every feature he wants is, in fact, present in the standards for Web interoperability, and implemented by browsers such as Mozilla (and, increasingly, even IE). Opera may not have caught up yet, but if so the problem is that Opera is insufficiently standards-compliant, not that it is excessively standards-compliant.

Use the W3C DOM. Use W3C DHTML (or "dynamic layers" if you want to call it that). There. It works and it's everything you want and no innovation has been squelched. It is clear from your post that you know almost nothing about these standards you are deriding. Learn them and use them first (they're well documented -- better so than IE's proprietary versions thereof) and then revise your whining. Do you really think LAYER is better than CSS positioning? That's ridiculous.

There's a lot to complain about with the standards process, and there are a lot of interesting discussions to be had, but this is not one of them.

Denny and jwb have it mostly right. What they're missing is that there really is nearly nothing that you can do in, say, IE, that you cannot also do in a standard way. (The exceptions have to do with Windows-specific functionality, like ActiveX, which we could not implement without a full Windows emulation even if we wanted to.)

w3C anti advocasy, posted 1 Jun 2002 at 19:13 UTC by Malx » (Journeyer)

Do you really think LAYER is better than CSS positioning? That's ridiculous.

I do. Becouse absolutly positioned content is not part of document actually (it is moved out of normal document flow). Ideologically it is separate entity.
Also you couldn't really compare LAYER and CSSP without comparing DOM.
LAYER in NN4 creates separate document. You could thinks of it as new window. And it is comparable only with <IFRAME.
You see it is not much difference in using <IFRAME or <LAYER :)
Ok. you could compare it with CSSP+scrolling, but you must type more characters to create same object and it will be in same document, not in new one. You couldn't load new HTML file into it (layer.aa.src="new.html").

there really is nearly nothing that you can do in, say, IE, that you cannot also do in a standard way.

You are wrong again. You can't do filters. You can't do transition effect (when one html page replaced by other with some effect like in PowerPoint presentation) - diagonlly or with strips moving or random dots replacement.
There is "data binding" - when you create HTML page with <table, wich is directly binded to some data storage. When DB changed this page also changes without reloading page!!!

So do not blindly claim standart is all.

Still I whould not recommend using MS extensions for web sites. Only for intranet and custom applications.

Transitional Effects? WTF??, posted 5 Jun 2002 at 10:38 UTC by Denny » (Journeyer)

> You can't do transition effect (when one html page replaced by other
> with some effect like in PowerPoint presentation) - diagonlly or with
> strips moving or random dots replacement.

I suspect this falls into the "doesn't need dignifying with a response" category, but WTF??? We're talking about webpages, not marketing presentations or children's TV programmes!!

The web is (or should be?) about CONTENT, not special effects. If you want to see pretty pictures move on your screen, go watch a cartoon...

*shaking head in disbelief*
Denny

This browser is not "humble" at all..., posted 5 Jun 2002 at 21:20 UTC by gerv » (Master)

The document object model (DOM) used by Mozilla and Internet Explorer are different enough that any serious DHTML coding needs to be written twice for every project.

I want to stop you right there. IE's support for the W3C DOM in versions 5 and 6 is just about good enough that you can write W3C-compliant DOM manipulations and expect them to work cross-platform. If that's not the case, there are several cross-browser DHTML libraries that you can use, where someone else has done all the hard work for you. If that's not what you want, you can even emulate IE's proprietary DOM in Gecko, because Gecko is powerful enough to let you do that.

The Humble Browser is a browser designed with the ability to unobtrusively render calls to either the IE DOM or the 'Netscape' DOM, smart enough to figure out elegantly what the developer is trying to do, even though he might be mixing 'LAYER' and 'DIV' tags in his DHTML.

You say this as if it's easy to "figure out what the developer is trying to do". Standards are the things that mean you don't have to "figure". What if IE figures one thing and Netscape figures another? You are right back where you started, with two versions.

What we also need is room for innovation in a format which is never "controlled" by the standards bodies. Thus the need for a humble browser;

So "The W3C standards are pants; I'll just invent my own way of doing things" is a humble attitude?

Gecko and IE still don't implement all of DOM2, CSS2 or CSS3. Let's do those, and then see what's still missing.

Gerv

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page