I was reading literature on Computer Measurement Group site. One article is by Jing Zi of Keynote Systems. "Statistical analysis of Web page download time measurements suggests that some relatively simple formulae can be derived to project page download times based on Web page composition and TCP connect time for a browser/server pair"
As an active member on advogato & badvogato, i find it is interesting to write a formulae which will derive web page composition ( including weirdo 'illegal code') from the page download time. Or to write a formulae to derive TCP connect time for a browser/server pair from all other known measurements. Any further thoughts on the subject?