13 Sep 2000
(updated 13 Sep 2000 at 20:41 UTC) »
The sun was born/ so it shall die/ So only shadows comfort me...
Really tired here at work. Went to see VNV Nation and Apoptygma Berzerk in concert last night. A
(almost religious) experience. The venue (the Limelight) is a huge converted church, with a main performance
and several outlying bars. The gothic ceilings, and original stained glass are impressive. There are also two
of catwalk above the stage for better view of performances. I was thrilled that the concert was there, because it is
one of the few times I get to go to the Limelight without dealing with hip hop and Bridge and Tunnel people.
Both bands put in solid performances. VNV Nation had an impressive backing video, and it
"DVD Play" appear on the screen behind them before they came on. Apoptygma Berzerk had an even larger
screen with stage lights on it. It was amazing. I haven't danced so hard in quite a while, and I finally staggered
home at 2:30 am (up at 8:30am). Thank goodness for coffee. I have to go to a seminar on XBRL this afternoon,
I'll probably need a refill so I don't fall asleep.
My sister sent me the Job doll from TrainUpAChild.
never owned an action figure with open sores before...
mattbradshaw asks why the difficulty of factoring large numbers
Here's a quick summary. Theoretical computer science defines P algorithms as those that can be completed in
polynomial time (eg, O(n), O(n^2), etc). A more difficult class is NP algorithms. These can't be solved in
time (eg, O(2^n), O(n!)), but can be verified in polynomial time. In order to prove that factoring large numbers, you
would have to prove the NO P-class algorithm exists for doing such an operation. Establishing existence in
is easy, but nonexistence is usually not possible. Although we haven't found a P solution yet, there is no good
to prove one doesn't exist. Of course, if you did find a P solution, I think you might also prove that P=NP, getting
the acclaim of all theoretical computer scientists everywhere.
squiggy: The problem stems from several limits. Search engines can be very big (500 million
pages). Due to limitations on bandwidth and processing, most of them only have a monthly refresh cycle for their
indexes. Unfortunately a lot can change in a month on the web. Another limitation is that you are only really
allowed to request pages from a given web server once every 20-30s (if you want to be polite and not piss them off).
This means you can only make some 2000-3000 requests against a given web site a day. This increases the time
it takes to index a site, reducing its frequency in a given period.
Potentially, you could write some batchmode processes to verify that links in the index are live. However, this must
still obey the 20-30s rule, and since you're checking pages, most search engines feel that you might as well just
wait until the next reindexing cycle. There are some other potential solutions, but I can't really go into them now.
The XBRL symposium was boring. Too much explaining to CPAs why XML might by a Good Thing(tm) for financial