# Older blog entries for nutella (starting at number 235)

The human brain is a very weird device, and I can't tell how many background processes I have running at any one time. I will spend some time learning the boundaries of a particular problem and trying some basic solutions. Then I'll forget about it for a long time. Then I'll wake up with a hunch that I have to try. This happened this morning and netted me a somewhat obscure victory.

I have a quick script on my TI-89Ti to run linear regression on a pair of lists, spit out some statistics and then plot the data and the regression line. It can be a nice thing to have on a portable device. The one glitch that was keeping me from nirvana was that I couldn't easily limit the regression line so as to be plotted in the region encompassed by the data. In "TI-BASIC" (cough cough) the lists for a data plot have to be stored (not program-local) variables, so it makes sense to call the program with pointers to those lists as the arguments:

progName("arg1","arg2")

(putting the quote marks around a variable name turns it into a pointer) and you can then dereference the pointers when you need to in the program code:

LinReg #list1, #list2

where e.g. list1 is the variable name of the first argument defined in the program prototype. You can create a function graph of the resulting regression line with:

Define y1(x)=regeq(x)

but this will give you a line that stretches through +/- infinity (or at least crosses the entire graphics window). So you can constrain the domain of that function using "WITH" limitations, e.g.:

Define y1(x)=regeq(x) | x>0

which will limit the display of the line to the region with the dependent variable greater than zero. I wanted to limit the upper end of the line to the data and you can calculate that as max(#list1) so you *should* be able to use:

Define y1(x)=regeq(x) | x>0 and x<max(#list1)

but the interpreter does not parse that before passing it to the Y= Editor to set it up for function graph display. The ugly method I had been using was to save the x limit as a non-local variable and then referring to it for the limit in the definition. This morning I woke up with the idea of packing the graph definition into an expression that could be interpreted prior to it being delivered to the Y= Editor. So this ended up being:

expr("Define y1(x)=regeq(x) | x>0 and x<max(" & list1 & ")")

Nirvana! The bizarre thing is that I hadn't known that my mind was working on this. I'm now worried that part of my brain's capacity is occupied working on optimising some difficult Lego building exercise from when I was 5. People who say that we only use a fraction of our brain's capacity should probably be forced to qualify the statement to indicate that we're only using a fraction of the brain's capacity "for what we are working on consciously at this moment in time".

Yikes, is it nearly a year since my last post? I have been reading and enjoying Advogato on an almost daily basis but haven't had the time or inclination to write. My personal life has been way more interesting than it has been for a long time, so I'll see how it all works out.

This post was partly provoked by a previous entry by someone talking about R, which I have now been using regularly for over six months. I'm still using MATLAB and Mathematica, but R has all that Free Software goodness, which means I can just send the scripts to my colleagues and let them play. The other person's post was about graphics, and I believe that they were using the lattice/trellis form. That's overkill for my needs. However, one glitch I have encountered is with split.screen() as multiple runs of the script with different row and column values causes problems and I can't find where the persistent value is stored. Things I don't particularly like about R are the sparse documentation and poor examples (better than nothing) and that the language is big and doesn't follow the principle of Least Surprises (I am reduced to web searches for capabilities). One thing I do like is the Tinn-R editor.

Calculator frenzy!
I mentioned before that I have a Ti89 at work and this was recently spotted by an HP-phile colleague. To show off to me I now have their HP 50g on extended loan and I have to say that I am quite jealous. I love that Forth-like goodness of the RPL option and some of the more weird (but useful) functions. I miss the spreadsheet-like Table Editor of the Ti89, but I should be able to mimic that with vector manipulations on the HP - and there are also some third party apps for the HP. I'll play with it some more. I'm hoping that someone else has an NSpire CAS for further comparisons, but I hear that one is a bit of a disappointment.

Yes, it has been over a year since my last post. I've been reading Advogato regularly in the interim but haven't felt moved to write anything.

The trigger this time was experiencing some strange behaviour in my Advogato diary. Maybe I'm seeing the result of code designed to stop spam. Basically most of the hyperlinks in my old diary entries are served artificially so as to point back to my diary. The text of the affected diary entries is unchanged as if I open the entry in  mode the original links are there. This starts about 32 entries ago. It is not caused by bad links (most of them still work) and they are a mix of random .org, .com and even some .gov sites. I don't see anything about this in the FAQ. Feature or bug?

A fair portion of my older entries describes activities intended to keep old boxen functioning well past their compulsory retirement age. Since my last entry I moved (to a house!) which meant a cycle of packing and unpacking, which meant testing to see that everything survived (although the journey was short). Nearly everything works fine but I've been spurred into shopping for CMOS batteries - maybe I ought to have a regular replacement schedule the way that some do for smoke detectors. One hard drive (a problem-era Western Digital) refuses to spin up and some boxes required component jiggling to reseat memory and expansion cards. The other failure unrelated to the move was the LCD for my IBM T23 laptop. This had been a little flakey (a row of pixels would occasionally turn black) and was caused by me abusing it, often lifting it up by the open lid, and possibly due to my habit of running it without the battery (a state which might have reduced structural itegrity - as it did for my TP500). I grabbed it one too many times and the whole panel darkened and died (not just the backlight). Unlike the person in the Slarshdawt thread I decided to fix it and managed to find a local supplier of refurbished (ex-RMA machine) panels - Alan Computech in Union City. It arrived by UPS. It works. They also gave me a crazy (40%) Mothers' Day week discount. So far so good. This also gives me a better appreciation of the fragility of laptops so, for the meantime at least, I am being a little more gentle with my 8-year old ThinkPad.

Weird! It seems that a reasonably high proportion of the time I visit Central Computers (San Francisco or Santa Clara) Don Marti is there. Clone? Mistaken identity? Eerie coincidence?

Zoiks! Is it just me (Firefox 2.0.0.11) or did someone (<cough> adulau <cough>) forget to close a [bold] tag in their RSS feed?

1 Dec 2007 (updated 1 Dec 2007 at 17:44 UTC) »

While using truly Free software allows totally unrestrained joy when passing on tips and tricks to others, there's still some happiness to be gained when the software is proprietary but the recipients of the tip are people with whom you work. Here in the Real World[TM] I have to deal with (non computer) hardware manufacturers who sell overpriced computers running horrible equipment control software and who refuse to give you the "administrator" password, presumably because they believe you'd immediately copy the kludgey software to a more affordable box. Argh! Mercifully there's also equipment specific software written by Real Programmers and they've embedded macro languages that allow you to express yourself and get the job done. Thank you, Oh Sensible Ones! Today I managed to use such a a TIMTOWTDI rich macro language in a strange way and it was clearly The Right Way. It was so beautiful. My co-workers immediately appreciated the extra stability and efficiency, if not the beauty of the code. That was reward enough.

I wanted to demonstrate to a colleague the prevalence of typos out in the interweb and so asked The Google to return hits containing "Gusty Gibbon" (one of my favourites). Alas the big G assumes that this is just a typo and returns many hits for the more boring correct title.

But what if I was working on a project on primate flatulence? How would I find the information I need?

As I mentioned earlier, I have been allowed some time to play with Mathematica at work. I tried to assess it by transliteration of some of those popPK spreadsheets and in doing so it has grown on me. I do like the ability of the random number generator to produce real numbers over a specified range. For Excel I had been forced to use RANDBETWEEN() (which only generates integers) and scale by a large number - this led to many off-by-epsilon rounding errors. Now I can precalculate the log-normal probabilities of each of the target limits of the PK parameters with;
CDF[LogNormalDistribution[mean, cv/100*mean], Exp[value]]
and then generate a table of random parameter values for the population with;
myList = Table[Log[Quantile[LogNormalDistribution[myMedian, myCV/100*myMedian], Random[Real, {myMinProb, myMaxProb}]]], {populationSize}];
This seems to be a small price to pay for having to use studlyCaps for variable names and for forever forgetting to use square brackets instead of parentheses and double square brackets instead of singles. The other major gotcha was not realising that you have to initialise an array (e.g. by setting to Null) if you want to subsequently add values to it piecemeal (the error messages generated are way too arcane).

I also had to change my approach when switching programs as in Mathematica it is actually easier to plot a function defined symbolically than it is to generate a bunch of x,y values and use them.

Pump up the func
I've mentioned previously that I have been experimenting with some PBPK and enzymology modeling on a Ti89. This was my first exposure to anything that allows you to work digitally with symbolic mathematics. I must have been looking too awestruck as a colleague has allowed me to play with their installations of Mathematica (not seen it before) and MATLAB (only seen very old versions previously), just to expand my horizons. They are both very cool but have way too much functionality for my needs. At first glance the capabilities of Mathematica appear to be a superset of MATLAB (although the latter likely has the edge for matrix manipulation). I've started trying to move some of my models into the two packages. MATLAB looks as if it could handily replace some of my more clunky spreadsheets, where I'm using discrete methods, as I can turn the columns into lists and functions. Scanning Mathematica's abilities suggests that I could come up with more general analytical models, and that is very appealing. I see that there are some free alternatives but I don't know if any of them could cope with the uses I have in mind (I was aware of octave but have never used it).

226 older entries...

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.