On the way to Madison we were trying to see what the Wisconsin ski resorts look like. Seems like the toughest runs are rated double-black-bunny.
On the way to Madison we were trying to see what the Wisconsin ski resorts look like. Seems like the toughest runs are rated double-black-bunny.
No more overheating
Pissed off about the CPU overheating, I wrote a simple daemon. It monitors the core temperatures and sets the cpufreq governor to powersave when temperature in either core is above 90 deg celsius, and sets it back to ondemand when it gets below 75 in both cores (those numbers I pulled out of my ass, they might need tuning). It simply polls the temperature every 2 seconds. There is no configuration or anything, simply change code and recompile. It’s all the neurons I’m willing to commit to this problem.
Yes I know performance might suffer since the CPU can go even further, but I don’t care about performance, I care about the machine not turning off randomly. I guess ondemand is actually better poewr (and heat) wise when everything is normal, but when then heat is high, powersave does come to the rescue.
Here is the code for those that want to do something similar. You might need to modify it heavily. I called it noverheatd.c, I simply compile the thing with something like
gcc -O2 -Wall noverheatd.c -o noverheatd, placed the resulting binary in
/root and then in
/etc/rc.local I added
/root/noverheatd &. The parts that need modification is set_policy where you need to set the number of cpus your kernel thinks it has, and then in the main loop you need to set the right coretemp paths for all the coretemps you have. I had to run “sensors-detect” as root from the lm_sensors package to obtain those guys.
About a week ago I finally added those new extra exercises I’ve been promising on the webpage (almost 150 new exercises) to my differential equations book. These are now with solutions. I did not want to add solutions to existing exercises. I still feel that it’s better to not have solutions. But I guess having some exercises with solutions does make the students feel better. Plus it seems this was an argument against using my book in at least department (not enough exercises and no exercises with solutions). So there. Some of the new exercises are interesting, many are just simple plug and play exercises to get students going. I’ve added all of them as exercises numbered 101 and above so that I would not change existing numbers. I suppose the “even/odd” thing is the common theme, but this has the added advantage that I can have fewer exercises without solutions.
As for the real analysis book. After having taught with Rosenlicht at UCSD (because my book doesn’t have metric spaces), I decided I will use my book at Madison this coming fall. This requires that I write up some metric space stuff. So I will be adding a Chapter 7 to the book. It will probably not be completely polished by the fall, so I might keep it separate even for the fall and only add it once all the bugs are caught after teaching with it. The plan is to do first everything on the real line and then do metric spaces. I found that metric space stuff was a bit too abstract for the students if I jumped right in. It might be better to do first sequences and continuity with real numbers only. I will skip some other material such as series though, and cover other material more lightly, due to time constraints. As for other news on the book: It is now (in slightly modified form) the standard book at University of Pittsburg.
We have to kill you to save you
This is funny in a very dark sort of way. Let’s do some numbers: Suppose that after some time about 100 people a year die from extra radiation exposure due to all these scanners. That’s a fairly low number, it would be impossible to
even notice this number from the cancer statistics. Given the millions of people that get scanned every year and that spend sufficient time near those scanners, that’s a tiny percentage. Now let’s see, let’s suppose we take the last 50 years of plane travel before 2001 and we notice that number of people killed in terrorist attacks including 2001 is a little more than 3000. That’s about 60 a year. Let’s suppose that no terrorist attack ever happens again (yeah right). Still we’d be killing almost twice as many by our security measures.
I wish my granddad were alive, this is what he did (nuclear hygiene). It would be nice to ask if 100 is a reasonable estimate.
Army of math
Doing math is like being in the military it seems, at least if you want to be in the academia. You don’t really get much of a choice where you go work, you sort of get an assignment. Especially in this job market. The idea is the following: you send about 100 applications (if looking for research jobs), then about 50 of those you won’t consider anyway, though you don’t know that when applying. Then out of the next 50 you might get some interviews, and some of those you decide you don’t want to go to. Then you get an offer from a place you didn’t interview at, for a postdoc, for which you didn’t really apply (actually I got two such offers this year). The whole process takes about half a year, though then you get 2 weeks to decide once you get an offer.
My first job application 4 years ago when I went to Illinois was a bit simpler. I applied to 100 places. 50 of those were tenure-track positions where I had no chance straight out of the PhD. The 50 were postdocs, out of which probably 5 I was a reasonable candidate for since no place will hire a postdoc unless they have a group in your area. Out of that I got two offers around the same time, and one sort of informal offer. Then you pick, and go live in the midwest for 3 years.
So we spent 3 years in Illinois, 1 year in San Diego, next up to 3 years in Wisconsin, then …? Wisconsin will be the 3rd state that Maia will be living in and she’s only going to be 5. So when someone asks her “Where’d you grow up?” Then instead of saying: “We moved around, my dad was in the military” she’ll sat “We moved around, my dad was a mathematician.”
Well at least it will make for a more interesting story. I wish the job market gets better so that I can find a permanent job I like. I know it sounds like whining, since I have a guaranteed job for next 3 years (though it’s also guaranteed I will not have that job in 3 years time).
Academia is one of those careers where you’re well in your thirties before you really can possibly settle down. And before your salary starts reflecting the level of eduction you got. Professors at top schools do get paid well, but it takes a long time. As a programmer, I could be making the amount of money I hope to be making 10 years from now as a mathematician, 10 years ago.
Then you hear someone say “Oh this or that person couldn’t get a job in the industry so he went to academia” … yeah right … that’s the easy way out.
Hopefully I’ve solved my overheating problems with the lenovo. First using the nvidia blob seems to have lowered the gpu temperature, but it wasn’t enough. Turning off the “discrete graphics” and trying to run the thing with the Intel GPU led to scary kernel crashes. I’ve realized that cpufreq does not take into account cpu temperature (that’s kind of dumb isn’t it). The few posts I found had solutions of the form “cpus should never overheat” and “reapply thermal paste” … yeah that’s very useful. My acpi does not output temperature for some reason, though lm_sensors seem to be working, so it seems cpuspeed daemon won’t work I guess. So it’s either hacking cpuspeed or the simpler solution just loewring the maximum speed of the cpu. That seems to be working beautifully. I tried very hard to overheat it and it’s still good. I can’t really tell that it’s slower so I don’t really care.
Still I hoped this would have been solved long ago. I sort of assumed it was actually.
Before I managed to “fix it,” I have come up against the “run fsck manually” message, which I filed as a bug against fedora only to get “what did that look like and that shouldn’t happen nowdays” response. Well I am not about to replicate this as I actually need to … you know … do work. And I don’t want to end up spending the day reinstalling the computer in case the filesystem really does get hosed.
Anyway, not too happy with the Lenovo anymore. There are plenty of problems with this hardware. Given how much everyone was raving about lenovo, I expected a lot better. Next time (which given how this hardware seems to be working is going to be soon) i will buy another one of those linux preinstalled laptops. The hardware will suck, but at least I won’t have to buy windows.
I wish I could buy a laptop and have it for years. That doesn’t seem to be a possibility. First you buy a laptop and must install a bleeding edge distro to get all the hardware to work. Then by the time the version of the distro you use is reasonably mature and bug free (or you can use a different long term supported kind of distro) then your hardware breaks down, forcing you to buy a new laptop. The cycle of life!
I wish people built things that meant to last for more than 1-2 years.
Yaikes, Firefox 5 is out and Firefox 4 is EOL. Each time I used Chrome (I had to use chrome to access the webct gradebook at UCSD) it had a different version number. I can’t quite tell the difference between browsing 3 years ago and browsing now, except that Chrome still doesn’t do flash on 64 bits, and in firefox it is by running the 32 bit version of flash in the wrapper.
Whatever people are somking, I want some!
At the same time my laptop (lenovo, not too satisfied anymore, not sure if I will buy one again) turns off about once a week, possibly overheating, but its hard to tell.
Hell I want just something that works! Why do people keep adding new features that break old features, so that no matter which version of software or hardware you use you always end up with something broken.
I don’t care for the fastest hardware, I mean it really isn’t any faster that I can tell anyway than it was a few years ago. But the old hardware dies and you have to buy new hardware that requires new drivers that are broken in new ways, before they get fixed, your hardware dies again.
Software seems the same way. What happened to quality engineering? It’s a known fact that writing a new feature is 5% of the work and making sure it doesn’t break everything is 95%. Now everyone wants to just skip the 95%.
The best example of good software is TeX and LaTeX. They have not changed in … decades. Yes a new version of a macro does come out every once in a while, and a distribution will break the installation once in a while for stupid reason, but the software itself is stable and mature. I can compile a document made 10 or 20 years ago without modification. I don’t have to learn anything new. It works, and it has quirks, but it has the same quirks for everybody, so they are usually well documented quirks.
Congrats to the GNOME community on the GNOME 3 release
As the title says.
Now this is the first major GNOME release I was not part of (insert sad nostalgic face here). There is also an alarming lack of easter eggs in this release. I think those two facts are highly correlated. So no randomly floating fish to make fun of those who take life way too seriously: Wanda, may you rest in peace!
So just in case my last post seemed too negative: I do like gnome shell, and I am using it. I am just a grumpy kind of person, I always was. So some of the gnome 3 experience takes some getting used to, some of it is annoying, but it is kind of cool. I think it could have been a lot nicer if there were not so many on purpose annoying aspects of gnome shell. Another example: the internal microphone needs some tinkering with alsa levels. This was possible with the old gnome alsamixer, and I probably would have figured out what was going on if i had that. The command line alsamixer is too difficult for me (I can’t tell the difference between muted and not, and I have no clue how to move left/right channels separately, which is what needed to be done to make the mic to work).
My gripes are with the assumption that gnome is working on perfect hardware and only well written apps run under it. That will never happen, no matter how much we wish it to happen. 10 years ago I thought that within a few years linux experience will be 100% out of the box on almost any laptop. It’s still not there, and will never be there. That last 5% will take forever, not a couple of years. Mostly because even the windows experience is not 100% good out of the box even though it is preinstalled. I tried turning on windows first before wiping it, and it already had some issues even though it was the stock experience, but I found even the ipad to be buggy (and it made me laugh that marketa’s vista laptop has been crashing on shutdown ever since it was new, it started doing that before we even installed anything on it). I just tend to see problems with design that other people ignore for all the cool-aid they are drinking (this is especially true with apple and/or google).
Anyway, overall, I’m fairly happy with GNOME 3. And I’m sure it will get better in a few years. I’m just hoping that more essential things also get solved.
GNOME 3 experiences
So my Zareason notebook decided to break (actually it was breaking for a while, the case is really terrible material-wise). I’ve been looking to buy a linux preinstalled laptop, but finally saw a sale on a lenovo u460 and decided to just get it. The machine is very nice and essentially everything works. I installed newest Fedora alpha and updated to the latest bits so I have GNOME 3 here.
Experience is not entirely positive. GNOME 3 is a solution in search of a problem. The things that GNOME 3 makes easier weren’t really all that difficult before. It doesn’t really make anything important any easier. Basically it improved on one part of the desktop experience that was already “good enough.” There is nothing that a user couldn’t have done before that they can do with GNOME 3. But there are things that were possible with GNOME 2 that aren’t with GNOME 3. So this improvement is at a cost of making lots of more rarely done things much harder. If there are 100 things, each one of them only affecting 1% of the users, it is entirely possible that 100% of the users are affected. I am sure that everyone will find a couple of things they need to do (not just want to do, but NEED to do) that will be very hard if not almost impossible in GNOME 3. For example for me, linking two computers in a temporary way with an ethernet cable was not possible with a GUI anymore, and I couldn’t any more figure out how to change the mac address the network card uses in the new dialogs. Both were things I needed to do. It doesn’t help if someone tells me I shouldn’t have to do them if say the network setup (which is beyond my control) was done better.
A good UI gets out of the way. GNOME 3 more often gets in the way by making things that I needed to do harder or impossible to find or do. So while much of gnome shell is nice there are many places where it makes life harder on purpose for whatever reason. GNOME 2.0 had the same philosophical problem.
There are many places where the linux desktop is still very deficient in a way that keeps people from using it. GNOME 3 does nothing to improve that in my opinion. It’s all nice in a perfect world, but we do not live in a perfect world where all hardware looks the same, all 3d drivers work, all people work the same way and all necessary software for linux is already written.
Someone should try to fund a study to find out “why are you not using linux” or more specifically “what does linux not do that you need it to do before you will use it”. Surely it is not fixed workspaces and starting applications from a menu.
New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.
Keep up with the latest Advogato features by reading the Advogato status blog.
If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!