Quick instructions for bar plots in R
Need to make bar plots quickly in R? It's easy!
What kind of object is this R variable, anyway?
Numerous ideas here, with the first being:
In other news, I've decided to just start posting all the questions I end up looking up, since I often have to find them again, and I believe in "upvoting" them for posterity.
Run a sequence of SQL statements in a script and spit the output to a file:
shell> mysql db_name < script.sql > output.tab
virtualbox -- assigning multiple cores to guest without hardware virtualization?
Q: Hey pedro, I have an old server with multiple cores, but without AMD-v or VT-x. Can I give a guest machine access to more than one of those CPU cores?
A: No. You need AMD-v or VT-x to assign more than one core to a guest machine.
looking for volunteers to donate their web data to SCIENCE!
Do you know what network sniffers like Ethereal or Wireshark are? Do you like SCIENCE? Would you be willing to surf the web UNENCRYPTED and UNCOMPRESSED for about an hour (or so) and send me, a trustworthy individual, captures of the data you downloaded? If so, please read on.
If you don't know what Wireshark is or how to get it, you're not really sure what I'm talking about, how it would work technically, or you DO understand very well thankyouverymuch and are horrified or unwilling to do so because of security or privacy concerns, thank you, but you can stop reading now.
Still reading? Cool!
I'm working on a research project involving the properties of web data in the real world. It would be super convenient if I could just browse the web all day and use my own data for my experiments... unfortunately, I can't assume that the way I use the web is typical, so I need other people out in the world to surf the web and send me the logs of the data they viewed. I will perform analysis on the data, which *probably* won't require me to look at the data. (I can tell you more about my research if you are interested, but I don't want to post about it here.)
Does this sound like a privacy risk? It is. But, it's not as bad as it sounds. (If I thought I was putting you at risk, I wouldn't ask.) I need the full contents of the data you view through the browser, but I don't need or want you to do anything sensitive like do your online banking, email, Facebook, or researching that mole on your arm on WebMD.
Chances are, there's a lot of other stuff you do online that isn't normally encrypted and doesn't fall into the category of "highly personal" (unless you think everything you do is highly personal). That's exactly the kind of data I want. Still, there is a risk in saving your web data and sending it to me, since there could always be some unexpectedly private information in there. This SHOULD make you think twice... so no hard feelings if you just don't want to do it.
If you're still willing, I promise you that I will make every effort to keep the contents of your data private. In *most* cases I should not even need to examine the contents of the data, no one else will be given copies of it, and I will destroy it after my experiments. My analysis is mostly statistical and numerical in nature -- sizes of files, how long they would take to transfer over a network, etc. NO names of websites, IP addresses, content or anything like that will be revealed in my papers or graphs. There should be no way to personally identify that you contributed to my research.
Still willing to help?
Here's the process I would need you to go through. These instructions are for Firefox only:
1. Quit all open Firefox windows, and restart it. Open your preferences. Under Advanced | Network, clear your "web content cache".
2. Next, we need to disable encryption, because I can't analyze the properties of data if it is encrypted. (Don't worry, we'll arrange a way for you to transfer me the data in an encrypted fashion.)
In Preferences | Advanced, under Encryption, *uncheck* "Use SSL 3.0" and "Use TLS 1.0". Yes, again, I am asking you to DISABLE YOUR ENCRYPTION. This is a necessary step, but I will personally remind you to re-enable encryption when you send me your data.
3. Next, I need you to disable compression. In the "address bar" of Firefox, enter "about:config" and hit enter. It will (probably) print a warning about how this could ruin your browser, but what we are doing is safe and reversible. In the search window, type "encoding", which will limit the configuration lines to those that match.
Right-click on the line for the preference "network.http.accept-encoding". Look for the an option "Reset", which should be grayed out if you have not changed the setting. If it is NOT grayed out, then you have changed the setting. If it does not exist, you are using an old version of Firefox. Either way, write down the setting for that preference (the default is probably: "gzip, deflate").
Double-click on the entry, erase the text, and confirm the change.
4. Now, start Wireshark (or Ethereal if you are using that). On the upper left side of the corner (probably under the "Edit" menu) there is an icon of a network card with a wrench on it. Click it. This will configure the capture settings. We need to do four thigns: a) choose the interface, b) limit the capture to web traffic, c) limit the capture size, and d) start the capture. Here's how to do that:
a) Select the network card you are using -- probably eth0 if you are using a wired connection and wlan0 if you are using wireless.
b) Then, in the Capture Filter field, type "tcp port 80". This will limit the captured traffic to web data only.
c) Next, on the bottom left, check the box to "Stop capture ... after" a certain number of megabytes. I would prefer 25 or 50 megabytes of data.
d) Finally, click Start. Wireshark will begin logging web traffic.
5. Start surfing! Don't do anything sensitive like online banking or anything personal, but otherwise please just use the web as you normally do. You may notice that you can't access certain sites like Gmail or Facebook with SSL disabled. That's OK. As Han said to Chewie, "Fly Casual".
6. Wireshark will stop capturing data automatically, but you may want to periodically look at it to see if it is still capturing data.
Once the data has hit the limit and stopped (you can see that the packet count at the bottom of the application will stop growing when you load new pages), save the data.
To do this, click File | Save As, and save the file as "whatever_you_want.pcap". Then, contact me at firstname.lastname@example.org and I will give you instructions for delivering me the data via SSH or your delivery method of choice.
7. FINALLY, remember to turn encryption and compression back on:
a) Preferences | Advanced | Encryption -- enable SSL and TLS
b) Enter "about:config" in the address bar, search for encoding, and right click on "network.http.accept-encoding" and select "Reset" -- or double click and enter whatever you wrote down previously.
Thank you! Your help will hopefully help make computers more efficient someday, and it will definitely make a tangible contribution towards me graduating!
home, pgdn, pgup, end buttons not working in FF? it's a feature, not a bug.
next, Borislav Ivanov confesses on Oprah...
paper cut: adding nautilus folder bookmarks in Ubuntu 12.04
You used to be able to drag and drop folders to the "bookmarks" pane in Nautilus... but you can't any more. At least, I can't. However, you can add bookmarks using control-D -- from within the folder you wish to bookmark.
hey, I should do^H^H^H^H see if someone has done that
So you're diving through folders, looking for some files or data. Of course you're using Nautilus since you're running GNU/Linux and Gnome*. You get to your file, "ah ha, you found it" only to realize that you really wish you had a command line to slice and dice the file. What you'd love is to be able to just "open a terminal here". I just had that experience, and like usual (thanks to the long tail), someone else has already done this. If you're using Ubuntu (or possibly Debian) you can simply 'apt-get install nautilus-open-terminal'. Enjoy!
* No offense implied to anyone else, I'm just being silly.
New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.
Keep up with the latest Advogato features by reading the Advogato status blog.
If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!