Older blog entries for kgb (starting at number 359)

Microsoft Past, Present and Future

Ghost of Microsoft Past

Back in July, John Dvorak wrote an article entitled Is the party over for Microsoft? Setting aside his personal views, the article does a good job of listing many of Microsoft’s attempts to be everywhere by imitating or destroying the innoventions of others.  There are more examples. Gaming became popular, so they launched Xbox, iPods begat Zune, and earlier this week noted the launch of the first Microsoft store. Microsoft plans to open one next to every major Apple store – a strategy similar to how Burger King handles McDonald’s.

I can’t think of anything Microsoft innovated in its history, except perhaps their Surface technology. The company was even founded on pitching a DOS PC product before they ever owned it (and they later purchased it rather than wrote it). I’m old enough to remember computing before Microsoft, when VAX/VMS and Unix systems were replacing mainframes, and in all cases computers often ran for years without a failure or reboot. Microsoft moved from the consumer desktop to the data center with Windows NT, which came out just months after Linus released the first versions of Linux on Usenet (the 2 events are independent of each other; Microsoft did not release NT because of Linux, their competition was Digital Equipment Corporation). With MS Windows came an OS locked into a GUI, which meant it was more bloated, difficult to script, tedious to perform repeat operations, and you could not remote control it (until someone eventually came out with tool that handled displaying the remote screen). Every patch and install required a reboot, and Windows crashed so frequently throughout its history that even today people believe you should reboot your servers once every month or so to prevent them from crashing unexpectedly. Not true for Unix and Linux, but people do it anyway.

Most Windows users and Admins today still are totally unaware of how, from the beginning, Unix X-Windows and Linux Gnome & KDE all support running displays and window mangers across multiple boxes. You also have Microsoft to thank for introducing the backslash as a character you type a lot– all other OS’s and the web have always used the forward slash.

Microsoft’s first networking technology was poor. DECnet, TCP/IP, and Novell had replaced IBM Token Ring and SNA, and these were routeable protocols. You could design large networks where traffic was isolate to the systems that needed to listen to it. Microsoft’s original network was flat, and for a while they argued it was the only way to do it, but reality eventually forced them to adopt segments and routing, and eventually the TCP/IP stack. For a long while it was quite a feat to get all the protocol stacks to work together on a desktop PC; even the order you loaded them determined what worked and what was broken.

WordPerfect and Lotus 1-2-3 were the widely used Word processor and spreadsheets. Microsoft countered with Word and Excel. WordPerfect was generally considered better because you could use it without ever touching the mouse; it had keyboard shortcuts and function keys for everything. This made it comfortable for use by fast typists. Both product suites existed in the enterprise for years, especially if you worked in the government, until Word finally won the war by the numbers.

Mosiac was the first browser, followed by Netscape as the first commercial browser. When its popularity soared, Microsoft fought back by creating an inferior browser and giving it away for free with the OS, essentially causing commercial enterprises to standardize on it for that financial advantage, and it put Netscape out of business. Microsoft also has a history of not adopting standards unless they modify them in ways that break everything not made by them. Ask any xhtml/css web developer trying to create sites compatible with multiple web browsers and phones what they think of Internet Explorer. They will tell you that they had so many frustrations they created an IE specific CSS file. I include myself in that group.

Not until Windows 2000 or XP did experienced IT professionals begin to feel that the Windows OS had matured enough to truly belong in the data center.

Most of Microsoft’s products, especially when compared to Apple Macs, have GUIs that seemed designed by people that have not actually asked end-users for their input. Actions often take 2 or 3 clicks to reach, and the configuration items are not in the same menus from program-to-program. I started using a MacBook less than 12 months ago and it’s amazing how much cleaner and well thought out the GUI is compared to Windows and Linux.

Microsoft has tried to be everywhere and everything, and had mastered nothing in that process. They have a wealth of products, but in many cases they were trail-and-error reactions to the competition, or purchased takeovers. The cash cow of Microsoft Office and Windows allowed them to take a loss on products competing in a lot of areas. But this is coming to an end, and the new generation of people growing up on smart phones, Google, and social media are more familiar with Microsoft alternatives. Microsoft Vista was one of their worst releases, forcing them to accelerate a release of a replacement, and don’t forget the day their music died.

Microsoft Present

Google, Apple, and Linux are all around them. Google has entered the Operating System space with Android and Chrome OS, and the browser market with Chrome. Microsoft fired back with the Bing search engine. Apple continues to earn customer loyalty and consumer awards, and dominates the mobile media market. Microsoft is ahead in Netbook sales, due to consumer familiarity rather than being a good OS option for that platform. This was an area where Linux could have been a natural leader, but they failed to deliver.

Microsoft made multiple announcements this year that are a turn-around from previous strategies. They are embracing Web 2.0 by providing free versions of Office as Web Applications. They came out with Bing; a new search engine that does not look like they even wrote it.  It’s clean, simple in use and display, and does not follow their usual GUI style. It looks like Google. However they were also caught manipulating search results to favor making them look better than the competition, so they still need to work on that trust factor.  They also took a step toward social media by creating a bunch of Twitter accounts, but currently they all serve as PR announcement tools and do not engage their customers in dialog, sometimes cross-posting between themselves.

If you haven’t seen it, the Microsoft Surface product is very cool.

Divisions and products are going to feel the force of making a profit rather than expecting market dominance and their size will continue to cover loses. The Xbox is a great product, but it needs to make a profit.

Windows 7 officially launched this week and is getting good reviews from grateful people dumping Vista.

Microsoft Future

Microsoft is still using their old playbook on some fronts. When most other technology stores are moving to the web, they are going to open physical stores for the first time in order to compete with Apple. Ignoring all the hype and flash, it’s not clear they understand that the success of Apple is not related to their stores, but their customer service, ease of adoption, and a better product. People go to Apple stores to see and touch very cool, artistic looking products. Will that apply to Microsoft Stores?

They’ve admitted they need to participate in social media by announcing deals with Facebook and Twitter to include their data in Bing. Microsoft and Google are both trying to get into the game. Google is better positioned from technology standpoint, with Google Connect and Profiles and maybe Wave, but social media all comes down to adoption. The users will decide, and right now that’s Twitter and Facebook. Google and Bing are moving as fast as possible to producing search results that change in real-time.

One of the coolest things to come out of Microsoft was a video showing their vision for Xbox’s future. They should be going all guns to deliver on the possibilities shown in their video. If they do, they will lead in several areas of home computing. If they don’t, it’s a vaporware viral video.

Internet Explorer is dying. Firefox, Chrome, Safari, and Opera are taking over. Firefox is used most everywhere the consumer is allowed to install it. If Microsoft was smart, they’d turn IE over to the Open Source public.

It’s hard to compete with innovation, especially concerning Google, Wii, and Apple’s iPhone. Obviously Microsoft recognizes Social Media is important, as is the search engine market, but so far they are treating them as a PR tools. The public will see through that. Cloud computing and web applications are turning the data center and computing into a commodity. The playing field is getting flatter. Microsoft’s biggest advantage in new computing is that they currently dominate the Netbook market. Their biggest threat to this now is a Google Netbook. There is a strong possibility that Google could end up owning the market in the long run (or Apple if they cared enough to complete in it).

Microsoft has the resources to be a leading innovator on many fronts, so you have to ask yourself why do Google, Twitter, Apple, and Facebook seem to dominate most of the future technology change? Because others are creating standardized APIs, allowing connectivity, promoting Open Source, and engaging the public (i.e. “crowd sourcing”). If Microsoft changes the way they engage the public and their product development, they will be a major force. If they continue to do things as they have, they will be slow to change and will only have a small piece of the pie (assuming they survive).

Syndicated 2009-10-25 07:10:38 from Keith Barrett Online » Technology

Building A Blog

A few months ago I moved my blogs from their scattered pieces on Facebook, MySpace, Tumblr, Advogato, and LiveJournal over to a WordPress platform. This was a tedious and time consuming project. I had to do a lot of editing. The different platforms were geared toward different audiences. Some were technology centric, some Disney World fandom, some opinionated, some humorous and some personal. On the social platforms my audience was primarily friends and family, so writing tended to be informal and blunt. I frequently lumped multiple smaller, unrelated topics into single articles. These had to be split into individual entries or rearranged into better groupings so they could be properly tagged and indexed.

Getting My Data Out Of LiveJournal …

Getting my data off LiveJournal was tricky, especially since I wanted my comments to survive as well. I located only one, free program (and only for Windows) that allowed me to export the data into a csv I could load into WordPress. After this completed I had about 500 comments loaded, although they no longer had avatars. I was able to perform a mass SQL edit to the database URL column so that the comment pointed back to the URL of the author’s LiveJournal webpage.

Finding a Host …

Finding a viable hosting platform was the most difficult. I hoped for a free web host account that was based on Linux and provided SQL, PHP, WordPress, and a few other resources. Searching the Internet shows no lack of free options, and everyone seemed to love and hate the same sites, however the majority of all reviews I’ve read were fake or spammed into uselessness. Because I wanted a host that allowed my own ads, while not adding anything unreasonably visual in return, I could not use wordpress.com. Other host sites all wanted me to park my domain on their site, which by itself is not unreasonable, but they didn’t give me control  control over the DNS or MX data, which meant a hard time controlling my already extensive email addresses, plus sub domains that pointed to other sites. Most also did questionable things like strip off your www or force you to have it, which is bad if that is part of your brand or you want to handle that subdomain differently. Anytime I made a change to try something out, their DNS TTL’s were often set for a week, not hours. This created some scares and took my site down a few times. In the end I tried two host sites, so that if one went down I could swap to the other. I’m not going to mention who they are because within a few weeks I was not happy with either of them. One thrusted a lot of advertising tricks at me and both look like they could vanish overnight and take your content with them. I ended up going with my registrar, GoDaddy.

I wonder; has anyone done a trade-off analysis of whether it’s better to host your content on a free site with their bandwidth controls versus hosting at home using dyndns and your broadband provider’s limitations?

GoDaddy provided challenges of their own. They have an unpredictable implimentation of .htaccess, where some things work and some don’t, and they don’t publish anywhere what that is. The most allowing thing is that whether your htaccess file pre-existed or not, a change might not take place for hours, which made trial-and-error edits to determine what worked a nightmare. I have a working setup, but I still do not have an htaccess file that I like.

Finding A Theme …

Once leaving the constraints of LiveJournal, MySpace, and Facebook for WordPress you enter a massive world of look-and-feel FREE theme possibilities. With a push of a button your blog can resemble anything. I examined hundreds of themes and picked about 30 that I like, but am not entirely happy with any single one. I’m sure every blogger out there is chuckling at this; they’ve all gone through it. Each theme had things I liked and disliked. I picked one I liked the most, knowing I was going to edit the heck out of iit.

UPDATE: I’ve edited the heck out of it. Told you.

Sub-domains vs Subdirectories vs Toplevel …

Why doesn’t someone create a how-to for all this? Yet another decision you have to make is how are people going to reach your blog and what will the URL links look like. The choices all have pros and cons. I’m not going exhaustively list them all here, but in the simplest of terms as they applied to me:

  1. Unless WordPress is the only web application you will be using and your entire site is dedicated to one purpose/subject, you will want to install WordPress in a subdirectory to help keep your files organized. You may even find you want to install multiple WordPress copies to exploit multiple themes.
  2. With WordPress in a subdirectory, you may (depending on your host service) have the option of pointing a sub-domain to that location. While it is attractive to have your users type “blog.fred.com” instead of “fred.com/blog”, it only has value if you want your sub-domain to be seen by search engines as an independent site apart from the main domain, and that only makes sense if each has sufficient independent content to stand on their own. At least starting off, I wanted all my search engine goodness to relate to my main web site. There are other options for hiding the sub-directory from the user…
  3. You can set up your .htaccess file to redirect visitors (including those coming in from sub-domains) to the WordPress sub-directory, thereby hiding the fact you are using a sub-directory. WordPress further supports this by allowing you to indicate what your want your URLs to look like (“fred.com/blog/my-story” or “fred.com/my-story”). My conclusion is that there is no search engine penalty for going 1 directory level deep and it will make it MUCH easier to perform a redirect to another location should you want to move your blog. You can set up your .htaccess file so “fred.com” goes directly to “fred.com/blog”, thereby making it very user friendly and after that it’s all link clicks.

I elected to (1) put WordPress in a sub-directory, (2) keep that sub-directory visible in my URLs so it doesn’t intermix with other things I might add to the site, (3) created a sub-domain that pointed to my main domain, and (4) set up my .htaccess file to perform a redirect to the sub-directory

Tags Versus Categories …

I soon encountered the confusion of tags versus categories. I’ve been dying to have these features; my content covers multiple areas of interest and readers will finally be able to locate what interests them. On the surface they appear to do the same thing; they both mark content so it’s easier to group and locate. Opinions differ on how they should be used, but in general categories act like file folders to group articles together, and tags are best viewed as sticky notes of popular labels. The difference is their behavior in SEO. Categories are supersets of tags; articles should be placed in just one category to prevent the penalty of search engines viewing the content as a duplicate.

I elected to show tags for tracking down similar content, and hide the categories from the user. I’ve also chosen to drive my menus via categories. Articles can resides in multiple categories and tags, but I am blocking both tags and categories from search engines to prevent duplicate data.

UPDATE: I just suffered through a lot of pain because I made many of my tags and categories the same name and the same “slug” abbreviation. This got crazy when I deleted or renamed some and the others would be affected. I have since went through all of them and gave all my categories slug names ending in “-topic” so they will keep separate from tags.

Plugins and Widgets …

When you’ve been in computing as long as I have, you become numb to the fact that everyone and every piece of technology likes to use different generic words for the same thing. Themes/skins, plugins/widgets/add-ons/injections, containers/boxes/groups, etc. I had to understand what WordPress meant by having both plugins and widgets. Plugins are just their term for pieces of software that add or change blog behavior. A plugin may or may not also create a widget. A widget is just a GUI tool for placing and configuring the logic rather than force the user to deal with the actual css/javascript/php code directly. You don’t download widgets, its just a term. Not every theme supports widgets or plugins well, so it’s not likely you can avoid it always.

There are thousands of plugins, and some behave badly alone or in conjunction with other plugins. Some work with some themes and some don’t. There are also many duplicate plugins that do the same thing, but differently or are just written by different authors. It’s a project in itself to find ones you like and that behave well. WordPress has popularity and download indicators to help, but most of the ones I found were by deciding I needed something and entered search terms in Google (like “WordPress plugin FriendFeed”).

Note: In a future article I plan to write about what plugins I find the most useful.

I wanted to connect my blog to FriendFeed, Twitter, Facebook, and maybr MySpace, so these are the types I’ve been playing with the most. I’ve had to make manual modifications to some of these to get them to work with my blog. Some plugins make assumptions about your directory structure or host name being root, when mine is in a subdirectory. The FriendFeed comments plugin was one. Some plugins, like smart404 and permalink redirect, actually did nothing in my system (even after performing the required PHP edits).

UPDATE: Apparently there is an issue using multiple Facebook plugins. You get class errors when you try to activate both Facebook Connect and Socialite for example. Also neither of the 2 plugins I found that support cross-posting to Myspace work, and the Twitter Connect is very unreliable. I customized a lot.

Local Comments vs Disqus …

Another debate: I wanted to use Disqus for my commenting system. This mean Disqus importing my previously LiveJournal imported comments. I was excited that the Disqus web site and the WordPress plugin both had an import feature for WordPress, but when I clicked it – nothing was imported. After a few tries 5 comments were imported. I took a close look at those 5 and discovered that they were the only ones in my database with data in the IP address column. Using SQL I set all the IP addresses in my comments table to ‘255.255.255.255′, exported the WP data to XML, and tried the Disque import again. It ran for 10 minutes, then I looked and it had loaded all but 79 of the comments! Unfortunately there seems to be no easy way to track down which 79 comments they were. There is also debate whether you lose SEO benefits to outsourcing your comments. I can always change my mind and import my comments later, so for now I’m sticking with local comments.

UPDATE: Disqus added full Twitter, Facebook, and OpenID, and Yahoo login support, so it may be time to reconsider using it. There are also new comment services that tie together commenting from other social network services, a feature I highly desire.

The Past and Future …

When I started blogging in 2000 (before the term “blog” existed; they were called online diaries then), content mostly reflected the ongoing status of work or your opinions about life and news. My first blog was on a site called advogato.org, which still exists today but now I RSS feed into it. I worked for Red Hat back then, and my audience was co-workers, family, the Open Source community, and any Press interested in my projects. The postings were short, and frequently technical. In the last four years I’ve written a lot more personal articles because I now work for Disney Information Technology, and who doesn’t love sharing their unique experiences at Disney World? It generates a lot of interesting content. Beginning in late 2007 I became active in social media, but ironically I did not have a platform to share this outside of enclosed systems like Facebook and MySpace. Now that this blog has been centralized I hoped to generate a lot more discussion. Today I use tools like FriendFeed, Twitter, Tumblr, and yes Facebook. These allow you to post in real-time and build off existing content created by others quickly, but you also tend to say what you have to say in a pithy paragraph or two.

The most enjoyable aspect of this project is that all of my content survived and is in one place. I actually located my very first blog entry, as well as a few old Usenet posts going way back to the late 1980s! The journey was fun, but it was painful to see how trivial or poorly written some of those old entries were. I did go back and clean some up, but I had to balance how I wanted it to read versus maintaining the historical message and feeling of the original post.

I hope you like this site.  I have additional design options and I’d like you, my readers, to lead me in the direction of what they should look like.

Thanks, Keith

Syndicated 2009-10-25 04:14:03 from Keith Barrett Online » Technology

New Mac versus PC Ad

Actually there are 3 new ads, but I like this one the most:

www.youtube.com/watch?v=Gk4FIIkKXdw

Syndicated 2009-10-23 03:41:50 from Keith Barrett Online » Technology

Twitter’s Growing Pains

Twitter LogoTwitter is going through some interesting changes, and I’m not talking about the recent theft of internal materials. On the one hand they are forming partnerships and continuing to grow at an ever increasing rate, and on the other they have to deal with a wealth of service abuse and SPAMmer tricks.

GoDaddy’s domain control panels now include Twitter integration. For any domain you have registered, you can instantly look up whether that Twitter account is still available or not, and create it immediately from your GoDaddy account.

Why would they offer this?

There would have to be money or recognized value in it for GoDaddy. Perhaps Twitter is making promotional deals with companies. In the “Free” or Open Source world, income from corporate support is frequently the more valuable and can be significant when you have a large audience of eyes to offer in exchange. We know from Techcrunch that they held discussions with Google and Microsoft. Buying a domain is often a person’s first excursion into changing from a casual Internet user to a more serious consumer of Internet technology. Another reason could be that GoDaddy offers business promotional packages and perhaps they use this as a lead-in. I suspect we are going to see more of this Twitter cross-company promotional activity. It’s the logical next step in their growth – become visible everywhere.

Twitter is going though some difficult growing pains.

In addition to being a more visible target of hackers, Twitter is becoming a favorite tool of marketers and SPAMmers. Why? It’s a revisit to old, familiar times. When search engines started they drove their categorization and ranking results from the site’s use of keywords, and a scan of the page content. This was subject to a lot of control and eventual manipulation by the site owner, so today results are driven by a complex formula that includes links, HTML parsing, content assessment, frequency of publication, cross-site connectivity, and more. Keywords are barely a factor in any of it. However unlike popular search engines, keywords and hashtags are all you have on Twitter. The best way to get noticed in only 140 characters is to cram it full of buzzwords, even when those words have nothing to do with your message. During the Michael Jackson funeral I saw German dating services throw every Jackson related keyword they could into their tweets, with their only real content being a short URL to click.

Search engines, email, and web pages generally require the user to visit the message to see it. Twitter is a bulletin board service with a potential audience of … ANYONE. Have you ever watched the public stream for a while? Some companies are broadcasting once a minute ad tweets. And like the days of email before smart filters, all you need to do is include a user’s @id in a message to target that person’s reply inbox. There are no SPAM filters or opt-in requirements to deliver that message. People with a lot of followers are eventually forced to use a tool that filters out every reply but specific people or topics they are interested in.

There are also issues of adult material. I received my first unsolicited adult photo as an @reply this year from a hit-and-run Twitter account. When people can quickly create an account, tweet once, and abandon it, you will get problems like this. There is no filter or rating service to prevent offensive material from reaching anyone, including minors.

Twitter has to be dealing with squatters also. While I haven’t (yet) see anyone sell or auction off a Twitter username, I have seen companies advertise the selling of their followers. It’s a common problem in domain management that people quickly buy up all the 1,2,3 and 4 letter domain names, as well as common nouns and marketing terms. This has to be a growing problem in Twitter. As favorite names are taken, they become more in demand by someone willing to negotiate for them.

Twitter is working on these issues. Earlier this year they quietly announced an upcoming “verified identity” service to help combat squatters and impostors, and they’ve started to register and protect aspects of their branding. These serve the company more than the users; there are many basic features commonly used by other services (rating systems, account creation verification, scanners, captcha, encryption) they could include to help fight abuse and protect their customer base. Their API also has rules preventing accounts from subscribing to too many people too quickly or sending messages to frequently, but you can circumvent that by creating disposable accounts.

It’s a difficult walk.

To encourage people to subscribe to other people, while preventing the issues that having a large following creates. It’s in Twitter’s best interest to have a large number of subscribers, each with a small manageable following, and their following to include several trusted accounts (such as celebrities) they can leverage.

Third party tools such as tweetdeck are the most useful in keeping the service usable to people with large followings. If Twitter continues growing like other startups do, they will eventually purchase/invest in one of these tools and brand it.

Syndicated 2009-07-17 15:21:28 from Keith Barrett Online » Technology

Real-Time Search

The web is moving rapidly toward real-time. Real-time display of messages, real-time display of posts and comments, real-time status updates from friends, disposable chats, even real-time video.

The future and strength of the Internet is in providing and spreading information instantly. Some institutions will push to capitalize on delaying information (for example stock quotes, law enforcement, and government), but in the long term this will have limited success. When every person is armed with a pocket computer and instant access to the world, you can’t hide much.

What’s not quite there yet is real-time search, because that has a dependency on the real-time notification of content publishing, which too is lacking.

Traditional search engines crawl the web, meaning they start at some web page and record all the links found to other locations, along with a representative content of the page. As the links are followed more links are encountered, and so on. The end result is that you have a massive database that can be used to return search results. Not all content is easily captured today, video files and the imagery within photographs for example, but as technology progresses these obstacles will be overcome.

In real-time search, content is cataloged as it is published. Search results would include this information immediately, and automatically update your display with the changes. Real-time search would probably be used concurrently with traditional crawling, but to do it at all means search engines need to know WHEN something has changed instantly. The blogging “ping” system is a working example of a commonly used publish notification system. Services like Google Alerts and RSS feeds also publish data as quickly as their source systems want them to. To do this effectively however the notification needs to include details of the data being published or changed, not just a ping or a link. If everyone adopted a standard of real-time notification the dependency for crawling goes away, but the practice would still occur because of desires to capture “the deep web” and any data being excluded (accidentally or intentionally) from the notification network.

Currently real-time search is specialized. Twitter, Facebook, and FriendFeed all have private search options for their data. Tweetmeme is an example of an external service providing real-time search and trending results on Twitter posts, but only for a short period of a week.

The Christian Science Monitor posted an article on upcoming real-time search engines, discussing CrowdEye, Collecta, and Google. I encourage you to read it. Google is expected to come out with their own micro-blogging service in the future, and already having the most popular search engine means if they can sway people away from Twitter they could be the leader in this space too. Microsoft has yet to provide a clue on how they will respond, but if Bing becomes popular it would be an obvious tool for them to add onto.

[Updated 7/6/2009 to include mention of the blogging ping network]

Syndicated 2009-07-06 06:45:51 from Keith Barrett Online » Technology

Microsoft Bing

Microsoft launched a search engine – Bing. They call it a “decision engine”, but it really is a search engine, trying to capture and/or crawl the web with the goal that it can return a good result to your query.

Google is so popular and well known that it became a verb to “Google” something. Why would anyone not using some sort of revolutionary new technology try to compete with Google in the search engine market? Because Microsoft is big in size and Google does not own that entire market. To capture even 1% of the search engine market can potentially produce millions in profit.

There are a fair number of search engines out there, some highly specialized. Until Microsoft the most recent was Wolfram. Calling itself a “computational knowledge engine” and using Artificial Intelligence, it’s goal was to extrapolate from the data on the Internet and directly respond to your query with a single direct answer. It’s an interesting tool but falls far short of being as helpful as Google.

In a short time, people are already calling the “big 3″ search engines Google, Yahoo, and Bing. Yahoo is a busy, distracting place. Google took off because it was simple, clean, and uncluttered. Results were the only thing displayed. The Bing screen is done the same way, making me believe its developers are either copying or learned from Google’s experience.

So how do these “big 3″ compare to each other? Someone has created a blind test tool for you to decide for yourself. You can go to Blind Search, enter your query and pick with engine returned the best responses. The results surprised me. I love Google, and have had a lot of past issues with Microsoft, but for the few selfish queries I did on my own name and blog, Bing came back with me as the top ranked result. On other queries the favorite results bounced between Bing and Google, so for a brand new search tool Bing is doing surprisingly well.

UPDATE 08/2009: Microsoft was caught manipulating the search results to favor themselves over the competition. For example; searching for “why is windows so expensive” produced a first page, different from Google’s, listing why Macs and Linux were more expensive than windows. They corrected this, but shortly later another one was found (and corrected), so there is now a result trust problem with the Bing search engine.

Syndicated 2009-06-30 00:53:31 from Keith Barrett Online » Technology

Using Technology Against The Consumer

Panasonic firmware noticePanasonic released a notice that their camera firmware will force consumers to only use Panasonic batteries, continuing a trend of applying technology to prevent consumer freedom.

I remember when software license keys first appeared in the early 1980s. I was working at Digital Equipment Corporation at the time and had to deal with entering these long hexadecimal number sequences for unlocking functionality that existed just a day before without them. Until that point people simply bought software and owned it. If you needed it on 3 computers, you bought 3 copies. Computer networking changed that. Now you could install one copy and people could share it. Software manufacturers initially just switched from charging you per copy to per user, but then they changed their code so that only a certain total number of people could use it, and if you wanted more you had to pay more. The more aggressive companies would based it on the total number of POSSIBLE users rather than the number of SIMULTANEOUS users. It didn’t matter that the code was identical and there was nothing special about 50 people versus 5 people except the result of an “If” statement, you had to pay more. You had enough hardware and processing power, but you were still stuck. It was an artificial limitation; like buying an oven with 2 working burners and if you wanted all 4 to work you had to send more money. Then the limitation was further restricted to EXPIRE, forcing you to send money every year or so just to keep the switch turned on (whether you used it once or 10000 times in that period). Then copy protection was introduced, sending a “we don’t trust you” message to the customer while interfering with his/her ability to backup their own purchased products. This all caused a lot of controversy back then and still does today. It’s what started the Free Software Foundation and the Open Source movement.

This was eventually adopted in hardware too. CPU manufacturers use to sell different versions of their processors at different prices based just on the results of quality control. Processors that failed tests at high speeds but worked at lower speeds could be sold as a lower speed processor at a reduced price. Seemed fair; and was better than throwing them away. But then they adopted a practice of artificially crippling a CPU in a number of ways (turning off the cache, disabling the floating point unit, etc.), sometimes to help prevent their previous product from becoming instantly obsolete. This caused a controversy similar to software licensing because now the more expensive processor was identical to the less expensive one, except that something was simply turned off and you had to pay more to turn it on.

As digital photography and printers began eliminating film, manufacturers of consumables switched from selling film to selling ink cartridges. In the age of film cameras there were a few (such as the Instamatic 44 and Polaroid) who produced cameras that only accepted a specific type of film, but for the most part everyone adopted 35mm film. With digital technology, many printer manufacturers try to force consumers to buy only THEIR ink cartridges by adding a little processor intelligence to the printer to detect what is inserted. This was brand loyalty forced instead of earned. Consumers had no choice but to purchase a specific, higher priced cartridge that was really no better than a lesser expensive one because the printer was smart enough to prevent it.

As technology began enabling manufacturers to override consumer choice, the technology itself was reverse-engineered and over come time and time again. Taking things apart to learn how they work or making them work better has been a part of life since the beginning of curiosity and the scientific method. Thus in 1998 the DMCA was passed, making reverse engineering or discussion about it in the USA a legal matter, since it was unlikely it could ever be prevented otherwise. The massive issues this created concerning publishing security vulnerabilities, scientific research, education, product cross-compatibility, product backward compatibility, international law, and more are beyond the scope of this article.

This all moved into media, where copy protection went from preventing you from making a copy of your purchase to dictating what you could watch or listen to, when, and on what equipment. Digital restrictions (called Digital Rights Management by the seller) forced consumers to re-purchase the same product over and over again, while making it much more difficult to resell their used product on the open market. Stories of CDs and DVDs not playing on some equipment were common. This won’t stop here; eventually DRM will prevent people from connecting up certain types of monitors to certain types of receivers using certain types of speakers (and perhaps certain brands of wire, all in the name of ensuring quality). Lawsuits have been filed from the blind siting the Disabilities Act because they are being prevented from listening to digital books when publishers disable the speech translator in the Amazon Kindle to protect their audiobook sales.

The vendor arguments in favor of these practices concern prevention of theft and piracy, protection of assets, sustainability of business, contractual obligations, quality, safety, etc. Since I am primarily a consumer and technology savvy, technology restrictions tend to inhibit my ability to fully enjoy my product experience rather than help me control something I’ve produced. I can’t help but wonder what today’s state of technology would look like if all these legal and artificial means of control and crippling were not around. If processors were allowed to be as fast and as cheap as the market dictated, and software upgrades took place because people were attracted to new functionality instead of their license expiring, and businesses were forced to adapt to change, and consumers could freely copy their movies and music to any device they owned, and manufacturers and sellers were forced to make a profit only on how good their products, innovations, and history were.

Syndicated 2009-06-26 02:06:46 from Keith Barrett Online » Technology

IE8 Get The Facts Campaign

Mozilla's get the factsMicrosoft published a “fact sheet” comparing a short list of features in IE8 with other browsers. My first thought reading this was that they must think their customers are idiots. INFOWORLD wrote an article saying just that. Robert Cringely also has an article warning about major issues with IE8.

Everyone that works with CSS ends up creating an IE specific file because their implementation of standards are so broken it’s impossible to do the right code and keep IE happy at the same time. It is funny to see Microsoft list a new IE specific security or privacy feature and then claim that they are more secure/private than everyone else for not having their feature. What about all the features they have IE8 doesn’t (or more to the point, what about IE8 specific vulnerabilities?). And does anyone understand that last bullet description? Frankly I’m amazed Microsoft gives in enough to award some of their bullet points “a tie” with the other browsers.

The community did not let this go by. A Mozilla developer posted this humorous response, which does mention the Firebug tool. There are other responses posted as well. There is another response chart here that is more informative and readable.

Microsoft is also promoting IE8 with offers of charity donations and a $10k cash prize (only discoverable using IE8). Techcrunch posted a response to those efforts.

The frequent question being asked is who is the target audience for this IE8 campaign? Web site designers all know the oddities of IE8, end-users are switching away from IE at an ever increasing rate. Corporations will be very slow to adopt IE8 especially given any pain Vista gave them, plus they usually base their migrations on licensing and support needs rather than hype.

If you are going to get the facts, get ALL of the facts. What you don’t know could end up costing you a lot of time and money.

Syndicated 2009-06-21 03:25:59 from Keith Barrett Online » Technology

2 Jul 2010 (updated 2 Jul 2010 at 19:41 UTC) »

350 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!