Older blog entries for kgb (starting at number 354)

Real-Time Search

The web is moving rapidly toward real-time. Real-time display of messages, real-time display of posts and comments, real-time status updates from friends, disposable chats, even real-time video.

The future and strength of the Internet is in providing and spreading information instantly. Some institutions will push to capitalize on delaying information (for example stock quotes, law enforcement, and government), but in the long term this will have limited success. When every person is armed with a pocket computer and instant access to the world, you can’t hide much.

What’s not quite there yet is real-time search, because that has a dependency on the real-time notification of content publishing, which too is lacking.

Traditional search engines crawl the web, meaning they start at some web page and record all the links found to other locations, along with a representative content of the page. As the links are followed more links are encountered, and so on. The end result is that you have a massive database that can be used to return search results. Not all content is easily captured today, video files and the imagery within photographs for example, but as technology progresses these obstacles will be overcome.

In real-time search, content is cataloged as it is published. Search results would include this information immediately, and automatically update your display with the changes. Real-time search would probably be used concurrently with traditional crawling, but to do it at all means search engines need to know WHEN something has changed instantly. The blogging “ping” system is a working example of a commonly used publish notification system. Services like Google Alerts and RSS feeds also publish data as quickly as their source systems want them to. To do this effectively however the notification needs to include details of the data being published or changed, not just a ping or a link. If everyone adopted a standard of real-time notification the dependency for crawling goes away, but the practice would still occur because of desires to capture “the deep web” and any data being excluded (accidentally or intentionally) from the notification network.

Currently real-time search is specialized. Twitter, Facebook, and FriendFeed all have private search options for their data. Tweetmeme is an example of an external service providing real-time search and trending results on Twitter posts, but only for a short period of a week.

The Christian Science Monitor posted an article on upcoming real-time search engines, discussing CrowdEye, Collecta, and Google. I encourage you to read it. Google is expected to come out with their own micro-blogging service in the future, and already having the most popular search engine means if they can sway people away from Twitter they could be the leader in this space too. Microsoft has yet to provide a clue on how they will respond, but if Bing becomes popular it would be an obvious tool for them to add onto.

[Updated 7/6/2009 to include mention of the blogging ping network]

Syndicated 2009-07-06 06:45:51 from Keith Barrett Online » Technology

Microsoft Bing

Microsoft launched a search engine – Bing. They call it a “decision engine”, but it really is a search engine, trying to capture and/or crawl the web with the goal that it can return a good result to your query.

Google is so popular and well known that it became a verb to “Google” something. Why would anyone not using some sort of revolutionary new technology try to compete with Google in the search engine market? Because Microsoft is big in size and Google does not own that entire market. To capture even 1% of the search engine market can potentially produce millions in profit.

There are a fair number of search engines out there, some highly specialized. Until Microsoft the most recent was Wolfram. Calling itself a “computational knowledge engine” and using Artificial Intelligence, it’s goal was to extrapolate from the data on the Internet and directly respond to your query with a single direct answer. It’s an interesting tool but falls far short of being as helpful as Google.

In a short time, people are already calling the “big 3″ search engines Google, Yahoo, and Bing. Yahoo is a busy, distracting place. Google took off because it was simple, clean, and uncluttered. Results were the only thing displayed. The Bing screen is done the same way, making me believe its developers are either copying or learned from Google’s experience.

So how do these “big 3″ compare to each other? Someone has created a blind test tool for you to decide for yourself. You can go to Blind Search, enter your query and pick with engine returned the best responses. The results surprised me. I love Google, and have had a lot of past issues with Microsoft, but for the few selfish queries I did on my own name and blog, Bing came back with me as the top ranked result. On other queries the favorite results bounced between Bing and Google, so for a brand new search tool Bing is doing surprisingly well.

UPDATE 08/2009: Microsoft was caught manipulating the search results to favor themselves over the competition. For example; searching for “why is windows so expensive” produced a first page, different from Google’s, listing why Macs and Linux were more expensive than windows. They corrected this, but shortly later another one was found (and corrected), so there is now a result trust problem with the Bing search engine.

Syndicated 2009-06-30 00:53:31 from Keith Barrett Online » Technology

Using Technology Against The Consumer

Panasonic firmware noticePanasonic released a notice that their camera firmware will force consumers to only use Panasonic batteries, continuing a trend of applying technology to prevent consumer freedom.

I remember when software license keys first appeared in the early 1980s. I was working at Digital Equipment Corporation at the time and had to deal with entering these long hexadecimal number sequences for unlocking functionality that existed just a day before without them. Until that point people simply bought software and owned it. If you needed it on 3 computers, you bought 3 copies. Computer networking changed that. Now you could install one copy and people could share it. Software manufacturers initially just switched from charging you per copy to per user, but then they changed their code so that only a certain total number of people could use it, and if you wanted more you had to pay more. The more aggressive companies would based it on the total number of POSSIBLE users rather than the number of SIMULTANEOUS users. It didn’t matter that the code was identical and there was nothing special about 50 people versus 5 people except the result of an “If” statement, you had to pay more. You had enough hardware and processing power, but you were still stuck. It was an artificial limitation; like buying an oven with 2 working burners and if you wanted all 4 to work you had to send more money. Then the limitation was further restricted to EXPIRE, forcing you to send money every year or so just to keep the switch turned on (whether you used it once or 10000 times in that period). Then copy protection was introduced, sending a “we don’t trust you” message to the customer while interfering with his/her ability to backup their own purchased products. This all caused a lot of controversy back then and still does today. It’s what started the Free Software Foundation and the Open Source movement.

This was eventually adopted in hardware too. CPU manufacturers use to sell different versions of their processors at different prices based just on the results of quality control. Processors that failed tests at high speeds but worked at lower speeds could be sold as a lower speed processor at a reduced price. Seemed fair; and was better than throwing them away. But then they adopted a practice of artificially crippling a CPU in a number of ways (turning off the cache, disabling the floating point unit, etc.), sometimes to help prevent their previous product from becoming instantly obsolete. This caused a controversy similar to software licensing because now the more expensive processor was identical to the less expensive one, except that something was simply turned off and you had to pay more to turn it on.

As digital photography and printers began eliminating film, manufacturers of consumables switched from selling film to selling ink cartridges. In the age of film cameras there were a few (such as the Instamatic 44 and Polaroid) who produced cameras that only accepted a specific type of film, but for the most part everyone adopted 35mm film. With digital technology, many printer manufacturers try to force consumers to buy only THEIR ink cartridges by adding a little processor intelligence to the printer to detect what is inserted. This was brand loyalty forced instead of earned. Consumers had no choice but to purchase a specific, higher priced cartridge that was really no better than a lesser expensive one because the printer was smart enough to prevent it.

As technology began enabling manufacturers to override consumer choice, the technology itself was reverse-engineered and over come time and time again. Taking things apart to learn how they work or making them work better has been a part of life since the beginning of curiosity and the scientific method. Thus in 1998 the DMCA was passed, making reverse engineering or discussion about it in the USA a legal matter, since it was unlikely it could ever be prevented otherwise. The massive issues this created concerning publishing security vulnerabilities, scientific research, education, product cross-compatibility, product backward compatibility, international law, and more are beyond the scope of this article.

This all moved into media, where copy protection went from preventing you from making a copy of your purchase to dictating what you could watch or listen to, when, and on what equipment. Digital restrictions (called Digital Rights Management by the seller) forced consumers to re-purchase the same product over and over again, while making it much more difficult to resell their used product on the open market. Stories of CDs and DVDs not playing on some equipment were common. This won’t stop here; eventually DRM will prevent people from connecting up certain types of monitors to certain types of receivers using certain types of speakers (and perhaps certain brands of wire, all in the name of ensuring quality). Lawsuits have been filed from the blind siting the Disabilities Act because they are being prevented from listening to digital books when publishers disable the speech translator in the Amazon Kindle to protect their audiobook sales.

The vendor arguments in favor of these practices concern prevention of theft and piracy, protection of assets, sustainability of business, contractual obligations, quality, safety, etc. Since I am primarily a consumer and technology savvy, technology restrictions tend to inhibit my ability to fully enjoy my product experience rather than help me control something I’ve produced. I can’t help but wonder what today’s state of technology would look like if all these legal and artificial means of control and crippling were not around. If processors were allowed to be as fast and as cheap as the market dictated, and software upgrades took place because people were attracted to new functionality instead of their license expiring, and businesses were forced to adapt to change, and consumers could freely copy their movies and music to any device they owned, and manufacturers and sellers were forced to make a profit only on how good their products, innovations, and history were.

Syndicated 2009-06-26 02:06:46 from Keith Barrett Online » Technology

IE8 Get The Facts Campaign

Mozilla's get the factsMicrosoft published a “fact sheet” comparing a short list of features in IE8 with other browsers. My first thought reading this was that they must think their customers are idiots. INFOWORLD wrote an article saying just that. Robert Cringely also has an article warning about major issues with IE8.

Everyone that works with CSS ends up creating an IE specific file because their implementation of standards are so broken it’s impossible to do the right code and keep IE happy at the same time. It is funny to see Microsoft list a new IE specific security or privacy feature and then claim that they are more secure/private than everyone else for not having their feature. What about all the features they have IE8 doesn’t (or more to the point, what about IE8 specific vulnerabilities?). And does anyone understand that last bullet description? Frankly I’m amazed Microsoft gives in enough to award some of their bullet points “a tie” with the other browsers.

The community did not let this go by. A Mozilla developer posted this humorous response, which does mention the Firebug tool. There are other responses posted as well. There is another response chart here that is more informative and readable.

Microsoft is also promoting IE8 with offers of charity donations and a $10k cash prize (only discoverable using IE8). Techcrunch posted a response to those efforts.

The frequent question being asked is who is the target audience for this IE8 campaign? Web site designers all know the oddities of IE8, end-users are switching away from IE at an ever increasing rate. Corporations will be very slow to adopt IE8 especially given any pain Vista gave them, plus they usually base their migrations on licensing and support needs rather than hype.

If you are going to get the facts, get ALL of the facts. What you don’t know could end up costing you a lot of time and money.

Syndicated 2009-06-21 03:25:59 from Keith Barrett Online » Technology

2 Jul 2010 (updated 2 Jul 2010 at 19:41 UTC) »

RIP Analog TV

In case you missed it, on June 12th all televisions stations in the USA finally switched from analog signals to digital. As expected, there was a lot of last minute chaos as people suddenly faced with no more waiting time flooded government offices with requests for discount coupons or help desk assistance.

For most of the population not located near major cities and use to receiving a weak signal, this will seem like a bad idea. Instead of being able to watch their slightly snowy picture, they will get nothing at all or a picture that keeps cutting out. Digital signals are either received well enough to produce a picture, or there is no picture at all. Gone are the days when an unusual weather front caused excitement by allowing you to receive channels you never saw before from 3 states away. And those old crystal radio projects won’t pick up the audio from TV channel 8 anymore either. No matter how you look at it, this is an historical event.

The largely unanswered question is what about the emergency broadcast systems? The government pushed for this change in order to make money from the sale of the frequency spectrum. Will we eventually start hearing about a need for funds to upgrade all the emergency broadcast equipment? Emergency systems tend to be analog, because analog signals are simple and carry the furthest. Nature is analog, as are human eyes and ears. Until the human brain can be directly plugged in, you need to convert digital data to analog signals in order to be viewed or heard by people. Analog equipment is also less vulnerable to some of the situations that may exist in a national emergency, such electromagnetic pulses, radiation, static electricity, extreme temperatures, etc.  Everyone’s those old portable TVs that run on batteries are now headed for the landfill – if you don’t have the AC to operate a TV you also don’t have it to operate a converter.

Sounds like there might be a market for battery operated converter boxes.

Syndicated 2009-06-15 03:29:10 from Keith Barrett Online » Technology

MegaTeraPetaBytes

The Telegraph published an article last week discussing the future application of new optical disk technology. Within 5 years it’s expected that optical discs will be able to store over 5TB of information.

The fact that all storage media are becoming massively large and still affordable is going cause tremendous changes in society. It’s not just a matter of you being able to store your entire music collection, HD movie collection, or even your whole life’s worth of records and photos on a single pocket-sized or rice gain sized device. When storage goes away as a barrier, especially solid-state storage like memory, you can carry everything with you, anywhere, always, and never delete anything, ever. Any electronic based activity (watching videos, reading, recording shows) can wait until it’s convenient because you can store it for later. Everything becomes portable. You’ll be able to keep large amounts of internet data as cache to combat outages or bandwidth problems. All information … everywhere … online and accessible. In multiple copies to protect it. With room to spare.

It’s freedom, and it’s big brother. Which direction will it go?

Syndicated 2009-06-13 20:47:49 from Keith Barrett Online » Technology

Losing Company Honesty

Something the Internet has been good at is helping create companies that go through a tremendously fast growth. Within the span of just a few of years we can observe what use to take 20 or 30 years in a company’s life cycle. When the customer experiences that kind of pace from a service, changes that were subtle become easy to see. Some of the first things to change are openness, honesty and accessibility.

Internet startups tend to be more open than larger companies. They have little choice. Their survival depends on building a trustworthy customer relationship, word-of-mouth endorsement, and they don’t have a lot of spare resources to bury unpleasantness. You can see this when you need to report a problem or get special assistance if the information for contacting them or submitting problems is easy to find on their web site. Even more when you can get a hold of the very people responsible for supporting or creating the software/service you are using.

But unless a company has a very good product or reliable service, their growing customer base can over-whelm them with problem reports and support demands. Sometimes they’ll hide their email address in favor of an online form, or make it impossible to find out how to contact them at all. Their FAQ becomes a controlled PR tool – existing just to give answers to obvious things while not even mentioning the problems the majority of users are experiencing. These are practices designed to make a company look good (perhaps to lure in a buyer) rather than sustain a long-term reputation of customer satisfaction.

I experienced this just today with Facebook. Large numbers of my photos “vanished” from my profile, probably for the 3rd time in 2 years.. Their FAQ is filled with things like “How do I do this?”, but nothing about reporting the repeat problem of lost photos. A problem very visible to users and frequently discussed on the Internet. None of the links on their site are a “contact us”. It took me 20 minutes of full site searches to find an online form for submitting a problem. I was happy to find it, but it was so generic that I have no idea if it will get seen by anyone in a position to act. It did not assign me a request number or say that anyone would be in contact. It did not even allow me to classify the type of problem I was reporting. [UPDATE: It sent me a confirmation email that the problem was received. I never got an answer back from a person].

Compare that with Disqus or Google. Google does have “contact us” on their company page, and their project pages usually have Internet groups, forums, or Wikis so any knowledgeable person can help. Disqus also is known for the developers responding to requests for help, even from Twitter.

Whether a service is free or paid, customers do not appreciate being treated like they are an annoyance. If your company does not have the resources to respond to their customer base, set up forums and wikis and help people help each other – but keep it honest. Don’t hide problems people need help with. Appreciation for providing help is great PR. Let FAQs actually be driven by the frequently encountered problems. Give fixing frequently reported problems priority so they go away from public discussions on their own. If a company doesn’t provide these resources themselves, the community may do it anyway, and when they do your company will have less of a voice in it.

Syndicated 2009-06-10 06:25:54 from Keith Barrett Online » Technology

DMV Bans Smiles in License Photos

Because more and more government systems are implementing face recognition technology, they are requiring that everyone have an unemotional expression when having photos taken in order for the software to better match the faces. A story came out today that the Virginia DMV Bans Smiles in Driver’s License Photos.

It seems to be standard practice lately for agencies and law enforcement to adopt obviously inappropriate policies on its citizens and wait until they are fought to have them removed. It wasn’t until the last few years that social security numbers were removed as an ID number. We seem to be one large beta test population groups for every piece of photo and scanning technology the government buys. It may be easier to apologize than ask permission, but it’s usually more expensive too. I hope someone challenges this in court because it should never be illegal to smile, or be refused your tax paid government services because you do smile.

The Declaration of Independence says “Life, Liberty, and the Pursuit of Happiness”. We need more happiness. All government IDs are going to look like mug shots. And it’s not a matter of choice – the DMV software will actually reject your photo if you do smile and ask you to do it again, so unless you co-operate you do not get a driver’s license.

Remember that as you continue to give away your freedom and privacy, you are also not allowed to smile in the process.

Syndicated 2009-06-04 06:37:22 from Keith Barrett Online » Technology

Ridiculous IT Job Posts

I received an unsolicited email today advertising for an open Information Technology position. I’ve written previous articles referencing studies reporting that IT is the most abused job you can have, and it’s job postings like this (all too common) that show it:

Unix and OpenVMS Administrator

Hello,

My name is Arjun and I’m a recruiter at [redacted].

We have an urgent contract for one of our direct clients:

Job Title:  Unix and OpenVMS Administrator
Location:  RICHMOND, VA

Job Description:

Required Skills : Unix and OpenVMS

1. Must live and work Richmond VA on site daily presence is required by the contract with our customer
2. Willing to work OT in a manufacturing environment
3. Willing to be on call rotation
4. Can pass background check and drug screening
5. DEPENDABLE
6. RELIABLE
7. Strong communicator
8. Excellent verbal and written skills
9. Works well in structured TEAM environment

Technical skills required
1. Excellent/proven knowledge & skills in OpenVMS 6.2 and 7.2
2. Knowledge of VMS running in Windows via Charon emulator
3. Knowledge and use of Change Management tools and process

Nice to have
1. Excellent skills in Windows 2K and above
2. RDB
3. Knowledge of LINUX and/or HP UX v11,
4. Understanding of SAN storage relating to HP – UX

If you are qualified, available, interested, planning to make a change, or know of a friend who might have the required qualifications and interest, please call me ASAP, even if we have spoken recently about a different position. If you do respond via e-mail please include a daytime phone number so I can reach you. In considering candidates, time is of the essence, so please respond ASAP.

[Redacted] is a global IT Consulting company with over 30 Fortune 500 customers.  You may visit our website to learn more about us.

The first 6 requirements listed, the most important ones, are that you (1) move here, (2) work frequent overtime (3) carry a pager (very likely 24×7 given the OT requirement), (4) pass background checks, (5 & 6) must be reliable and dependable despite all these demands on your personal life. The post then points out a few technical requirements. The entire post does not explain what the job is, the real responsibilities, or the rewards. The best it does is list a few desired technical skills for a computer system that has not been made in 25 years.

Any employer seeking skills 20+ years outdated will need to be much more flexible if they hope to attract anyone from that niche. It’s also humorous to toss in at the end that Unix, Linux, Windows, and SANs skills are a plus. Besides painting a picture of only seeking supermen that are willing to give up their lives for the job, the “nice to have” skills are in conflict. I actually met the unique technical requirements of this position, and I can tell you that people who know VMS and Unix and Linux and Windows are few, and all would expect a fair compensation for this work even before the demands of the on-call are considered,

It’s hard to know who’s at fault here. The company could be such an demanding, unrewarding place that this is the best way to write up the opening. Or it could be that the recruiter has no idea how to promote jobs. The most attractive thing about this job the way it’s written is that it’s a contract, meaning it would eventually end.

What’s more common is that IT job postings have such a long list of required skills only supermen or extremely niche people can fill them. The most common reason for this is that a company is trying to find one person to do the work of several, or a long-time employee that was involved with multiple systems has left and they are trying to find an identical replacement.

From a recruiting position; a job posting is an advertisement. This should have promoted the benefits of working for this company and the compensation for the demands made on you. Remember you are trying to attract the best people you can.

If this posting was written as a restaurant ad it would read like:

Come dine with us – we serve 20 year peanut butter sandwiches. You will be forced to stay all day to ensure you have breakfast, lunch, and dinner here, and will be required to wear a shirt advertising our busines anytime you are in public. The kitchen will be noisy, and you will be required to prove you aren’t sneaking in any other food. You must smile the whole time, and the final cost of your experience will be a surprise. Experience eating lobster or working as a gourmet chef or a farmer is desirable.

Syndicated 2009-05-31 02:37:12 from Keith Barrett Online » Technology

345 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!