Older blog entries for kgb (starting at number 356)

Twitter’s Growing Pains

Twitter LogoTwitter is going through some interesting changes, and I’m not talking about the recent theft of internal materials. On the one hand they are forming partnerships and continuing to grow at an ever increasing rate, and on the other they have to deal with a wealth of service abuse and SPAMmer tricks.

GoDaddy’s domain control panels now include Twitter integration. For any domain you have registered, you can instantly look up whether that Twitter account is still available or not, and create it immediately from your GoDaddy account.

Why would they offer this?

There would have to be money or recognized value in it for GoDaddy. Perhaps Twitter is making promotional deals with companies. In the “Free” or Open Source world, income from corporate support is frequently the more valuable and can be significant when you have a large audience of eyes to offer in exchange. We know from Techcrunch that they held discussions with Google and Microsoft. Buying a domain is often a person’s first excursion into changing from a casual Internet user to a more serious consumer of Internet technology. Another reason could be that GoDaddy offers business promotional packages and perhaps they use this as a lead-in. I suspect we are going to see more of this Twitter cross-company promotional activity. It’s the logical next step in their growth – become visible everywhere.

Twitter is going though some difficult growing pains.

In addition to being a more visible target of hackers, Twitter is becoming a favorite tool of marketers and SPAMmers. Why? It’s a revisit to old, familiar times. When search engines started they drove their categorization and ranking results from the site’s use of keywords, and a scan of the page content. This was subject to a lot of control and eventual manipulation by the site owner, so today results are driven by a complex formula that includes links, HTML parsing, content assessment, frequency of publication, cross-site connectivity, and more. Keywords are barely a factor in any of it. However unlike popular search engines, keywords and hashtags are all you have on Twitter. The best way to get noticed in only 140 characters is to cram it full of buzzwords, even when those words have nothing to do with your message. During the Michael Jackson funeral I saw German dating services throw every Jackson related keyword they could into their tweets, with their only real content being a short URL to click.

Search engines, email, and web pages generally require the user to visit the message to see it. Twitter is a bulletin board service with a potential audience of … ANYONE. Have you ever watched the public stream for a while? Some companies are broadcasting once a minute ad tweets. And like the days of email before smart filters, all you need to do is include a user’s @id in a message to target that person’s reply inbox. There are no SPAM filters or opt-in requirements to deliver that message. People with a lot of followers are eventually forced to use a tool that filters out every reply but specific people or topics they are interested in.

There are also issues of adult material. I received my first unsolicited adult photo as an @reply this year from a hit-and-run Twitter account. When people can quickly create an account, tweet once, and abandon it, you will get problems like this. There is no filter or rating service to prevent offensive material from reaching anyone, including minors.

Twitter has to be dealing with squatters also. While I haven’t (yet) see anyone sell or auction off a Twitter username, I have seen companies advertise the selling of their followers. It’s a common problem in domain management that people quickly buy up all the 1,2,3 and 4 letter domain names, as well as common nouns and marketing terms. This has to be a growing problem in Twitter. As favorite names are taken, they become more in demand by someone willing to negotiate for them.

Twitter is working on these issues. Earlier this year they quietly announced an upcoming “verified identity” service to help combat squatters and impostors, and they’ve started to register and protect aspects of their branding. These serve the company more than the users; there are many basic features commonly used by other services (rating systems, account creation verification, scanners, captcha, encryption) they could include to help fight abuse and protect their customer base. Their API also has rules preventing accounts from subscribing to too many people too quickly or sending messages to frequently, but you can circumvent that by creating disposable accounts.

It’s a difficult walk.

To encourage people to subscribe to other people, while preventing the issues that having a large following creates. It’s in Twitter’s best interest to have a large number of subscribers, each with a small manageable following, and their following to include several trusted accounts (such as celebrities) they can leverage.

Third party tools such as tweetdeck are the most useful in keeping the service usable to people with large followings. If Twitter continues growing like other startups do, they will eventually purchase/invest in one of these tools and brand it.

Syndicated 2009-07-17 15:21:28 from Keith Barrett Online » Technology

Real-Time Search

The web is moving rapidly toward real-time. Real-time display of messages, real-time display of posts and comments, real-time status updates from friends, disposable chats, even real-time video.

The future and strength of the Internet is in providing and spreading information instantly. Some institutions will push to capitalize on delaying information (for example stock quotes, law enforcement, and government), but in the long term this will have limited success. When every person is armed with a pocket computer and instant access to the world, you can’t hide much.

What’s not quite there yet is real-time search, because that has a dependency on the real-time notification of content publishing, which too is lacking.

Traditional search engines crawl the web, meaning they start at some web page and record all the links found to other locations, along with a representative content of the page. As the links are followed more links are encountered, and so on. The end result is that you have a massive database that can be used to return search results. Not all content is easily captured today, video files and the imagery within photographs for example, but as technology progresses these obstacles will be overcome.

In real-time search, content is cataloged as it is published. Search results would include this information immediately, and automatically update your display with the changes. Real-time search would probably be used concurrently with traditional crawling, but to do it at all means search engines need to know WHEN something has changed instantly. The blogging “ping” system is a working example of a commonly used publish notification system. Services like Google Alerts and RSS feeds also publish data as quickly as their source systems want them to. To do this effectively however the notification needs to include details of the data being published or changed, not just a ping or a link. If everyone adopted a standard of real-time notification the dependency for crawling goes away, but the practice would still occur because of desires to capture “the deep web” and any data being excluded (accidentally or intentionally) from the notification network.

Currently real-time search is specialized. Twitter, Facebook, and FriendFeed all have private search options for their data. Tweetmeme is an example of an external service providing real-time search and trending results on Twitter posts, but only for a short period of a week.

The Christian Science Monitor posted an article on upcoming real-time search engines, discussing CrowdEye, Collecta, and Google. I encourage you to read it. Google is expected to come out with their own micro-blogging service in the future, and already having the most popular search engine means if they can sway people away from Twitter they could be the leader in this space too. Microsoft has yet to provide a clue on how they will respond, but if Bing becomes popular it would be an obvious tool for them to add onto.

[Updated 7/6/2009 to include mention of the blogging ping network]

Syndicated 2009-07-06 06:45:51 from Keith Barrett Online » Technology

Microsoft Bing

Microsoft launched a search engine – Bing. They call it a “decision engine”, but it really is a search engine, trying to capture and/or crawl the web with the goal that it can return a good result to your query.

Google is so popular and well known that it became a verb to “Google” something. Why would anyone not using some sort of revolutionary new technology try to compete with Google in the search engine market? Because Microsoft is big in size and Google does not own that entire market. To capture even 1% of the search engine market can potentially produce millions in profit.

There are a fair number of search engines out there, some highly specialized. Until Microsoft the most recent was Wolfram. Calling itself a “computational knowledge engine” and using Artificial Intelligence, it’s goal was to extrapolate from the data on the Internet and directly respond to your query with a single direct answer. It’s an interesting tool but falls far short of being as helpful as Google.

In a short time, people are already calling the “big 3″ search engines Google, Yahoo, and Bing. Yahoo is a busy, distracting place. Google took off because it was simple, clean, and uncluttered. Results were the only thing displayed. The Bing screen is done the same way, making me believe its developers are either copying or learned from Google’s experience.

So how do these “big 3″ compare to each other? Someone has created a blind test tool for you to decide for yourself. You can go to Blind Search, enter your query and pick with engine returned the best responses. The results surprised me. I love Google, and have had a lot of past issues with Microsoft, but for the few selfish queries I did on my own name and blog, Bing came back with me as the top ranked result. On other queries the favorite results bounced between Bing and Google, so for a brand new search tool Bing is doing surprisingly well.

UPDATE 08/2009: Microsoft was caught manipulating the search results to favor themselves over the competition. For example; searching for “why is windows so expensive” produced a first page, different from Google’s, listing why Macs and Linux were more expensive than windows. They corrected this, but shortly later another one was found (and corrected), so there is now a result trust problem with the Bing search engine.

Syndicated 2009-06-30 00:53:31 from Keith Barrett Online » Technology

Using Technology Against The Consumer

Panasonic firmware noticePanasonic released a notice that their camera firmware will force consumers to only use Panasonic batteries, continuing a trend of applying technology to prevent consumer freedom.

I remember when software license keys first appeared in the early 1980s. I was working at Digital Equipment Corporation at the time and had to deal with entering these long hexadecimal number sequences for unlocking functionality that existed just a day before without them. Until that point people simply bought software and owned it. If you needed it on 3 computers, you bought 3 copies. Computer networking changed that. Now you could install one copy and people could share it. Software manufacturers initially just switched from charging you per copy to per user, but then they changed their code so that only a certain total number of people could use it, and if you wanted more you had to pay more. The more aggressive companies would based it on the total number of POSSIBLE users rather than the number of SIMULTANEOUS users. It didn’t matter that the code was identical and there was nothing special about 50 people versus 5 people except the result of an “If” statement, you had to pay more. You had enough hardware and processing power, but you were still stuck. It was an artificial limitation; like buying an oven with 2 working burners and if you wanted all 4 to work you had to send more money. Then the limitation was further restricted to EXPIRE, forcing you to send money every year or so just to keep the switch turned on (whether you used it once or 10000 times in that period). Then copy protection was introduced, sending a “we don’t trust you” message to the customer while interfering with his/her ability to backup their own purchased products. This all caused a lot of controversy back then and still does today. It’s what started the Free Software Foundation and the Open Source movement.

This was eventually adopted in hardware too. CPU manufacturers use to sell different versions of their processors at different prices based just on the results of quality control. Processors that failed tests at high speeds but worked at lower speeds could be sold as a lower speed processor at a reduced price. Seemed fair; and was better than throwing them away. But then they adopted a practice of artificially crippling a CPU in a number of ways (turning off the cache, disabling the floating point unit, etc.), sometimes to help prevent their previous product from becoming instantly obsolete. This caused a controversy similar to software licensing because now the more expensive processor was identical to the less expensive one, except that something was simply turned off and you had to pay more to turn it on.

As digital photography and printers began eliminating film, manufacturers of consumables switched from selling film to selling ink cartridges. In the age of film cameras there were a few (such as the Instamatic 44 and Polaroid) who produced cameras that only accepted a specific type of film, but for the most part everyone adopted 35mm film. With digital technology, many printer manufacturers try to force consumers to buy only THEIR ink cartridges by adding a little processor intelligence to the printer to detect what is inserted. This was brand loyalty forced instead of earned. Consumers had no choice but to purchase a specific, higher priced cartridge that was really no better than a lesser expensive one because the printer was smart enough to prevent it.

As technology began enabling manufacturers to override consumer choice, the technology itself was reverse-engineered and over come time and time again. Taking things apart to learn how they work or making them work better has been a part of life since the beginning of curiosity and the scientific method. Thus in 1998 the DMCA was passed, making reverse engineering or discussion about it in the USA a legal matter, since it was unlikely it could ever be prevented otherwise. The massive issues this created concerning publishing security vulnerabilities, scientific research, education, product cross-compatibility, product backward compatibility, international law, and more are beyond the scope of this article.

This all moved into media, where copy protection went from preventing you from making a copy of your purchase to dictating what you could watch or listen to, when, and on what equipment. Digital restrictions (called Digital Rights Management by the seller) forced consumers to re-purchase the same product over and over again, while making it much more difficult to resell their used product on the open market. Stories of CDs and DVDs not playing on some equipment were common. This won’t stop here; eventually DRM will prevent people from connecting up certain types of monitors to certain types of receivers using certain types of speakers (and perhaps certain brands of wire, all in the name of ensuring quality). Lawsuits have been filed from the blind siting the Disabilities Act because they are being prevented from listening to digital books when publishers disable the speech translator in the Amazon Kindle to protect their audiobook sales.

The vendor arguments in favor of these practices concern prevention of theft and piracy, protection of assets, sustainability of business, contractual obligations, quality, safety, etc. Since I am primarily a consumer and technology savvy, technology restrictions tend to inhibit my ability to fully enjoy my product experience rather than help me control something I’ve produced. I can’t help but wonder what today’s state of technology would look like if all these legal and artificial means of control and crippling were not around. If processors were allowed to be as fast and as cheap as the market dictated, and software upgrades took place because people were attracted to new functionality instead of their license expiring, and businesses were forced to adapt to change, and consumers could freely copy their movies and music to any device they owned, and manufacturers and sellers were forced to make a profit only on how good their products, innovations, and history were.

Syndicated 2009-06-26 02:06:46 from Keith Barrett Online » Technology

IE8 Get The Facts Campaign

Mozilla's get the factsMicrosoft published a “fact sheet” comparing a short list of features in IE8 with other browsers. My first thought reading this was that they must think their customers are idiots. INFOWORLD wrote an article saying just that. Robert Cringely also has an article warning about major issues with IE8.

Everyone that works with CSS ends up creating an IE specific file because their implementation of standards are so broken it’s impossible to do the right code and keep IE happy at the same time. It is funny to see Microsoft list a new IE specific security or privacy feature and then claim that they are more secure/private than everyone else for not having their feature. What about all the features they have IE8 doesn’t (or more to the point, what about IE8 specific vulnerabilities?). And does anyone understand that last bullet description? Frankly I’m amazed Microsoft gives in enough to award some of their bullet points “a tie” with the other browsers.

The community did not let this go by. A Mozilla developer posted this humorous response, which does mention the Firebug tool. There are other responses posted as well. There is another response chart here that is more informative and readable.

Microsoft is also promoting IE8 with offers of charity donations and a $10k cash prize (only discoverable using IE8). Techcrunch posted a response to those efforts.

The frequent question being asked is who is the target audience for this IE8 campaign? Web site designers all know the oddities of IE8, end-users are switching away from IE at an ever increasing rate. Corporations will be very slow to adopt IE8 especially given any pain Vista gave them, plus they usually base their migrations on licensing and support needs rather than hype.

If you are going to get the facts, get ALL of the facts. What you don’t know could end up costing you a lot of time and money.

Syndicated 2009-06-21 03:25:59 from Keith Barrett Online » Technology

2 Jul 2010 (updated 2 Jul 2010 at 19:41 UTC) »

RIP Analog TV

In case you missed it, on June 12th all televisions stations in the USA finally switched from analog signals to digital. As expected, there was a lot of last minute chaos as people suddenly faced with no more waiting time flooded government offices with requests for discount coupons or help desk assistance.

For most of the population not located near major cities and use to receiving a weak signal, this will seem like a bad idea. Instead of being able to watch their slightly snowy picture, they will get nothing at all or a picture that keeps cutting out. Digital signals are either received well enough to produce a picture, or there is no picture at all. Gone are the days when an unusual weather front caused excitement by allowing you to receive channels you never saw before from 3 states away. And those old crystal radio projects won’t pick up the audio from TV channel 8 anymore either. No matter how you look at it, this is an historical event.

The largely unanswered question is what about the emergency broadcast systems? The government pushed for this change in order to make money from the sale of the frequency spectrum. Will we eventually start hearing about a need for funds to upgrade all the emergency broadcast equipment? Emergency systems tend to be analog, because analog signals are simple and carry the furthest. Nature is analog, as are human eyes and ears. Until the human brain can be directly plugged in, you need to convert digital data to analog signals in order to be viewed or heard by people. Analog equipment is also less vulnerable to some of the situations that may exist in a national emergency, such electromagnetic pulses, radiation, static electricity, extreme temperatures, etc.  Everyone’s those old portable TVs that run on batteries are now headed for the landfill – if you don’t have the AC to operate a TV you also don’t have it to operate a converter.

Sounds like there might be a market for battery operated converter boxes.

Syndicated 2009-06-15 03:29:10 from Keith Barrett Online » Technology

MegaTeraPetaBytes

The Telegraph published an article last week discussing the future application of new optical disk technology. Within 5 years it’s expected that optical discs will be able to store over 5TB of information.

The fact that all storage media are becoming massively large and still affordable is going cause tremendous changes in society. It’s not just a matter of you being able to store your entire music collection, HD movie collection, or even your whole life’s worth of records and photos on a single pocket-sized or rice gain sized device. When storage goes away as a barrier, especially solid-state storage like memory, you can carry everything with you, anywhere, always, and never delete anything, ever. Any electronic based activity (watching videos, reading, recording shows) can wait until it’s convenient because you can store it for later. Everything becomes portable. You’ll be able to keep large amounts of internet data as cache to combat outages or bandwidth problems. All information … everywhere … online and accessible. In multiple copies to protect it. With room to spare.

It’s freedom, and it’s big brother. Which direction will it go?

Syndicated 2009-06-13 20:47:49 from Keith Barrett Online » Technology

Losing Company Honesty

Something the Internet has been good at is helping create companies that go through a tremendously fast growth. Within the span of just a few of years we can observe what use to take 20 or 30 years in a company’s life cycle. When the customer experiences that kind of pace from a service, changes that were subtle become easy to see. Some of the first things to change are openness, honesty and accessibility.

Internet startups tend to be more open than larger companies. They have little choice. Their survival depends on building a trustworthy customer relationship, word-of-mouth endorsement, and they don’t have a lot of spare resources to bury unpleasantness. You can see this when you need to report a problem or get special assistance if the information for contacting them or submitting problems is easy to find on their web site. Even more when you can get a hold of the very people responsible for supporting or creating the software/service you are using.

But unless a company has a very good product or reliable service, their growing customer base can over-whelm them with problem reports and support demands. Sometimes they’ll hide their email address in favor of an online form, or make it impossible to find out how to contact them at all. Their FAQ becomes a controlled PR tool – existing just to give answers to obvious things while not even mentioning the problems the majority of users are experiencing. These are practices designed to make a company look good (perhaps to lure in a buyer) rather than sustain a long-term reputation of customer satisfaction.

I experienced this just today with Facebook. Large numbers of my photos “vanished” from my profile, probably for the 3rd time in 2 years.. Their FAQ is filled with things like “How do I do this?”, but nothing about reporting the repeat problem of lost photos. A problem very visible to users and frequently discussed on the Internet. None of the links on their site are a “contact us”. It took me 20 minutes of full site searches to find an online form for submitting a problem. I was happy to find it, but it was so generic that I have no idea if it will get seen by anyone in a position to act. It did not assign me a request number or say that anyone would be in contact. It did not even allow me to classify the type of problem I was reporting. [UPDATE: It sent me a confirmation email that the problem was received. I never got an answer back from a person].

Compare that with Disqus or Google. Google does have “contact us” on their company page, and their project pages usually have Internet groups, forums, or Wikis so any knowledgeable person can help. Disqus also is known for the developers responding to requests for help, even from Twitter.

Whether a service is free or paid, customers do not appreciate being treated like they are an annoyance. If your company does not have the resources to respond to their customer base, set up forums and wikis and help people help each other – but keep it honest. Don’t hide problems people need help with. Appreciation for providing help is great PR. Let FAQs actually be driven by the frequently encountered problems. Give fixing frequently reported problems priority so they go away from public discussions on their own. If a company doesn’t provide these resources themselves, the community may do it anyway, and when they do your company will have less of a voice in it.

Syndicated 2009-06-10 06:25:54 from Keith Barrett Online » Technology

347 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!