Moving back to the client

Posted 27 Jun 2002 at 07:24 UTC by mglazer Share This

Why client side applications are the new move for personal security, independece, and unclogging the Internet SuperHighway of Mis-information.

There have been many extraordinary advances in the client side area of programming.

Some include: xml, xslt, userData, persistence, xml stores, css, JSRS, Chromeless windows, Dhtml, WYSIWYG editors, XMLHTTP, Bookmarklets...

Advantages are:

  1. More secure
  2. Saves Internet load server
  3. Quicker load times for the end user
  4. More independence and personal control of your data and information by localization

Creating Dynamic html with JS, Editable glyphs, XML Namespaces,XMLObject (XSLT transformations) locally storing user input in the Win userData XML Stores, JSRS for remote storage, XMLHTTP, JS GET VARS, (still no POST vars, understandably).

Here's, a simple example just think of the silliness of md5 on the server after the client already plastered their password on the Internet, when you get it it then gets md5? Wouldn't it make more sense to md5 passwords on the client? JS allows you to do this as well (http://pajhome.org.uk/crypt/md5/).

The most secure forms of encryption are based on client portions of the algorithm never leaving. Such as PGP(http://pgp.com/). PGP works by the combination of public and private keys. One is Internet ready and can be stored, sent and passed around the web. But it is useless without the locally stored one that is never sent anywhere and is only useful in addendum to the other key with the end users passphrase to gain access.

The Microsoft XML parser (http://www.w3schools.com/xml/xml_parser.asp)

Remote scripting "getting information from the server without refreshing the page" (http://www.ashleyit.com/rs/)

userData behavior XML data stores on the client more dynamic and larger than cookies (http://msdn.microsoft.com/library/default.asp?url=/workshop/author/behaviors/reference/behaviors/userdata.asp)

JS Client side HTTP GET VARS "Passing variables with a QUERY string" (http://www.sjhdesign.com/scrptwrx/scripts/variabl3.htm)

XMLHTTP Both MSIE and Mozilla Specs (http://msdn.microsoft.com/library/default.asp?url=/library/en-us/xmlsdk30/htm/xmobjxmlhttprequest.asp / http://unstable.elemental.com/mozilla/build/latest/mozilla/extensions/dox/interfacensIXMLHttpRequest.html

This is from Bookmarklets.com (http://bookmarklets.com/tools/new.html)

Here's one of the most useful bookmarklets I have:
  Go Wayback (Explorer 4+ and Netscape 4+)
Trigger it when you run into a 404. As time goes by, there will be increasingly more documents in the past than we have in the present. So the value of a service that solves 404s, such as Wayback (which archives old webpages) will increase.
Another good Wayback bookmarklet:
  Wayback Undo (Explorer 4+ and Netscape 4+)
which is necessary because once you go into Wayback you stay in Wayback... this allows you to escape to the present.
To push this a bit further, I'll bring up the topic of XMLHTTP (Microsoft's documents and Mozilla's documents).
This allows scripts on a page to request data from other pages, so you can get a bookmarklet like:
  Wayback Analyze (Explorer 5+ version)
which, if triggered on a result of Go Wayback, will list the sizes of the archived documents. This gives some sense of the overall variation in the page over time. It would be easy to modify this to allow other analyses of the archived pages... allowing searches or summarizations, etc.
The idea of XMLHTTP could be used in other ways. For example, this bookmarklet:
  Self-Link Titles (Explorer 5+ version)
lists the titles of all pages that link off the current page into the same domain. Again, search and summarization are fairly easy. So you get a way to analyze the "cloud of meaning" around the current page.

Chromeless windows, multiple drag drop, resize windows (http://www.dynamicdrive.com/dynamicindex8/chromeless.htm / http://www.dynamicdrive.com/dynamicindex11/abox2.htm)

Dhtml editing control the really cool stuff (http://msdn.microsoft.com/downloads/samples/internet/default.asp?url=/Downloads/samples/Internet/browser/editcntrl/Default.asp)

vcXMLRPC Library JavaScript XMLRPC library (http://www.vcdn.org/Public/XMLRPC/)

WASP JavaScript SOAP Client (http://www.systinet.com/eap/jsstack/demos/headlines/index.html)

Html Editor (http://www.insidedhtml.com/tips/contents/ts12/page1.asp) and tons more (TTW WYSIWYG Editor Widgets)

Cross platform XML parsing in JavaScript (http://xmljs.sourceforge.net/)

Internet connectivity should and will be limited to application handlers with the advant of `web services` with XML-RPC, SOAP, .NET etc... which will no doubt save considerable intnernet load handling.

To sum up there is a lot you can do on the client and you should, it saves resources, energy, doesn't clog the information highway, and it is generally much more secure.

I would suggest to people an effort to impress upon others that websites and posting to the web is a more permanent and thus thoughtful activity. Many people nowadays have very strong PCs that they don't take advantage of. I submit that shifting application goals to the client side will be beneficial to all in the long-term for the three benefits mentioned. They are, Internet resource saver, personal security, and personal independence.


Wait, is this for real?, posted 27 Jun 2002 at 07:39 UTC by tk » (Observer)

mglazer, is this for real, or do you have something up your sleeves?

Freeing up bandwidth is a lost cause. After all, many people use the Internet for such meaningful activities as reading/writing junk, downloading huge MP3s and porn, and downloading utilities to help in downloading more MP3s and porn and other utilities, etc.

Who's Bandwidth, posted 27 Jun 2002 at 07:51 UTC by mglazer » (Journeyer)

There is bandwidth then there is my bandwidth.

The focus on the article (though not mentioned) was local networks or personal networks where you have control. Some local networks can have 3 to 20,000 or more people, yet they are not publiclly accessible. So the issue (bandwidth) you mentioned, as being futile, is still an issue and has a better fighting chance in the below noted environments.

Such as university blue chip wireless LANs. 3g broadband data transactional localized buffer servers, home run WANS etc...

In these environments an administrator has greater control over security, bandwith, processes etc.. so there still is hope and effort that can be made.

A joke?, posted 27 Jun 2002 at 14:40 UTC by deekayen » (Master)

Surely as tk suggested, this is a joke.

Client side processing is never more secure than verifying information integrity server side. Suggesting that client side stuff is secure is what leads to so many PHP scripts with remote exploits, because they take certain client side processing for granted. Have you considered how many sites on the internet are probably vulnerable to SQL injection exploits from someone modifying a cookie set by the server? It's not hard to conceive either. Consider this line in a newbie's script.

SELECT * FROM tablename WHERE userid = $_COOKIE["userid"]

More independence and personal control of your data and information by localization
I know you work with PHP, so you can't tell me that the following functions shouldn't be used. A few: strftime(), localeconv(), localtime(), setlocale()

Quicker load times for the clients is relative. You have the tradeoff of sending gobs of javascript with each request to dynamicly generate HTML (assuming the client can process JS) or you can process it server side with Smarty. The choice there is easy for me.

If I ever visit a site that has a message like "Please visit the microsoft homepage to download a XML parser to be able to see our content", the next thing I'll actually visit is another site, never to come back.

I spose next I'll hear that it's ok to verify form data with just JS to preserve server cycles for something else.

RE: A joke?, posted 27 Jun 2002 at 15:17 UTC by mglazer » (Journeyer)

FYI 'client-side applications' mean something different than you assume.

You think it means the end user on a server side application which it doesn't. It means application progamming, such as JavaScript, peformed on the client's or end user's computer (not your remote server) before it is sent to a server script handler.

Client side encoding by default human logic dictates it is more secure than not encoding on the client side, passing an unencoded password through a public network and then encoding it on the server side after it has been `exposed` already.

Re: A joke?, posted 27 Jun 2002 at 15:51 UTC by tk » (Observer)

Passing the MD5 of a password over the Internet is no more secure than passing the password in the clear. Suppose an attacker knows that the MD5 of a user X's password, P, is equal to Y. Then in order to log in as X, he just sends Y to the server. He doesn't even need to know P.

Furthermore, true PGP/GPG-style security is difficult to achieve, maybe even impossible, using things like Javascript. There are many, many problems.

For one: why should I trust that the JavaScript downloaded from some server running on my PC won't try and upload my private key to the server? (That's why new releases of PGP and GPG need to be cryptographically signed, using older versions of PGP/GPG.)

Another: Many cryptographic algorithms need a plentiful supply of randomness in order to be effective. Random numbers for cryptographic applications need to be a lot more random than your good old pseudo-random rand() and all that. (That's what the Linux device /dev/random is for, and that's why GPG actually includes an entire program just for the purpose of gathering randomness.)

And so on.

Security is a big topic. Even the experts in the field can't get it completely right.

how about sockets, flat files, and GUIs?, posted 27 Jun 2002 at 17:01 UTC by sej » (Master)

Is anyone else amazed by the plethora of mechanisms, standards, and formats required for this new age of Internet-enabled applications? And I'm still exploring what you can get done with sockets, flat files, and GUIs. The web took off because it was an even simpler way of developing applications than that. There are so many acronyms in the above article, I have a hard time believing the process is simple or bug free. But I'm speaking from a lack of experience in that domain. Maybe it is all a joke.

secure communication, posted 27 Jun 2002 at 17:17 UTC by sye » (Journeyer)

i firmly believe the best conduit of secure communication is to invent a 'natural' language between two agents who plan to participate in a private communication. If a third party is able to decode their message, it is an indication that they have a friendly force to join in the ring of communication. If a third party blocked the conduit because of its foreigness to their standard, it is an indication that there is a hostile force that coudn't tolerate or fear about the frequency and nature of this private communication.

Re: A joke?, posted 27 Jun 2002 at 18:13 UTC by deekayen » (Master)

Right, passing md5 instead of plaintext is only marginally better. Even better, and probably the best and most widely used now, is SSL.

RE: Security, posted 27 Jun 2002 at 18:24 UTC by mglazer » (Journeyer)

Your right security is a big issue.

But no one besides Phil Zimmermann (native New Yorker (http://www.mccullagh.org/image/d30-22/phil-zimmermann.html)), the creator of PGP (http://pgp.com/), has been able to crate a stronger encryption algorithm than can't be broken in less than 20 years by any computer of today.

There is a reason by PGP wasn't allowed by the US government to be even downloadable by foreign countries, until a couple of years ago, and is still not available to most arab and other countries that are a threat and might use these encryption capabilities in negative ways.

PGP is the most powerful encryption algorithm it uses a two key system, public and private and the private one which is needed to use PGP never leaves the client based on well known encryption techniques and philosophies.

Nit-picking..., posted 27 Jun 2002 at 18:41 UTC by bgeiger » (Journeyer)

Actually, Phil Zimmerman didn't create the encryption algorithms he used. He only created a popular implementation of them.

The credit actually belongs to people like Merkle, Diffie, Hellman, Rivest, Shamir, and Adleman (all of which worked with public-key cryptography).

My apologies if I got any of those names wrong, and I know I missed quite a few...

RE: Nit-picking...,, posted 27 Jun 2002 at 19:36 UTC by mglazer » (Journeyer)

Yet, Phil is the creator of PGP, the most popular and powerful encryption tools available today.

JS md5 , posted 27 Jun 2002 at 20:51 UTC by ask » (Master)

tk, obviously you include a random variable with the request for the password. So it's more like "User X's password P plus V gives Y". "V" will never be the same twice, so you can't just reply the transaction.

Why we should use client-only programs, posted 28 Jun 2002 at 08:33 UTC by goingware » (Master)

An important point that I hope to get across to everyone in the rush towards server-based software, is that moving applications to the server may ultimately be a disaster for the end-user.

I think server-based applications that are of any critical nature are good things when used purely internally to an organization. Also they are helpful as a way for a company to present its products and services to the public, as with traditional eCommerce.

However, for an individual or a company to use server based software for their daily work is a real problem if they don't own the server the software is running on.

Suppose a company outsources their accounting software to an ASP that provides an accounting package that runs, and stores data, on a server.

Now suppose that ASP is caught in some big financial scandal and goes belly-up all of a sudden. What happens to their user company's finances?

It would help if server hosts made the application data available to download and backup by their customers, but I suspect not all service provides will or that all of their customers will take advantage of it.

I recently read an article about lots of people losing precious digital photographs when a graphic hosting company went out of business. They didn't warn any of their users because they didn't want to incur bandwidth charges they didn't have the cash to pay for if their users attempted to retrieve their property.

One woman lost the only copies of some photos of her brother who had died recently - understand that if you're using a digital camera, your photos may move straight from the camera to the hosting service without even being copied to your hard drive.

Alternatively, even if a user has the data stored on their own disk, they may lose the ability to open them if it is stored in an undocumented format and the server operator goes out of business.

If you purchase even a proprietary application on CDROM, and the publisher ceases to support it or even disappears, at least you still have the original installation media! You can continue to use it as long as you still have a system that it is compatible with, even if you have to reformat or replace your hard drive.

This is one of my gripes with network product activation as well as network software installers like Apple's QuickTime installer. If a publisher goes out of business and you need to reinstall, you won't be able to activate the product key.

If you downloaded the QuickTime network installer, which is just a small program with a UI that allows you to select the components you want, and then a downloader that retrieves them, you won't be able to reinstall if Apple discontinues the version you want to use.

I was quite dismayed when I took a zip disk to work where I could use a high-bandwidth connection to install QuickTime on the Windows machine there, but although I was able to copy the files the network installer had downloaded to the zip disk, the installer wouldn't use them when I got home. Instead the installer forced me to download the whole thing again over a 28.8 modem connection!

It turns out that Apple has a full installer available on their site, that I think they advertise is meant for use as an Intranet server installer, but they sure don't make it obvious how to get it.

And finally, server-based applications are a disaster for the Free Software community. An application vendor can build the server side of an application by patching together and modifying all kinds of GPL'ed software, modifying it as they please, and as long as they only execute it on the server, and do not distribute the binary outside the company, they are not in any way required to distribute the source they used or their own source that is linked to it, let alone place their mods under the GPL.

The result is that a widespread public move to server-based applications would amount to a Free Software motel - GPL source checks into an ASP, but it never checks out!

I and others have brought this issue up with Richard Stallman. He replied to me that he would like to address the problem in a future revision to the GPL, but he said this could not happen soon. I imagine it would be difficult to draft a license that addresses this in a bulletproof way, especially considering that the GPL has yet to be tested in court in any way. I think also even though the FSF gets free legal advice, their pro-bono attorney's ability to donate time is limited.

And let's not forget the issue of security at the service provider's end. Some teenager in England or somewhere cracked a web server and stole Bill Gates' credit card number along with hundreds of thousands of others. I understand this enterprising kid ordered a case of Viagra delivered to Bill's home. There have been many, many cases of h4x0rs breaking into websites and stealing untold quantities of credit card numbers and other valuable personal information.

Now imagine that most of the companies in the industrialized world stored their business plans, financial records, human resource records, critical safety data (of interest to terrorists), and other important information on server sites that are administered the way most server sites are administered today - not particularly well.

If outsourced server-based computing becomes widespread, I predict an economic catastrophe that will make the dot-com bust, Enron, WorldCom and the economic aftermath of 9/11 seem trivial by comparison.

Thank you for your attention.

Server-based computing , posted 28 Jun 2002 at 11:39 UTC by Gregory » (Apprentice)

Goingware

``If outsourced server-based computing becomes widespread, I predict an economic catastrophe that will make the dot-com bust, Enron, WorldCom and the economic aftermath of 9/11 seem trivial by comparison."

I can understand some of your concerns. But with respect aren't you being just a little bit paranoid. I am a Lamp developer and so what if I develop server-based services. As you put it ``By patching together and modifying all kinds of GPL'ed software." Just how do you suggest working around this with out placing restrictions on server side developers like my good self. Who may I add have a much better track record on sharing source code than client-side developers.

Portability is the key here client-side apps, need only really be genetic GUI shells.

With the current move to Broadband and infustructure impovements. The server-side in the long run is going to win out. Why do you think big-blue and others are sponcering this kind of development. That said I agree with you about service downs due to bankruptcy and the such like. Simply pulling the plug and walking away is just not on.

Greg

``What you cannot enforce, do not command." - Socrates

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page