Somewhat Chunky Client

Once again, someone thinks all resources can live away from your local machine on the magical mystical server. The offending victim today is Jay Currie, in an article at TCS. Can I just state for the record that all you distributed network, thin client, remote location people need to have a nice hot cup of STFU? It is, for many technical reasons, a bad idea. It will continue to be a bad idea for the foreseeable future.

Why? Well, let's start with the obvious ones first. If I'm working on something remotely and the connection freaks out, I've lost what I'm doing. Anyone who uses Blogger has surely been through this at least once. Hit the 'Publish Post' button and watch your work magically vanish! That's a paradigm I need extended to a word processor. Not. So if I have to save my docs to my hard drive first, what the hell good is the storage space on the server again?

Next problem: if the server goes down, I got no apps. I work for a Fortune 500 company in the engineering department. We have, for various technical and licensing reasons, a lot of engineering software that runs from a central server. When that server craters, as it did a few months back, I literally cannot do my job. The barely competent MIS department took 2 days to bring the server back up. My company pulls in 2 billion a year in revenue, and has a vested interest in getting mission critical software back up. Somebody providing you with a service for free has that kind of motivation? I doubt it, and I don't need that on my PC at home.

Let's look at the assertion that most of us don't need graphics. Gee, I'm glad Mr. Currie is such a serious kind of guy. I don't do a whole lot of imaging work or graphic design or anything. However, I do use another kind of software that puts a hell of a load on a graphics card: games. Doom 3 is doing me no justice if I'm trying run that off a network model. I'm not too sure, because I'm an anti-social bastard with no friends, but it appears that a whole bunch of other people play games, too.

Really, this is just a return to dumb terminals. There's reasons the industry moved away from that model. Unless those reasons have changed, why would you go back? Especially if, as the article claims, hardware is getting cheaper. (Which it must be, seeing as how Wal-Mart will sell me a box for $278.) If the hardware is so bloody cheap, why do I want to put all of my applications and data on a server located in another time zone? Convenience? Most people do not need this model of computing, and I'm guessing they don't really want it.


Blogger Ontario Emperor said...

Personally, I alternate between both models. For example, I keep my personal e-mail on a server, so that I can access it from any computer, anywhere in the world. However, I keep my business e-mail on my laptop, partially because of server storage limitations, but also partly out of convenience to the way I work.

Even if server-hosted applications do take off, this won't necessarily be the Microsoft killer. Web browsers themselves were supposed to be the Microsoft killer, but Microsoft rose to the challenge. If server-hosted applications become popular, MSN will incorporate a server-hosted version of Word within three years (one year to develop v1.0, and another two years to work out the bugs).

3:00 PM  
Blogger T said...

I didn't discuss the Microsoft killer portion of the article because I find all of those pronouncements absurd, and have for several years. Wow, a new application will knock out Microsoft! Microsoft has 60 billion in cash. That's pretty damn hard to compete against in any meaningful fashion. If the next killer app does come along, Microsoft can just buy one.

The stories about how the IE project started are instructive. Entire departments at Microsoft were told to drop what they were doing and start work on IE. Not drop as in "tidy up you projects and close them down neatly", drop as in "format your drives and forget what you were doing".

9:06 AM  

Post a Comment

<< Home