Are we de-evolving or on a natural evolution?

Talking around the office about a future with web-based IDEs, it was interesting that people are starting to get it, or at least, not scoff that it’s something we’ll ever to deal with. There are some good aspects to it for the tools business. At the least it’s a great way to quickly get our products out to customers with minimal install fuss (the bane of my existence these days at work), and it’s a great way to get immediate feedback on what they find valuable.

The question that needs to be answered is why now? I remember back in the 90’s we were clamoring to get away from the client/server model. Everyone wanted a PC or workstation on their desk and former stars of the server world, DEC in particular comes to mind, faded away. Servers found a new life thanks to the web and it seems now, about 20 years later, we starting to climb back onto the client/server bandwagon. Why did we get away from that architecture and what’s happening to make us want to go back.

From what I know, looking back, I think one of the biggest problems with servers in the 80’s and early 90’s was their sheer cost. They were expensive machines. You could buy 100 PCs for the cost of one of these things. Worse, yet, they didn’t provide 100 times the compute power. The price/performance ratio made PCs a smart bet. They are both cheep and powerful. That, and they provided freedom to the user. If the server went down, they could keep working, and if they wanted to install some “forbidden” software, they could do it. It was really refreshing come to think of it.

But as any IT professional, or installer guy, would tell you, maintenance of all these machines is a nightmare, for the admin, and for the bottom line. As employees of larger companies well know, there are companies making money on software that beaver away in the background making sure all the other software is kept up-to-date and on the up-and-up. And, of course, some of the more rogue employees know how to uninstall that software and get it out of the way ;).

What the old server model provided was that ease of maintenance. You installed software on one machine and all your users had instant access to it. Of course there are risks to that as all of us tweeters had to deal with today, but with an improved focus on security and robustness with these critical server apps, like we had in the server era, those should become rare.

That, and looking at the cost of servers these days, the costs are way down. I would think that the price/performance curve is turning towards the server side. And just look around your workplace and count the number of CPUs sitting idly. It would be an interesting study to figure out what percentage of CPU power companies have is actually being used. It might make more sense to spend more on servers and less on desktops. You don’t need that much power to run a web browser, especially with the ever improving JavaScript VMs that we are finding in them these days.

It’s not really far fetched today to see a future, say five years away, where all of our apps are running on servers, in the “cloud” say, and we are accessing them through “dumb” terminals running web browsers, which is what Google’s Chrome OS and I’m sure others will provide. The economics are right. The culture though is something else. Are users ready to give up the freedom that traditional desktops provide? I think so, but only if the applications provide significant new value. Tools that integrate with other web apps to allow collaboration over the web could provide that value. Running desktop-style apps that simply display themselves in a web browser, will not.