ArticleS. UncleBob.
TheWebWelcomeToThe70s [add child]

The Web: Welcome to the 70's!

In the 70s there was a popular style of on-line computer system promoted by IBM. There was a big mainframe in the sky surrounded by dozens, sometimes hundreds, of 3270 terminals. These terminals were quite smart, for their day, and did much of the formatting and validating.

The model of operation was simple. The terminal sent a structured request to the mainframe. The mainframe responded (eventually) with a structured response. This response included data and formatting information. The terminal would format the data on the screen nicely. Then the terminal would collect and validate the response from the user. Once the user had filled in all the fields, and checked all the right boxes, they would hit <strong>enter</strong>, and the terminal would send all the entered data in another structured packet.

Sound familiar?

For awhile, in the late 80s, and early 90s, we got away from this model. We dabbled with distributed computing and true client/server systems. We wrote systems that had a true separation of concerns. Servers were focussed and cared nothing for formatting. They dealt in the realm of pure data and pure business rules. Clients were in the business of interpreting that data for the users. Servers didn't care what kind of clients they talked to. Clients were the liasons knowing <em>both</em> parties' needs, and tuning the conversation to meet them.

When the web first started to peek up above the horizon in the early '90s, it wasn't clear what was going to happen. The notion of applets was compelling. Perhaps the server could have a repository of tasks that could be downloaded to the clients to run there on behalf of the user. Though there have been some valiant attempts (and even some successes) this model has not really caught on.

Instead, like MacArthur:

we have returned




...to the '70s.

!commentForm

Append your comments below:

Web model does look very similar to mainframe but with one site serving hundreds of thousands clients instead of dozens and orders of magnitude lower cost of connection and server hardware itself plus the technology itself widely available so a regular student could host a web site.
I very much agree with Ryan on the fact that using XMLHttpRequest on the client is progress (and in fact- this is exactly what we do for all our new intranet apps) but it doesn't mean we are back in the 80th doing client-server.

Yes, the progress continually returns to old ideas but on a different level. So instead of going in circles we are slowly ascending a spiral.
 Sat, 10 Dec 2005 23:36:45, RScott, Not quite the '60s and '70s
We have to remember that, unlike the past decades, this technology is a currency now used freely amongst the masses. And, that is quite significant!

In the past, only the computational "high priests" had access to hyper-text concepts or DARPA's TCP/IP schemes. Consider how hard it is to make the complex a simple thing. It's true what is said above. But, thanks to Tim Berners-Lee for the kick in the pants at that point in history... and to the next steps when simplyfying to a subset of SGML became HTML (and now expanded to XML for the business community by proponets at MS).... and the HTTP...

Now let's look towards dumping HTTP for a more security-oriented deal for transactations! (I don't want a little piece of software on my machine to perform this communications function for every company with which I wish to do business).
 Sun, 11 Dec 2005 07:21:57, Jim Menard, Eternal pendulum
For as long as I've been in the computer business (around 25 years), the pendulum has kept swinging between central server/thin clients to client-server/fat clients. (Hmm, maybe peer-to-peer was an extreme swing on the "fat client" side.)

There is always money to be made in changing from whatever we have (whether it works or not) to a new, shiny way of doing things. Thus the pendulum will keep swinging.