Single Page Application (SPA) and applets - what's the difference? The fat-narrow evolution.

I had the chance to read the Manning SPA book. It is a great read and I encourage you to read it.

It describes how to use the browser as a deployment platform for your application and to develop your application all in the client (the browser) using javascript. The central idea is that a SPA is a desktop application really in that most of the logic, except data access, occurs on the client.

Based on the description, there is a launcher html file that is used. It specifies most of the javascript and CSS resources you need for your application. And there is a specific, or at least recommended, way to manage your javascript code for scalability, etc. It recommends that you manage your "single page application" from the javascript. This is all good stuff because I see most of the execution in a variety of frameworks really moving to the desktop. Some frameworks are still more server centric. Essentially, in a SPA, you do not do the puts/gets etc. back to the server, the javascript merely uses socket-like communication to obtain its data from the server. In SPA world, the server can be written in node.js which helps you use the same language and programming approaches across the stack (which is a very worth goal).

Based on the description, I thought that SPAs are the next applets. There is a different of course. A java applet requires the client to have already installed a JVM. But wait! A SPA requires the client to have a browser (essentially) to run the javascript code and like a JVM it really needs to be a recent version to support the evolving nature of javascript. Alright, I'll admit that most computers have a more up to date browser than they do a java JVM, but the ideas, at an abstract level, are the same. Here's the salient common areas between SPA and applets at a simple level of abstraction:
  • There is a required component on the client.
  • Most of the application code is designed to be executed on the client.
  • The client accesses data resources remotely.
  • Your DOM is the same as the visual tree/scenegraph you create for your application.
  • You manipulate the DOM/scenegrap programmatically as needed.
And what is most interesting is that you still need best practices, a whole lot of developer discipline and a good design. That has not changed.

I still find javascript code alot of work to manage as a delivered product (resources, managing code's lifecycle). dart/coffee/typescript work to make that easier for the average developer. Its almost funny. dart/coffee/typescript promise one language, easy to use, runs on all platforms because they compile to javascript. But wasn't that what java promised?

So we see that architecturally, we are getting back to some common architectures that have been proposed for the past 20 years or so and already implement. But "this time its better!" is what I hear. What we see is a different set of choices that emerge to fulfill each architecture component as the specific hardware/delivery platforms evolve.
  1. For a long time, there was the idea that a fat client would continue to get fatter and more powerful.
  2. Applications were developed that typically had a fairly large execution footprint and a substantial number of environment dependencies in order to improve "integration".
  3. Then mobile platforms came along, they were very disruptive but their ability to run fat programs was limited.
  4. So a new set of technologies evolved (web programming first was created for sharing not for mobile).
  5. Those new technologies were fairly limited or designed to solve a very specific problem.
  6. Over time the mobile platform will become more capable and become fatter.
  7. Those new technologies become old and their lack-of-scaling/evolving becomes evident because everything becomes costly and/or difficult again.
  8. The new plaform (wristbands, watches or headbands) will come along and be narrow again. New technologies will emerge to fit that platform.
  9. The cycle repeats itself.

This is the interesting aspect. You move from fat to narrow in the sense of capabilities on a specific platform. Then through advances in hardware platforms, you get to moving back to fat. But in the process, a variety of technologies evolve more aligned to that narrow platform than other platforms.

Those new technologies are typically narrow to start with as well and often scale horribly to something fatter. That's what I often think about javascript. Its was a good language for small web page and you can see that it was easy to write an interpreter for because of the way it was designed. Then when it comes time to scale to a fatter platform, you have to evolve javascript through the ECMA process or write a language on top of it. Blah.

If the narrow technology designs are not written well to scale, then they will create more wreckage over time and actually slow down innovation. This is the exact opposite that a technology written to support the narrow phase was designed to do--a narrow technology is designed to by flexible, nimble and speed time to market and improve innovation. They have a time in the sun then they become clouded a bit.

I guess we are stuck with fat-narrow cycle. It's not wrong, just an unfortunate cost of the landscape. Thomas Kuhn where are you?!

Popular posts from this blog

graphql (facebook), falcor (netflix) and odata and ...

React, Redux, Recompose and some simple steps to remove "some" boilerplate and improve reuse

Using wye and tee with scalaz-stream