I'm the father of ItsNat AJAX Java web framework, JNIEasy, LAMEOnJ, JEPLayer and JEPLDroid libraries (and very old stuff like XPDOM), supporter of the Single Page Interface (SPI) paradigm, writer of the SPI manifesto and currently Android native guy. Jose Maria has posted 28 posts at DZone. You can read more from them at their website. View Full User Profile

Don't Throw Away Your Old Java Web Framework: the Short Single Page History of Twitter

  • submit to reddit

Twitter.com is one of the most popular websites in the world, few people know that is also one of the few Single Page Interface, stateless, SEO compatible websites in the world.

Twitter is SPI in the sense that it prevents fully loading pages; each click implies a partial change of the page without loading a new page, with the necessary data obtained through AJAX.

Twitter is SEO compatible because there are public pages designed to be accessed by search engine robots such as Google bots, while that same page can be viewed by users that have logged in. For example http://twitter.com/jmarranz is basically the same page, whether you are logged in or not.The key is JavaScript: when JavaScript is ignored (case of crawlers) the links of the page are conventional links to other pages, when JavaScript is executed the page is SPI (and if you're logged in it is fully functional).

Page crawlers (crawlers) such as Google Search do not interpret JavaScript, and therefore they see the world as "paged" -   no AJAX is used in the page loaded by the robot and the robot will not process AJAX loaded states. This is not the case of Twitter which provides alternative conventional pages.

It is "stateless" in the sense that Twitter ensures that servers do not have information of the status of the user page loaded, that is, web session data. This allows requests to arrive at any node of a server cluster without sharing sessions or needing server affinity. Looking at the AJAX requests Twitter sends an id representing the temporary state of the user's page saying something like "previous stuff is already loaded, I want new things".

As we will see later, this SPI approach is server centric or hybrid (even though it has a lot of programming client), but Twitter has not reached the current implementation in the first attempt, there was previously an SPI client centric implementation.

First version: client-centric

The first version of Twitter Single Page Interface used a currently trendy and hot approach, client pages rendered with JavaScript based on data retrieved from the server through REST APIs. This approach is lately seen by many as the "the path to follow".

We all know Twitter's REST API, which returns user activity data in JSON format. This API was very popular in alternative Twitter clients until the company introduced ​​ limitations that harmed the popularity of these readers. By then the Twitter website itself was a consumer of it's own REST API so that the browser was a real Twitter client for logged in users...

Twitter pages were mainly empty of data on initial load and the browser rendered the page via JavaScript requesting JSON data in successive AJAX requests. 

The title "client-centric" means that the HTML is rendered from data. Where and when HTML is rendered from server data is the big architectural part of a web application, in this case it is the client.

In summary Twitter was a SPI website when users were logged in. For bots ignoring JavaScript and public pages Twitter offered alternative SEO compatible web pages. At the time hashbangs #! were intensively used.  Hashbangs allows "compatible SPI" links while allowing bookmarks.

Hashbangs are also SEO compatible because Google has been supporting them since some years ago.


When Google sees: http://twitter.com/#!jmarranz

Google will load: http://twitter.com/?_escaped_fragment_=jmarranz

To offer a SEO version of public pages Twitter generated alternative pages for bots and SPI behaviour rendered markup in client when JavaScript is enabled.

Second (and current) version: server-centric (or hybrid)

In early 2012 a change occurred in Twitter web engineering which apparently could be described as a conservative revolution, an apparent return to pages, to the first pre-SPI Twitter website. A retro-revolution led by Dan Webb, principal engineer in Twitter.

At the same time one of the key developers of the client-centric SPI Twitter left the company for a startup. 

Everything seemed to point to a return to a classic paging system, slightly spiced with some JavaScript and AJAX but far from the radical and "ahead of time" client-centric SPI model of the version at the time, which looked as if it was going to be thrown away almost completely.

Dan Webb seemed an avowed enemy of the Single Page Interface according to an article in his blog against hashbangs, a cornerstone idea for providing SPI, bookmarking and SEO compatibility in any browser: http://danwebb.net/2011/5/28/it-is-about-the-hashbangs

Just aT the end of the Twitter blog entry it seems to be a light and hope for SPI:

"What’s next? We’re currently rolling out this new architecture across the site. Once our pages are running on this new foundation, we will do more to further improve performance. For example, we will implement the History API to allow partial page reloads in browsers that support it, and begin to overhaul the server side of the application."

The key words are: "History API". I myself was alarmed to read hashbangs frontal attack by the chief engineer of the Twitter website, one of the major drivers of the SPI, and tried to "dissuade" Dan:



I was crazy enough to make this proposal: "In JSON and AJAX requests avoid your own REST API in server render your page chunks and inject the markup into the page with innerHTML as much as possible"

Dan Webb's response deals with the fact that that although there seems to consider some partial updates  made via AJAX but not using hashbangs, using the History API instead (not possible for instance in IE 6-8 browsers).

The last tweet says:

"we made our perf decisions based on data. It's not about liking or not liking a technique. It's about what we prove is fastest."

I thought he was talking about full page loading vs pages rendered in JavaScript. My surprise is that the focus of the "new Twitter", started for a while before our conversation, was basically the same as "my proposal" (although I have always defended the hashbangs):


The current server-centric (or hybrid) SPI approach of Twitter.com

At the time when the previous talks took place, the "new" Twitter.com was just being born, gradually changing the pure client centric approach based on API REST to the new more server-centric approach rendering again in server. Today Twitter.com is basically a SPI web site for a logged in user who is using a modern browser with JavaScript enabled

The main motivation of the new hybrid architecture was  performance:

"That architecture broke new ground by offering a number of advantages over a more traditional approach, but it lacked support for various optimizations available only on the server"

The main new features of this approach are:
  • Any publicly loaded page is initially the same for users, logged in or not (maybe bots). This ends the dual model website for SEO support.
  • It follows a Single Page Interface approach but avoiding hashbangs, instead the History API is used. History API is not available in older browsers with AJAX support like IE 6-8, in these minority browsers is accepted that user navigation is poorly paged.
  • Partial page changes are made in server, this can dramatically reduce and simplify the necessary JavaScript code for page management. It may improve rendering speed as well as decreasing the number of requests necessary because in the same AJAX request page chunks can be rendered with different sets of data, in case of a REST API several request would be necessary.
What is the point of this article for a Java (or back end in general) developer?
Well, a lot, considering that there is a trend towards 100% client-centric applications accessing the server via REST APIs returning JSON data. Therefore, don't throw away your old web framework, especially your template processor, maybe you're going to need it again :)
Published at DZone with permission of its author, Jose Maria Arranz.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Cyberiax Mega replied on Fri, 2013/07/19 - 9:12am

Very much agree with this!

Many people jump to fast to conclusions, thinking any new method is immediately the one and only method and the clear future.

Other example: few years ago dynamic typing became more popular and static typical less. After a year 'everyone' was convinced. Dynamic typing is the new silver bullet. All statistically typed languages will disappear and dynamic typed will dominate. But then dynamic's upward trend stabilised and even declined! Now many years later static is at 70% and dynamic at 30%.  Where are all those people now who where totally sure static == ancient in 2003 and that starting any project in a static language was stupid since they would be history very soon?

It's the same with the all-client side approach! There is some trend and everyone thinks it's the absolute future. But the trend is already slowing down and the problems become obvious. From server logs it looks response is much faster, but there is significant overhead on the client! Naive people 'forget' to measure this.

I'll stick with JSF (Java) and RoR (Ruby) for sure!

Jose Maria Arranz replied on Fri, 2013/07/19 - 10:50am in response to: Cyberiax Mega

Thanks for your comment

There's an electronic concept also valid for many other science areas: signal to noise ratio 

and can also be applied to Internet as an amplifier of new ideas and sometimes amplifier of ... noise.

Internet is excellent amplifying new ideas, this is the "noise", if this noise doesn't become a strong signal it will be vanishing or in the best case it will be steady in the real value it provides. 

The problem is most of people think noise will become EVER in strong signal, and "Internet history" proves is not ever true, is the hype cycle. In the real world the "silver bullet" is not so nice and perfect but meanwhile the hype is strong you're rarely going to listen "the dark side" usually because some kind of shy or because "the bad news messenger" is not usually the "popular guy in college" or because is not easy to be "the negative guy in the block". Years later when hype vanishes and only the good parts and uses survive, sometimes you're going to listen the crazy contrary "everything was wrong in X".

Another analogy is the tale The Emperor's New Clothes it works fine in the tale, but in the real world the child would be insulted "stupid" or worst words before recognizing the child was right... years later when the Emperor is dethroned :)

When Internet propagates much noise of something based on repetition, repetition, repetition, repetition, is hard not to follow, is hard not saying "me too", in this case is interesting to avoid too much noise and read/search the *very rare* criticism on this "new stuff", because most of the time the "new stuff" brings something interesting to the plate but far, far, of a silver bullet and it usually comes with some serious caveats.

Just an example: I've ever said that those XML Spring files were *code* instead of "configuration", and that you should seriously think if those +1000 LOC of XML is good engineering or just a pile of crap able to kill to Gang Of Four  guys. In the Spring hype days when everybody was repeating the mantra "put all dependencies in XML configuration files" generating this kind of monster, saying something like previous was similar to invite people to throw you to the bonefire. Now this is not so rare (here too).

Or thinking in the future: be ready to read horror stories of data corruption in NoSQL databases with no ACID and not transactions applied to problems needed of ACID and transactions or something similar to compensate.

Freedom of thinking is very hard... and is even harder when you're really wrong, is this "fear to freedom fear to be wrong" is a very strong force driving us to be "another clone, another parrot".

Peter Booth replied on Wed, 2013/07/24 - 1:12pm

Thanks for writing a thought-provoking and optimistic article. Having struggled with the performance and capacity implications of designs that depend on session affinity for over a decade I can see the value in architectures that avoid it. You mention the XML configuration files. I can still recall sitting in a huge conference room at Java One and hearing the first presentation describing the EJB spec. "Either I'm too stupid to see the value in this, or this is the dog's breakfast it appears to be", I thought. My distaste for XML became revulsion. When Spring appeared, as the "dependency injecting solution to J2EE" I was baffled, "WTF- more of the same?" and fled to the RoR camp. I didnt like what was happening to Java.

As a consumer I love the experience of web apps that have an SPI model. Kayak, Gmail, work the way that I want them to work. Page based user interfaces feel clunk and unfriendly. As a programmer who specializes in performance work I've had pain with the only SPI app that I have worked on, because of it's reuse of the same JSP to render whole pages, receive forms, and send/receive in-page Ajax requests. This might have simplified coding but reduced the transparency of web logs and also made (server side reverse proxy) caching difficult to impossible to implement safely. I'm pretty confident that was an implementation detail, not a feature of this model. 

Anyway, thanks again for the post, and I look forward to working on a great example of an SPI architecture that has no need for session affinity.


Jose Maria Arranz replied on Thu, 2013/07/25 - 1:24pm in response to: Peter Booth

Thanks for your comment.

I'm not "against XML", XML is nice for "static tree descriptions" for instance think on Android UI "static definitions" based in XML, is your problem is not "static" you can go through the procedural approach.

I'm against "XML imperative programming" any "sane-mind" guy can see Spring XML is just Java imperative programming in XML form. Fine if your XML is just a very small handful of line but I've heard horror stories of thousands of lines of XML code just because some "brilliant mind" said something like "to get the Java code free of dependencies put all your dependencies in XML "configuration"". 

This approach has been the "standard" in "enterprise Java development", this is why I seriously ask myself about the sanity of mainstream trends. 

Maybe the pure client centric trend based on the clumsy JavaScript world is becoming one of these insane approaches.

Regarding "a great example" of SPI, is not a great example but it works: take a look this example and how to

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.