I'm the father of ItsNat AJAX Java web framework, JNIEasy, LAMEOnJ, JEPLayer and JEPLDroid libraries (and very old stuff like XPDOM), supporter of the Single Page Interface (SPI) paradigm, writer of the SPI manifesto and currently Android native guy. Jose Maria has posted 28 posts at DZone. You can read more from them at their website. View Full User Profile

Can Your 300 Page Web Site Be Represented on One Page?

05.29.2010
| 7652 views |
  • submit to reddit

As you propably know ItsNat is strongly oriented for the development of web sites (not only web applications) based on a single web page without reloading. That is done by following the Single Page Interface paradigm (SPI), with no loss of the typical characteristics of a standard page-based web site (SEO, accessibility with JavaScript disabled, bookmarking, back/forward buttons, visit counters and so on). 

I am a firm believer that SPI is the future of web sites, maybe Web 3.0. In fact there are some big movements like Twitter or FaceBook that are mainly SPI web sites. Of course we do not need to be too purist, there may be two, three pages per web site :)

The problem of SPI applied to web sites is to avoid web site duplication, SPI and page based for SEO. Google is aware of this serious problem and is trying to provide solutions. The hardest part is how to crawl AJAX content, and proposed approaches by Google are too "hand made", costly or problematic in my opinion.

The SPI Manifesto

Trying to push forward web development evolution to SPI, The Single Page Interface Manifesto was published to show how SPI can become real in web sites. To lead by example, my company's web site innowhere.com was converted to SPI with the same aesthetic and of course with SEO, bookmarking and visit counters.

The SPI Tutorial

The next step was a SPI tutorial showing how to build a simple SPI web site with ItsNat pursuing the objectives of the Manifesto. This tutorial showed how a web site can be in the same time SPI and page based, where "states" (in SPI "state" concept replaces "page") can be designed with plain HTML markup and therefore very similar to the traditional page based approach, the main difference is you only design the fragments being to be inserted avoiding the typical repetition of headers, footers of every page (the "include" hell) and in the case of ItsNat, no server logic is included into the HTML markup of templates (because view logic is Java calling Java W3C DOM API). Of course alongside this tutorial is the expected online demo. In the last version of this tutorial/demo Back/Forward buttons are simulated updating the page without reloading, that is, even more SPI (the first version reloaded the page to the expected state).

The next step seems obvious...

Why not convert to SPI a real world big site?

Yes, is done... but not the original one, I have partially cloned as SPI the e-commerce conventional page based web site of a very big Spanish retailer, following similar techniques and code shown in the tutorial. The Spain's law says that web sites of big companies must be fully accessible following the old WAI approach, that is, fully working with JavaScript disabled, as anyone can understand, JavaScript disabled and SEO friendly are two objectives very similar. The Manifesto and the tutorial showed how a SPI web site can be at the same time page based, these ideas and technical background were applied to this new challenge.

This web site resulting from this has:

  • Navigation with no reload.
  • Bookmarking of states the same as pages
  • SEO compatible (try to disable JavaScript to understand how is "seen" by web crawlers).
  • Back/Forward support (browser's history navigation in general) with NO reload.
  • Fully working with JavaScript disabled.
  • Layout" exactly the same as the original site.
  • Remote view/control of other users using the web site (typical "free" bonus of ItsNat).

Read the information in the overview page including the terms of use and what part has been cloned as SPI.

Link to the DEMO.

Can your web site be SPI?

Is SPI the future of web?

 

Published at DZone with permission of its author, Jose Maria Arranz.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Stephane Vaucher replied on Sun, 2010/05/30 - 2:06pm

I'm confused by this post. You are preaching about how sites should be implemented as single pages. Fine, but you have not provided substantial evidence as to why. Since you now converted part of a large site, will you now be able to show us quantitative evidence as to why SPI is good? You basically reproduced "traditional requirements" in SPI. Now why is it better? Are the sites easier to maintain, more scalable?

Alessandro Santini replied on Sun, 2010/05/30 - 5:04pm in response to: Stephane Vaucher

It is indeed a valid point, Stephane. Jose Maria is in a privileged position to let us know :-

  • Improvements in server CPU/bandwifth utilization;
  • Improvement in response time / scalability;
  • Improvements in the overall user experience.

Jose Maria Arranz replied on Mon, 2010/05/31 - 1:44am in response to: Stephane Vaucher

Have you tried to disable JavaScript and navigate through the same web site?

 

Jose Maria Arranz replied on Mon, 2010/05/31 - 1:51am in response to: Alessandro Santini

  • Improvements in server CPU/bandwifth utilization;
  • Improvement in response time / scalability;
  • Improvements in the overall user experience.

  • Get rid of the weird and absurd page based paradigm in development. In desktop NO ONE develops applications using a page based approach (no, wizards are not page based, there is only one page which content and transitions are fully controlled by the container). 

Alessandro Santini replied on Mon, 2010/05/31 - 2:57am in response to: Jose Maria Arranz

Jose, I guess comparing web and desktop applications in this case is like comparing apples and oranges.

Desktop applications:

  • Do not serve thousands of users at the same time;
  • Do not suffer of the same latency problems web applications suffer;
  • Do not use a document-oriented protocol (that is, HTTP).
Besides being a nice marketing pitch - what value does SPI bring to the table? Sorry for being so direct, but you already know my skepticism to trendy technologies :)

Jose Maria Arranz replied on Mon, 2010/05/31 - 3:07am in response to: Alessandro Santini

Jose, I guess comparing web and desktop in this case is like comparing apples and oranges.

No longer true, if you remove HTTP and embed web browser in your desktop application, HTML-JavaScript is just another GUI technology. HTTP just adds remoting.

  • Do not serve thousands of users at the same time;

Correct.

  •  Do not suffer of the same latency problems web applications suffer;

Any client/server application suffers of these problems. 

  • Do not use a document-based protocol (that is, HTTP)

 HTTP + session cookie = stateful protocol

 HTTP + HTML is document based only when page loading, in a SPI application is just the first request. ItsNat AJAX events return JavaScript code, that is, behavior (most of the time transporting view changes in DOM form) and SPI applications are usually event based (not page based).

Besides being a nice marketing pitch - what value does SPI bring to the table? Sorry for being so direct, but you already know my skepticism to trendy technologies :) 

Being a bit skeptical is good.

You have answered this question for me :)

  • Improvements in server CPU/bandwifth utilization;
  • Improvement in response time / scalability;
  • Improvements in the overall user experience.
  • Get rid of the weird and absurd page based paradigm in development (yeah this is mine)

 

 Think for a while why almost any modern web technology, server centric and client centric is more and more SPI focused (there are many). The problem is most of them (all?) cannot be applied to web sites  (SEO, JavaScript disabled, freedom of layout design...), SPI is already mainstream in web applications.

 

Alessandro Santini replied on Mon, 2010/05/31 - 6:57am in response to: Jose Maria Arranz

No longer true, if you remove HTTP and embed web browser in your desktop application, HTML-JavaScript is just another GUI technology. HTTP just adds remoting.

Not entirely. HTTP is still the provisioning protocol of the content, be them whole pages or fragments. Remoting (read Remote Procedure Call) protocols have been created and adapted to HTTP.

Any client/server application suffers of these problems.  

This is true but the significance of delay in web applications is much higher; desktop applications suffer  delay only during calls to a remote system (if any); web applications suffer delay during any interaction phase (page/fragment loading, remote call).

HTTP + session cookie = stateful protocol

Document-oriented protocol != statefu/stateless protocol

HTTP + HTML is document based only when page loading, in a SPI application is just the first request. ItsNat AJAX events return JavaScript code, that is, behavior (most of the time transporting view changes in DOM form) and SPI applications are usually event based (not page based). 

You will definitely agree with me that the advantage introduced by AJAX in any of its incarnations is inversely proportional to the page delta (that is, the bigger the change, the lesser the gain). This without adding the payload/traffic generated by all the HTTP request/response headers that AJAX generates at each interaction. I know, this is about page design, but still it is an additional design effort to be taken into account.

What I fail to understand, Jose, is why can't you give us some metrics in terms of bandwidth/CPU or decreased response time in a AJAX vs. classic HTML approach.

Let me take it from another angle: let's imagine I am a manager or any other form of decision maker; you are the pre-sales engineer; how would you actually convince me that AJAX, SPI (and I presume ItsNat in your case) are a much better approach to web development? By thinking that I will jawdroppingly stare at the geekness of the framework or by providing me with some real *facts*?

Thanks.

Jose Maria Arranz replied on Mon, 2010/05/31 - 7:22am in response to: Alessandro Santini

What I fail to understand, Jose, is why can't you give us some metrics in terms of bandwidth/CPU or decreased response time in a AJAX vs. classic HTML approach.

A very simple performance test:

Load the demo in this state, open FireBug, enable the Console, be sure "Show XMLHttpRequests" is enabled, now change "Listar por:" from "Novedad" to "Nombre" to "Precio" to "Más vendidos", FireBug shows around 650 ms per request (of course this value depends of how much load is serving the server, your connection, your computer, FireBug overhead etc). Now disable JavaScript and do the same, I can roughly measure around 2.5-3 seconds per request (again it depends on many things).

Think any of your favorite AJAX intensive web application (pick any web app of Google) as page based.

From a development point of view:

 - Templates only include HTML fragments to be inserted (and removed) when you want and where you want, compare this approach with the page based way of development with tons of typical parameterized includes repeated again and again in every page usually mixing different states in the same template (switching them with view logic inside the template) to avoid too many template files.

 - Event based development, avoiding the typical problems of coordination between pages and the hell of back/forward/history navigation/unexpected reloads.

 

Jose Maria Arranz replied on Mon, 2010/05/31 - 7:30am

You will definitely agree with me that the advantage introduced by AJAX in any of its incarnations is inversely proportional to the page delta (that is, the bigger the change, the lesser the gain)

There will be EVER some gain, because ItsNat tries to use innerHTML as much as possible, innerHTML pushes plain HTML markup to the native code of your browser, this native code will parse and process this markup at the same speed as the markup being loaded when loading a page but without the overhead introduced by page loading (because in DOM terminology a Document object construction is ever heavier than fully changing the content of an already created Document and already attached to a view).

 

Mclaughlan Craig replied on Tue, 2013/12/03 - 2:58am

 SEO is made really intelligent to work and identify websites that deliver real content and information, rather than just duplication. That is why, I strongly urge those who are revamping their website to make it simple yet content rich, and have website marketing campaigns that would direct traffic to your website. It is always important to make your website navigation to make the robots crawl easily within.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.