Nicolas Frankel is an IT consultant with 10 years experience in Java / JEE environments. He likes his job so much he writes technical articles on his blog and reviews technical books in his spare time. He also tries to find other geeks like him in universities, as a part-time lecturer. Nicolas is a DZone MVB and is not an employee of DZone and has posted 217 posts at DZone. You can read more from them at their website. View Full User Profile

Critical Analysis of Frameworks Comparison

12.06.2010
| 4660 views |
  • submit to reddit

Let me first say I was not at Devoxx 2010. Yet, I heard from Matt Raible’s Comparing JVM Web Frameworks. Like many, as I read the final results, I was very surprised that my favorite framework (Vaadin for me) was not ranked first. I passed through all stages of grief, then finally came to realize the presentation itself was much more interesting than the matrix. The problem lies not in the matrix, but in the method used to create it. Do not misunderstand me: Matt is very courageous to step into the light and launch the debate. However, IMHO, there are several points I would like to raise. Note that even Matt’s work was the spark for this article, the same can be said for every matrix which aim is to rank frameworks.

Numbers

The matrix uses plain and cold numbers to calculate ranks. As such, there’s a scientific feeling to the results. But it’s only that, a feeling. Because each and very grade can be jeopardized. How do you assign them? Let’s take the mult-language criteria: it seems the maximum grade (1) is given when the framework supports Java, Grails and Scala. From what I understand, Struts is available in JAR format, that can be called from any JVM language. So why the 0.5 for Struts?

On the other hand, I personally would assign brand new frameworks (like Play and Lift) a 0 for the degree of risk criteria. I would also give a flat 0 to JSF 2. It all depends of your vision.

Perimeter

This one is short: what’s the difference between the ‘Developer availability’ and the ‘Job trends’ criteria? The former is the snapshot and the latter the trend? Then why the ‘Plugins/addons’ or the ‘Documentation’ criterion do not get the same treatment?

Criterion weight

Why in God’s name are all criteria assigned the same weight? What if I don’t care if my application can easily be localized? What if I don’t have to support mobile? What if scalability is not an issue? Giving criterion weight should give entirely different results. Now the problem lies in assigning the right weight. And we’re back on point 1.

Context, context and more context

My previous article advised one to think in contexts. This stays true: if you’re located in Finland, I bet ‘Job trends’ or ‘Developer availability’ shouldn’t be a concern for managers wanting to start a Vaadin project. In contrast, in Switzerland, I don’t know many people mastering (or even knowing something) about JSF 2 or Spring MVC.

Requirements

I don’t think anyone should choose a single framework and be done with it. Think about it: if you choose Flex, you won’t be able to run on iPhone whereas if you choose a traditional approach, you won’t be able to run your application offline. Different requirements mean you should have a typology of possible use-cases and have a framework ready for each one.

My own experience

I was confronted with the same task when we had to choose a JavaScript framework. We did a written piece on the candidate frameworks, listing for each perceived pros and cons, but voluntarily did not rank them. IMHO, I believe this is more than enough to let you choose the right application framework, depending on your requirements and your context.

Conclusion

On the persistence layer, Hibernate or EclipseLink are leaders. For Dependency Injection, Spring is the de facto standard. For the presentation layer, the problem is not in the lack of choices but in the plenty alternatives you’ve got. Because many of them have strong pros and cons. Waiting for the perfect framework that will solve all and every of our problems, we should settle for the one best matched to the requirements and context at hand.

 

From http://blog.frankel.ch/critical-analysis-of-frameworks-comparison

Published at DZone with permission of Nicolas Frankel, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Tags:

Comments

J Szy replied on Mon, 2010/12/06 - 3:48am

Why in God’s name are all criteria assigned the same weight?

There is a tab with weighted results, but the weights are, to say the least, insane. E.g. support for mobile (which may well be unnecessary) has the weigth of 10, but i18n/l10n has 0 (yes, zero). Support for languages hardly anyone uses is weighted 10, but risk degree is deemed irrelevant, and so are the job trends.

Raveman Ravemanus replied on Fri, 2010/12/10 - 3:39am

good points, I only disagree about SpringMVC and JSF2.

 

SpringMVC is used in a lot of places(if you think about its one of the best framework because of its name, you have Spring(it used to be very cool) and MVC Design Pattern in one(it makes explaining MVC at job interviews easier, i hope more frameworks will use Design Patterns, do you remember Bridge Design Pattern? I dont)).

 

JSF is the standard and that the reason why many people choose it(that is a good criteria, I would be happy to never learn another web framework-please remeber that when you look for a job you should know all web frameworks). I see JSF2 being like JPA2, the same thing just a little better and it will gain market.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.