Enterprise Integration Zone is brought to you in partnership with:

Software Dev&QE, Author, Pilot... Software engineer interested in bleeding-edge technologies, always craving new toys to play with especially in the Java world. Working as a supervisor in the JBoss division of Red Hat, propagating JBoss projects and products in the Czech Republic by teaching courses at local universities and writing blog posts. Occasional freelance consultant and trainer. Likes to save moments of everyday life using a camera, has a great passion for flying high in the sky and more. Martin has posted 2 posts at DZone. You can read more from them at their website. View Full User Profile

ESB Performance Pitfalls

08.05.2010
| 14449 views |
  • submit to reddit

An Enterprise Service Bus (ESB) is at the heart of many Service Oriented Architecture (SOA) solutions, a technology that is being widely adopted nowadays. Among plenty of various ESB offerings, how do you choose the right one, the one that best suites your needs? Let's consider various attributes that might be of interest for you: ease of development of services, ease of deployment, features for message manipulation, transactional processing, message persistence, supported endpoints, memory consumption, license/support cost, available sources, documentation, and performance.

How can I measure performance?

Performance seems to me to be a frequently discussed attribute of available ESBs. Vendors are always creating scenarios that make their implementation look the best, and new comparative studies are published etc. There is no mystery about that as companies want to know what hardware is required for their applications. But how do we measure the ESB performance for real? It is not that easy to answer this question. For relatively short time we have SOA Manifesto that standardizes the definition of what SOA is. Now we want to also define standardized means to measure its performance.

In the past, there have been multiple attempts at defining SOA benchmarks. In 2007 there was a project to measure the most widely used ESBs. WSO2 performed ESB Performance Testing Round 1, Round 2 and Round 3. But, it is sort of outdated now and limited to only Web Service scenarios.

Recently, there appeared a revitalization of these tests this time promoted by AndroitLogic and adapted to their ESB. But, this has not become a wide adopted ESB performance measurement technique. Some of the possible reasons for this are discussed later in this article.

At the end of 2009, Standard Performance Evaluation Corporation (SPEC) entered the scene by founding a new SOA Subcommittee. This Subcommittee took over the initiative that had previously been driven mainly by IBM. The subcommittee's goal is to develop a new industry standard benchmark for measuring performance for typical middleware, database and hardware deployments of applications based on the SOA. From their early findings, they are fully aware of the risk of not having a standardized SOA benchmark. The risks are mainly:

  1. Promoting Web Service specific benchmarks as general SOA benchmarks,
  2. creating vendor dependent benchmarks,
  3. failing to create a SOA benchmark, which hinders SOA wide adoption.

One of basic requirements for this performance benchmark is that multiple vendors must agree on it. The benchmark will consist of three parts. The first part is called Services and will be composed of several Web Services handling some automatic tasks. The second part is called Choreography and will contain some business processes (both fully automated and with human tasks). The last part is called Integration and will be assembled from core ESB features including service virtualization, and message routing, transformation and modification.

Common pitfalls of current benchmarks

Let's discuss common drawbacks that existing SOA benchmarks suffer from. The list is not comprehensive at all but touches most visible issues that might prevent the wide adoption of the up to date benchmark scenarios.

Web Services

Web Services are definitely important for a SOA solution. But bear in mind that Web Services is not equal to SOA, even though they are often related. Many of today's SOA solutions do not primarily use Web Services. They use transports like messaging (JMS), FTP, files, and databases. Also, many of today's Web Service applications even are not SOA solutions. They are just simple remote procedure calls. A SOA performance benchmark should not stick only to Web Services.

SOAP

Web Services often use SOAP messages that are XML files of a specific format. The XML manipulation is an Achilles heel of Web Services' performance. First you need to convert your data to XML on the client side. An ESB parses this XML, performs an operation on it, and creates response XML that the client must parse again. Parsing XML documents is an expensive operation.

It is important to compare how ESBs deal with SOAP messages to show that the ESB is not the bottleneck. I tried to use jProfiler on some ESBs and it showed up that creating and parsing XML is the real performance blocker. Now, my question is, what part of communication in your ESB may be other than SOAP?

Fortunately, there is a possibility to use REST (Representational State Transfer) as a communication protocol with Web Services. However, you need to define your own message format.

Transactions

In many applications transactional processing is of major importance. Either all operations must succeed or none. All operations must be executed exactly once. This is what you would expect when transferring money or closing a car insurance deal on-line. Obviously, there is an overhead needed to accomplish those natural expectations.

Accordingly, it is important to see how transactions influence performance by measuring the time it takes for several resources in separated processes to participate in a transaction driven by a shared transaction manager.

Security

In addition to transactions, security is an major aspect of a production environment. But how much does it slow things down? This should be measured for various transports because of different security implementations. Both authentication and authorization should be tested as well.

Currently, there is a scenario suggested which is about securing an unsecured service using an ESB. This is useful in situations when you want to make your internal service publicly available. You might not request authentication behind a company firewall but you need it for the outside world. What about the opposite scenario where an unsecured gateway is created for a secured service?

Concurrent Clients

While showing great performance for a single client, an ESB may be totally unusable for many concurrent clients. For stable and fair results, an ESB must be isolated from clients as much as possible. Clients must be on their own machine(s), the ESB should have its own dedicated server, and any possible proxied service (see the Virtualization scenario) must have its own machine as well. This configuration corresponds to a typical production environment.

Having clients, the ESB server, and the proxied service on a single machine does show significant fluctuations in results. It is logical - how many cores does your server have? Let's count. We want to test with 1,000 concurrent clients. The ESB should create 1,000 threads to serve the clients fast. And the proxied service should have the same number of threads to serve the ESB's requests. This requires a total of 3,000 threads. Switching a thread context is not a cheap operation. Even if the ESB and the proxied service used 200 threads each, we would still end up with 1,400 threads. It is a fight for resources, not a serious performance measurement since you are far behind the server's saturation point.

Conclusion

Any ESB vendor can create a scenario with some conditions that beats all other ESBs. I know that because I can fine tune JBoss ESB to be better in the performance with the existing scenarios. But it does not prove much until there is a valid set of standard scenarios for performance measurement. Even if there was such a set, developers would have to take other aspects into account as well. Great performance numbers are useless if an ESB can only run some simple scenarios, or if there is not a wide range of supported endpoints. Stability and absence of memory leaks is also important for you server not to crash every other day.

I am dealing with the ESB performance for almost two years now. And there are still more questions than answers. I would like to collect all requirements for such an objective performance measurement scenarios for ESBs. I hope I can hear your opinions here because I do not know answers to my questions yet. If there are some scenarios widely used in prodcution environment, then they are definitely good for performance testing. I plan to lately publish my findings here based on your input.

Do you run performance tests of your SOA solutions? What scenarios do you use? Do you believe a standardized benchmark is a must? What is the right size of a test message? How many concurrent clients should be used?

Acknowledgements

Special thanks to Jiri Pechanec and Len DiMaggio for their help.

Published at DZone with permission of its author, Martin Vecera.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Asankha Perera replied on Thu, 2010/08/05 - 9:13pm

Hello Martin

This is a good article! One small typo in the text is about the name of the company "AdroitLogic" being written differently :)

On your comment "We want to test with 1,000 concurrent clients. The ESB should create 1,000 threads to serve the clients fast" does NOT apply to an ESB using Java Non-Blocking IO, although it certainly does apply to "some" ESBs out there that will choke on such a load.

See how the UltraESB handles thousands of concurrent users with only a very few threads [http://bit.ly/bu6FYf ]. The ESB performance tests at http://esbperformance.org use 20,40, 80, ... 2560 concurrent users with messages of size 512bytes upto 100K and records performance for each scenario.You could run the complete performance suite and notice that the maximum thread count never exceeds somewhere around one hundred for the UltraESB

It  is good to see SPEC starting to define a SOA benchmark - although they are quite late in doing so! However, what I do not like about this approach is that when I reached out to them as probably the person who has run most of the ESB performance testing you've mentioned above, they asked me to first pay up and get a "full SPEC membership"! I think an ESB, an ESB performance benchmark, as well as contributing to either of these should be free of any cost and open to any individual.

I fully agree with your suggestions on including non-XML based performance tests, and REST/JSON is a very good candidate. I am specifically staying out of JMS performance benchmarking as its highly dependent on the JMS provider performance and the message persistence choosen, and thus reflects the performance of "something else" other than the ESB itself.

The esbperformance.org suite will be updated with REST/JSON shortly and any other scenarios suggested are most welcome!

cheers

asankha

Founder and CTO AdroitLogic

Chris Haddad replied on Mon, 2013/05/06 - 8:06am

ESB Performance benchmark 6.5  results are available


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.