Performance Zone is brought to you in partnership with:

Anthony Goubard is a freelance Senior Software Engineer from Amsterdam. He has developed in Java since 1995. He has developed many softwares available at and is the main developer of the Web Services framework XINS. Anthony is a DZone MVB and is not an employee of DZone and has posted 33 posts at DZone. You can read more from them at their website. View Full User Profile

Top Ten Performance Problems and Their Solutions

  • submit to reddit

Whether you're the developer or the user of a Java application you would like to see running faster, here are the top ten tips to use.

10) GregorianCalendar

This class is slow and is not synchronized. If possible avoid it and use Joda time

9) No time based animation

All animation should be time based so that if it's meant to be 1 second it happens in 1 second even if it's only 4 frames a second.
I hate the "all download have been finished" window that takes 10 seconds and uses the rest of the CPU available. Look at the timing framework.

8) No cache

Computing the same thing again and again, try to identify where it happens and cache the latest results.
Be careful not to create a memory problem when never releasing objects of the cache.

  private static final int LIMIT = 100;   
private Map cache = new LinkedHashMap(LIMIT, .75f, true) {
protected boolean removeEldestEntry(Map.Entry eldest) {
return size() > LIMIT;

7) No feedback

If you don't provide a progress bar or a hourglass the user will think the application hangs or is slow. Provide some distraction, he won't notice it or at least he will know what it's doing. Use the JProgressBar or setCursor(new Cursor(Cursor.WAIT_CURSOR));

6) Logging

Logging is slow as it uses the disk and is synchronized (each thread needs to wait its turn).
The first fix is to log on a separate disk than the application is running.
You should also look at what you're logging and reduce it to only what is needed.

5) Premature optimization

What? Yes premature optimization is the root of all evil.
Moreover you may make it worst with incorrect assumptions.
I once profiled a framework where 40% of the time was spent on premature optimization.
Don't assume but measure and act. Measure and optimize during the late Beta phase of your project.
Profilers are your friends here.

4) Database

Database are often the weakest link of an application.
Measure, create indexes where needed, optimize queries and if still a problem change of database.

3) Network

Network is slow. Two things are slow, the connection to the server and the download of the information.
Cache all that come from the network when possible. Compress the information sent and received using GZipOutputStream or a GZip Servlet filter.

2) Software

Use the latest softwares available, especially Java 6 update 6.
Application servers and databases also compete between them to be the fastest. So the more recent version you have the faster it will be.
Another good example of performance improvement is FreeBSD 7.

1) Hardware

Who doesn't know Moore's law?
I once was asked to produce a benchmark for a Toolkit. After writing and running the code, I gave my results with as conclusion for better results run it on better hardware and they did.

Published at DZone with permission of Anthony Goubard, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Jess Holle replied on Thu, 2008/06/26 - 9:47am

You forgot to mention "bad algorithms".

This is really the converse to "premature optimization".  You shouldn't contort your code to prematurely optimize things that may not need it but you should take care in crafting/selecting an appropriate algorithm.  Caching is part of this, but itself can be a premature optimization.  Little things like avoiding linear searches unless you are absolutely certain your data is going to stay really, really small, and using appropriate data structures for your task, can make a huge difference -- so huge that Moore's law won't even make up the different quickly enough :-)

Also with both database and network, it should be noted that round-trips in series (as opposed to in parallel) need to be minimized -- as does the amount of data sent both ways on the pipe.  This (and SQL indexing and query optimization, of course) are critical if you actually care about performance :-)

Anthony Goubard replied on Thu, 2008/06/26 - 11:11am in response to: Jess Holle

Well, funny because I've done linear search on a big data set using as data structure the old Hashtable and Vector.

The data set is all exiting words (or most words)  of several languages and it goes really fast.

Everybody is free to test it as it's available as applet: (use the text field in the toolbar). Because I've kept the KISS principle it was easy to optimize.

Of course when moving to Java ME I used the binary search method. 

Peter Karussell replied on Thu, 2008/06/26 - 12:35pm

Just one small tip:

Because one should only trust his own performance benchmarks! But this could be somewhat difficult (especially with jit enabled).





Jeroen Wenting replied on Fri, 2008/06/27 - 12:14am

10) Joda Time is a disaster. I've used it, it may be faster but it's so obfuscated and convoluted it's not worth the effort. Using Joda Time would thus fall under "5) premature optimisation" more often than not, and is therefore a worse choice than GregorianCalendar!

 8) Would depend on the cache used and the thing you're calculating. I've seen plenty of examples where recalculating was faster than the whole cache lookup and retrieval process.

4) while the database may sometimes be slow, most often that's a result of "3) the network" rather than the database itself. Databases are generally faster than most other persistence schemes you can come up with.

3) The network is unreliable and potentially slow, but it's more often than not faster than the user interface (which the vast majority of programmers don't know how to code efficiently).

2) Java 6 is a disaster. The same performance benefits have been backported to 1.5 while the stability problems have not.

1) The eternal fallacy, and the cause of most complaints about software. Just telling people they should invest in new hardware to run your application when their current hardware is relatively new is NOT good. If your software can't run on hardware that's a year or two beyond the current market average in performance (certain heavyweight applications like games and simulation systems excluded) YOU have a problem, YOU did a poor job.
Shoving that problem to your client, forcing him to spend hundreds to tens of thousands of Euros to run your application, is utterly unacceptable.

Brian S O'Neill replied on Fri, 2008/06/27 - 10:45am in response to: Jeroen Wenting

Re: "Joda Time is a disaster."  Can you elaborate a bit please? Given that JSR-310 will borrow heavily from from Joda Time, perhaps you should voice your concerns.

Re: "while the database may sometimes be slow,"  The proposed solution was to optimize aspects of the database itself, not to write your own database.

Re: "Java 6 is a disaster."  Again, can you elaborate a bit? Your choice of words is a bit strong, so you're going to need strong arguments to back this up.

Re: Hardware vs. software:  Buying new hardware vs rewriting software is an economic choice. Software development is not cheap. Granted, the initial design should be good enough on its own, but it often isn't. When it's wrong, which is cheaper? Hardware upgrade or software development? It depends, and so any decision must be made carefully.

Adam Malter replied on Mon, 2008/06/30 - 11:22am in response to: Brian S O'Neill

Re Re: "Java 6 is a disaster."  Again, can you elaborate a bit? Your choice of words is a bit strong, so you're going to need strong arguments to back this up.

 I like being the contrarian also, but there is definitely some evidence to back this up. Trying running Eclipse on anything north of Sun JDK 6b4 with the -server hotspot jit. (Just make sure you save often!) You'll soon come upon the dreaded compile thread core dump. []

 I think anyone who has a project of decent size has seen their own share of errata in the new JDK. My guess is that they are becoming more aggressive about backporting all the cool gee-wiz optimizations in the OpenJDK project. But, this is nothing new. Java has had good builds and bad builds for years. If I remember correctly, 1.4.0 and 1.4.1 were disasters, while 1.4.2 was a rock solid dreadnought battleship. 

What would be nice is a place to share this backroom secret info that generally gets passed around between project leads over beers during conventions. Like "The Drudge Report" but for Java rumors :-)

Jeroen Wenting replied on Tue, 2008/07/01 - 12:27am

Joda Time is unintuitive, doesn't interface well with the standard classes. That makes it very hard to use, unless you decide to never use the standard classes at all. In any realistic application that's quite impossible as almost everything else you use will use the standard classes.
For pretty much everything we need to do the standard classes are quite good enough. About the only thing missing IMO is an Interval and/or Duration, and there's no need to completely uproot the entire system to introduce those.

Java 6 is unstable, a grabbag of "ME 2!!!" "features" included for no other reason than that competing technology has it as well (with no thought as to whether doing so would be useful at all).
And worse yet, it's opening the door for the even worse product that's going to be Java 7.
We evaluated 6 for our server because of its supposedly superior performance over 5, but quickly had to abandon that because they kept crashing and entering ever longer garbage collection cycles (effectively killing performance).

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.