As an Agile Coach, Miško is responsible for teaching his co-workers to maintain the highest level of automated testing culture, allowing frequent releases of applications with high quality. He is very involved in Open Source community and an author of several open source projects. Recently his interest in Test Driven Developement turned into http://TestabilityExplorer.org with which he hopes will change the testing culture of the open source community. Misko is a DZone MVB and is not an employee of DZone and has posted 38 posts at DZone. You can read more from them at their website. View Full User Profile

Managing Object Lifetimes

04.16.2009
| 5834 views |
  • submit to reddit

There is a myth out there that creating objects is expensive. The result of this is that a lot of applications have objects whose lifetime is too long. Let’s take a web app for example. Most web-apps I have seen have too many long lived objects and not enough request scope objects, which has implication on how data flows through the application. A web app has at least three scopes: application, session and request scope. These scopes are in decreasing lifetime and most of the time they do not overlap (ie the request scoped objects die, before the session objects can die).

Now I believe that each object should have reference to other objects whose lifetime is equal or shorter to the object. In other words, a request can know about session, but it is not a good idea for the session to know about a request. (When I say ‘know’, I mean that it does not have field reference to request, it can have temporary reference through the stack, i.e. method parameter.) The way a long lived object gets a hold of a short lived object is through the stack (passed in through a method parameter).We could generalize this rule to say: Pass in objects of equal and greater lifetime through the constructor and objects of shorter lifetime through the stack.

Now the interesting part is what happens when you start to break this rule. There are two ways to break this rule, lets look at them in turn.

When this rule is reversed (passing short lived objects into a long lived object constructor. i.e. session knows about request) you have a recipe for bugs, since now the garbage-collector will not be able to clean up the objects which are clearly out of scope. There is not much you can do with a request object after the http connection is closed. I see this mistake often, and usually developers solve this through some kind a clean up code. Recently, for example, I came across a game of GO which had a App and the Game class. Now App is lifetime of the application but Game is only for the duration of the game. In the code every time the user wanted to play a new game the App had to go through a cleaning process for a Game object, which was a source of errors. Your first game would behave right but your second game may not. Solution is to simply throw away the Game and instantiate a new one.  Now the App is responsible for instantiation of the Game object but the App does not keep a reference to the Game object (beyond the local reference on the stack.)

More common violation is to not have an appropriate life time object in the first place. For example let’s say that you have a web app and you don’t have any classes which are meant to be request scope, such as a Servlet. A servlet is application scoped, as a result, you cannot inject the request/response into the constructor, instead you have to pass them through a stack, which is exactly what happens. Now not all code is in your servlet, therefore your servlet needs to collaborate with other objects, but most of them are application scoped as well. Now let’s say that you have three application scoped objects A, B and C, such that A calls B, B calls C and A does not know about C directly (i.e. A->B->C). Let’s say that C needs the http cookie. The only way for me to get the cookie into C from my servlet is to pass it in as an argument to A, which than passes it to B, which than passes it to C. Sooner or later you will realize that this is a real pain, since every time C needs a new object reference you have to modify all of these unrelated objects. Therefore you will create a context object which knows about cookie and all of the other objects and you will be proud of yourself how clever you are. But a context is solving the wrong problem. The real problem is that C’s lifetime is wrong. If C’s lifetime would be that of cookie (request scope) than the Servlet could just pass it in directly like this:

HttpRequest request = ...;
C c = new C(request.getCookie());
B b = new B(c);
A a = new A(b);
a.doWork();

Now this is great since A and B are no longer involved in the process of passing along the cookie. After all neither A nor B cares about the cookie. It also means that if at some later point in time C needs additional object such as request-path C can just ask for it in the constructor without having to change anything in A nor B.

Servlet becomes a place where the long lived object (servlet) meets short lived object (our ABC object graph). This means that whenever you have to cross the object boundary in reverse you need a factory, which is what our servlet does.

Therefore, whenever you get into situation when you have to pass objects through many layers, or that you create a context and than pass the context through the layers your layers are probably long lived and need to have their lifetime adjusted. You can generalize this and say that: Whenever a class has a reference to another object but does not directly dispatch methods on that object, than the collaborator which needs the object is of wrong lifetime.

From http://misko.hevery.com/

Published at DZone with permission of Misko Hevery, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Tags:

Comments

Mark Thornton replied on Thu, 2009/04/16 - 6:23am

Not so much a myth as ancient history. In the early days of Java creating objects was expensive and techniques like object pooling did produce significant gains.

 

Thomas Nagel replied on Thu, 2009/04/16 - 8:45am

Another important area to keep in mind is garbage collection, or heap management, as it not only has consequences on the memory footprint of your application, but also on the responsiveness of it.

If you read the articles of the programmers working on garbage collector's its easy to see that creating and destroying objects often simplifies memory management drastically.

Guillaume Jeudy replied on Thu, 2009/04/16 - 8:53am

Excellent post! Beyond the memory management aspect of it; properly managing object lifecycle also gives you code that is more test-friendly and less error prone. That point is unlikely to become obsolete for the foreseeable future.

Alessandro Santini replied on Thu, 2009/04/16 - 9:05am

Hello Misko,

in line of principle I agree with the rule of thumb you provided; however, I would like to submit a couple of pointers to your attention:

  • Weak/Soft References have been introduced since v1.4.2. In summary, you can create a reference to an object while still allowing it to be finalized and garbage collected. A nice tutorial can be found at http://weblogs.java.net/blog/enicholas/archive/2006/05/understanding_w.html
  • I did not fully understand the App/Game example. I assume that the ongoing games have to be stored somewhere (using a strong reference) in order to let the game be found by potential players; in the context you are presenting, the garbage collector may pass and reclaim an instance of a presently-running game. At this point, you will agree with me that cleanup is still necessary.

Ciao.

Alessandro

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.