Slava has posted 2 posts at DZone. View Full User Profile

Introducing Caching for Java Applications (Part 1)

07.17.2008
| 29906 views |
  • submit to reddit

Common Cache Use Scenarios

Common cache use scenarios include an application cache, a second level (L2) cache and a hybrid cache.

Application Cache

An application cache is a cache that an application accesses directly. An application benefits from using a cache by keeping most frequently accessed data in memory .

The following communication diagram illustrates using an application cache:

 

 

Level-2 Cache

One of the major use scenarios for a cache is a level-2 (L2) cache . An L2 cache provides caching services to an object-relational mapping (ORM) framework or a data mapping (DM) framework such as Hibernate or iBatis respectively. An L2 cache hides the complexity of the caching logic from an application.

An L2 cache improves performance of an ORM or DM framework by reducing unnecessary trips to the database .

The following communication diagram illustrates using an L2 cache:

 

The application does not access cache directly in this use scenario. Instead, the application utilizes a high level interface provided by an ORM or a DM framework. The framework uses cache for caching its internal data structures such as mapped objects and database query results. If the cached data is not available, the framework retrieves it from the database and puts it into the cache.

Hybrid Cache

A hybrid cache is a cache that uses an external data source to retrieve data that is not present in the cache. An application using a hybrid cache benefits from simplified programming of cache access.

This use scenario is different from the application or the second-level cache when an application or a data access framework is responsible for populating the cache in case of cache misses.

The following communication diagram illustrates using a hybrid cache:

 

 

Caching Anti-Patterns

Caching provides such a great improvement of performance that it is often used without limit. An anti-pattern Cache Them All is characterized by caching all data, without regard to temporal or spatial locality of data access . Cache Them All degrades application performance instead of improving it. The degradation of performance is caused by the overhead of maintaining a cache without benefiting from reduced cost of access to frequently used data.

To avoid the pitfall of the Cache Them All , only data that is hard to get and shows temporal and spatial locality of access should be cached.

Caching Products for Java

While it takes only 10 lines of code to write your own cache for Java, developing a usable cache is a challenging task. The simple 10-line cache is missing many important features required for using it in an application. Some of these features include concurrent access, configuration, eviction to disk and statistics.

Fortunately, several projects have already developed working caching APIs, so there is no need to reinvent the wheel.

A commercial caching API maybe a safe bet if your organization wants to be sure that its requests for help are addressed in time and that a caching API is developed professionally.

Often an organization cannot afford commercial software and is ready to take risks. In such situation a free caching API may be a solution.

The following sections outline some commercial and free product supporting local caching.

Commercial Products


Free Products

 

About Author

Slava Imeshev is a president and CEO of Cacheonix Systems. You can reach Slava at simeshev@cacheonix.com.

AttachmentSize
hybrid_cache.gif10.44 KB
level_2_cache.gif9.82 KB
application_cache.gif10.44 KB
Published at DZone with permission of its author, Slava Imeshev.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Alex Miller replied on Thu, 2008/07/17 - 3:40pm

Your SimpleLRUCache is not LRU.  The constructor needs to call the special LinkedHashMap constructor that takes a boolean arg (set to true for "last access order") instead of by insertion order.  As is, this is really a Least Recently Inserted cache.  Which might also be useful, but isn't LRU. 

François Ostyn replied on Thu, 2008/07/17 - 4:00pm

Hello,

Congratulations for your article.
Manage an "in memory" cache in JVM is a good way to improve performances.
Personnally, I've used this method for a critical project in my last company and results were excellents.
(fyi, it was a business web-site with 30K users...).
The "in memory" cache in JVM is very good for a small application.
To replicate caches, I use generally JMS (with OpenMQ)...
Actually, I'm testing memcached (an other free cache server) used by LinkedIn for example.
If you use hibernate (or not), I advise to use EhCache, a very good product;)

This is very intersting subject  (and JAVA applications tuning in general) and deserving of an entiere book...

Cheers
François OSTYN
J2EE Architect

Slava Imeshev replied on Thu, 2008/07/17 - 5:38pm in response to: Alex Miller

Alex,

[quote=puredanger]Your SimpleLRUCache is not LRU. The constructor needs to call the special LinkedHashMap constructor that takes a boolean arg (set to true for "last access order") instead of by insertion order. As is, this is really a Least Recently Inserted cache. Which might also be useful, but isn't LRU. [/quote]

 

Yes, this is a bug. Thanks for pointing to this, it has been fixed.

 

Slava

 

Alex Miller replied on Thu, 2008/07/17 - 8:00pm in response to: Slava Imeshev

Yep, that fixed it.  Although you presized the map to 1, which probably isn't big enough. :)  The default for LinkedHashMap is 16. 

Also, I should mention that Terracotta provides an easy way to distribute many of these open-source caches across multiple JVMs. 

Slava Imeshev replied on Thu, 2008/07/17 - 8:24pm in response to: Alex Miller

Alex,

The second and the last article in the series will cover dsitributed caching and data grids :)

Regards,

Slava Imeshev

Kode Ninja replied on Fri, 2008/07/18 - 5:18am in response to: Slava Imeshev

How about generifying your cache interface:

public interface Cache<K, V> {   
V get(final K key);
V put(final K key, final V value);
}

That's a type-safe cache interface!

Slava Imeshev replied on Fri, 2008/07/18 - 6:28am

 

Yes, that would definitely make sense. Strongly typed interfaces is a way to go, no doubt about it.

Unfortunatelly, there is a lot of shops still using 1.4, so us forced to support is did affect my thinking when writing that piece. Not sure if this counts as an excuse :-)

 

Regards,

 

Slava Imeshev

François Ostyn replied on Fri, 2008/07/18 - 6:47am in response to: Slava Imeshev

Excellent ;)
And you can speak about cacheonix... your "baby" !
Thanks
François OSTYN

Dmitry Namiot replied on Fri, 2008/07/18 - 6:48am

for LRU cache right in your JSP (or Coldfusion) applications you can use this component: LRU taglib

Alex(JAlexoid) ... replied on Sun, 2008/07/20 - 7:30pm in response to: Alex Miller

[quote=Alex] As is, this is really a Least Recently Inserted cache[/quote]

Last time I checked  "Least Recently Inserted" is called FIFO eviction algorithm based cache or FIFO cache.

Slava Imeshev replied on Sun, 2008/07/20 - 8:23pm in response to: Alex(JAlexoid) Panzin

[quote=jalexoid]

[quote=Alex] As is, this is really a Least Recently Inserted cache[/quote]

Last time I checked "Least Recently Inserted" is called FIFO eviction algorithm based cache or FIFO cache.

[/quote]

 

In case of FIFO eviction cache elements do not change there position and are evicted in order they are added. So, "Least Recently Inserted" is not FIFO.

 

Regards,

Slava Imeshev

 

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.