Computers have been my hobby since I was 12. Now I'm a freelance Java developer. Like many other developers I am working on various private projects. Some are open source components (Butterfly Components - DI container, web ui, persistence api, mock test api etc.). Some are the tutorials at Yet others are web projects. I hold a bachelor degree in computer science and a master degree in IT focused on P2P networks. Jakob has posted 35 posts at DZone. You can read more from them at their website. View Full User Profile

In-memory databases: Use a standard product, or roll your own?

  • submit to reddit

Here are a few questions I would like to ask the community:

As memory gets cheaper, more and more application datasets can now be kept fully in memory, and just have changes flushed to disk. No reads from disk (except at startup) - only writes.

Some of the applications I think could benefit from having all data in memory, never to read from disk but only to flush changes to disk (async perhaps), are:

*) Multiplayer game server
*) Search engine live index
*) Live analysis systems


1) Have you had any experiences with this kind of setup? All data in memory for your application?

2) Did you use a standard database product?

3) If yes, which product did you use, and what were the benefits?

4) Did you end up just keeping the data as objects in memory, and create your own indexes etc. to search through it?

5) If yes, what are your experiences with that?

6) Does anyone have any speed comparisons?


I remember working for a data warehouse once that exported part of their index to KDB, an in memory lightning fast database. Indices were rebuilt every night, and remained unchanged for 24 hours. The speedup was x 1000 compared to searching in MS SQL Server back then (in 2001).

Published at DZone with permission of its author, Jakob Jenkov.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Célio Cidral Junior replied on Thu, 2011/09/08 - 9:38pm

Take a look at Prevayler, you may find it interesting:

Martin Thompson replied on Fri, 2011/09/09 - 3:15am

We implemented our own in-memory transaction system at LMAX when building a low-latency financial exchange.  Martin Fowler wrote a good article going into the details of our experience.

I give a presentation on what we learned at QCon last year.  The comments below go into a lot of detail.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.