DevOps Zone is brought to you in partnership with:

I'm a Java developer by heart, a Java Team Leader of a great Agile Team by profession, and I'm passionate about JSF, JBoss Richfaces, JBoss Seam, Spring, Selenium, Maven and others. Over the last 10+ years I had the opportunity to work in many roles in many projects in the banking area mostly in Germany, but also in Poland, Denmark, Sweden and the Netherlands. For some of the projects I joined at the very early draft business idea, and stayed until it was live and needed to adopt to changing business requirements. Alexander has posted 1 posts at DZone. You can read more from them at their website. View Full User Profile

Still Using Those Old-School Log Files? — Let’s Use a Log Server Instead!

  • submit to reddit

Logging to files seems easy.

When you start with a small app, you usually log to the console as long as you are developing it. Of course you are using a proper logging framework, with either log4j or logback at the end. As soon as you are done with your development, you add a few lines to your log4j.xml to also log to a log file so operations is happy.

But leads to manual work later.

When operations knocks on your door with a problem at hand, you ask them for the production log files. They might return to you the next day with an email that contains some attached log files. You start analyzing the files, then the testers knock on your door and report a problem in the test environment. This means you need more log files from operations.

This is a very manual process, and it doesn’t make development, test, and operations happy. Also, someone is going to have to explain to business why it takes so long to investigate problems.

Log server to the rescue!

Wouldn’t it be nice to have immediate access to all log files of development, test, and production systems? Could there be a way to trace a user’s session through all layers, and over cluster nodes? Could everything also be in a nice GUI that's easy and fun to use?

Two years ago we changed the situation by adding a log server to our infrastructure. We have decided to deploy logFaces ( that we found easy to install, easy to integrate, and easy to use.

Setup is easy.

Instead of logging to local files (only), all of our applications now log also to the central log server. So we're now using log4j in our applications, and it's as easy as adding 11 lines to our log4j.xml files. As the transmission of log messages is done in a separate thread, response time is not affected.

The log server stores all log messages in a database. With logFaces, you have the choice of a SQL or no NOSQL database. We have decided to use MongoDB to store the data, as log messages can be of arbitrary length and the total amount of log messages can be limited by size on a database level (this makes housekeeping easy). The log server tags all log messages with the origination server and the domain (for us, the application name, plus a code for the environment and the country).

And you have the data at your fingertips.

Then you start your local logFaces client that connects to the server. It’s a RCP application with an Eclipse look’n’feel. You can watch the global trace in real time, or you can specify custom filters by environment, application, severity and other criteria. The full information of a log4j message is retained, including exception stack traces, class name and context information like MDC/NDC (see for more information what NDC and MDC can do for you).

When you look for a historic log event, you can use the query editor to find what you need. After you find what you need, look up additional log messages before that event, and export some or all log messages to the clipboard or to a file.

You can use the history data to generate statistics, or you can easily browse and filter it to find errors in your production logs before your users report them.

It’s a commercial product after all.

You can test it free for 30 days, after that you’ll have to license it. The log server license is 499 USD per server or 1299 USD per site, with an unlimited number of servers for the site. I consider it a fair price, for it allows you to have an unlimited number of applications logging to the server, a log file size that is only restricted by your hardware, and an unlimited number of clients accessing the log server.

Practical things to consider:

We set up up one server for the production environment, and another one for all development and test environments. This way, development and test don’t fill up the log space reserved for production. Also the access restrictions are easier to manage if you set them up per server instead of per application.

Polyglot logging is more than just Java!

We started out with linking our most important Java applications to the log server. After a while, more applications were linked.

Also the .NET world was linked to the log server using NLog. And a Java class also enabled our PL/SQL-stored procedures in the database to log to the log server. As the wire protocol is quite simple (it’s basically XML via TCP), you can link almost any source to the log server.

What’s next?

From my experience, I can say: Give it a try, even for a small application. If your application has multiple layers that you need to trace, and if you need to handle multiple staging environments, it becomes invaluable to have a log server solution that is easy to access.

Due to the direct integration to log4j, it is also a valuable tool during development; for the application you have running locally, you don’t need to look at consoles during development any more.

Last thing: I'd love to hear from you. Please feel free to share your thoughts and approaches in the comments!

logfaces_drill.png51.05 KB
logfaces_query.png19.82 KB
Published at DZone with permission of its author, Alexander Schwartz.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)


Peter Butkovic replied on Wed, 2012/08/01 - 1:23pm

I believe this type of app is one of the extremely usefull ones for maintenance for most of the products. However i'm still dreaming of the similar solution, but open source one :) 

For our product we created viewer enabling sorting and filtering based on different log message parts as well as we have solution running extracts of messages related to particular event/user action. I see that without these it would be very difficult to analyze logs from production systems. 

However this is all closed and not for public eyes. 

Brian Luby replied on Wed, 2012/08/01 - 3:25pm

There are several open source log consolidation products out there.  Logstash and Greylog2 both come to mind.  Personally I am a fan of Splunk which is somewhat pricey but the interface and addons are well worth it.  

Curt Cox replied on Wed, 2012/08/01 - 5:34pm

Still Using Those Old-School Log Servers? — Let’s Use a Cloud Based Log Server Instead!

Alexander Schwartz replied on Thu, 2012/08/02 - 2:33pm


thanks for sharing your solutions!

I agree that there might be others more appropriate when you have syslog like sources, or if you have to parse log files i.e. from an Apache httpd server.

LogFaces has its strenghts in preserving all the semantics of a log4j message, and it has nice live console... 

Best regards,

Den Fletcher replied on Mon, 2013/04/29 - 6:38am

I have been a developer for over seven years now and I've always had to deal with different log files that took forever to obtain. There is a lot of time lost in the process of asking and receiving those log files not to even mention the time needed to parse them. To avoid this and make the process a lot faster I implemented an automatic process that uploads all these log files on a dedicated server  on a regular basis. This way, anyone with the necessary clearance can access them and they can't ever be deleted or lost. Although it seems like an obvious idea, it hasn't been widely implemented yet but I am sure that's going to change in the close future.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.