Sr. Software Architect, Passionate about Open source Tools & Technologies. Expert in Enterprise Web Application Development using Java/J2ee Platform. Worked significantly on Enterprise Data Grid, Transactional Data Management, B2B Integration and Performance & Code Instrumentation. Senthil has posted 8 posts at DZone. You can read more from them at their website. View Full User Profile

Unit Tests - "Written Once and Forgotten Forever"

05.27.2010
| 6409 views |
  • submit to reddit

I have come across many unit test cases that are written once and forgotten for ever, with all the dataset/environmental dependencies in it. What is the importance of dataset/environment? Let's take an example, observe the test case below.

public void testEmpFinder () {

//Weird Test case for Fun!!!

String result = "JOHN";

//Passing employee id returns employee object.

//verify the name matches.

Employee emp = EmpFinder.find(1);

assertEquals(emp.getName,result);

}

What's wrong? The developer had made an assumption, that on querying with employeeid='1' will return employee with the name "JOHN". The data could be coming from a database table "EMPLOYEE". But It’s very evident this test case would fail if run on an environment where the employeeid='1' data doesn't exists. This makes the test cases obsolete the moment they are written.

Approach

1. Save Insert/Delete datascript, that could be run on test setup () and delete test data on completion of test.

2. You could use something like DBUnit (http://www.dbunit.org/) which exports and imports the database data into an XML dataset. - Not sure of DBUnit support for modern day ORM's (Hibernate).

Conclusion:

Unit tests should match & sustain life time of source, after all its a guarantee card to your source code. We just looked at database dependency for example, how about dependencies such as JMS, Content Repository, LDAP etc. Writing a test case with all its data & environmental dependencies Mocked & externalized is an Art.

Published at DZone with permission of its author, Senthil Balakrishnan.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Tags:

Comments

Mladen Girazovski replied on Thu, 2010/05/27 - 2:56am

DBUnit is ignorant to ORM, since DBUnit is only concerned about the DB, and yes, it would be good to use it in this case, have a look at this article:

http://xunitpatterns.com/Back%20Door%20Manipulation.html

So, i isagree with your Options 1 & 2, since they are both the same ;)

Senthil Balakrishnan replied on Thu, 2010/05/27 - 7:01am in response to: Mladen Girazovski

I am thinking of writing a DBUnit wrapper for ORM(JPA) to use the underlying store/fetch mechanism of DBUnit. Setting up data for every test is not a feasible option for a project of  15+ Team size, rather it's easy to import the current development data and update it on every releases if the ORM wrapper works :)...

 Thanks for the XUnitPatterns link!!! will give a detaile reading :)

 

 

Jonathan Fullam replied on Thu, 2010/05/27 - 7:08am

I've been using Mockito to mock my fixture's dependencies and it's been working out great. It's very easy to learn and use and provides very readable test cases.

Senthil Balakrishnan replied on Thu, 2010/05/27 - 8:26am in response to: Jonathan Fullam

"Mockito doesn't give you hangover " - Sounds cool, let me try it!!!

Mladen Girazovski replied on Thu, 2010/05/27 - 9:13am in response to: Senthil Balakrishnan

Setting up data for every test is not a feasible option for a project of 

It has nothing to do with the team size, it is an integration test, which is slow by nature and should not be run by each developer after each change.

If you're not setting up the test data for each test, you'll end up with test data leakage and therefore with interacting tests, which in turn leads to fragile tests that are not deterministic anymore, thats really bad.

Walter Bogaardt replied on Fri, 2010/05/28 - 12:43am

Yes its true developers may do this. The one thing is unit testing interactions to a database even a unit test? The reason is now your going outside the method boundary and hiting another application, in this case a database. This would be an integration test more so. DBUnit is a better filler to this problem, but one area that because a bigger issue is that too many dbunit tests and now your unit tests end up running longer and longer for each build.

One reason I like using TestNG is that you can mark areas you like to test in various group configurations: unit, integration, system, broken and so forth.  This way your continous integration builds can be customized to either run the full suite of tests or run subsets. 

Test driven development is great, but it has to be flexible and realistic.

 

 

Raveman Ravemanus replied on Fri, 2010/05/28 - 3:45am

I had used DbUnit in commercial app and its not as easy as you guys think. For simple select with 4 joins you need to create at least 4 inserts. its a lot of work and its very easy to mess it up.

Mladen Girazovski replied on Fri, 2010/05/28 - 4:24am in response to: Raveman Ravemanus

I had used DbUnit in commercial app and its not as easy as you guys think. For simple select with 4 joins you need to create at least 4 inserts. its a lot of work and its very easy to mess it up.

 Thats weird, i used it in at least 4 commercial apps, and i never had to generate inserts, all i had to do was to create a dataset and use it (insert, delte, clean_update, etc.).

It is not an easy task to create integration tests for the persistence (layer) itself, after all, you want to test your daos/repositories, which involves a couple of technologies.

 

Andrea Del Bene replied on Fri, 2010/05/28 - 5:17am

if you use Spring test classes and an ORM (Hibernate for example) you have an third easer solution. With Spring you can use AbstractTransactionalDataSourceSpringContextTests as test case class and define your data test instances in a context file.

Then you can save your test data before running your tests using a common database transaction that will be rolled back by default at the end of tests execution. In this scenario Spring himself  ensures that tests don't modify your data in any way.
Obviusly you must declare a transaction manager inside Spring context :)

Josh Marotti replied on Fri, 2010/05/28 - 9:31am in response to: Mladen Girazovski

Mlaiden makes the point I want to make.  The reason your code is a bad unit test, is because it isn't a unit test.  It is an integration test.  You should mock out all dependencies except for the specific piece of code you are testing.

Alexander Ashitkin replied on Sat, 2010/05/29 - 2:57am

i'm disagree with the author. creating data scripts for every test leads to huge amount of duplicated data because of sql constraints. in my practice they are very soon become unmanagable and when it will be necessary to add column or data to scripts for 50 daos you'll be very glad - it will be necessary to do by hand. even if you have single data script for all data it's not convinient - you need to put clean up script + wrap all tests in the suite, so the single test couldn't be run anyway, although this solution much more better then the first one. So my suggest - create test schema, document data in this schema and make test assemble of this apllication deployed above this schema. So the excerpt in the beginning of article is perfect for me. It will solve most of the my problems. for me in this env it will be much more easy to create and maintain tests.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.