Fabrizio Giudici is a Senior Java Architect with a long Java experience in the industrial field. He runs Tidalwave, his own consultancy company, and has contributed to Java success stories in a number of fields, including Formula One. Fabrizio often appears as a speaker at international Java conferences such as JavaOne and Devoxx and is member of JUG Milano and the NetBeans Dream Team. Fabrizio is a DZone MVB and is not an employee of DZone and has posted 67 posts at DZone. You can read more from them at their website. View Full User Profile

Detecting Multiple Errors in a Single Test with JUnit

08.29.2009
| 10687 views |
  • submit to reddit

You can say "thanks!" to Rules. Rules are objects with a specific preset behaviour, annotated with @Rule. For instance, ErrorCollector is a container of multiple errors that can occur in a single test. It must be declared as a test class field as follows:

public class MyTest 
{
@Nonnull
private final ExpectedResults expectedResults;

// test follows as usual
@Test
public void testImage() { ... }
}

Now you can accumulate errors with code such as the following:

for (int t = 0; t < thumbnailCount; t++)
{
try
{
final ExpectedResults.Image expectedThumbnail = expectedResults.getThumbnail(t);
final Dimension size = expectedThumbnail.getSize();
assertLoadThumbnail(ir, t, size.width, size.height);
}
catch (Throwable e)
{
errors.addError(e);
}
}

...that is, you try/catch Throwables around assertions and pass them to ErrorCollector.addError(). You could directly add a throwable with a if, but it's much more convenient to stay with the Assert framework, so you enjoy some self-describing error messages.

If ErrorCollector is not empty at the end of the test, JUnit will consider the test as failed. JUnit will report details about *all* the errors that occurred, as you can see in the following excerpt from a Maven Surefire report. Note that you could do a similar thing with standard stuff, accumulating exceptions in a Collection and asserting at the end that is empty - I've tried that - but it would be harder to correctly get all the exception dumps in the report.

-------------------------------------------------------------------------------
Test set: it.tidalwave.imageio.raf.RAFImageReaderImageTest
-------------------------------------------------------------------------------
Tests run: 16, Failures: 15, Errors: 0, Skipped: 0, Time elapsed: 60.184 sec <<< FAILURE!
testImage[[https://imaging.dev.java.net/nonav/TestSets/peterbecker/Fujifilm/FinePixS9500/RAF/DSCF3756.RAF]](it.tidalwave.imageio.raf.RAFImageReaderImageTest) Time elapsed: 34.917 sec <<< FAILURE!
org.junit.ComparisonFailure: expected:<[8c256e68fe9897a4fac12a06f1a07fb4]> but was:<[f53e19fbd1f512ddf052e13097735383]>
at org.junit.Assert.assertEquals(Assert.java:123)
at org.junit.Assert.assertEquals(Assert.java:145)
at it.tidalwave.imageio.ImageReaderTestSupport.assertRaster(ImageReaderTestSupport.java:282)
at it.tidalwave.imageio.NewImageReaderTestSupport.testImage(NewImageReaderTestSupport.java:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.FailOnTimeout$1.run(FailOnTimeout.java:28)

testImage[[https://imaging.dev.java.net/nonav/TestSets/peterbecker/Fujifilm/FinePixS9500/RAF/DSCF3756.RAF]](it.tidalwave.imageio.raf.RAFImageReaderImageTest) Time elapsed: 34.919 sec <<< FAILURE!
java.lang.AssertionError: metadata.fujiRawData.table1Offset expected:<650246> but was:<611914>
at org.junit.Assert.fail(Assert.java:91)
at org.junit.Assert.failNotEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:126)
at it.tidalwave.imageio.NewImageReaderTestSupport.testImage(NewImageReaderTestSupport.java:231)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.FailOnTimeout$1.run(FailOnTimeout.java:28)

testImage[[https://imaging.dev.java.net/nonav/TestSets/peterbecker/Fujifilm/FinePixS9500/RAF/DSCF3756.RAF]](it.tidalwave.imageio.raf.RAFImageReaderImageTest) Time elapsed: 34.919 sec <<< FAILURE!
java.lang.AssertionError: metadata.fujiRawData.fujiTable1.fujiLayout expected:<false> but was:<true>
at org.junit.Assert.fail(Assert.java:91)
at org.junit.Assert.failNotEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:126)
at it.tidalwave.imageio.NewImageReaderTestSupport.testImage(NewImageReaderTestSupport.java:231)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.FailOnTimeout$1.run(FailOnTimeout.java:28)

testImage[[https://imaging.dev.java.net/nonav/TestSets/peterbecker/Fujifilm/FinePixS9500/RAF/DSCF3756.RAF]](it.tidalwave.imageio.raf.RAFImageReaderImageTest) Time elapsed: 34.931 sec <<< FAILURE!
metadata.fujiRawData.fujiTable1.coefficients: arrays first differed at element [0]; expected:<336> but was:<320>
at org.junit.internal.ComparisonCriteria.arrayEquals(ComparisonCriteria.java:54)
at org.junit.Assert.internalArrayEquals(Assert.java:414)
at org.junit.Assert.assertArrayEquals(Assert.java:260)
at it.tidalwave.imageio.NewImageReaderTestSupport.testImage(NewImageReaderTestSupport.java:203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.FailOnTimeout$1.run(FailOnTimeout.java:28)

 

Published at DZone with permission of Fabrizio Giudici, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Tags:

Comments

Eugen Paraschiv replied on Sun, 2009/08/30 - 10:15am

Cool solution, but isn't this kind of an antipattern to JUnit and unit testing in general? Shouldn't each of these assertions be in their own method? This way you would see each individidual fail, not worry about one call affecting the state of the object, etc. Just a though.

 

Alessandro Santini replied on Sun, 2009/08/30 - 12:24pm in response to: Eugen Paraschiv

Each single test should re-set the whole test scenario. There are cases in which this is particularly expensive (ex., performing a very expensive calculation, logging in to an application, etc.).

As an example, the Selenium test framework uses a similar approach, still letting the tester the ability to choose which one suits best:  the assert* methods stop the execution of the test method whenever an assertion fails; the verify* methods let the test method end, reporting the test as failed if at least one verification failled.

Ciao

 Alessandro

Fabrizio Giudici replied on Sun, 2009/08/30 - 3:31pm in response to: Eugen Paraschiv

Isn't this kind of an antipattern ... unit testing in general?

Right: in fact this example is a functional test.

Aaron Digulla replied on Tue, 2009/09/01 - 10:11am

In my own tests, I usually concat all results into a big, multi-line string and then compare that against the expected result (see this blog entry). This allows me to check several values at once and the diff tool of the IDE will help me to figure out what went wrong.

Fabrizio Giudici replied on Sun, 2009/09/06 - 9:14am

Another possibility, that I'm using in a different project to assert that a bunch of data gets correctly imported into a database, it's an extension of Aaron's idea: instead of concatenating a string, I create a dump into a text file, and then compare the file with a reference by means of diff.

Thomas Mauch replied on Tue, 2009/12/15 - 7:24pm

Maybe you are interested in reading an article I wrote about testing which introduces a new visual approach to testing and which covers quite a few points mentioned in this thread.

Have a look at

http://magicwerk.blogspot.com/2009/12/magictest-automated-visual-approach-for.html

 

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.