Fabrizio Giudici is a Senior Java Architect with a long Java experience in the industrial field. He runs Tidalwave, his own consultancy company, and has contributed to Java success stories in a number of fields, including Formula One. Fabrizio often appears as a speaker at international Java conferences such as JavaOne and Devoxx and is member of JUG Milano and the NetBeans Dream Team. Fabrizio is a DZone MVB and is not an employee of DZone and has posted 67 posts at DZone. You can read more from them at their website. View Full User Profile

Detecting Multiple Errors in a Single Test with JUnit

08.29.2009
| 10682 views |
  • submit to reddit
In my previous post, I described some useful features of the latest JUnit, together with the way I'm using it with the test suite of jrawio.

Now, it's the turn of another little useful thing. Recalling my test examples, and taking one with more details (i.e. testing some metadata values):

package it.tidalwave.imageio.raf;

import javax.annotation.Nonnull;
import java.util.Collection;
import it.tidalwave.imageio.ExpectedResults;
import it.tidalwave.imageio.NewImageReaderTestSupport;
import org.junit.runners.Parameterized.Parameters;

public class RAFImageReaderImageTest extends NewImageReaderTestSupport
{
public RAFImageReaderImageTest (final @Nonnull ExpectedResults expectedResults)
{
super(expectedResults);
}

@Nonnull
@Parameters
public static Collection<Object[]> expectedResults()
{
return fixed
(
// S9500
ExpectedResults.create("https://imaging.dev.java.net/nonav/TestSets/peterbecker/Fujifilm/FinePixS9500/RAF/DSCF3756.RAF").
image(4292, 4291, 3, 16, "f53e19fbd1f512ddf052e13097735383").
thumbnail(160, 120).
thumbnail(1600, 1200).
issues("JRW-252").
metadata("metadata.fujiRawData.header", "FUJIFILMCCD-RAW 0201FF393101FinePix S9500 \u0000\u0000\u0000\u0000\u0000").
metadata("metadata.fujiRawData.version", "\u0000\u0000\u0000\u0000").
metadata("metadata.fujiRawData.b1", new byte[]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 48, 50, 54, 57}).
metadata("metadata.fujiRawData.b2", new byte[]{0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0}).
metadata("metadata.fujiRawData.JPEGImageOffset", 148).
metadata("metadata.fujiRawData.JPEGImageLength", 611759).
metadata("metadata.fujiRawData.table1Offset", 611914).
metadata("metadata.fujiRawData.table1Length", 3958).
metadata("metadata.fujiRawData.CFAOffset", 615872).
metadata("metadata.fujiRawData.CFALength", 18528512).
metadata("metadata.fujiRawData.unused1", 0).
metadata("metadata.fujiRawData.unused2", 18528512).
metadata("metadata.fujiRawData.unused3", 0).
metadata("metadata.fujiRawData.unused4", 0).
metadata("metadata.fujiRawData.unused5", 0).
metadata("metadata.fujiRawData.unused6", 0).
metadata("metadata.fujiRawData.unused7", 0).
metadata("metadata.fujiRawData.unused8", 0).
metadata("metadata.fujiRawData.unused9", 0).
metadata("metadata.fujiRawData.unused10", 0).
metadata("metadata.fujiRawData.fujiTable1.width", 2448).
metadata("metadata.fujiRawData.fujiTable1.height", 3688).
metadata("metadata.fujiRawData.fujiTable1.rawWidth", 2512).
metadata("metadata.fujiRawData.fujiTable1.rawHeight", 3688).
metadata("metadata.fujiRawData.fujiTable1.fujiLayout", true).
metadata("metadata.fujiRawData.fujiTable1.coefficients", new short[]{320, 502, 320, 422})
);
}
}

,,,you can see that there are a lot of things that can fail: image size, raster, thumbnail size, any metadata item; there can be multiple failures at the same time, but with the usualy style of handling test errors (that is with plain asserting) you can only see the first. You fix it, just to discover another, etc. For instance, if the image size is different than the expected, the whole section of assertions for metadata wont' be executed.

In my experience with jrawio, many errors with a new sample file could be fixed at the same time; wouldn't be much better if you could see *all* the failures immediately?

Published at DZone with permission of Fabrizio Giudici, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Tags:

Comments

Eugen Paraschiv replied on Sun, 2009/08/30 - 10:15am

Cool solution, but isn't this kind of an antipattern to JUnit and unit testing in general? Shouldn't each of these assertions be in their own method? This way you would see each individidual fail, not worry about one call affecting the state of the object, etc. Just a though.

 

Alessandro Santini replied on Sun, 2009/08/30 - 12:24pm in response to: Eugen Paraschiv

Each single test should re-set the whole test scenario. There are cases in which this is particularly expensive (ex., performing a very expensive calculation, logging in to an application, etc.).

As an example, the Selenium test framework uses a similar approach, still letting the tester the ability to choose which one suits best:  the assert* methods stop the execution of the test method whenever an assertion fails; the verify* methods let the test method end, reporting the test as failed if at least one verification failled.

Ciao

 Alessandro

Fabrizio Giudici replied on Sun, 2009/08/30 - 3:31pm in response to: Eugen Paraschiv

Isn't this kind of an antipattern ... unit testing in general?

Right: in fact this example is a functional test.

Aaron Digulla replied on Tue, 2009/09/01 - 10:11am

In my own tests, I usually concat all results into a big, multi-line string and then compare that against the expected result (see this blog entry). This allows me to check several values at once and the diff tool of the IDE will help me to figure out what went wrong.

Fabrizio Giudici replied on Sun, 2009/09/06 - 9:14am

Another possibility, that I'm using in a different project to assert that a bunch of data gets correctly imported into a database, it's an extension of Aaron's idea: instead of concatenating a string, I create a dump into a text file, and then compare the file with a reference by means of diff.

Thomas Mauch replied on Tue, 2009/12/15 - 7:24pm

Maybe you are interested in reading an article I wrote about testing which introduces a new visual approach to testing and which covers quite a few points mentioned in this thread.

Have a look at

http://magicwerk.blogspot.com/2009/12/magictest-automated-visual-approach-for.html

 

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.