sqa | doc | management
Measurement

Measurement

A common tasks for sqa teams is to measure things, and then to report the results.  But in the course taking a measurement, sometimes you need more than just your standard tools:  sometimes you need a very specific tool to first capture the necessary data, then compare the actual data with the expected results. This topic – how do you know what you know – came up recently.  I was working as a consultant at scientific instrument company, Labsphere. Labsphere makes sophisticated hardware and software for measuring light.  Light is not something that is so easily measured, at least so far in my experience. A developer and I were looking at some light measurement results in Labsphere’s software application, Illumia Pro. We got on to the topic of data validity: how do we know the measurements made reflect reality?

This was a problem I encountered when I worked at Visioneer, a scanner and software company in northern California. In this case the engineering team wanted to measure not light, but color fidelity.

The problem

At  Visioneer we were building the first color sheet fed scanner, later to be called the Strobe.  Visioneer had previously released a black and white sheet fed scanner, but this new one we were working on was a color scanner. A key part of the process was evaluating the quality of the color captured by the contact image senors (cis), the critical component in the  scanner. There were several companies submitting their color cis for evaluation, among them Toshiba and Dyna.

Engineering had two tasks: (1) select a vendor whose CIS delivered the truest color fidelity  (2) Using color fidelity as the primary benchmark, evaluate the Strobe scanner against other popular, comparable scanners.

The team

The group involved in the evaluations were:

Rich Pasco – a senior software developer in the hardware group; Rich had a strong background in image processing algorithms
Ron van Os – a senior software developer in charge of the Strobe driver.
Me – senior SQA Manager for the Windows products.

The plan

Our plan to test each CIS was to use a IT.8 kit as our baseline for color fidelity. The kit, from Fuji, consisted of a chart (or target) that could be scanned in. The scanned data was then processed, and compare to digital values, also known as a LAB file. The LAB file was included in the IT8 kit from Fuji.

Fuji IT8 chart used to test color values.
Full IT8 kit from Fuji.

The process  for testing each CIS was:

1.  Scan in the IT8 chart using a vendor supplied CIS. We scanned at 100dpi.
2.  Save the scanned image as a TIFF.
3.  Run the TIFF image file through a propreitary program written by Rich. The outputs included:

  • an RMS value (root mean square) which we used as an overall score for the fidelity of the device.
  • a scatter plot of each of the three primary colors. The x-axis represented the Fuji supplied LAB data, and the y-axis was from the scanned image.

4.  Chart the results, comparing the digital output from the CIS with the expected values supplied by Fuji.

The goal was to ensure the CIS degraded the colored fidelity of what was scanned as little as possible (a perfect score was impossible, of course). After the IT8 chart was scanned and processed we wanted to know how true the colors were. For example, how closely did the scanned orange (cell c8 on the IT8 chart) matched the expected digital value supplied by Fuji? Ideally there was little deviation from the LAB values so the lower the RMS score, the better.

This process was used not only to chose a CIS vendor, but also to compare color Strobe scanners to competitive products made by Hewlett Packard (HP) and Logitech.

Is red really red – the IT8 tests

Below are some of the records and documents of our testing.  As always, keeping detailed records of everything was essential.

Other tests and results

At the request of the vp of engineering we also ran a series of user evaluations: scanning images with the Strobe scanners, then asking users to comment on and rate the results on screen and printed out. While these tests were more anecdotal and subjective than those done using the IT8 chart, the users tests did provide a second set of evaluations for the selected CIS’es, and how the Strobe compared with other scanner results.

Results

Because our methods used for determining the quality of the contact image sensors were objective, and by programmatically comparing the image data from the Strobe scanner against the expected values in the Fuji LAB file, we were able to select the best vendor for the CISes. Visioneer selected Toshiba to be the supplier for the contact image sensors for the Strobe scanner – our testing indicated their sensors were consistently better than those supplied by other vendors.

The Strobe scanner itself was a huge success once it was released, winning many industry and consumer awards.

 

Sorry, comments are closed for this post.