The Role of Uncertainty of Measurement in CT
March 22, 2021 | Dirk Steiner
One might think a measurement using the same calibrated tool would have the same accuracy regardless of the object being measured. But metrology is more than that, as the whitepaper “Uncertainty of measurement” will show.
In a CT system, X-ray photons emitted from an X-ray tube will travel through the object and hit a detector. The detector senses the dose at each pixel of the detector area. This creates digital images while the object is rotated. The reconstruction software calculates virtual slices based on the dose at each location from different angles. The result is called a tomogram. Each pixel in the tomogram represents a density of material, a voxel.
Figure 1: CT process
In order to be able to measure features, it is necessary to give the object a surface this is typically done by looking at an area of material and an area of air. The value that is in the middle will be used as the threshold (ISO 50 method) to set points on the surface.
Figure 2: Subvoxel accurate surface determination
The red point most left is right between the black voxel (gray value 0) and the white one below (gray value 255). For simplicity, the example shows only the point placed in the vertical direction. In real life this is done in all three directions. The next one to the right shows a voxel of 127 which places the surface right in the middle of the voxel. With gray values closer to black and white the surface moves accordingly to that direction.
Under ideal conditions, theoretically the accuracy of the surface would be 1/256 of the voxel size. That would be assuming the area coverage of the material creates a signal (gray value) change proportional to the area being covered. In reality, scan data are not 100 percent perfect due to multiple physical effects like scatter radiation, pixel noise and approximation in the reconstruction algorithms. A good rule of thumb is to use 1/10 of the voxel size is the accuracy of the surface on a decent CT scan. There are many unknowns that make it difficult to state a hard number. Fortunately, there are ways to find out what uncertainty of measurement is and there are standards that can be used.
The primary standard for this is “GUM,” JCGM 100:2008 Evaluation of measurement data — Guide to the expression of uncertainty in measurement. It is a general guide and can be applied to any measurement and unit. It is a somewhat complex mathematical standard, however there are more hands-on standards that have the GUM built into them.
To understand how the measurement results are distributed, model the process. In most of cases assume the measurements are distributed under a bell curve and huddle around the mean value.
GUM provides two modeling options called “Type A” and “Type B”. Type B is based on experience and previous data. If Type A has been done a hundred times, judgement may be used to come up with a number.
For a new application, Type A must be used. To do this, measure one or more objects of the same type and analyze observations. The measurement equipment is only part of the contributor to the uncertainty of measurement, the standard provides a guide that gives a reliable method to come up with an objective “hard” number for the uncertainty of measurement.
For more information, download the free white paper.
September 22, 2023 | Gina Naujokat
View three simple (and fast!) methods for evaluating porosity via CT with Dragonfly. Each method can be automated using macros, with a downloadable application note detailing the method and analysis time required.View more
July 27, 2023 | Gina Naujokat
A sedimentary slab from the Holstein Rock was to be examined for its fossil content using computed tomography. Latest CT technology provided results no one had expected.View more