Featured Product
This Week in Quality Digest Live
CMSC Features
George Gayton, Rong Su, Richard Leach, and Liam Bradley
Groundbreaking project offers new framework
Mike Richman
Knowledge exchange par excellence
Belinda Jones
New partnerships for great brands
Matthew Ilardo
Goodbye Orlando, hello New Orleans
Belinda Jones
Dr. Bonnie Dunbar wows at CMSC 2019

More Features

CMSC News
Partnering with FARO Technologies
Structured-light 3D scanner for industrial design professionals
New capabilities, greater ease of use, and user experience improvements
ATOS Triple Scan used to design 12 bronze busts
New NIST chip hints at quantum sensors of the future
Deal provides new and better solutions for customers
New portable, wireless optical coordinate measuring system
Full-field, high-accuracy noncontact part measurements with fully automated scanning

More News

George Gayton, Rong Su, Richard Leach, and Liam Bradley

George Gayton, Rong Su, Richard Leach, and Liam Bradley’s default image

CMSC

Uncertainty Evaluation of Fringe Projection Based on the Linear Systems Theory

Groundbreaking project offers new framework

Published: Thursday, August 29, 2019 - 21:22

Fringe projection techniques offer fast, noncontact measurements of the surface form of manufactured parts. Fringe projection has seen successful implementation in the automotive, aerospace, and medical industries. Recently, advances in fringe projection have reduced the sensitivity of the measurement system to effects such as multiple surface reflections and projector defocus. Typically, the measurement method is altered to optimize the system for specific measurement conditions, without any regard for quantifying the effects of influence factors. Furthermore, there is no standardized calibration framework for fringe projection systems and uncertainty evaluation of surface measurements is rarely carried out in practice, which places some restrictions on the use of this technique in manufacturing industry.

Fringe projection systems detect the intensity of a projected fringe pattern that is reflected from the surface to be measured. Any process that alters the intensity of this pattern received by the camera will change the measurement outcome. Therefore, fringe projection systems typically have many influence factors that affect the measurement outcome, including the surface characteristics (e.g., optical properties and topography), imaging optics (e.g., defocus and aberrations), and external factors (e.g., ambient light intensity level, mechanical vibration, and temperature). The complexity of the measurement model makes current calibration methods given in ISO 15530 (for contact coordinate measuring machines) unsuitable for fringe projection. Additionally, it is unclear how to apply the calibration method in ISO 25178 for areal surface topography measuring instruments. A calibration framework for estimating spatial-frequency-dependent measurement uncertainty built on solid theoretical foundations is required.

To move towards a traceable surface measurement using fringe projection techniques, we are developing a measurement model to accurately predict the captured image and include all major uncertainty contributors. The first step of the model is to describe the optical field distribution within the projection volume of the projector by considering its point spread function in three dimensions. The optical field distribution is sampled at surface locations, using a ray tracing algorithm to map intensity values to corresponding camera pixels. The results are validated by comparing to an experimental fringe projection system with carefully controlled parameters. The intention is to use this simulation within a Monte Carlo framework to create an uncertainty map of the phase image that can be used to estimate the uncertainty at each point-cloud data point. Additionally, the model will give insights into the relationships between influence factors, allowing the implementation of improvements to fringe projection systems.

Introduction

Fringe projection is a three-dimensional optical measurement technique that measures surface topography and geometrical dimensions of a part and is increasing in popularity in the aeronautics, automotive, and medical industries.1,2 Fringe projection uses a camera and a projector to take relatively quick measurements of surfaces, describing them as high-density point clouds. A pattern is projected onto a measurement surface. The camera, offset from the projector, records the image of the projected pattern, which has become distorted due to the surface geometry. The camera image is decoded to give correspondence between the camera and projector reference frames, allowing points to be triangulated between the images. Many different fringe projection techniques exist which project different patterns, all optimized for specific measurement scenarios.3,4

Unlike other coordinate measurement systems, such as contact coordinate measuring machines (CMMs), there is no standardized calibration framework for fringe projection systems, and this restricts the use of this technique in manufacturing industry. Fringe projection measurements depend on surface characteristics, e.g., optical properties and topography, making current calibration methods in parts one to three of ISO 155305 (for contact CMMs) unsuitable for fringe projection. Performance verification standards, such as VDI/VDE 26346 or the draft ISO 10360-137 currently assume Lambertian reflection on the surface. Also, it is unclear how to apply the calibration approach in ISO 251788 for areal surface topography measuring instruments. Given an accurate measurement model, an uncertainty evaluation using ISO 15530-49 is promising, but it is difficult to create a rigorous model.

Sensitivity to small changes in intensity, a complicated image-decoding process and the large number of fringe projection configurations makes quantifying individual influence factors difficult in fringe projection systems. Fringe projection systems can operate from millimeter-range surface topography measurements10,11 to larger scales of meters and above.12 Current methods to evaluate fringe projection uncertainty are limited to geometrical optics only, ignoring diffraction and surface effects.13 Fringe projection measurement models generally assume specific surface optical characteristics, yet it is well understood that surface optical characteristics strongly influence the measurement outcome.14–16

Any rigorous measurement model that accurately describes fringe projection must take into account both surface characteristics and diffraction effects. Recently, there has been some progress in using linear systems theory to understand and model optical surface topography instruments 17-19. These instruments tend to work at smaller scales, where the numerical aperture (NA) is large enough that diffraction effects are significant within the working measurement volume. Previously, a 2D instrument transfer function was used to characterise the spatial resolution of a fringe projection system16. The instrument transfer function was valid with the assumption that the change in surface height was much smaller than the working distance of the camera, and the amplitude of the surface spatial frequency components were smaller than the linearity surface height limits. In this paper, a measurement model based on geometrical optics and linear systems theory is used to model a fringe projection system. The objective of this model is to be used in a Monte Carlo simulation in the future.

Theory

The projector in a fringe projection system forms an image of, e.g., a 2D sinusoid pattern, onto the surface of an object. For objects that cannot be placed entirely within the depth of focus, it is important to model the entire output intensity distribution. In this work, we attempt to use 3D imaging theory to model the imaging process in a fringe projection system. Under linear systems theory, the intensity distribution at position r in the projection volume Iout(r) can be described as:

      (Equation 1)

where Iin(rand h(r) are the input intensity distribution and point spread function, respectively. The Fourier transform of Equation 1 gives:

      (Equation 2)

where k is the spatial frequency vector. Assuming the projector is a shift-invariant and incoherent system, its optical transfer function (OTF), given by H(k), is similar to that of a microscope.20

If the volumetric intensity distribution of the fringe pattern can be predicted using Equation 1 and the location of the surface is given, light reflected from the surface can be calculated. To simplify the surface-scattering problem, a Lambertian surface that scatters uniformly in all angles is assumed. By further assuming that the surface is located within the depth of focus of the camera, the image of the fringe pattern recorded by the camera can be calculated by considering the camera modulation transfer function (MTF) and the perspective of the camera.

A geometrical transformation (Ξ), is applied to the output field distribution that first expands the output distribution uniformly, then compresses or stretches the distribution in the lateral position (x, y) as a function of the axial position z. The geometrical transformation Ξ is chosen so that Iout(r) resembles that of a projection system. The expansion is given by:

r= Ξr,                                          (Equation 3)

with Ξ given by:

    (Equation 4)

where M is the global magnification of Iin. For the total intensity of Iout(r) across successive planes perpendicular to the optical axis to remain constant, Iout(r) decreases proportional to the increase in successive plane area. With the assumption that the scene remains fully within the depth of focus of the camera and that scattering is Lambertian, the surface intensity values can be mapped directly to camera pixels to create Icam (u, w). A simple ray tracing algorithm maps surface intensity values to the corresponding pixel. The same algorithm is used to establish occlusions in the projector field of view.

Method

A number of images were taken using a simple fringe projection system. Fringes were generated on a Raspberry Pi, projected using an Optoma HD142X projector and captured using a Nikon D3500 camera in a thermally stable environment. The laboratory wall was used as a flat plane target as it is a relatively flat, white, Lambertian surface that filled the projector field of view. A set of 20-mm-diameter optical spheres were also used as a target. The optical spheres are Al2O3 balls with a matte finish, designed to be Lambertian. All images have been manually cropped to remove scenery and converted to grayscale.

Two scenes were simulated that created the same features that are found in the experimental images. For the projector, NA = 0.2 and the projector image contains 800 × 450 pixels. The intensity distribution I(r) sampling resolution is 800 × 450 × 512. The sample distance for Iin(r) is set to 2 nm in (x, y). The model is run in MATLAB 2018b, on a computer with 32 GB of memory, Intel Xeon W-2123 with a clock speed 3.6 GHz. Each simulation took approximately 30 sec.

Results and discussion

Figure 1 shows fringes projected onto a sphere artifact. The fringes look curved in the camera perspective due to the form of the spheres. There is some specular reflection on the spheres. Part of the spheres are occluded from the view of the projector. A sphere located in the back right corner is out of focus of the camera. The simulation of a similar scene is shown in figure 2, where a single sphere is simulated that increases its radius from 0.08 m to 0.12 m and 0.16 m. The curved fringes found in figure 1 are also present in figure 2. The specular reflection is not present because the surface is assumed to be Lambertian. Figure 2 provides evidence that the model is capable of simulating the change in pattern due to surface topography.


Figure 1: Image of a collection of 20-mm-diameter optical spheres placed in the measurement volume of a fringe projection system (a); configuration shown graphically (b)


Figure 2: Model scene of a sphere of increasing radius of 0.08 m, 0.12 m, and 0.16 m for (a), (b), and (c), respectively; configuration is shown graphically in (d). The dashed line represents the optical axis and the dotted line is the projector focal plane. Scale bar in (a–c) is 0.62 m.

Figure 3 shows another image taken where the projector is placed at distances 1.23 m, 0.89 m, and 0.66 m from the surface, with the surface and camera position fixed. In figure 3 the surface undergoes a magnification and blurring in figure 3(c). Figure 4 shows a simulation of a scene that includes a surface that is both in and out of focus. The surface closest to the projector (right) is brighter and compressed compared to the surface that is furthest away (left). Figure 4 shows that the model presented here can qualitatively recreate the effect of focus and magnification found in a fringe projection system, which cannot be done using a two-dimensional linear model or a geometrical model only.



Figure 3: Images of a proceeding projector with camera and surface static (a–c), with (c) being the furthest distance from projector to surface and (a) being the closest. Configuration is shown graphically in (d). The scale bar represents a length of 64 cm on the surface.

GaytonFig4.JPG
Figure 4: Model scene of a surface placed at 45º to the optical axis of the projector. The downward arrow and scale bar in (a) represent the projectors focal plane location and a distance of 0.66 m respectively. Configuration is shown graphically in (b), with the dotted line representing the focal plane of the projector.

Conclusion

We have modeled fringe projection by combining 3D imaging theory with a geometric transformation. The preliminary results show that projection image distortion caused by surface topography and diffraction effects can be described by this model. Qualitative agreements with experiments were achieved. The potential advantage of this modelling approach is the inclusion of surface scattering/diffraction effects and using the OTF as a metric to evaluate fringe projection performance. In future work, surface-light interactions will be included within the model. The model will continue to be developed for use in a framework that can evaluate uncertainty in fringe projection systems.

Acknowledgements

We would like to thank the Engineering and Physical Sciences Research Council (Grants EP/L01534X/1 and EP/M008983/1) and the Manufacturing Technology Centre (Coventry, UK).

References

1 Chatterjee A., Singh P., Bhatia V., and Prakash S., Ear biometrics recognition using laser biospeckled fringe projection profilometry, Opt. Laser. Technol., 112: p. 368-378, 2019.

2 He W., Zhong K., Li Z., Meng X., Cheng X., Liu X., and Shi Y., Accurate calibration method for blade 3D shape metrology system integrated by fringe projection profilometry and conoscopic holography, Opt. Laser. Eng., 110: p. 253-261, 2018.

3 Zhang S., High-speed 3D shape measurement with structured light methods: A review, Opt. Laser. Eng., 106: p. 119-131, 2018.

4 Feng S., Zhang Y., Chen Q., Zuo C., Li R., and Shen G., General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique, Opt. Laser. Eng., 59: p. 56-71, 2014.

5 ISO 15530: Geometrical product specifications (GPS) - Coordinate measuring machines (CMM): Technique for determining the uncertainty of measurement, 2013

6 VDI/VDE 2634: Optical 3D Measuring Systems, 2002

7 ISO 10360: Geometrical Product Specifications (GPS) - Acceptance and reverification tests for coordinate measuring machines (CMM), 2005

8 ISO 25178: Geometrical product specifications (GPS) - Surface texture: Areal, 2019

9 ISO 15530: Geometrical product specifications (GPS) - Coordinate measuring machines (CMM): Technique for determining the uncertainty of measurement, Part 4: Evaluating task-specific measurement uncertainty using simulation 2008

10 Southon N., Stavroulakis P., Goodridge R., and Leach R.K., In-process measurement and monitoring of a polymer laser sintering powder bed with fringe projection, Mater. Des., 157: p. 227-234, 2018.

11 Inanç A., Kösoğlu G., Yüksel H., and Inci M.N., 3-d optical profilometry at micron scale with multi-frequency fringe projection using modified fibre optic lloyd's mirror technique, Opt. Laser. Eng., 105: p. 14-26, 2018.

12 Du H., Chen X., Xi J., Yu C., and Zhao B., Development and Verification of a Novel Robot-Integrated Fringe Projection 3D Scanning System for Large-Scale Metrology, Sensors, 17(12): p. 2886, 2017.

13 Zhou D., Wang Z., Gao N., Zhang Z., and Jiang X., Virtual fringe projection system with nonparallel illumination based on iteration, Meas. Sci. Technol., 28(6): p. 065201, 2017.

14 Ribo M. and Brandner M. State of the art on vision-based structured light systems for 3D measurements. ROSE. 2005. IEEE.

15 Vukašinović N., Bračun D., Možina J., and Duhovnik J., The influence of incident angle, object colour and distance on CNC laser scanning, Int. J. Adv. Manuf. Tech., 50(1-4): p. 265-274, 2010.

16 Zhang B., Davies A., Evans C., and Ziegert J., Validity of the instrument transfer function for fringe projection metrology, Appl. Opt., 57(11): p. 2795-2803, 2018.

17 Su R., Thomas M., Leach R.K., and Coupland J., Effects of defocus on the transfer function of coherence scanning interferometry, Opt. Lett., 43(1): p. 82-85, 2018.

18 Coupland J.M. and Lobera J., Holography, tomography and 3D microscopy as linear filtering operations, Meas. Sci. Technol., 19(7): p. 074012, 2008.

19 Su R., Wang Y., Coupland J., and Leach R.K., On tilt and curvature dependent errors and the calibration of coherence scanning interferometry, Opt. Express, 25(4): p. 3297-3310, 2017.

20 Streibl N., Three-dimensional imaging by a microscope, JOSA A, 2(2): p. 121-127, 1985.

Discuss

About The Author

George Gayton, Rong Su, Richard Leach, and Liam Bradley’s default image

George Gayton, Rong Su, Richard Leach, and Liam Bradley

George Gayton is a Ph.D. researcher at the University of Nottingham; Rong Su is a research fellow at the University of Nottingham; Richard Leach is a professor at the University of Nottingham; and Liam Bradley is a senior research engineer at the Manufacturing Technology Centre.