Abstract
The inspection in confined spaces, for instance inside aircraft engines, is currently performed manually, since the inspection approaches cannot be sufficiently automated. Using a novel sensor system based on the borescopic fringe projection method, such small installation spaces can be inspected with high precision 3D measurements. This provides a basis for a standardization of the inspection processes during maintenance cycles. In order to automate the inspection process, an approach to plan measurement strategies based on ray tracing simulations of the optical measurement is presented. By taking multiple reflections and the corresponding reconstruction failures into account suitable measurement poses are identified. Finally, an in-situ measurement approach to assess the condition of (aero engine) turbine blades and derived damages is presented.
You have full access to this open access chapter, Download chapter PDF
Keywords
- Borescopic fringe projection
- Turbine blade inspection
- In-situ measurement
- Ray tracing
- Multiple reflections
- Virtual fringe projection
1 Introduction
Project C2 is located in the area of early fault detection within the process chain of turbine maintenance of CRC 871. Over the course of the previous two funding periods, research was conducted on a novel type of sensor technology for inspection in confined spaces. The measurement method of fringe projection was transferred to new scales and applications by means of a borescopic structure (Schlobohm et al. 2015, 2016; Pösch et al. 2017). This enables optical 3D measurement in areas that are difficult to access with limited space for movement. In particular, the current advancement in the field of smartphone cameras has made the implementation of miniature sensors possible. The use of these sensors for high-precision geometric component characterization will be demonstrated in this paper using the example of aero engine inspection.
2 Objective
The aim of this subproject is to develop a fast inspection approach for the inspection of complex geometries. For this purpose, the inverse fringe projection method was initially researched and used to perform rapid condition assessments of aircraft engine blades (Pösch et al. 2012; Schlobohm et al. 2017b, a). However, a precise metric derivation of defect sizes is not possible based on a single inverse pattern. Furthermore, precise knowledge of the orientation and geometry of the measured object is required. Due to rapid technical developments in fields of cameras, projectors and computing power, the fringe projection measurement method has caught up with the inverse fringe projection approach for rapid inspection. Today, high-speed fringe projection measuring systems can be built that can perform 3D measurements within one second for high-precision damage analysis. With these, it is even possible to perform handheld data acquisition at reduced accuracy (Matthias et al. 2018).
Significant advancement in the field of miniaturization of camera sensors have also been made due to the needs of the smartphone industry. In addition to the inspection of complex components, this also enables the development of 3D sensors for inspections in confined spaces. In particular, the early fault detection of complex capital goods such as blisks and turbine blades has been identified as an application area. Precise 3D inspection during a maintenance interval on assembled and disassembled aircraft engines is needed to investigate safety-critical aspects and prevent unnecessary and expensive engine disassembly and repair. Accordingly, the objective is to develop miniaturized 3D measuring systems based on the fringe projection method for the purpose of early fault detection. Two different inspection tasks within the maintenance process were defined for this specific application. On the one hand, the particularly challenging inspection of the assembled engine via maintenance openings has to be targeted. On the other hand, blisks and turbine blades of the partially disassembled engine have to be inspected. The two inspection tasks present different challenges for the development of the measuring systems. Therefore measuring head sizes of less than 10 mm in diameter are required and the working distance of the sensor is within the range of 10–20 mm or 40–60 mm. Miniaturized camera sensors in particular are more sensitive to malfunctions than industrial camera sensors and require a suitable measurement strategy with appropriate measurement poses when used within optical sensors. To achieve intelligent measurement pose planning, the reflective properties of the measurement object must be taken into account in addition to the sensor-specific requirements. Especially multiple reflections on highly reflective (shiny) components can lead to faulty reconstructed points within the measurement. For this purpose, a GPU-based simulation approach for determining low-reflection measurement poses and a compensation approach for the error-causing reflections are presented.
This article is structured as follows: First, the concept of a borescopic fringe projection sensor and its challenges during miniaturization are presented. Then a simulation approach to plan suitable measurement strategies is introduced and finally the measurement capabilities and results of the in-situ inspection are demonstrated.
3 Borescopic Fringe Projection Sensor
3.1 Design
For the adaptation of the fringe projection measuring technique to another scale range and the application in confined spaces, the classic camera projector design is adapted. In order to obtain such small measuring heads, miniaturized cameras in the “Chip-on-the-Tip” design are used instead of industrial cameras. To enable the projection of fringe patterns into small installation spaces, a borescope including a lens is used within the projection path. By projecting sinusoidal patterns through a borescope, the sensor head can be spatial separated from the projector and frame grabber board of the camera. Two iterations of borescopic sensors were developed within this subproject. First, the proof of concept for this device class of measurement systems with a measuring head diameter of about 10 mm was designed (Fig. 1 right). As the project progressed, technical advances in the camera sector enabled additional miniaturization of the borescope sensor with a measuring head diameter of about 6.5 mm (Fig. 1 left).
Figure 2 shows a schematic of the measuring head of both sensors. Here the green cone visualizes the field of view of the projector and the pyramid the field of view of the camera. The camera is fixed to the borescope shaft by a customized design of a 3D print clip. This can be flexibly adapted according to the required triangulation basis and thus adjusts the working distance of the measuring system. The two developed measurement systems are based on the components presented in
Table 1 both camera sensors were manufactured by OmniVision Technologies, Inc (Santa Clara, United States). For the projection of the sinusoidal patterns, an evalution module 4500 from Texas Instruments Inc (Dallas, US) was used. The micromirror device forms the fringe pattern by binary tilting each individual micro mirror.
3.2 Miniaturization
In addition to inspection in confined spaces, a miniaturized measuring head has been developed to enable the inspection of maintenance openings in engines. Due to the continuously shrinking size of electrical components and sensors, the quality of the optical measurement also decreases and thus the suitability for use in optical measurement systems. In order to investigate the most relevant properties of the sensors and the influence of miniaturization, analyses on the signal to noise ratio, the edge spread function and spatial frequency response of the cameras are performed below. In addition, the illumination homogeneity, the distortions of the camera-projector pair and the measurement uncertainty of the two measurement systems are presented.
Camera Noise
The signal to noise ratio (SNR) of a camera indicates the ability to differentiate phase information retrieved from the sinusoidal patterns and noise. The noise of an image sensor results from the photon shot noise, sensor read noise, fixed pattern noise, thermal noise, pixel response non-uniformity and quantization noise. In order to determine a corresponding SNR in practical application, Eq. 1 is used to describe the SNR in context of read-and shot-noise. According to the approach of Martinec (2008), the summed intensity of two images (µsummedl mage) was related to the standard deviation of the difference of the images (crdifferencel mage).
This test requires the cameras to be out of focus and acquire multiple images under constant illumination conditions. The exposure time is increased incrementally during the test to use the full intensity range of the sensor. In order to create comparative conditions and compensate for the inhomogeneous illumination, regions of interest (ROI) were analyzed in each image. Based on the Fig. 3, it can be shown that miniature sensors (OV5647 and OV2740) with one-piece injection-molded lenses are significantly more susceptible to noise compared to industrial cameras with high-quality lenses (CB120).
Spatial Frequency Response
The spatial frequency response (SFR) is one of the most important quality metrics in the camera sector since it quantifies the extent to which a camera and lens system can resolve image details. The slanted edge method (SEM) according to the implementation of van den Bergh (van den Bergh 2019, 2018) is used for this analysis. This approach is robust against distortions, which is particularly relevant for miniaturized systems. For the evaluation according to van den Bergh, the cameras were aligned in their focal plane and the intensity gradients at the edges of black rectangles on white background were examined (test target USAF1951 from Thorlabs, Inc. (Newton, United States)). Figure 4a therefore shows the Edge Spread Function (ESF) and Fig. 4b the SFR of the miniaturized camera sensors.
The plot of the ESF depicts the intensity curve over an edge (of a black rectangle) normalized to the average of white and black areas. The sensor OV5647 overshoots and undershoots at the edge, which looks similar to an unsharp masking effect. This behavior is not to be expected, the ESF of the OV2740 and other industrial sensors record a smooth curve and therefore behave approximately ideal. Internal signal processing within the sensor (OV5647) can probably explain this effect. The plot of SFR shows the resulting modular transfer function (MTF) as the contrast over the spatial frequency related to the sensor pixel. The MTF50 is usually used to compare optical systems regarding the local frequency to which a sensor images sharply. By the example of sensor OV2740, the MTF50 is about 0.15 Cycles and the imageable frequency is 288 (pixel count of 1920). In the context of fringe projection, the OV5647 is able to display higher frequencies more accurate than the OV2740, but the OV2740 performs more traceable and can reproduce intensity transitions during fringe projection measurement more accurate.
Homogeneity
The homogeneity of the combined sensor systems is also analyzed, since strong differences in exposure are caused by different lenses, working distances and borescopes. Especially the need of high dynamic range approaches can be evaluated. The homogeneity evaluation is carried out using a white photographic target which is illuminated by a solid field pattern of the projector.
A strong radial intensity gradient can be observed starting from a certain center in Fig. 5. Both sensors show contrary behavior due to a larger measuring volume of the OV2470 and a different alignment of the borescope. When varying the working distance of the sensor, the centers of the rationally symmetrical intensity drops are shifted. In order to precisely adjust the projection of the light into the C-mount lens of the borescope a more flexible design of the digital micro mirror device must be used.
Calibration
In the context of sensor miniaturization, it is meaningful to verify, whether the distortion of the smallest sensor can sufficiently be corrected. Distortion correlates anti proportionally to the aperture size and is not an inherent property of a sensor. Therefore, camera and projector are calibrated according to the pinhole camera approach of Zhang (2000). With respect to radial and tangential distortion of camera and projector, the distortion is modeled via the polynomial approach of Conrady and Brown (Brown 2002). For the determination of the extrinsic system parameters a final stereo calibration of camera and projector is performed. Figure 6 shows the resulting distortion plots. Considering the direction of pixel displacement (arrow directions within the image), it can be concluded that the camera exhibits pincushion distortion, while the projector has a barrel distortion.
Camera and projector are subject to strong radial distortion while the camera also has tangential distortion. Tangential distortion is probably caused by the lens tilting in the thread of the camera package. The corners of the camera image underlie strong image field curvature, so that the features extracted are false and neglected. High pixel displacement within the distortion plot of the projector result from an artificially extrapolated resolution of the projector.
Reconstruction Quality
The probing error with respect to form is often used to classify the reconstruction quality. Using a borescopic sensor with a measuring head at 30 mm working distance it was determined to be 20 µm according to VDI/VDE 2634-2 (Deutsches Institut für Normung e. V. 2012) for a cylindrical feature of a calibrated micro contour standard. Additionally, the probing error with respect to size was calculated following the guideline JCGM 100:2008 (International Organization for Standardization 2008). The probing error with respect to size on this feature is 40 µm within 20 repeated measurements. Please refer to Matthias et al. (2017) for further accuracy and measurement uncertainty investigations and supplementary explanations. These specifications apply to surfaces with good optical cooperativity, for surfaces with limited optical cooperativity, the known physical limits of triangulating optical measurement principles apply.
4 Planing a Measurement Strategy Using Ray Tracing Simulations
During the optical measurement of complex geometries, especially during the measurement of optically non-cooperative surfaces (or glossy surfaces), multiple reflections caused by the shape of the specimen occur frequently. This is critical for fringe projection measurements, since multiple reflections can lead to incorrect phase information which is unwrapped from the camera images. An example of this can be seen in Fig. 7. Here, a fringe projection measurement was performed on the concave surface of a highly reflective compressor blade. Due to multiple reflections, false points are reconstructed outside the actual geometry. The use of such measurement data leads to erroneous damage derivations and prevents the automated data evaluation.
In order to gain a deeper understanding of this problem and to develop a compensation approach for these influences, an optical simulation of the measurement is performed. Since these kinds of simulations can last up to several days on a CPU basis, a near real-time GPU-based approach has been implemented. Thus, a physically based high resolution simulation of the measurements can be carried out within one second. After explaining the simulation pipeline, an approach to identify low-reflection measurement poses and a method to compensate for erroneous phase information based on the ray tracing simulations is presented.
4.1 GPU-Based Ray Tracing Simulation
Modern ray tracing algorithms rely on the rendering equation of James Kajiya (1986). This equation describes the energy conservation of light rays in space and provides a physically based description of light based on radiometric quantities to simulate an image. To render an image, the following equation has to be solved for each pixel of the image:
The amount of outgoing radiance Lo(p, wo) at a surface point p is integrated over all incident light ray directions dwi of a corresponding hemisphere H2 as a function of the incoming radiance Li(p, wi) and the reflection properties of the object surface f(p, wo, wi). During the simulation of fringe projection measurement cos 0i describes the angle between the optical axis of the camera wo and the optical axis of the projector wi. The bidirectional reflection distribution function (BRDF) f(p, wo, wi) of an object surface describes the distribution of the reflected light. In order to simulate reflections physically based and take the surface roughness into account the BRDF model according to Torrance-Sparrow (Torrance and Sparrow 1967) is applied. The rendering of a gray-scale image of the projection of a sinusoidal pattern of the measurement sequences is depicted in Fig. 8a. For a more detailed mathematical breakdown, Middendorf et al. (2021b) can be referred to. To calculate the occurrence of multiple-reflections and their reflection locations efficiently, an inverse ray tracing approach is used. Since a large number of light rays emitted by the projector are not reflected into the camera, the inverse approach reduces the computational effort significantly. As a consequence, interaction of the light rays and the specimen surface are traced from the camera origin to the projector origin. To further limit the computational effort of tracing multiple reflections, a ray tracing approach according to Whitted is used (Whitted 1980). Starting from the camera origin, a primary ray is traced, and a secondary ray is generated for each intersection of a light ray with an object. This creates a path structure, where the secondary rays are calculated based on the normal direction of the specular reflection of the incident ray. In order to perform ray tracing efficiently, the algorithm was implemented using OptiX a ray tracing engine developed by NVIDIA® (Parker et al. 2010). To parallelize the rendering on the graphics card, the raytracing application is based on NVIDIA’s Compute-Unified-Device-Architecture (CUDA) (NVIDIA et al. 2020). This recursive ray tracing is continuously performed until a self-defined reflection depth is reached. In this application, it can be assumed that the influence of a light beam from the 4th reflection on wards has a minor effect on the resulting camera image. Figure 9 shows an exemplary reflection map calculated using the measurement pose from Fig. 8a. In this figure, the maximum reflection depth per camera pixel is color-coded.
4.2 Evaluation of Suitable Measurement Poses
To identify suitable measurement poses, a consistent evaluation metric is first defined. Based on the sum of all maximum reflection depths per camera pixel, a representative value per measurement pose is determined. To position the sensor according to a targeted field of view and its working distance, a surface point on the specimen is aligned in the focal plane of the camera sensor. This enables the comparison of the reflectivity of a region of interest around the object point. To identify a low-reflectivity measurement pose, a spherical scanning of possible measurement poses around the defined surface point is performed. Figure 10 shows 1024 different measurement poses with respect to the reflectivity of the measurement pose on the sphere. The yellow star shows the center of the sphere and represents the surface point observed. The green star indicates the measurement pose with the lowest reflectivity and the red star represents the measurement pose with the highest reflectivity. The rendering of both measurement poses are shown in Fig. 11a and b. In Fig. 11a, the fringe pattern is projected into the blade and in the direction of the blade root, causing the light to be reflected multiple times. This leads to incorrect phase information recorded by the camera. In contrast, the fringe pattern in Fig. 11b is projected to the blade tip, which avoids reflections.
4.3 Compensation of Multiple Reflections
In order to reduce the influence of the faulty phase information within the reconstruction, a masking approach is presented below. Based on the calculated reflection locations and the reflection depth in the camera image, a binary mask can be created. The masking is applied to the images of the entire measurement sequence so that the areas of multiple reflections are suppressed everywhere. An application of this approach can be seen in Fig. 12. To apply this approach to real measurements and to identify reflections in these, a pose estimation of the measurement object has to be performed first. This allows to obtain the relative pose of the specimen in the camera coordinate system. Using the calculated pose, a ray tracing simulation of the measurement can be performed, and a mask can be calculated. The real measurement can be subsequently filtered with the simulated mask. In addition, it is also possible to calculate a mask for the projection, which reduces the reflections during the measurement. To apply this approach for real measurements some assumptions are made. For example, the real position and size of the measured object differs from the simulated one and thus causes a certain uncertainty budget. This is currently taken into account in a simplified manner by performing an erosion of the masking to allow for small errors. In addition, possible defects and other machining operations that change the shape and surface of the component cannot be taken into account, resulting in possible uncompensated reflection effects. To validate this simulation approach, three different simulations were performed and reconstructed according to the normal phase shift pipeline used for fringe projection measurement. Subsequently, a deviation analysis of the reconstructed simulation was performed in comparison to the CAD file. Simulation one from Fig. 13a represents an ideal measurement with a reflection depth of one, which avoids erroneous multiple reflection effects. The deviation analysis proves that the simulation was successfully reconstructed despite the optical properties of the measurement system. In the second simulation, a fringe projection measurement was performed with a reflection depth of four. Significant deviations can be seen in the reconstructed point clouds (Fig. 13b). Especially in the area of the leading edge and the blade root, multiple reflections occur. By means of the masking approach, the deviations from the second simulation were compensated. The masked image areas and reduced measurement deviations are shown in Fig. 13c.
5 Inspection of Turbine Blades in Confined Spaces
For the study of measurement in confined spaces, a geometric blade arrangement was reproduced from the mounted aircraft engine. Using five blades in different states of wear, investigations were carried out on damages at the leading edge. Besides the wear at the blade tip, this is one of the main places of wear, as erosion, cracks, nicks, dents, burns, leading edge burn through, coating damages and blocked film-cooling holes occur (IAE. International Aero Engine 2000). Figure 14 shows an image of the borescopic sensor with two blades in different states of wear. The left-hand blade is heavily worn and only slightly visible due to a burnt coating, while the right-hand blade is well visible, as only coating at the leading edge is missing. In addition to the missing coating, a lot of material is missing at the leading edge. An example measurement pose for the borescopic engine inspection is depicted here. The inhomogeneous illumination is caused by the miniaturized measuring system, which is particularly noticeable. Depending on the measurement pose, up to three blades are within the field of view of a borescopic measurement. Due to the rotational degree of freedom of the engine shaft, the position of the turbine blades relative to the measuring system is initially unknown. In order to clearly identify and assign the blades and their position in the engine, the unknown pose must first be determined. The identified measurement pose can then be used to perform wear and deviation analyses. Given the highly variable measurement and wear conditions within the aero engine, a feature segmentation approach is used in addition to the common iterative closest point (ICP) registration approach. The film-cooling holes of the turbine blades have proved to be unique features that can be detected, even after increased wear. These can be segmented both in the two-dimensional image via color gradients as well as in the three-dimensional point cloud via its geometric shaping. To register the measured point cloud to the reference geometry of the CAD models, the film-cooling holes in the CAD reference are first identified. A detailed description of this approach can be found in Middendorf et al. (2022a). Using an equivalent segmentation approach, the film-cooling holes within the measured point cloud are segmented using a clustering approach (DB-Scan). To identify the features, the coordinates of the centroid of the segmented cluster are used. Based on the identified film-cooling holes, a random sample consensus (RANSAC)-based numerical optimization approach is used to find the closest possible match between the set of film-cooling holes from the measurement and the model. The segmented film-cooling holes in the reference geometry are shown in Fig. 15. Based on the estimated pose of the turbine blades, a subsequent fine registration process based on an ICP approach can be used to align the entire measurement to the reference geometry. To evaluate the condition of the measured specimen and derive damages, a surface comparison to the reference geometry is performed. For this purpose, the deviation of the point cloud to the CAD geometry is determined in polygonal normal direction. When assigning the respective (polygon) planes to the corresponding 3D points, a 1-Nearest-Neighbor (NN) classification of the reference point cloud (generated from all polygons) and the reconstructed point cloud is calculated. Subsequently the Euclidean distance of all points is determined to the nearest polygon. The right-hand turbine blade from Fig. 14 is damaged along the leading edge. Deviations of more than 1 mm compared to the reference geometry can be measured, see Fig. 16. Based on the knowledge of the exact geometric shape and location of the damage, the damage can be classified and a disassembly decision for a particular engine can be made. With this metric and high-precision measurement data, the subjective and error-prone assessment of the normal borescope process can be extended (Drury et al. 1997; See 2012; Aust and Pons 2022). In combination with damage classification approaches based on neural networks, such as that implemented by Aust et al. (2021), very fast, efficient and reliable inspection becomes possible. With an appropriate mechanical connection to the engine and a positioning mechanism, an automation of the inspection process based on borescopic fringe projection sensors can be realized in near future.
6 Conclusions
Within this research project, a borescopic inspection approach for confined spaces was developed. Using the example of aero engine inspections, the successful miniaturization and suitability of the measurement system for the intended measurement task could be demonstrated. For the design and development of miniaturized 3D measurement systems, it became evident that especially the camera and the size of the borescope are decisive. Compared to industrial cameras, strong noise influences, non-linearities in the intensity response, possibly implemented data processing pipelines on the sensor and strong distortions due to small working distances have to be considered. Concerning the borescopes, bending effects, which have an influence on the optical properties of the sensors, have to be taken into account in addition to the oscillation-sensitive design. A reduction of the borescope diameter also leads to a loss of intensity within the measurement scene and a drop in intensity within an image. This means high dynamic range measurements must be carried out. By means of a GPU-based ray tracing simulation, an approach for automated measurement pose planning could be developed. This approach takes the sensor-specific properties and reflective characteristics of the measurement objects into account. In addition, the measurement pose planning should be extended by further influential factors such as the sensor noise as well as geometry and pose-dependent measurement uncertainties. Furthermore, the influence of production-specific variances of the measurement objects and their effect on the masking and compensation approach should be investigated. For the application of fringe projection measurements on shiny components, a compensation approach for multiple reflections was developed. This enables the examination of highly reflective surfaces, which could previously only be measured with an anti-reflection spray. However, the subsequent compensation of multiple reflections in real measurements requires a successful pose estimation of the measured object and a precise reference geometry. The inspection in confined spaces in an academic environment was also successfully implemented. In particular, navigation and orientation within the aero engine could be addressed with a rigid endoscopic sensor. The pose estimation of the turbine blades within the engine were realized using a feature segmentation approach. In the end, the condition assessment and damage derivation of worn turbine blades could be demonstrated using exemplary damages with impacts at the leading edge. For future tasks, it is possible to bring the developed sensors to a level of industrial maturity to be tested on real aircraft engines outside the laboratory.
References
Aust, J. and Pons, D. (2022). Comparative analysis of human operators and advanced technologies in the visual inspection of aero engine blades. Applied Sciences, 12(4):2250.
Aust, J., Shankland, S., Pons, D., Mukundan, R., and Mitrovic, A. (2021). Automated defect detection and decision-support in gas turbine blade inspection. Aerospace, 8(2):30.
Brown, D. (2002). Close-range camera calibration. Photogramm. Eng., 37.
Deutsches Institut für Normung e. V. (2012). DIN 26343-2 Optical 3-D measuring systems, Optical systems based on area scanning. Beuth Verlag GmbH.
Drury, C. G., Spencer, F. W., and Schurman, D. L. (1997). Measuring human detection performance in aircraft visual inspection. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 41(1):304–308.
IAE. International Aero Engine (2000). V2500 maintenance manual, borescope inspection, standard practices ata 70–00–03 2000. Technical report. Available online: https://www.slideshare.net/RafaelHernandezM/v2500-bsi-issue-01 (accessed on 26 May 2022).
International Organization for Standardization (2008). ISO/IEC GUIDE 98–3:2008, Uncertainty of measurement – Part 3: Guide to the expression of uncertainty in measurement (GUM:1995). Beuth Verlag GmbH.
Kajiya, J. T. (1986). The rendering equation. ACM SIGGRAPH Computer Graphics, 20(4):143–150.
Martinec, E. (2008). Noise, dynamic range and bit depth in digital slrs.
Matthias, S., Schlobohm, J., Kästner, M., and Reithmeier, E. (2017). Fringe projection profilometry using rigid and flexible endoscopes. tm - Technisches Messen, 84.
Matthias, S., Kästner, M., and Reithmeier, E. (2018). A 3-d measuring endoscope for hand-guided operation. Measurement Science and Technology, 29.
Middendorf, P., Hedrich, K., Kästner, M., and Reithmeier, E. (2021a). Miniaturization of borescopic fringe projection systems for the inspection in confined spaces: a methodical analysis. In Ehmke, J. and Lee, B. L., editors, Emerging Digital Micromirror Device Based Systems and Applications XIII, volume 11698, pages 151–164. International Society for Optics and Photonics, SPIE.
Middendorf, P., Kern, P., Melchert, N., Kästner, M., and Reithmeier, E. (2021b). A GPU-based ray tracing approach for the prediction of multireflections on measurement objects and the a priori estimation of low-reflection measurement poses. In Beyerer, J. and Heizmann, M., editors, Automated Visual Inspection and Machine Vision IV, volume 11787, pages 86–96. International Society for Optics and Photonics, SPIE.
Middendorf, P., Blümel, R., Hinz, L., Raatz, A., Kästner, M., and Reithmeier, E. (2022a). Pose estimation and damage characterization of turbine blades during inspection cycles and component-protective disassembly processes. Sensors, 22(14).
Middendorf, P., Rothgänger, M., Peddinghaus, J., Brunotte, K., Uhe, J., Behrens, B. A., Quentin, L., Kästner, M., and Reithmeier, E. (2022b). In situ wear measurement of hot forging dies using robot aided endoscopic fringe projection. Key Engineering Materials, 926:1211–1220.
NVIDIA, Vingelmann, P., and Fitzek, F. H. (2020). Cuda, release: 10.2.89.
Parker, S. G., Bigler, J., Dietrich, A., Friedrich, H., Hoberock, J., Luebke, D., McAllister, D., McGuire, M., Morley, K., Robison, A., and Stich, M. (2010). Optix: A general purpose ray tracing engine. ACM Trans. Graph., 29(4).
Pösch, A., Vynnyk, T., and Reithmeier, E. (2012). Using inverse fringe projection to speed up the detection of local and global geometry defects on free-form surfaces. In Bones, P. J., Fiddy, M. A., and Millane, R. P., editors, Image Reconstruction from Incomplete Data VII, volume 8500, pages 91–97. International Society for Optics and Photonics, SPIE.
Pösch, A., Schlobohm, J., Matthias, S., and Reithmeier, E. (2017). Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection. Optics and Lasers in Engineering, 89:178–183. 3DIM-DS 2015: Optical Image Processing in the context of 3D Imaging, Metrology, and Data Security.
Schlobohm, J., Pösch, A., Kästner, M., and Reithmeier, E. (2015). On the development of a low-cost rigid borescopic fringe projection system. In Photonics, Devices, and Systems VI. SPIE.
Schlobohm, J., Pösch, A., and Reithmeier, E. (2016). A raspberry pi based portable endoscopic 3d measurement system. Electronics, 5(3).
Schlobohm, J., Bruchwald, O., Frackowiak, W., Li, Y., Kästner, M., Pösch, A., Reimche, W., Maier, H. J., and Reithmeier, E. (2017a). Advanced characterization techniques for turbine blade wear and damage. Procedia CIRP, 59:83–88. Proceedings of the 5th International Conference in Through-life Engineering Services Cranfield University, 1st and 2nd November 2016.
Schlobohm, J., Li, Y., Pösch, A., Kästner, M., and Reithmeier, E. (2017b). Multiscale measurement of air foils with data fusion of three optical inspection systems. CIRP Journal of Manufacturing Science and Technology, 17:32–41. SI: Advanced M&T for TES.
See, J. (2012). Visual inspection : a review of the literature. Technical report. Suffern, K. and Hu, H. H. (2014). Ray Tracing from the Ground Up. A. K. Peters, Ltd., Natick, MA, USA, 2nd edition.
Torrance, K. and Sparrow, E. (1967). Theory for off-specular reflection from roughened surfaces. Journal of the Optical Society of America (JOSA), 57(9):1105–1114.
van den Bergh, F. (2018). Deferred slanted-edge analysis: a unified approach to spatial frequency response measurement on distorted images and color filter array subsets. J. Opt. Soc. Am. A, 35(3):442–451.
van den Bergh, F. (2019). Robust edge-spread function construction methods to counter poor sample spacing uniformity in the slanted-edge method. J. Opt. Soc. Am. A, 36(7):1126–1136.
Whitted, T. (1980). An improved illumination model for shaded display. Communications of the ACM, 23(6):343–349.
Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11):1330–1334.
Acknowledgements
Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)—SFB 871/3—119193472.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2025 The Author(s)
About this chapter
Cite this chapter
Middendorf, P., Kästner, M., Reithmeier, E. (2025). Fast Measurement of Complex Geometries Using Inverse Fringe Projection. In: Seume, J.R., Denkena, B., Gilge, P. (eds) Regeneration of Complex Capital Goods. Springer, Cham. https://doi.org/10.1007/978-3-031-51395-4_14
Download citation
DOI: https://doi.org/10.1007/978-3-031-51395-4_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-51394-7
Online ISBN: 978-3-031-51395-4
eBook Packages: EngineeringEngineering (R0)