Roadmap on Industrial Imaging Techniques PDF

Summary

This roadmap provides a critical overview of 13 industrial imaging techniques, categorized by target type (solid, fluid, or both). The document aims to highlight challenges and offer perspectives for next-generation imaging systems. Common challenges such as improving speed, resolution, and accuracy remain, while data-driven or AI approaches are increasingly important. The document also emphasizes the need for trustworthy AI approaches.

Full Transcript

Measurement Science and Technology ACCEPTED MANUSCRIPT OPEN ACCESS Roadmap onIndustrial Imaging Techniques To cite this article before publication: Marco Jose Jose Da Silva et al 2024 Meas. Sci. Technol. in press https://doi.org/10.1088/1361-6501/ad774b Manuscript version: Accepted Manuscript...

Measurement Science and Technology ACCEPTED MANUSCRIPT OPEN ACCESS Roadmap onIndustrial Imaging Techniques To cite this article before publication: Marco Jose Jose Da Silva et al 2024 Meas. Sci. Technol. in press https://doi.org/10.1088/1361-6501/ad774b Manuscript version: Accepted Manuscript Accepted Manuscript is “the version of the article accepted for publication including all changes made as a result of the peer review process, and which may also include the addition to the article by IOP Publishing of a header, an article ID, a cover sheet and/or an ‘Accepted Manuscript’ watermark, but excluding any other editing, typesetting or other changes made by IOP Publishing and/or its licensors” This Accepted Manuscript is © 2024 The Author(s). Published by IOP Publishing Ltd. As the Version of Record of this article is going to be / has been published on a gold open access basis under a CC BY 4.0 licence, this Accepted Manuscript is available for reuse under a CC BY 4.0 licence immediately. Everyone is permitted to use all or part of the original content in this article, provided that they adhere to all the terms of the licence https://creativecommons.org/licences/by/4.0 Although reasonable endeavours have been taken to obtain all necessary permissions from third parties to include their copyrighted content within this article, their full citation and copyright line may not be present in this Accepted Manuscript version. Before using any content from this article, please refer to the Version of Record on IOPscience once published for full citation and copyright details, as permissions may be required. All third party content is fully copyright protected and is not published on a gold open access basis under a CC BY licence, unless that is specifically stated in the figure caption in the Version of Record. View the article online for updates and enhancements. This content was downloaded from IP address 1.9.119.137 on 12/09/2024 at 02:14 Page 1 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 Roadmap 5 6 7 8 Roadmap on Industrial Imaging Techniques pt 9 10 11 12 13 Jung-Ryul Lee1*, Hongki Yoo2, Chia Chen Ciang1, Young-Jin KIM1, 2, Daehee KIM2, Teow Wee Teo3, cri 14 Zeinab Mahdavipour3, Azizi Abdullah4, Bee Ee Khoo5, Mohd Zaid Abdullah5, Dimitris K. Iakovidis6, 15 Panagiotis Vartholomeos6, Andrew Yacoot7, Tao Cai8, Mirae Kim8, Kyung Chun Kim8, Jiamin Ye9, 16 Xiao Liang9 Lidan Cao10, Xingwei Wang10, Jianqing Huang11, 12, Weiwei Cai11, Yingchun Wu13, Marco J. 17 da Silva14, Chao Tan9, Sayantan Bhattacharya15, Pavlos Vlachos15, Christian Cierpka16, Massimiliano 18 Rossi17, 19 20 1 Department of Aerospace Engineering, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, us 21 Yuseong-gu, Daejeon 34141, Republic of Korea 22 2 Department of Aerospace Engineering, Korea Advanced Institute of Science and Technology (KAIST), 291 Daehak-ro, 23 Yuseong-gu, Daejeon 34141, Republic of Korea 24 3 TT Vision Technologies Sdn. Bhd., Plot 106, Hilir Sungai Keluang 5, Bayan Lepas Industrial Zone, Phase 4, Penang 11900, 25 Malaysia 26 4 Faculty of Information Sciences and Technology, Universiti Kebangsaan Malaysia, 43600 Bangi Selangor, Malaysia 27 28 29 Penang, Malaysia an 5 School of Electrical and Electronics Engineering, Engineering Campus, Univerisiti Sains Malaysia, 14300 Nibong Tebal, 6 Department of Computer Science and Biomedical Informatics, University of Thessaly, Papasiopoulou 2-4, Lamia, Greece 7 National Physical Laboratory, Hampton Road, Teddington, Middlesex, TW11 0LW, UK 30 8 School of Mechanical Engineering, Pusan National University, Busan 46241, Republic of Korea 31 9 School of Electrical and Information Engineering, Tianjin University, Tianjin 300072, China 32 10 Electrical and Computer Engineering Department, University of Massachusetts Lowell, One University Ave, Lowell, MA dM 33 01854, USA 34 11 Key Lab of Education Ministry for Power Machinery and Engineering, School of Mechanical Engineering, Shanghai Jiao 35 Tong University, People’s Republic of China. 36 12 Department of Electrical and Electronic Engineering, the University of Hong Kong, People’s Republic of China. 37 13 State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou 310027, People’s Republic of China 38 14 Institute of Measurement Technology, Johannes Kepler University Linz, Austria 15 School of Mechanical Engineering, Purdue University (West Lafayette, IN, 47907) USA 39 16 Institute of Thermodynamics and Fluid Mechanics, Technische Universität Ilmenau, Helmholtzring 1, 98693 Ilmenau, 40 Germany 41 17 Department of Industrial Engineering, Alma Mater Studiorum Università di Bologna, Via Fontanelle 40, 47121 Forlì, Italy 42 43 pte 44 45 46 47 48 * Author to whom any correspondence should be addressed. 49 50 Abstract 51 ce Imaging plays a vital role in enabling the visualization and analysis of objects and phenomena 52 across various scientific disciplines and industrial sectors, spanning a wide range of length and time 53 scales. This roadmap presents a critical overview of thirteen industrial imaging techniques, which are 54 organized into three thematic sections according to their applicability to either solid, fluid, or both 55 56 solid and fluid targets. The objectives of this roadmap are to highlight challenges and provide Ac 57 perspectives for next-generation imaging systems, which can serve as a guide to researchers and 58 funding agencies in identifying new prospects. It has been found that the common challenges of 59 imaging techniques have remained fundamentally unchanged over the years, including improving 60 coverage, speed, resolution, accuracy, and robustness; however, there is an increasing reliance on AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 2 of 76 1 2 3 data-driven or artificial intelligence (AI) approaches. Addressing these challenges necessitates easy 4 access to high-performance computing resources. Notably, the trustworthiness and traceability of AI 5 6 approaches should be enhanced through the sharing of benchmarking data, balancing with physics- 7 based techniques, and the adoption of more Explainable Artificial Intelligence (XAI). 8 Keywords: signal processing, image processing, computer vision, machine learning, nondestructive pt 9 10 testing and evaluation, imaging informatics, multimodal imaging 11 12 13 cri 14 15 16 17 18 19 20 us 21 22 23 24 25 26 27 28 29 an 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 Page 3 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 Contents 5 6 1. Introduction 7 8 2. Imaging techniques applicable to solid targets. pt 9 10 1. LIDAR Imaging 11 12 2. Ultrasound Propagation Imaging 13 cri 14 3. Luminescence Imaging 15 16 4. Endoscopy 17 18 5. Confocal microscopy 19 20 us 21 6. Atomic Force Microscopy 22 23 3. Imaging techniques applicable to both solid and fluid targets 24 25 1. Phosphor thermometry 26 27 28 29 2. 3. Electrical Tomography Photoacoustic imaging an 30 31 4. Holography 32 dM 33 4. Imaging techniques applicable to fluid targets 34 35 1. Multiphase flow process imaging 36 37 2. Particle Image Velocimetry 38 39 40 3. Microscopic Particle Image Velocimetry 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 4 of 76 1 2 3 4 1. Introduction 5 A picture is worth a thousand words. Imaging technology has come a long way since the first 6 camera obscura was invented in ancient Greece. Since then, imaging technology has been continually 7 evolving, driven by human curiosity to better comprehend and record the world around us, and the 8 need to find solutions to the challenges we face. Imaging technology offers a plethora of techniques, pt 9 10 each with its unique characteristics and capabilities. They range from generating visualizations directly 11 through mechanical waves to reconstructing images indirectly via electromagnetic interference. While 12 some techniques excel in producing detailed 2D still images, others bring scenes to life through 13 cri 14 animated videos. Additionally, the versatility of these methods is evident as some are optimized for 15 examining vast macroscopic landscapes, whereas others excel in the intricate world of nanoscale 16 phenomena. Their development has supported vivid, detailed visualization of the targets and provided 17 18 insights that would have been impossible to obtain just a few decades ago. 19 Among all imaging techniques, some have been identified as having significant potential to 20 revolutionize our ways of life through improved healthcare, easier daily navigation, and access to high- us 21 quality products, yet, some of them have received comparatively less attention than their substantial 22 23 potential for industrial enhancement in relevant sectors warrants. Against this backdrop, this roadmap 24 presents a critical overview of thirteen industrial-relevant techniques, aiming to highlight challenges 25 and provide perspectives for upgrades that may profoundly alter the future landscape. Nevertheless, 26 27 28 29 an it is important to clarify that the selection is certainly not exhaustive, and there is no intention to suggest that those not included are of lesser potential or significance. The remainder sections of this roadmap present the selected imaging techniques according 30 to the physical forms of targets to which they can be applied. Specifically, with reference to Figure 1, 31 32 Section 2 is dedicated to techniques that can perform imaging on targets in solid form; Section 3 to dM 33 techniques capable of imaging both solids and fluidic targets; while Section 4 is reserved for those 34 techniques suitable for targets in fluid form only. Note that the term ‘fluids’ encompasses both liquids 35 36 and gases. It is important to clarify that the classification is based on the general applicability of the 37 techniques, without being confined to the examples provided or the specific application fields focused 38 on in the respective sub-sections. For instance, while photoacoustic imaging (Subsection 3.3) is 39 presented primarily based on biomedical imaging of biological tissues, which are effectively solids, this 40 41 technique is classified under Section 3 due to its suitability for flow-field imaging as well. Similarly, 42 electrical tomography (Subsection Error! Reference source not found.), which is being presented 43 pte leaning more towards fluid-flow applications, is classified under Section 3 because of its ability to 44 45 image property inhomogeneity or damage in solids. In order to give a rough idea where a technique 46 stands relative to the other, the presentation order of the techniques within their respective sections 47 is generally arranged by the descending physical size of the targets. It is therefore easy to find that the 48 techniques can be broadly divided into two groups in each section: one for macroscopic targets and 49 50 the other for microscopic targets, as depicted in Figure 1. Exceptions are the photoacoustic imaging 51 ce (Subsection 3.3) and holography (Subsection 3.4) techniques, which are applicable to both 52 macroscopic and microscopic targets. 53 54 55 56 Ac 57 58 59 60 Page 5 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 5 6 7 8 pt 9 10 11 12 13 cri 14 15 16 17 18 19 20 us 21 22 23 24 25 Figure 1. Classification of selected imaging techniques based on their applicability to solid, fluid, or both solid and fluid F 26 27 28 29 an Discussions on the thirteen imaging techniques, presented in the subsequent sections of this roadmap, reveal that the common challenges encountered by these techniques today are not 30 significantly different from those faced decades ago; namely, the need to improve spatial and/or 31 temporal resolution, enhance accuracy, increase imaging speed, and ensure robustness in highly 32 dM 33 variable environments. While delving into the technical specifics of each technique is beyond the 34 scope of this Introduction—with detailed information available in the respective sections or cited 35 references—it is noteworthy that these challenges and inherent possibilities converge at several key 36 37 junctures. For instance, many techniques are still limited to two-dimensional visualizations at specific 38 moments, despite the fact that the targets exist in three-dimensional physical spaces and evolve over 39 time. This limitation in coverage is particularly apparent for techniques like multiphase flow process 40 imaging (Subsection 4.1), which restricts the visualization to phase fraction distribution at specific 41 42 locations within a larger system. Addressing this constraint could significantly enhance our 43 pte understanding of the targets, thereby unlocking new possibilities. For example, in the context of 44 multiphase flow process imaging, achieving comprehensive coverage in both spatial and temporal 45 46 dimensions could enable precise automatic control of complex systems. 47 Secondly, the imaging speed restricts the overall throughput and temporal resolution across 48 many imaging techniques. For instance, phosphorescence activities at high temperatures can occur 49 within microseconds or nanoseconds, necessitating frame rates for phosphor thermometry 50 51 (Subsection 3.13.3) in the MHz or GHz range. Yet, the highest frame rate currently stands at ce 52 approximately 100 kHz. Additionally, spatial point-scanning methods, such as photoacoustic imaging 53 (Subsection 3.3) and ultrasound propagation imaging (Subsection 2.2), are typically limited in terms 54 55 of throughput. This limitation is especially pronounced in ultrasound propagation imaging (Subsection 56 2.2) where the targets, such as thin-walled engineering structures like airplanes, require larger imaging Ac 57 regions of interest (ROI) compared to the smaller biomedical targets used in photoacoustic imaging 58 (Subsection 3.3). Overcoming the speed limitation by developing faster hardware or implementing 59 60 compressive sensing for reduced measurement points would, obviously, enable imaging of more AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 6 of 76 1 2 3 targets or larger ROIs in a shorter time and enhance our ability to understand and address more 4 5 dynamic problems. 6 Thirdly, as highlighted in some subsections, imaging accuracy, robustness, and contextual 7 understanding can be significantly improved by fusing complementary information from more than 8 one set of data, each acquired using different measurement modalities or parametric settings. This pt 9 10 approach is applicable even for techniques beyond the thirteen selected ones. For example, distortion 11 in the imaging plane of ultrasound propagation imaging (Subsection 2.2) can be reduced or corrected 12 by incorporating depth information from LIDAR imaging (Subsection 2.1); a comprehensive 13 temperature-velocity history of a flow field can be obtained by merging temperature data from cri 14 15 phosphor thermometry (Subsection 3.1) with velocity data from particle image velocimetry 16 (Subsection Error! Reference source not found.); imaging accuracy can be enhanced by integrating 17 electrical tomography (Subsection Error! Reference source not found.) data with data from other 18 19 techniques such as radiometric and ultrasonic imaging; contextual understanding of scanning electron 20 microscopy can be improved by amalgamating topological information from atomic force microscopy us 21 (Subsection 2.6). Potential fusions applicable for both the selected techniques as well as for other 22 23 techniques not covered in this roadmap, remain to be discovered. This includes possibilities within the 24 existing scope of example applications and to applications beyond those outlined in the subsections. 25 For instance, although endoscopy is described under the context of the biomedical industry in 26 27 28 29 an Subsection 2.4, it is also commonly employed for the inspection of airplane turbine engines and industrial pipelines. In this broader applications, combining endoscopy (Subsection 2.4), which provides visual information, with techniques like luminescence imaging (Subsection 2.3), which can 30 provide information regarding material degradation, might enhance understanding of the target 31 32 through an information-rich, fused image. dM 33 Fourthly, artificial intelligence (AI) technologies, particularly deep learning (DL) and machine 34 learning (ML), are revolutionizing imaging techniques through improved image reconstruction, 35 interpretation, and measurement accuracy. However, their trustworthiness is often compromised by 36 37 the notorious scarcity of benchmarking data, which is crucial for training and validating AI models, as 38 well as for fair comparisons and quantification of advancements. Therefore, the sharing of diverse and 39 well-annotated benchmarking datasets is essential, as exemplified commendably by the authors of 40 41. Moreover, while AI algorithms vary in transparency, DL networks are particularly noted as ‘black- 42 box’ algorithms, with their input-output relationships being largely opaque. Thus, it is imperative for 43 pte all stakeholders, particularly imaging technologists, researchers, and the broader AI community, to 44 enhance transparency and traceability by directing more efforts toward Explainable Artificial 45 46 Intelligence (XAI). Last but not least, the significant increase in data-driven approaches due to AI 47 should be complemented by physics-based approaches, ensuring that they support one another for a 48 holistic understanding. 49 50 To fundamentally address the previously delineated challenges and fully leverage the 51 potential of imaging techniques, access to high-performance computing resources, such as parallel ce 52 graphics processing units (GPUs) and tensor processing units (TPUs), has become more of a 53 requirement than an option. Consequently, the development of scalable computing architectures and 54 55 next-generation processing technologies should progress concurrently with advancements in imaging 56 techniques. Furthermore, revisiting and learning from more mature techniques by defining Ac 57 performance metrics, ensuring traceability, and establishing standardization is beneficial. The 58 59 discussions on traceability in Subsection 2.6 “Atomic Force Microscopy” and on uncertainty in 60 Subsection Error! Reference source not found. “Error! Reference source not found.” serve as good Page 7 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 examples. Only when these aspects are defined and established can a technique be considered a 4 5 measurement tool rather than merely a visualization tool, thereby paving the way for it to be widely 6 embraced by industries. 7 In conclusion, it is hoped that this roadmap, constituted by the overall discussion presented 8 in this Introduction and the status and challenges of all techniques detailed in the respective pt 9 10 subsections, will catalyse cross-disciplinary collaborations and dismantle existing barriers. The aim is 11 to foster a unified approach that synergizes diverse expertise and state-of-the-art technological 12 resources for the comprehensive advancement of imaging techniques, thereby aiding the continuous 13 and sustainable progress of global society. cri 14 15 16 References 17 Ma S, Yang S and Xing D 2010 Photoacoustic imaging velocimetry for flow-field measurement 18 19 Opt. Express, OE 18 9991–10000, https://doi.org/10.1364/OE.18.009991 20 Zhao S, Miao Y, Chai R, Zhao J, Bai Y, Wei Z and Ren S 2022 High-Precision Electrical Impedance us 21 Tomography for Electrical Conductivity of Metallic Materials Advances in Materials Science and 22 Engineering 2022 e3611691, https://doi.org/10.1155/2022/3611691 23 24 Barnkob R, Cierpka C, Chen M, Sachs S, Mäder P and Rossi M 2021 Defocus particle tracking: a 25 comparison of methods based on model functions, cross-correlation, and neural networks 26 27 28 29 an Meas. Sci. Technol. 32 094011, https://doi.org/10.1088/1361-6501/abfef6 Barredo Arrieta A, Díaz-Rodríguez N, Del Ser J, Bennetot A, Tabik S, Barbado A, Garcia S, Gil- Lopez S, Molina D, Benjamins R, Chatila R and Herrera F 2020 Explainable Artificial Intelligence 30 (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI Information 31 32 Fusion 58 82–115, https://doi.org/10.1016/j.inffus.2019.12.012 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 8 of 76 1 2 3 4 2.1. LIDAR Imaging 5 Young-Jin KIM, Daehee KIM 6 Korea Advanced Institute of Science and Technology (KAIST), South Korea 7 [[email protected], [email protected]] 8 pt 9 10 Status 11 LIDAR (LIght Detection and Ranging) is one of the most discussed 3D machine vision 12 technologies due to their growing demands in autonomous vehicles [1-3], object detection in smart 13 manufacturing [4-6], human-machine interaction for collaborative robots [6-9], automated guided cri 14 15 vehicles in smart factories, which commonly requires machine intelligence [10,11]. LIDAR has been 16 also the prerequisite in other broad applications of geodetic survey, formation flying of smart satellites, 17 environmental monitoring of chemical leakages, and understanding of global warming effects. General 18 19 machine vision technologies are split into 2D and 3D imaging; LIDAR technologies currently span over 20 a large part of 3D machine vision over passive-type stereovision, active-type optical triangulation, and us 21 active-type interferometry, as shown in Fig. 1. The principle of LIDAR is strongly dependent on the 22 required measurement range and precision. In most LIDAR applications requiring mm-level precision, 23 24 fast and simple geometrical optics and time-of-flight detection principles are readily applied. However, 25 when the target precision approaches sub-millimetre or even down to a few micrometre-level, the 26 27 28 29 an time-of-flight detection resolution or the wave nature of light, optical diffraction starts to limit the attainable precision. Therefore, the other wave nature of light, the optical interference, should be introduced for realizing sub-micrometre-level precision. 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 46 47 48 49 Figure 1. LIDAR as a 3D machine vision technology. LIDARs currently span over a large part of 3D machine vision over passive stereovision, 50 active-type optical triangulation, and active-type interferometry. LIDAR image courtesy USGS. Machine vision image courtesy Cognex Corp. 51 ce 52 The basic principle of LIDAR was first demonstrated in 1961 , shortly after the invention of 53 54 the laser; 3D imaging LIDARs determine the absolute distances from the LIDAR to the target objects by 55 measuring the time-of-flight of the reflected or back-scattered photons from the objects and 56 multiplying the times-of-flight (t) by the speed-of-light (c = 299,792,458 m/s), one of the fundamental Ac 57 58 physical constants, as shown in Fig. 2(a) [13, 14]. The time-of-flight can be detected by detecting 59 ultrashort light pulses having a pulse duration of nanosecond (10-9 s), picosecond (10-12 s) or even down 60 to femtoseconds (10-15 s). The state-of-the-art photodetectors and event timers support Page 9 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 picosecond-level time resolution, which is converted into 0.15 mm level precision considering the 4 5 double-path configuration. 6 7 8 pt 9 10 11 12 13 cri 14 15 16 17 18 19 20 us 21 22 23 24 25 26 27 28 29 an 30 Figure 2. Schematics of ToF-based LIDAR 3D imaging techniques. This figure represents LIDARs’ system configuration, signal processing 31 procedure and distance calculation method of (a) Direct ToF, (b) FMCW (Frequency-Modulated Continuous Wave) ToF, (c) AMCW (Amplitude- Modulated Continuous Wave) ToF LIDAR, respectively. 32 dM 33 34 For higher sub-millimetre-level precision, the optical interference can be introduced as 35 demonstrated in FMCW (Frequency-Modulated Continuous Wave) LIDAR [16, 17], as shown in Fig. 36 2(b). A fast wavelength-swept laser is used as the light source in the FMCW LIDAR system. The laser 37 beam is split into two parts; one works as the measurement beam that is transmitted to the target (Tx) 38 39 and the returns from the target (Rx), while the other beam works as the reference (local oscillator (LO)) 40 without any travel. These two beams are recombined so as to form the optical beat frequency, fbeat, at 41 the high-speed photodetector. Because the time-of-flight to the target makes the fRX and fLO different, 42 43 FMCW can detect the fbeat as the difference between fRX and fLO; therefore, the fbeat can be converted pte 44 to the time-of-flight t for the distance measurement. Because FMCW utilizes the coherent nature of 45 the laser beam, the measurement range and attainable precision are strongly dependent on the 46 spectral linewidth and signal-to-noise ratio (SNR) of the fbeat, as shown in Fig. 2(b). The spectral 47 48 linewidth and SNR are governed by the coherence degree of the swept laser in use and other system 49 background noise. 50 For the time-of-flight detection with lower timing jitters, high-frequency amplitude 51 ce 52 modulation can be utilized with internal injection current modulation or external electro-optic 53 modulators, instead of using short laser pulses with a fast photodetector, as shown in Fig. 2(c). 54 Periodic sinusoidal waves (with the modulation frequency of fAM) can be generated with a simple 55 56 amplitude modulator in front of the laser; then, the phase difference () between the reflected or Ac 57 back-scattered beam (Rx) and the local reference beam (Tx) can be measured at the phase meter. The 58 resulting phase difference can be converted to the time-of-flight information t for the distance 59 measurement. In AMCW (Amplitude-Modulated Continuous Wave) LIDAR, however, the phase 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 10 of 76 1 2 3 ambiguity hinders the absolute determination of the target distances; therefore, some objects having 4 5 complicated large-stepped surface structures or having fast motions cannot be measured. 6 7 Current and Future Challenges 8 The measurement range, precision, and speed are the three key performances of 3D LIDAR pt 9 10 imaging technologies. Therefore, we will discuss four critical challenges in these three key 11 performances for state-of-the-art LIDAR systems based on direct TOF, FMCW TOF and AMCW TOF 12 detections. The first challenge is in the beam scanning mechanism. All LIDAR systems are based on the 13 TOF detection of a single point at a time; therefore, a beam scanning mechanism should be involved cri 14 15 for reconstructing the 3D information of the object. Wide-angular mechanical scanning with rotating 16 mirrors, small-angular mechanical scanning with MEMS (Micro-electro-mechanical system) mirrors 17 and OPA (Optical Phased Array) scanners are the available scanning mechanisms. An ideal 18 19 scanning system should have a wide scanning range, high angular resolution, high scanning speed, high 20 reliability, and low production cost at the same time, which cannot be realized with a single system. us 21 The second challenge is in multi-path interference (MPI). Because the laser beam has a physical beam 22 23 size and the beam is later expanded during the propagation, the returned beam can come from 24 different sections of the target objects or even from intermediate obstacles. This MPI effect should be 25 suppressed by post-processing, whose algorithms are different for different LIDAR principles. The third 26 27 28 29 an challenge is in 3D object detection. The LIDAR provides the object information in the form of 3D point clouds only. Therefore, the 3D point data should be processed for easier object detection with background noise removal. However, the distance information may not be enough for object detection 30 because one cannot tell whether the target object is a human being or a static obstacle. The fourth 31 32 challenge lies in weather and environmental sensitivity. The water vapours can absorb, scatter, or dM 33 reflect the laser beam in different directions. Therefore, the LIDARs can be strongly influenced by wet 34 targets, heavy rain or heavy fog. For the more ubiquitous application of LIDARs, the above-mentioned 35 four challenges should be overcome. 36 37 38 Advances in Science and Technology to Meet Challenges 39 Regarding the four technological challenges of LIDAR, there have been a series of important 40 41 advances in the last two decades. Regarding the first beam scanning challenge, the advanced OPA 42 concepts have been realized with the aid of semiconductor lithography processes; one concept 43 pte realized the active 2D phase-shifter array by combining the MEMS actuator layer with sub-wavelength 44 grating layer, and the other concept by realizing the on-chip optical-waveguide OPA. Regarding the 45 46 second challenge on MPI and the third challenge of 3D object detection, the introduction of 47 reinforcement learning or other artificial intelligence technologies could resolve the key issues in 48 distinguishing the unwanted multi-path information from the target information or in recognizing the 49 50 3D objects out of the numerous point clouds. About the fourth challenge of weather/environmental 51 sensitivity, other types of sensors that are immune to environmental conditions can be used in parallel ce 52 for making appropriate decisions. GPS, RADARs, ultrasonic sensors, and infrared sensors can work for 53 this purpose. The partial data loss in LIDAR due to environmental issues can be reconstructed with the 54 55 aid of machine learning technology. 56 Ac 57 Concluding Remarks 58 59 LIDARs have received extreme attention in the last decade owing to the solid technological 60 trends in machine intelligence for autonomous vehicles, Industry 4.0, smart factories, collaborative Page 11 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 robots, and artificial intelligence. Therefore, many non-precedented research efforts and investments 4 5 have been addressed in terms of system hardware, software, computing powers and searching for 6 business chances. These have resulted in consistent performance improvement of 3D LIDAR imaging 7 with real market opportunities. Nowadays, many LIDAR market leaders are waiting for the tipping 8 points where real business chances are open. Three key requirements, the measurement range, pt 9 10 precision and speed, that the business model requires will determine who is the first market leader. 11 Then, the other groups will try to catch up with them with advanced concepts. By this series of market 12 competition, we will be able to enjoy an opulent lifestyle with the bright future of LIDAR technologies 13 in the next decade. cri 14 15 16 Acknowledgements 17 This work was supported by the National Research Foundation of Korea 18 19 (2020R1A2C210233813, 2021R1A4A1031660), KAIST UP program. 20 us 21 References 22 23 1. Santiago R., and Maria B.-G., ‘An overview of LIDAR imaging systems for Autonomous 24 Vehicles’, Applied Sciences, Vol. 9, no. 19, MDPI, 4093, Sep. 2019. doi:10.3390/app9194093. 25 2. Joshua R., Julian T., Yoann A., Stephen M., and Vivek. K. G., ‘Advances in single-photon Lidar for 26 27 28 29 an autonomous vehicles: Working Principles, Challenges, and Recent Advances’, IEEE Signal Processing Magazine, Vol. 37, no. 4, IEEE, pp. 62-71, June 2020. doi: 10.1109/MSP.2020.2983772. 30 3. Liang W., Yihuan Z., and Jun W., ‘Map-based localization method for autonomous vehicles 31 32 using 3D-LIDAR’, IFAC-PapersOnLine, vol. 50, no. 1, Elsevier, pp. 276-281, Oct. 2017. doi: dM 33 10.1016/j.ifacol.2017.08.046. 34 4. Niclas V., Ozan U., Ke L., Luc V. G., and Dengxin D., ‘End-to-End optimization of LiDAR beam 35 configuration for 3D object detection and localization’, IEEE Robotics and Automation Letters, 36 37 vol. 7, no. 2, IEEE, pp. 2242 – 2249, Jan. 2022. doi: 10.1109/LRA.2022.3142738. 38 5. Ulrich W., and Peter B., ‘Plant detection and mapping for agricultural robots using a 3D LIDAR 39 sensor’, Robotics and Autonomous Systems, vol. 59, no. 5, Elsevier, pp. 265-273, May 2011. 40 41 doi: 10.1016/j.robot.2011.02.011. 42 6. Shuting W., Liquan J., Jie M., Yuanlong X., and Han D., ‘Training for smart manufacturing using 43 pte a mobile robot-based production line’, Frontiers of Mechanical Engineering, vol. 16, Springer, 44 pp. 249 – 270, Apr. 2021. doi: 10.1007/s11465-020-0625-z. 45 46 7. Zheng L., Fu Z., and Xiaoping H., ‘Low-Cost Retina-Like Robotic Lidar Based on 47 Incommensurable Scanning’, IEEE/ASME Transactions on Mechatronics, vol. 27, no. 1, IEEE, 48 Feb. 2021. doi: 10.1109/TMECH.2021.3058173. 49 50 8. Zhi Y., Li S., Tom D., and Nicola B., ‘Multi-sensor Online Transfer Learning for 3D LiDAR-Based 51 Human Detection with a Mobile Robot’, 2018 IEEE/RSJ International Conference on Intelligent ce 52 Robots and Systems, IEEE, Jan. 2019. doi: 10.1109/IROS.2018.8593899 53 9. Kai-Tai S., Chien-Wei C., Li-Ren K., Yu-Xuan S., and Ching-Hao M., ‘Autonomous Docking in a 54 55 Human-Robot Collaborative Environment of Automated Guided Vehicles’, 2020 International 56 Automatic Control Conference, IEEE, Dec. 2020. doi: 10.1109/CACS50047.2020.9289713. Ac 57 10. Di S., Guang-Mao T., and Jiaqi L.,’ Real-time localization measure and perception detection 58 59 using multi-sensor fusion for Automated Guided Vehicles’, 2021 40th Chinese Control 60 Conference, IEEE, Oct. 2021. doi: 10.23919/CCC52363.2021.9550235. AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 12 of 76 1 2 3 11. Seigo I., Shigeyoshi H., Mitsuhiko O., Hiroyuki M., and Masaru O., ‘Small imaging depth LIDAR 4 5 and DCNN-based localization for automated guided vehicle’, Sensors, vol. 18, no. 1, MDPI, Jan. 6 2018. doi: 10.3390/s18010177. 7 12. "New Radar System", Odessa American, Feb. 1961. 8 13. Joohyung L., Young-Jin K., Keunwoo L., Sanghyun L., and Seung-Woo K., ‘Time-of-flight pt 9 10 measurement with femtosecond light pulses’, Nature photonics, vol. 4, Springer, pp. 716-720, 11 Aug. 2010. doi: 10.1038/nphoton.2010.175. 12 14. Robert A. Lamb, ‘A technology review of time-of-flight photon counting for advanced remote 13 sensing’, 2010 SPIE Defense, Security, and Sensing, vol. 7681, SPIE, 768107, Apr. 2010. doi: cri 14 15 10.1117/12.852138. 16 15. Ivan Prochazka, ‘Semiconducting single photon detectors: the state of the art’, Physica status 17 solidi (c), Wiley, Mar. 2005. doi: 10.1002/pssc.200460834. 18 19 16. F. R. Giorgetta, E. Baumann, K. Knabe, I. Coddington, and N. R. Newbury, ‘High-resolution 20 Ranging of a Diffuse Target at Sub-Millisecond Intervals with a Calibrated FMCW Lidar’, CLEO us 21 2012, IEEE, pp. 1-2, May 2012. doi: 10.1364/CLEO_SI.2012.CF3C.2. 22 23 17. Fumin Z., Lingping Y., and Xinghua Q., ‘Simultaneous measurements of velocity and distance 24 via a dual-path FMCW lidar system’, Optics Communications, vol. 474, no. 1, Elsevier, Nov. 25 2020. doi: 10.1016/j.optcom.2020.126066. 26 27 28 29 18. an John P. Godbaz, Adrian A. Dorrington, and Michael J. Cree, ‘Understanding and Ameliorating Mixed Pixels and Multipath Interference in AMCW Lidar’, TOF Range-Imaging-Cameras, Springer, pp. 91-116, Jan. 2013. doi: 10.1007/978-3-642-27523-4_5. 30 19. Dingkang W., Lenworth T., Sanjeev K., Yingtao D., and Huikai X., ‘A Low-Voltage, Low-Current, 31 32 Digital-Driven MEMS Mirror for Low-Power LiDAR’, IEEE Sensors Letters, vol. 4, no. 8, IEEE, dM 33 5000604, July 2020. doi: 10.1109/LSENS.2020.3006813. 34 20. Christopher V. Poulton, Peter Russo, Erman Timurdogan, Michael Whitson, Matthew J. Byrd, 35 Ehsan Hosseini, Benjamin Moss, Zhan Su, Diedrik Vermeulen, and Michael R. Watts, ‘High- 36 37 Performance Integrated Optical Phased Arrays for Chip-Scale Beam Steering and LiDAR’, CLEO 38 2018, OSA, May 2018. doi: 10.1364/CLEO_AT.2018.ATu3R.2. 39 40 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 Page 13 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 2.2. Ultrasound Propagation Imaging 5 Chen Ciang Chia1 and Jung-Ryul Lee2 6 Dept. Aerospace Eng., KAIST, Daejeon, Republic of Korea 7 [1 [email protected]; 2 [email protected]] 8 pt 9 10 Status 11 Since the magnetostriction excitation of elastic waves by James Prescott Joule in 1847 , 12 ultrasound rapidly became a major modality for nondestructive testing and evaluation (NDT&E) of 13 aero-mechanical structures. Continuous use and adoption of thin-walled design for these safety- cri 14 15 critical structures lead to active research of ultrasound propagation imaging (UPI) of thin-walled 16 structures using guided waves and bulk waves (see Figure 1 and , for differences). In the recent 17 decade, we have witnessed the imaging of large (about 1 m2 ) region of interest (ROI) at high speed 18 19 (4 m2 per minute or higher with compressive sensing) due to the availability of galvanometric laser 20 scanners. The great applicability of UPI is demonstrated by the successful visualization of all common us 21 flaw types associated with conventional materials and the hybrid use of new functional materials, 22 including delamination and crushed honeycomb core. Research focus has also shifted from 23 24 geometrically simple specimens, such as isotropic metallic plates, to in-situ inspection of full-scale 25 structures, such as in-service airplanes and wind turbines. Aside from flaw visualization, UPI also has 26 27 28 29 an great potential for material evaluation and wave physics study. It is used in topological acoustics to visualize wave propagation in metamaterials with a specific acoustic property, such as directionality. Similar studies for “natural” materials, i.e., materials without repetitive features like 30 metamaterials, have also been validated using UPI, for example, for acoustic superlensing, trapping, 31 and cloaking. 32 dM 33 Despite significant advances, material degradation that doesn’t have an abrupt material 34 discontinuity, such as the weakening of interfacial adhesion , the onset of material fatigue , and 35 incipient thermal degradation, remained an open problem for UPI. With the widespread use of UPI in 36 37 research laboratories worldwide, improving the UPI for more industrial acceptance would logically be 38 the next step. Potential targets include silicon wafers at one end of the physical scale, with aircraft, 39 petrol-chemical tanks, and bridges at the other end. More focus could be directed toward large-scale 40 structures because the micro/nanomanufacturing environment is laboratory-like. Along this line, for 41 42 minimization of system footprint and flexibility of operation without any scanning scaffolding, an ideal 43 pte UPI system should be an angular-scanning pump-probe system. With a bolder imagination, ultimately, 44 it could be a mobile cobot-borne or airborne system equipped with a high-speed camera for ultrasound 45 46 measurement. Diverse challenges originated from different aspects of the UPI can be identified, and 47 those related to hardware are addressed in the next section. 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 14 of 76 1 2 3 4 5 6 7 8 pt 9 10 11 12 13 cri 14 15 16 17 18 19 20 us 21 22 23 24 25 26 27 28 29 an 30 31 32 dM 33 34 35 Figure 1. Measurement principle of ultrasound propagation imaging (UPI) using (a) guided waves and (b) bulk waves. Application (a) shows 36 hidden wall thinning damages in an aluminium plate (Reprinted from , Copyright (2020), with permission from Elsevier). Application (b) 37 shows hidden corrosions in a steel plate (Reprinted from , Copyright (2022), with permission from Techno-Press). 38 39 Current and Future Challenges 40 Currently, there are many variants of UPI, distinguishable according to the use of guided or 41 42 bulk waves; angular, translation, or robotic scanning; and functionality for flaw detection, material 43 evaluation, or wave study. Their common challenge is the limited signal-to-noise ratio (SNR) when pte 44 measuring surface displacement (or velocity) from attenuative materials, such as thick composites and 45 additively manufactured parts. Even with measurement averaging, getting a clear image for such 46 47 materials above 10 mm thickness is often difficult. For Lamb waves-based UPI, frequency optimization 48 is another challenge affecting both data acquisition and result processing. Theoretical or numerical 49 analysis could be done if the material properties of the specimen are known; else, it becomes an 50 51 educated-guessing process. Furthermore, when inspecting geometrically complex structures, it is ce 52 difficult to effectively extract useful information from the complicated interferences of multimodal and 53 dispersive Lamb waves. For angular-scanning UPI, large and complex three-dimensional (3D) imaging 54 planes may suffer from pincushion distortion. In addition, if the measurement is performed by angular- 55 56 scanning a laser Doppler vibrometer (LDV), the signal strength could be inconsistent or lost with Ac 57 increasing laser incident angles. Increasing the stand-off distance (SoD) could reduce the incident 58 angle, but the advantage gained would be offset by the reduction of its numerical aperture. 59 60 Page 15 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 We envision that the UPI will advance according to the timeline given in Figure 2 in the near 4 5 future and will let us visualize flaws and distribution of material properties in large and complex 6 structures by synchronously scanning both the pump and probe lasers using an angular scanner in an 7 autonomous fashion. The above-stated challenges of SNR, image-plane distortion, and inconsistent 8 LDV signal strength remain for this future UPI. Some of the challenges are even greater. To image pt 9 10 material properties, the distortion of the scanner must be reduced, and a confocal state or a micron- 11 scale displacement between the pump and probe laser focus must be maintained, at least within a few 12 meters of SoD regardless of the angular displacement. To inspect 3D curved or multifaceted structures, 13 keeping a normal incident angle for the lasers is critical. Even if the ultrasound were successfully cri 14 15 measured, a subsequent challenge is to geometrically morph the imaging plane according to the 16 specimen surfaces so that an intuitive interpretation of the result is possible. Ultimately, the UPI will 17 be mobile cobot-borne or airborne by unmanned aerial vehicles. Foreseeable challenges include the 18 19 miniaturization of system packaging suitable for deployment in the industrial environment, 20 considering the confined space, extreme temperatures, vibration, dust, precipitation, etc. us 21 Measurement must also be completed at a photographic speed, perhaps without any time-consuming 22 23 grid point scanning. 24 UPI inspection of large 25 Establishment of structures with complex 3D surfaces (2025?) 26 Lamb Wave 27 28 29 Equation (1917) Establishment of Rayleigh Wave an Laser Doppler vibrometry (1964) Laser-based UPI inspection of large structures (2004) Autonomous UPI inspection cobots (2030?) Pulsed 30 Equation (1885) light Adoption of 31 Discovery of generation robotic arm for Machine learning- scanning (1985) 32 piezoelectric of waves assisted UPI dM in solids analysis (2018) 33 effect (1880) (1961) Adoption of 34 Discovery of galvanometric Autonomous 35 magnetostriction First scanner (1972) navigation of 36 effect (1847) laser Visualization of robots (2010) (1958) 37 Ultrasound ultrasound Discovery of Resurgence of 38 Doppler inspection of wavefield (1972) artificial intelligence solids 39 effect (1842) (1929) (2010) 40 … 41 The 19th century 1930 1950 1970 1990 2010 2030 42 Figure 2: Roadmap of the key developments related to UPI in the past and in the near future. 43 pte 44 Advances in Science and Technology to Meet Challenges 45 46 Increment of SNR is currently performed using a continuous steady-state excitation and 47 optimizing the pump laser’s spatial and/or temporal profiles. These technologies come along with 48 some limitations; thus, there is still a craving for better transducers. More research to improve SNR, 49 50 for example, by fully exploiting existing or new functional materials or by developing a laser with 51 an arbitrarily tunable wavelength for optimum depth penetration in specimens , is very much ce 52 welcomed. For frequency optimization without material information, spectroscopy imaging could be 53 done during result processing, provided that the ultrasound is measured using a broadband 54 55 transducer. Otherwise, with appropriate material information, it is good to optimize the imaging 56 frequency at a narrow band. Perhaps better still, at one selected ultrasound mode –, for easier Ac 57 image processing. Note that the frequency linearity of the transducer and related equipment is 58 59 important because the sensitivity of UPI could be increased for material evaluation through nonlinear 60 ultrasound , similar to that used in for fatigue evaluation. AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 16 of 76 1 2 3 Distortion of the imaging plane due to large angular displacement in the scanner can be 4 5 corrected through better control. Distortion caused by 3D specimen surfaces must be corrected 6 using depth-of-field information, which could be measured using a depth sensor such as LIDAR or 7 surface profiler. It appears that inconsistent or loss of signal at large laser incident angles will remain 8 the biggest challenge. Before a solution emerges, it is necessary to demarcate the 3D surfaces into pt 9 10 multiple smaller ROIs according to the limitation of laser incident angle and then perform imaging for 11 each ROI after matching the normal angle of the imaging plane with that of the ROI. An algorithm is 12 needed to collect the results from all ROIs and “stitch” them into the final result. Considering a large 13 number of ROI for large 3D structures, delegating the laborious imaging works to a mobile cobot-borne cri 14 15 or airborne UPI must be considered. To realize this, we need to at least miniaturize the system 16 components, such as the LDV and pump laser. High-precision position and attitude reference 17 system must be established so that the UPI platform has sufficient environmental awareness to 18 19 perform ROI demarcation and subsequent imaging. For the reduction of imaging time, a multi-sensing- 20 points LDV or a high-speed camera , could be used to measure the ultrasound wavefield, us 21 but the sampling/frame rate, spatial resolution, sensitivity, and vibration immunity must be 22 23 significantly improved. 24 25 Concluding Remarks 26 27 28 29 an The UPI is perhaps the most powerful ultrasonic imaging tool available for flaws visualization and material evaluation of plate-like specimens in the laboratory. The main challenges to getting the UPI into the field for imaging of large-scale thick-walled 3D structures are identified, i.e., the increment 30 of measurement SNR, frequency linearity of the measurement system, and mode selectivity of 31 32 transducers. But, perhaps, the game-changing technology is the realization of remote ultrasound dM 33 measurement at large incident angles or wireless ultrasonic propagation sensing. These 34 advancements, together with the growth in related fields, including robotics, signal processing, and 35 deep learning, would prime the evolution of next-generation autonomous UPI. 36 37 38 Acknowledgements 39 This research was supported by the Brain Pool program funded by the Ministry of Science and 40 41 ICT through the National Research Foundation of Korea (2021H1D3A2A01100011) and by Defense 42 Acquisition Program Administration and the Agency for Defense Development of the Republic of Korea 43 pte under grant no. 99-402-805-032 (The integration of wireless sensing into aerial vehicles for structural 44 damage identification). 45 46 47 References 48 J. P. Joule, ‘XVII. On the effects of magnetism upon the dimensions of iron and steel bars’, 49 50 Lond. Edinb. Dublin Philos. Mag. J. Sci., vol. 30, no. 199, pp. 76–87, Feb. 1847, doi: 51 10.1080/14786444708645656. ce 52 C. C. Chia, S. Y. Lee, M. Y. Harmin, Y. Choi, and J.-R. Lee, ‘Guided Ultrasonic Waves Propagation 53 Imaging: A Review’, Meas. Sci. Technol., vol. 34, no. 5, p. 052001, Feb. 2023, doi: 54 55 10.1088/1361-6501/acae27. 56 H. Ahmed, A. Mohsin, S.-C. Hong, J.-R. Lee, and J.-B. Ihn, ‘Robotic laser sensing and laser mirror Ac 57 excitation for pulse-echo scanning inspection of fixed composite structures with non-planar 58 59 geometries’, Measurement, vol. 176, p. 109109, May 2021, doi: 60 10.1016/j.measurement.2021.109109. Page 17 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 D.-Y. Bae and J.-R. Lee, ‘Development of single channeled serial-connected piezoelectric sensor 4 5 array and damage visualization based on multi-source wave propagation imaging’, J. Intell. 6 Mater. Syst. Struct., vol. 27, no. 13, pp. 1861–1870, Nov. 2015, doi: 7 10.1177/1045389X15615969. 8 E. B. Flynn and G. S. Jarmer, ‘High-Speed, Non-Contact, Baseline-Free Imaging of Hidden pt 9 10 Defects Using Scanning Laser Measurements of Steady-State Ultrasonic Vibration’, in Proc. 11 IWSHM 2013, Stanford, CA, USA: Destech Publications, Sep. 2013, pp. 1186–1193. 12 H. Xue, Y. Yang, and B. Zhang, ‘Topological acoustics’, Nat. Rev. Mater., vol. 7, no. 12, pp. 974– 13 990, Dec. 2022, doi: 10.1038/s41578-022-00465-6. cri 14 15 M. Schaeffer, G. Trainiti, and M. Ruzzene, ‘Optical Measurement of In-plane Waves in 16 Mechanical Metamaterials Through Digital Image Correlation’, Sci. Rep., vol. 7, no. 1, p. 42437, 17 Feb. 2017, doi: 10.1038/srep42437. 18 19 F. Legrand et al., ‘Cloaking, trapping and superlensing of lamb waves with negative refraction’, 20 Sci. Rep., vol. 11, no. 1, p. 23901, Dec. 2021, doi: 10.1038/s41598-021-03146-6. us 21 L. Li, J. Jiao, X. Gao, Z. Jia, B. Wu, and C. He, ‘A review on nondestructive testing of bonding 22 23 interface using nonlinear ultrasonic technique’, Chin. Sci. Bull., vol. 67, no. 7, pp. 621–629, 24 2022, doi: 10.1360/TB-2021-0677. 25 G. Yan, S. Raetz, N. Chigarev, J. Blondeau, V. E. Gusev, and V. Tournat, ‘Cumulative fatigue 26 27 28 29 an damage in thin aluminum films evaluated non-destructively with lasers via zero-group-velocity Lamb modes’, NDT E Int., vol. 116, p. 102323, Dec. 2020, doi: 10.1016/j.ndteint.2020.102323. A. Shin, J. Park, H. Lee, Y. Choi, and J.-R. Lee, ‘Corrosion visualization under organic coating 30 using laser ultrasonic propagation imaging’, Smart Struct. Syst., vol. 29, no. 2, Art. no. 2, Feb. 31 32 2022. dM 33 A. D. Abetew, T. C. Truong, S. C. Hong, J. R. Lee, and J. B. Ihn, ‘Parametric optimization of pulse- 34 echo laser ultrasonic system for inspection of thick polymer matrix composites’, Struct. Health 35 Monit., vol. 19, no. 2, pp. 443–453, Jan. 2020, doi: 10.1177/1475921719852891. 36 37 J. Chen et al., ‘A Review of UltraHigh Frequency Ultrasonic Transducers’, Front. Mater., vol. 8, 38 2022, doi: 10.3389/fmats.2021.733358. 39 O. Saito, E. Sen, Y. Okabe, N. Higuchi, H. Ishizuki, and T. Taira, ‘Laser Wavelengths Suitable for 40 41 Generating Ultrasonic Waves in Resin-Coated Carbon Fiber Composites’, J. Nondestruct. Eval. 42 Diagn. Progn. Eng. Syst., vol. 3, no. 3, Apr. 2020, doi: 10.1115/1.4046719. 43 pte W. Wang and Y. Yang, ‘Generation of selective single-mode guided waves by d36 type 44 piezoelectric wafer’, Appl. Phys. Lett., vol. 120, no. 21, p. 214101, May 2022, doi: 45 46 10.1063/5.0091284. 47 H. Miao and F. Li, ‘Shear horizontal wave transducers for structural health monitoring and 48 nondestructive testing: A review’, Ultrasonics, vol. 114, p. 106355, Jul. 2021, doi: 49 50 10.1016/j.ultras.2021.106355. 51 M. A. A. Shahrim, C. C. Chia, H. R. Ramli, M. Y. Harmin, and J.-R. Lee, ‘Adaptive Mode Filter for ce 52 Lamb Wavefield in the Wavenumber-Time Domain Based on Wavenumber Response 53 Function’, Aerospace, vol. 10, no. 4, Art. no. 4, Apr. 2023, doi: 10.3390/aerospace10040347. 54 55 H. Yun, R. Rayhana, S. Pant, M. Genest, and Z. Liu, ‘Nonlinear ultrasonic testing and data 56 analytics for damage characterization: A review’, Measurement, vol. 186, p. 110155, Dec. 2021, Ac 57 doi: 10.1016/j.measurement.2021.110155. 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 18 of 76 1 2 3 B.-L. Qi, C.-H. Wang, D.-B. Guo, and B. Zhang, ‘A scanning distortion correction method based 4 5 on X – Y galvanometer Lidar system’, Chin. Phys. B, vol. 30, no. 4, p. 044206, Apr. 2021, doi: 6 10.1088/1674-1056/abcf42. 7 Y. Li, E. Dieussaert, and R. Baets, ‘Miniaturization of Laser Doppler Vibrometers—A Review’, 8 Sensors, vol. 22, no. 13, p. 4735, Jan. 2022, doi: 10.3390/s22134735. pt 9 10 J. M. Kilpatrick and V. B. Markov, ‘Full-Field Laser Vibrometer for Instantaneous Vibration 11 Measurement and Non-Destructive Inspection’, Key Eng. Mater., vol. 437, pp. 407–411, 2010, 12 doi: 10.4028/www.scientific.net/KEM.437.407. 13 H.-Y. Chang and F.-G. Yuan, ‘Visualization of hidden damage from scattered wavefield cri 14 15 reconstructed using an integrated high-speed camera system’, Struct. Health Monit., vol. 20, 16 no. 5, p. 1475921720940805, Oct. 2020, doi: 10.1177/1475921720940805. 17 N. V. Doan, N. S. Goo, Y. Ko, S. Seo, and M. Chung, ‘Design and Analysis of Micro-Vibration 18 19 Isolation System for Digital Image Correlation System-Based Structural Health Monitoring’, Int. 20 J. Aeronaut. Space Sci., vol. 23, no. 4, pp. 711–722, Sep. 2022, doi: 10.1007/s42405-022-00455- us 21 6. 22 23 J.Y. Jeon, D. Kim, G. Park, E. Flynn, T. Kang, S. Han, 2D-wavelet wavenumber filtering for 24 structural damage detection using full steady-state wavefield laser scanning, NDT & E International 25 116 (2020) 102343. https://doi.org/10.1016/j.ndteint.2020.102343. 26 27 28 29 an 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 Page 19 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 2.3. Luminescence Imaging 5 Teow Wee Teo1, Zeinab Mahdavipour1, Azizi Abdullah3, Bee Ee Khoo2 and Mohd 6 Zaid Abdullah2 7 1 8 TT Vision Technologies Sdn. Bhd., Plot 106, Hilir Sungai Keluang 5, Bayan Lepas Industrial Zone, pt 9 Phase 4, Penang 11900, Malaysia 2 10 School of Electrical and Electronics Engineering, Engineering Campus, Univerisiti Sains Malaysia, 11 14300 Nibong Tebal, Penang, Malaysia 12 3 Faculty of Information Sciences and Technology, Universiti Kebangsaan Malaysia, 43600 Bangi 13 Selangor, Malaysia cri 14 [[email protected], [email protected], [email protected], [email protected], [email protected]] 15 16 17 Status 18 One key challenge encountered in the PV (photovoltaic) industry is the removal of defective 19 crystalline Silicon solar cells before they are assembled to become PV modules. Among different types 20 of defects, micro-crack is one of the most common defects which occurs during the various stages of us 21 22 manufacturing. The trend of manufacturing thinner Silicon wafers makes PV cells vulnerable to micro- 23 cracks. This type of defect is typically invisible to naked eyes and therefore require specialised tools to 24 enable detection. The optical transmission (OT) method is one of the popular instruments used in 25 26 visual inspection of PV modules, particularly during the bare Silicon wafer manufacturing stage. 27 28 29 an However, the new diamond-wire slicing technology [2-3] leaves saw marks that can camouflage micro- cracks, thus reducing the effectiveness of OT. Emerging technique such as the transflection (TF) system are a very promising since this method is less vulnerable to interference caused by these saw marks. 30 31 Another two popular methods, especially for downstream segments of PV manufacturing, are the 32 photoluminescence (PL) and the electroluminescence (EL) imaging techniques. In addition to dM 33 micro-cracks, these technologies can also be used to detect other types of defects such as dark regions, 34 35 finger interruptions, stains, scratches, etc. Important components of OT, EL, PL and TF are shown 36 schematically in Figures 1 (a-d)(i) respectively. A good perspective overview of these imaging 37 technologies is published elsewhere. Meanwhile Figures (a-d)(ii) show examples of images 38 captured by OT, EL, PL and TF techniques respectively. Close examination of these figures revealed 39 40 that images produced by TL and EL are visually very complex and relatively noisy. This is most prevalent 41 for polycrystalline cells where different crystals produce luminescence of different intensity [8-9]. This 42 is the main drawback of TL and EL techniques. Hence, these technologies require a very sophisticated 43 pte 44 algorithm for image analysis and data automation. For this reason, TF offers a good alternative 45 solution as this modality is not influenced by the different crystalline structure. Unlike EL and PL, 46 unfortunately, TF is sensitive to micro-cracks only. Hence, EL and PL remain popular with PV 47 manufacturers as these technologies can map micro-crack and other types of manufacturing defects. 48 49 Nevertheless, the difficulty in interpreting EL and PL images remains an issue, and this field, constitutes 50 an active area of research. One recent development is the application of artificial intelligence (AI) and 51 ce deep learning (DL) such as the convolution neural network. The use of AI-driven software can help 52 53 enhance decision making and refine predictive analytics which are essential for efficient and accurate 54 image interpretation. 55 56 Ac 57 58 59 Current and Future Challenges 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 20 of 76 1 2 3 To-date there is no single ideal modality which is suitable for inspecting PV modules in 4 5 manufacturing-based environment. Though OT, EL, PL and TF provide valuable information about solar 6 cells or Silicon wafers, however, these imaging technologies are largely complementary. This means 7 that they have their own characteristics, strengths and weaknesses. Trends in the future may lead to 8 the development of a hybrid inspection system, with a combination of different types of luminescence pt 9 10 modalities. In the case of micro-crack, TF is very promising since this modality produces clean and less 11 noisy images. In contrary PL offers a good solution for other types of defects since this technology has 12 speed advantage compared to OT or EL. Therefore the fusion between TF and PL has many advantages 13 compared to the fusion of other modalities. The development of line-scan PL imaging system is cri 14 15 prerequisite as most currently deployed PL systems are based on area scan. While line-scan PL imaging 16 for solar cells currently exists , it would be significantly more challenging for Silicon wafers as PL 17 emission is significantly less intense. This poses a speed limitation especially for on-the-fly inspection 18 19 as much longer exposure time is required to compensate for the weak PL signal. For a hybrid TF and 20 PL to work harmoniously, speed is very crucial factor to be considered and these modalities have to us 21 share the same inspection station or space in the manufacturing line. While it may be possible to use 22 23 a single camera for both TF and PL, both techniques require very different illumination set-up. In this 24 case TF requires light source with wavelength longer than 1200 nm. In contrast PL operates at much 25 shorter wavelength, preferably less than 950 nm in order to avoid interference caused by Silicon 26 27 28 29 development. an emission which peaks at around 1040 nm. Currently, the design of such a system is active under 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 (i) pte (ii) 44 45 (a) 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 (ii) 60 Page 21 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 (i) 4 5 (b) 6 7 8 Figure 1 continued next page pt 9 10 11 12 13 cri 14 15 16 17 18 19 20 us 21 22 23 24 (i) 25 (ii) 26 27 28 29 an (c) 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 (ii) 46 (i) 47 48 (d) Figure 1. The schematics of (a) the optical transmission, (b) the electroluminescence, (c) the photoluminescence and (d) the transflection 49 methods. Important components are shown in (i) while (ii) shows one example of luminescence image produced by each imaging set-up. 50 The square box in each image depicts the area containing defect, in this case a micro-crack defect. 51 ce 52 Advances in Science and Technology to Meet Challenges 53 54 The availability of highly sensitive sensors in the near infrared (NIR) range such the indium 55 gallium arsenide (InGaAs) cameras could potentially enable high-speed acquisition of PL and TF 56 emissions. However, the limitations on resolution and cost factor of such cameras are hindrance to Ac 57 58 large scale installation. In future it’s envisioned that the high resolution but cheaper NIR sensor would 59 be available commercially as many major camera producers are actively investing and working to mass 60 produce such a device. This would speed-up the deployment of PV inspection system integrating TF AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 22 of 76 1 2 3 and PL modalities. Despite the positive outlook, the challenge of processing PL images remains, 4 5 especially with polycrystalline solar cells. While micro-cracks can be dealt effectively with TF, other 6 artefacts may not appear distinctively since they are camouflaged by complex polycrystalline 7 structures. Polycrystalline solar cell image is inherently more complex to process since it contains many 8 interfering artefacts such as the crystalline patterns, the dislocation clusters, etc. Recent development pt 9 10 in AI, in particular the DL framework, has a potential in solving this complicated image processing 11 problem as results from current research in this area suggest. For instance, Rahman and Haiyong 12 designed a multi-attention U-Net for classification of polycrystalline solar cells. Using a 5-fold cross- 13 validation technique, these authors reported an accuracy of 99.1 % when distinguishing good from cri 14 15 defective cells. In another study, Junjie et al. developed a hybrid neural-network-based defect 16 detection model that combines the advantages of ResNet152 and Xception networks. An accuracy of 17 96.2 % was reported by these authors when solving a binary classification problem. Nevertheless, the 18 19 performance of the algorithm dropped slightly to 92.1 % when multiclass classification problem was 20 attempted. More recently Binomairah et al. compared the performance of various convolution us 21 neural network (CNN) algorithms based on the You Only Look Once framework (YOLO). The algorithm 22 23 was designed to inspect several types of defects in monocrystalline cells. Results from this study 24 suggest that the heavy-weighted YOLO is the best performing algorithm, resulting in accuracy of 98.8 25 %. Nevertheless, this algorithm is relatively slow since it recorded a runtime of approximately 62.9 ms. 26 27 28 29 an In contrast the tiny-weighted YOLO with spatial pyramid pooling for improved machine learning resulted in accuracy of 91.0 % and runtime of 28.2 ms. Hence a trade-off between speed and accuracy is essential to in-order to arrive with acceptable solutions. Results from this research and other studies 30 suggest the potential in applying AI with deep learning capabilities in solving complex EL or PL images. 31 32 The combined improvements in the image acquisition systems and software would inherently yield dM 33 much improved inspection accuracies. Even though polycrystalline is less efficient that 34 monocrystalline, however, the former is relatively cheaper to produce compared to the latter. In 35 future, it’s predicted that the monocrystalline, particularly the Passivated Emitter Real Cell (PERC) solar 36 37 cells, would be a dominant Silicon used in assembling PV modules, possibly reaching more than 95 % 38 of the total PV production worldwide. As this trend continues, the challenges associated with 39 processing polycrystalline solar cells diminish as more manufacturers shift to producing 40 41 monocrystalline cells. Hence, the CNN driven solutions in the future would be targeting 42 monocrystalline instead of polycrystalline Silicon wafers and solar cells. 43 pte 44 Concluding Remarks 45 46 In summary the primary modalities used for luminescence imaging of Silicon wafers and PV 47 cells include EL, PL and very recently TF. Each of these modalities has technical advantages and 48 disadvantages. PL has slightly better merit compared to EL since the latter enables a fully contactless- 49 50 based inspection and rapid in-line instrumentation. However, both modalities produce complex 51 images which constitutes their main drawback. In contrast TF produces much cleaner image with less ce 52 noise. However, this technology is suitable for inspecting mechanically induced fractures especially 53 micro-cracks. This is the principal weakness with this imaging modality. In future such an inspection is 54 55 best conducted by the combination of different luminescence modalities or a hybrid system. In this 56 regard the integration between PL and TF offers distinct advantages over other available luminescence Ac 57 modalities. Advances in AI and DL together with hardware breakthrough further fuelled this interest, 58 59 enabling high-throughput and high-resolution image capturing at unprecedented performance and 60 Page 23 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 speed. For this reason, many leading PV processors continue to escalate their investment and support 4 5 for AI for different applications. 6 7 Acknowledgements 8 The authors acknowledge TT Vision Technologies Sdn. Bhd. for funding this research pt 9 10 (304.PELECT.6050420.T148) and matching grant from Universiti Sains Malaysia 11 (1001.PELECT.8070009). 12 13 References cri 14 15 Y. C. Chiou, and J. Z. Liu, “Recent Crack Detection of Multi-Crystalline Silicon Solar Wafer Using 16 Machine Vision Techniques,” Sensor Review., Vol. 31, pp. 154-165, March 2011. 17 A. Bidiville , K. Wasmer, J. Michler, P. M. Nasch, M. Van der Meer, and C. Ballif ,”Mechanisms of 18 19 Wafer Sawing and Impact on Wafer Properties,” Prog. Photovoltaics Res. Appl., Vol. 18, pp. 20 563–572, November 2010. us 21 Z. Li, P. Ge, W. Bi, C. Li, C. Wang, and J. Meng,” Influence of Silicon Anisotropy on Surface Shape 22 23 Deviation of Wafer by Diamond Wire Saw,” Materials Science in Semiconductor Processing., Vol. 24 133, pp. 105981, October 2021. 25 T. W. Teo, Z. Mahdavipour, and M. Z. Abdullah, “Design Of an Imaging System for Characterizing 26 27 28 29 Vol. 9, pp. 1097–104, May 2019. an Micro-Cracks in Crystalline Silicon Solar Cells Using Light Transflection,” IEEE J. Photovoltaics., C. G. Zimmermann, “Photoluminescence-Based Detection of Mechanical Defects in 30 Multijunction Solar Cells,” Journal of Applied Physics., Vol. 126, pp. 044503, July 2019. 31 32 W. Tang, Q. Yang, X. Hu, and W. Yan, “Convolution Neural Network Based Polycrystalline Silicon dM 33 Photovoltaic Cell Linear Defect Diagnosis Using Electroluminescence Images,” Expert Systems 34 with Applications., Vol. 202, pp. 35 Y. Fu, X. Ma, and H. Zhou, “Automatic Detection of Multi-Crossing Crack Defects In 36 37 Multi-Crystalline Solar Cells Based On Machine Vision,” Machine Vision and Applications., Vol. 38 32, pp. (60) 1-14, March 2021. 39 H. C. Sio, Z. Xiong, T. Trupke, and D. Macdonald, “Imaging Crystal Orientations In 40 41 Multicrystalline Silicon Wafers via Photoluminescence,” Applied Physics Letters., Vol. 101, pp. 42 082102, August 2012. 43 pte T. W. Teo, Z. Mahdavipour, and M. Z. Abdullah, “Recent Advancements in Micro-Crack 44 Inspection of Crystalline Silicon Wafers and Solar Cells,” Measurement Science and Technology., 45 46 Vol. 31, pp. 081001, May 2020. 47 Y. Fu, X. Ma, and H. Zhou, “Automatic Detection of Multi-Crossing Crack Defects In 48 Multi-Crystalline Solar Cells Based On Machine Vision,” Machine Vision and Applications., Vol. 49 50 32, pp. (60) 1-14, March 2021. 51 X. Zhang, Y. Hao, H. Shangguan, P. Zhang, and A. Wang, Detection of surface defects on solar ce 52 cells by fusing multichannel convolution neural networks, Infrared Phys. Technol., vol. 108, pp. 53 103334, 2020. 54 55 I. Zafirovska, M. K. Juhl, J. W. Weber, J. Wong, and T. Trupke, "Detection of Finger Interruptions 56 in Silicon Solar Cells Using Line Scan Photoluminescence Imaging," IEEE Journal of Ac 57 Photovoltaics, vol. 7, pp. 1496-1502, November 2017. 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 24 of 76 1 2 3 D. Feldman, K. Dummit, J. Zuboy, J. Heeter, K. Xu, and R. Margolis, “Spring 2022 Solar Industry 4 5 Update,” National Renewable Energy Lab (NREL/PR-7A40-82854), Golden, United States, May 6 2022. 7 M. R. U. Rahman and H. Chen, "Defects Inspection in Polycrystalline Solar Cells 8 Electroluminescence Images Using Deep Learning," IEEE Access, vol. 8, pp. 40547-40558, 2020. pt 9 10 J. Wang, L. Bi, P. Sun, X. Jiao, X. Ma, X. Lei, and Y. Luo, “Deep-Learning-Based Automatic 11 Detection of Photovoltaic Cell Defects in Electroluminescence Images”. Sensors, vol. 23 (1), pp. 12 297-318, 2022. 13 A. Binomairah, A. Abdullah, B.E. Khoo, Z. Mahdavipour, T.W Teo, N.S.M. Noor, and M.Z. cri 14 15 Abdullah, “Detection of microcracks and dark spots in monocrystalline PERC cells using 16 photoluminescene imaging and YOLO-based CNN with spatial pyramid pooling,” EJV 17 Photovoltaics., vol. 13(27), 2022. 18 19 20 us 21 22 23 24 25 26 27 28 29 an 30 31 32 dM 33 34 35 36 37 38 39 40 41 42 43 pte 44 45 46 47 48 49 50 51 ce 52 53 54 55 56 Ac 57 58 59 60 Page 25 of 76 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 1 2 3 4 2.4. Endoscopy 5 Dimitris K. Iakovidis, and Panagiotis Vartholomeos 6 Department of Computer Science and Biomedical Informatics, University of Thessaly, Papasiopoulou 7 2-4, Lamia, Greece 8 [[email protected], [email protected]] pt 9 10 11 Status 12 Endoscopy is a primarily optical imaging technique for screening internal structures of the 13 human body. It appeared in the early 1800’s, and since then, it is preferred because it enables direct cri 14 15 visual examination of tissues, and minimally invasive interventions. Today, the application of 16 endoscopy spans various systems of the human body, including the digestive, the respiratory, the 17 cardiopulmonary, the urinary, the reproductive, and the musculoskeletal system. Also, it can be 18 complemented by emerging imaging technologies offering enhanced information about the examined 19 20 tissues, in terms of detail and/or tissue composition. These include optical coherence tomography, us 21 near-infrared fluorescence imaging, confocal laser endomicroscopy, endocytoscopy, and 22 multispectral/hyperspectral imaging technologies [1,2]. 23 24 The endoscopic devices are evolving towards an improved trade-off between the size of the 25 endoscope and image fidelity, enhanced safety, less invasiveness, and convenience for both the 26 patients and their operators. Rigid Endoscopes (REs) and Flexible Endoscopes (FEs) have a sufficiently 27 28 29 an large tip to accommodate high-resolution image sensors, e.g., the diameter of a standard colonoscope is ~13 mm. Ultrathin FEs enable transnasal esophagogastroduodenoscopy and bronchoscopy, with 30 a very small diameter (~3-5mm). Such FEs are typically based on bundles of optical fibers for image 31 and light transmission. High-resolution images can be obtained by miniature FEs with diameters that 32 dM 33 can reach up to sub-millimeter level, by using a single fiber and a scanning laser light. Wireless 34 Capsule Endoscopes (WCEs) appeared in the early 2000’s. They are swallowable alternatives for the 35 examination of the gastrointestinal (GI) tract, with a size of a large vitamin pill. Most of them have a 36 length of 24 mm – 27.9 mm, a diameter of 10.8 mm – 13 mm, and a wide field of view ranging between 37 38 140o – 170o. The image resolution of such endoscopes is limited, not only by the size of the sensor, but 39 also by their energy requirements, e.g., for wireless image transmission. Various miniature robotic 40 endoscopic systems coping with the lack of navigation, biopsy, and targeted drug delivery of the 41 42 conventional WCE devices, have been proposed [3,5]. Robotic mechanisms have already been 43 integrated into commercially available RE/FE solutions, mainly offering enhanced precision and pte 44 convenience with respect to motion control and navigation [3,6]. Virtual Endoscopy (VE), performed 45 by 3D reconstruction of the internal body structures from Computed Tomography (CT) images, offers 46 47 non-invasive screening; however, although there are studies indicating that VE can provide a 48 comparable sensitivity with optical endoscopy for lesion detection, it involves radiation exposure, and 49 unlike optical endoscopy alternatives, it does not allow neither biopsy nor surgical interventions. 50 51 Artificial Intelligence (AI) provides tools for Clinical Decision Support (CDS) by detection/recognition of ce 52 findings, and in vivo visual measurements. Virtual/Augmented Reality (VR/AR) offers immersive 53 visualizations that can contribute to easier endoscope navigation, lesion recognition, and clinical 54 training. 55 56 Ac 57 58 59 60 AUTHOR SUBMITTED MANUSCRIPT - MST-120593 Page 26 of 76 1 2 3 4 5 6 7 8 pt 9 10 11 12 13 cri 14 15 16 17 Figure 1. Design concept of a wirelessly powered WCE for colonoscopy. Reproduced from. © IOP Publishing Ltd. All rights reserved. 18 19 Current and Future Challenges 20 The miniaturization of the endoscopes to have millimetre or even sub-millimetre diameters, us 21 22 while maintaining an imaging fidelity sufficient for diagnostic purposes, is still one of the major 23 challenges. Miniaturization does not refer only to the imaging sensors, but also to the tools required 24 for endoscopic interventions, as well as the design of the robotic systems to support safe endoscopic 25 26 procedures. Despite increasing interest in minimally invasive surgical techniques and related an 27 developments in flexible endoscopes and catheters, follow-the-leader motion remains elusive. 28 Following the path of least resistance through a tortuous environment requires the control of many 29 degrees of freedom. This typically results in large-diameter instruments. Furthermore, the capability 30 31 of miniaturised flexible endoscopes to apply sufficient lateral forces is currently limited by the low 32 stiffness of the mechanical structure. This is a critical challenge, which when solved will enable dM 33 effective physical interaction between the endoscope tool and the tissue, and it paves the way for 34 1 – We allow at 35 conducting a broad rangeFigureof interventional most two figures procedures. that are in this direction open Advancements roughly the size of this box. perspectives for less invasive screening procedures and interventions, as well as navigating in narrow 36 37 anatomical regions, such as those encoun

Use Quizgecko on...
Browser
Browser