DOCTORAL DISSERTATION Robustness of Feature Based Calibration in New Age 3D Applications DEEPAK DWARAKANATH July 2017 Submitted to the Faculty of Mathematics and Natural Sciences at the University of Oslo in partial fulfilment of the requirements for the degree of Philosophiae Doctor © Deepak Dwarakanath, 2017 Series of dissertations submitted to the Faculty of Mathematics and Natural Sciences, University of Oslo No. 1877 ISSN 1501-7710 All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, without permission. Cover: Hanne Baadsgaard Utigard. Print production: Reprosentralen, University of Oslo. Abstract There has been an increasing demand for multimedia systems in various areas. An innovative vision for multimedia systems has paved way for the advent of several interesting and useful applications, especially in the 3D arena. Such current and new age 3D applications based on single or multiple camera images can, for example, be seen in the field of vision based inspec- tion, mixed reality art performance, sports analytics, augmented reality and image metrology. For high quality performance in these applications, the underlying focus is on the quality of image based 3D reconstruction. To achieve high quality performance in image based 3D reconstruction, an accurate camera calibration is necessary. Camera calibration provides a priori knowledge about the camera’s intrinsic parameters (such as focal length, principal axes, skewness and lens distortion) and extrinsic parameters (such as spatial position and orientation). In certain application scenarios, traditional checkerboard calibration process (requires a checkerboard target) and marker-based calibration process (requires an identifiable marker) are impossible or inconvenient. In such cases, 3D systems have to rely on an alternate solution, i.e., Feature Based Calibration (FBC), where interesting feature points in the camera images are extracted and used for the calibration process. Therefore, the accuracy of FBC is an important factor defining the quality of single or multiple camera 3D systems. Although, the FBC can be integrated in 3D systems, there are several practical issues in- volved, e.g., (1) misalignment, arrangement and changes in the properties of one or more cam- eras; (2) misalignment of captured object scene; and (3) noisy feature points extracted from the images. Therefore, in this thesis, the aim is to explore the challenges in designing FBC to achieve high accuracy and robustness in 3D systems. In order to explore the influence of practical issues on FBC, relevant evaluation procedures that relates to specific application scenarios was setup. Extensive tests were carried out using both real and virtual datasets and simulations. The effects of camera misalignment, adoption of FBC, characterization of the state-of-the-art feature extractors and camera pose estimator were studied for obtaining an accurate and robust 3D reconstruction. The evaluation of results are dis- cussed by assessing the accuracy and robustness of FBC against practical issues. Consequently, tolerances for camera misalignment, operational limits of state-of-the-art feature extractors and an estimation of camera density to capture the scene are presented. Finally, recommendations are given for researchers and system developers to design better 3D systems considering prac- tical issues for their applications scenarios. iii Acknowledgements The completion of this thesis was possible with the support of many people. I would like to take this opportunity to thank all of them for this journey. With lots of gratitude, I would like to thank my advisors, Carsten Griwodz, Pål Halvorsen and Alexander Eichhorn for providing constant inspiration, guidance and supervision during my research work. All of them have been co-authors of some of the papers presented in the thesis. My sincere thanks to Jacob Lildballe - Image House PantoInspect A/S, Denmark, who was also a co-author for some of the papers presented in the thesis. I thank Simula Research Laboratory, University of Oslo and Norwegian Research Council for have given me a platform to start my research career as a PhD candidate. It was great to work with Sabita Maharjan, Rajwinder Panesar-Walawege, Shaukat Ali, Ahmed Elmokashfi and Thomas Kupka. I greatly appreciate the collaborative work with the master students, Kjetil Endal and Steffan Gullichsen. At PantoInspect A/S, Denmark, I wish to thank Lars Baunegaard With, Morten Langschwa- ger and Claus Hoelgaard Olsen for their valuable discussions and encouragement. The PantoIn- spect system was used as an application scenario to conduct research and publish the results. I thank Jan Friis Jorgensen, Ander Kühle and Ole Brydensholt - Image Metrology A/S, Denmark, and Christophe Mignot - Digital Surf, France, for their support and encouragement in the process of my research. The product from Digital Surf was used to validate a certain part of the results of the research. I have had opportunities to work in the related area of vision systems through small projects and hence, I would like to thank, A-Star Research and Development, Singapore and Qtehcnol- ogy A/S, Denmark. Specially, I thank my parents Sudharani A. and Dwarakanath G.R. for all their love and support in my life. I greatly thank Sphoorthi S.P., who took care of all the family affairs and provided invaluable support, in order to help me finish my research. I have received sincere support from the family in India, thanks to Chetak, Rashmi, Saanvi, Puttaswamaiah, Shashikala, Sujatha, Savitha, Shyla and many more. I would also like to thank the family in Denmark, Sri Sai Das, Kalpana Das, Raj Ponnambalam, Shilpa Kanakraj and Jayanth, for their constant support and encouragement. Finally, I once again thank Sri Sai Das for thorough proofreading of this thesis. v Contents I Overview xvii 1 Introduction 1 1.1 Application Scenarios . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Practical Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Goal and Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.4 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.5 Research Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.6 Main Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.6.1 Publications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.6.2 Software Development . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.7 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 1.8 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2 Preliminary Concepts 15 2.1 Image Based 3D Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.1.1 Camera Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.1.2 3D Reconstruction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.2 Deeper Look Into Application Scenarios . . . . . . . . . . . . . . . . . . . . . 23 2.2.1 Virtually Enhanced Real-life synchronizeD Interaction - ON the Edge (VERDIONE) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.2.2 An Integrated System for Soccer Analysis (BAGADUS) . . . . . . . . 28 2.2.3 PantoInspect Train Monitoring System (PTMS) . . . . . . . . . . . . . 31 2.2.4 Previz for On-set Production - Adaptive Real-time Tracking (POPART) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 2.2.5 Scanning Electron Microscopes Reconstruction (SEMRECON) . . . . 38 2.3 Conclusions for Preliminary Concepts . . . . . . . . . . . . . . . . . . . . . . 39 3 Feature Based Calibration (FBC) 41 3.1 Misalignment in Single Camera System . . . . . . . . . . . . . . . . . . . . . 42 3.1.1 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 3.1.2 Error Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 3.1.3 Error Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.1.4 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 3.2 Adoption of Feature Based Calibration . . . . . . . . . . . . . . . . . . . . . . 55 3.2.1 Proposed Re-calibration Methodology . . . . . . . . . . . . . . . . . . 57 vii 3.2.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 3.2.3 State-of-the-art FBC algorithms . . . . . . . . . . . . . . . . . . . . . 62 3.2.4 Accuracy of Measurements . . . . . . . . . . . . . . . . . . . . . . . . 62 3.2.5 Error Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 3.2.6 Resilience . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 3.2.7 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 3.3 Misalignment in Stereo Camera System . . . . . . . . . . . . . . . . . . . . . 68 3.3.1 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 3.3.2 Pure Translation Misalignment . . . . . . . . . . . . . . . . . . . . . . 71 3.3.3 Pure Rotation Misalignment . . . . . . . . . . . . . . . . . . . . . . . 71 3.3.4 Combined Misalignment . . . . . . . . . . . . . . . . . . . . . . . . . 74 3.3.5 Variable Object Size . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 3.3.6 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76 3.4 Conclusions for FBC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 4 Feature Extraction 79 4.1 State-of-the-art Feature Extractors . . . . . . . . . . . . . . . . . . . . . . . . 80 4.2 Robustness against Camera Intrinsic . . . . . . . . . . . . . . . . . . . . . . . 83 4.2.1 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 4.2.2 Accuracy Vs Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4.2.3 Image Blur . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 4.2.4 Lens Distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 4.2.5 Sensor Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90 4.2.6 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 4.3 Robustness against Camera Extrinsic . . . . . . . . . . . . . . . . . . . . . . . 97 4.3.1 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 4.3.2 2D Pixel Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 4.3.3 Camera Pose Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 4.3.4 Penality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106 4.3.5 3D Estimation Error . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 4.3.6 Comparative Performance . . . . . . . . . . . . . . . . . . . . . . . . 111 4.3.7 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 4.4 FBC using SIFT for Wide Baseline . . . . . . . . . . . . . . . . . . . . . . . . 119 4.4.1 Proposed Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 4.4.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123 4.4.3 Performance of Proposed Algorithm . . . . . . . . . . . . . . . . . . . 124 4.4.4 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 4.5 Conclusions for Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . 129 5 Pose Estimation 131 5.1 Sensitivity of Pose estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 5.1.1 Performance Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 5.1.2 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 5.1.3 Number of Feature Correspondence . . . . . . . . . . . . . . . . . . . 135 viii 5.1.4 Noise in Feature Correspondence . . . . . . . . . . . . . . . . . . . . 136 5.1.5 Sparsity of Feature Correspondence . . . . . . . . . . . . . . . . . . . 139 5.1.6 3D Reconstruction Metric Evaluation . . . . . . . . . . . . . . . . . . 142 5.1.7 Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142 5.2 Conclusions for Pose Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 145 6 Conclusions 147 6.1 Main Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 6.2 Practical Implications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 6.3 Practical Insight . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149 6.4 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150 II Research Papers 159 7 Paper I: Faster and More Accurate Feature-Based Calibration for Widely Spaced Camera Pairs 161 8 Paper II: Evaluating Performance of Feature Extraction Methods for Practical 3D Imaging Systems 169 9 Paper III: Study the Effects of Camera Misalignment on 3D Measurements for Efficient Design of Vision-Based Inspection Systems 177 10 Paper IV: Online Re-calibration for Robust 3D Measurement Using Single Camera- PantoInspect Train Monitoring System 193 11 Paper V: Robustness of 3D Point Positions to Camera Baselines in Markerless AR Systems 209 12 Poster I: 3-D Video Processing for Mixed Reality Art Performances 223 13 Poster II: 3D Multi-view Acquisition and Rendering System 225 14 Poster III: Multiple Camera Arrays for Real-time 3D Rendering Systems 227 ix List of Figures 2.1 Typical 3D system illustrating two different workflows for 3D applications, us- ing the knowledge of camera calibration. . . . . . . . . . . . . . . . . . . . . . 15 2.2 Pin-hole camera model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 2.3 Radial lens distortion. Undistorted image (left), barrel distortion (center), pin- cushion distortion (right) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.4 Tangential lens distortion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 2.5 Epipolar geometry stereo camera setup . . . . . . . . . . . . . . . . . . . . . . 20 2.6 Feature point correspondences in stereo images. . . . . . . . . . . . . . . . . . 21 2.7 Illustration of image rectification. . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.8 Illustration of depth estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . 23 2.9 World opera distributed stage performance. . . . . . . . . . . . . . . . . . . . 24 2.10 Illustration of VERDIONE capture and render subsystems. . . . . . . . . . . . 24 2.11 Multiple camera acquisition subsystem for VERDIONE. . . . . . . . . . . . . 25 2.12 Overall BAGADUS architecture. . . . . . . . . . . . . . . . . . . . . . . . . . 28 2.13 Camera setup in Alfheim soccer stadium. . . . . . . . . . . . . . . . . . . . . 29 2.14 PTMS: inspects defects on the pantographs mounted on electric trains. . . . . . 31 2.15 PTMS defects illustrated and shown on a pantograph. . . . . . . . . . . . . . . 32 2.16 User Interface of PTMS inspection system. . . . . . . . . . . . . . . . . . . . 33 2.17 Pantograph image analysis. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 2.18 POPART System. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 2.19 The 3D Point cloud of the real filming set. . . . . . . . . . . . . . . . . . . . . 36 2.20 The 3D reconstruction and surface analysis - waviness and roughness surfaces and ISO 25178 height parameters. . . . . . . . . . . . . . . . . . . . . . . . . 38 3.1 PTMS inspection scenario: world coordinates (Xw, Yw, Zw) and camera coor- dinates (Xc, Yc, Zc). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 3.2 Simulation procedure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3.3 Variation of error in 3D width measurements of the defects, due to changes in camera position and orientation about its camera center. . . . . . . . . . . . . . 46 3.4 Variation of error in 3D depth measurements of the defects, due to changes in camera position and orientation about its camera center. . . . . . . . . . . . . . 47 3.5 Projective geometric effects of camera tilt angle in PTMS. . . . . . . . . . . . 48 3.6 Linear model fitting and residual plots for variation of width error with camera translations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 xi 3.7 Linear model fitting and residual plots for variation of depth error with camera translations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 3.8 Curvilinear model fitting and residual plots for variation of width error with camera rotations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.9 Curvilinear model fitting and residual plots for variation of depth error with camera rotations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 3.10 Proposed feature based calibration for the PTMS. . . . . . . . . . . . . . . . . 57 3.11 Profile image with a representation of camera and world coordinates. . . . . . . 58 3.12 Defect identification and measurement. . . . . . . . . . . . . . . . . . . . . . 59 3.13 Evaluation of feature based calibration for PTMS. . . . . . . . . . . . . . . . . 60 3.14 Mean difference of width and depth measurements for two schemes. . . . . . . 63 3.15 Cumulative density function (CDf) for scheme 1 and 2. . . . . . . . . . . . . . 64 3.16 Resilience over pixel noise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 3.17 Resilience over pantograph vertical displacement (uplift). . . . . . . . . . . . . 66 3.18 Resilience over pantograph angular displacement (yaw). . . . . . . . . . . . . 66 3.19 Resilience over pantograph angular displacement (roll). . . . . . . . . . . . . . 66 3.20 Resilience over pantograph angular displacement (pitch). . . . . . . . . . . . . 67 3.21 Evaluation procedure for stereo camera misalignment. . . . . . . . . . . . . . . 69 3.22 Camera axes and rotations around X-Tilt, Y-Pan, Z-Roll. . . . . . . . . . . . . 69 3.23 Object and their projected stereo images. . . . . . . . . . . . . . . . . . . . . . 70 3.24 Variation of 3D error versus camera misalignment in terms of pure translations 72 3.25 Variation of 3D error versus camera misalignment in terms of pure rotations . . 73 3.26 Variation of Total 3D error versus camera misalignment in terms of translations and rotations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 3.27 Variation of Total 3D error averaged over range of misalignment versus object sizes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4.1 Illustraing feature extraction process - detection, description and matching, be- tween stereo pairs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 4.2 Evaluation Pipeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 4.3 Stereo images from various datasets low resolution 320x240. . . . . . . . . . . 87 4.4 Illustration of epipolar geometry. . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.5 Accuracy Vs Computational time. The post-fixes refers to the size of the im- ages: L-low resolution (320x240), M- medium resolution (640x480), H- high resolution (1280x960) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 4.6 Feature extraction on blurred (radius level 5) stereo images from Tromsø dataset with wide lens of L-low resolution 320x240. . . . . . . . . . . . . . . . . . . . 91 4.7 Performance of feature extractors for simulation of blur levels over various res- olutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 4.8 Feature extraction on barrel distortion (level 40%) stereo images from Microsoft dataset of L-low resolution 320x240. . . . . . . . . . . . . . . . . . . . . . . . 93 4.9 Performance of feature extractors for simulation of distortion levels over various resolutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 xii 4.10 Feature extraction on noisy (15dB) stereo images from Tromsø dataset with narrow lens and L-low resolution 320x240. . . . . . . . . . . . . . . . . . . . 95 4.11 Performance of feature extractors for simulation of noise levels over various resolutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 4.12 Scatterplots of matched feature points and 2D pixel error with 3D accuracy. . . 100 4.13 Experimental setup. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102 4.14 Cameras arranged in a circular configuration around the 3D model. . . . . . . . 103 4.15 The 3D models used for the experiment. From each model, 50 stereo image pairs are generated, corresponding to various baselines. . . . . . . . . . . . . . 103 4.16 The 2D error (Squared Sampson) based on epipolar constraint over varied base- lines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 4.17 Rotation error of stereo camera over varied camera baselines. . . . . . . . . . 107 4.18 Translation error of stereo camera over varied camera baselines. . . . . . . . . 108 4.19 Penalty values for all feature extractors. . . . . . . . . . . . . . . . . . . . . . 109 4.20 Mean 3D estimation error, categorized based on feature descriptors over varied camera baselines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110 4.21 Standard deviation of 3D estimation over varied baselines. . . . . . . . . . . . 111 4.22 Mean 3D estimation error, categorized based on feature detectors (SIFT, SURF and BRISK) over varied camera baselines . . . . . . . . . . . . . . . . . . . . 112 4.23 Mean 3D estimation error, categorized based on feature detectors (ORB, KAZE, AKAZE) over varied camera baselines . . . . . . . . . . . . . . . . . . . . . . 113 4.24 System overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121 4.25 Process of outlier detection: outliers (solid), inliers (dotted) . . . . . . . . . . . 121 4.26 Illustration of setup used by Microsoft to produce the multiview dataset. . . . . 124 4.27 Epipolar error (Ep) computed for three different methods . . . . . . . . . . . . 125 4.28 Re-projection error (Rp) computed for different algorithms . . . . . . . . . . . 126 4.29 Execution time of various algorithms relative to FullSIFT − RANSAC. . . 127 4.30 Deduction of relationship between object distance (D) and the baseline distance between the cameras (B). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 5.1 The extended 3D performance metric explained. . . . . . . . . . . . . . . . . . 133 5.2 Experimental setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 5.3 Mean 3D error for different number of total feature correspondences. . . . . . . 135 5.4 Measure of camera rotation and translation error over various noise levels with different number feature points for 3 different camera baselines. . . . . . . . . 137 5.5 Measure of 3D rotation accuracy and 3D position accuracy over various noise levels with different number feature points for 3 different camera baselines. . . 138 5.6 Measure of 3D orthogonality over various noise levels with different number feature points for 3 different camera baselines. . . . . . . . . . . . . . . . . . . 139 5.7 Measure of 3D rotation accuracy and 3D position accuracy over various noise levels with different sparsity (dispersion of points in 2D space) and various camera baselines. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140 5.8 Measure of 3D orthogonality over various noise levels with different sparsity (dispersion of points in 2D space) and various camera baselines. . . . . . . . . 141 xiii 5.9 Comparing 2D error and 3D error variation with noise for given N=75 pts and Sparsity = 100% and three baselines. . . . . . . . . . . . . . . . . . . . . . . . 143 5.10 SEM reconstruction with Mountains software. . . . . . . . . . . . . . . . . . . 145 xiv List of Tables 1.1 Outlining research questions and hypotheses that reflects the problem statement 8 3.1 Model parameters estimated for translation . . . . . . . . . . . . . . . . . . . 54 3.2 Model parameters estimated for rotations . . . . . . . . . . . . . . . . . . . . 55 3.3 Tolerances for camera misalignment, given the system inaccuracy limit as 0.5mm. 55 3.4 Reference measurements of defects of two pantograph types. . . . . . . . . . . 61 3.5 Single camera pose estimation algorithms and their description. . . . . . . . . . 61 3.6 Absolute angular difference in degrees between CBC and FBC - scheme 1. . . 62 3.7 Kullback-Leibler Divergence values for total (width + depth) error. . . . . . . . 65 4.1 Overview of the state-of-the-art feature extractors. . . . . . . . . . . . . . . . . 83 4.2 Quality - accuracy, reliability and execution time of 24 feature extractors, which provides practical recommendation for 3D applications(section 4.3.7). Here "Rotation" is the mean 3D rotational change (expressed in degrees) and "Po- sition" is the mean 3D positional shift (expressed in model units) of all the estimation 3D unit vectors that represent a model in 3D space. . . . . . . . . . 117 4.3 Comparing known and estimated camera rotational parameters. . . . . . . . . . 127 xv
2022 • 10 Pages • 211.64 KB
2022 • 1 Pages • 113.29 KB
2022 • 25 Pages • 574.59 KB
2022 • 32 Pages • 201.8 KB
2022 • 1 Pages • 58.08 KB
2022 • 27 Pages • 148.92 KB
2022 • 15 Pages • 418.21 KB
2012 • 17 Pages • 1.05 MB
2022 • 12 Pages • 11.48 MB
2022 • 20 Pages • 201.96 KB
2022 • 7 Pages • 289.65 KB
2022 • 5 Pages • 89.84 KB
2022 • 2 Pages • 441.92 KB
2022 • 30 Pages • 956.06 KB
2022 • 450 Pages • 7.8 MB
2022 • 8 Pages • 159.68 KB