Regular ArticleUnderwater Video Mosaics as Visual Navigation Maps
References (35)
A computational framework and an algorithm for the measurement of visual motion
Internat. J. Comput. Vision
(1989)- A. Arsénio, and, J. Marques, Performance analysis and characterization of matching algorithms. in, Proc. of the...
- J. Batista, H. Araújo, and, A. Almeida, Iterative multi–step explicit camera calibration, in, Proc. of the 6th...
- A. Branca, E. Stella, and, A. Distante, Autonomous navigation of underwater vehicles, in, Proc. of the IEEE OCEANS'98,...
Tutorial on Filtering, Restoration and State Estimation
(June 1995)- L. de Agapito, E. Hayman, and, I. Reid, Self-calibration of a rotating camera with varying intrinsic parameters, in,...
Three Dimensional Computer Vision
(1993)- et al.
Motion and structure from motion in a piecewise planar environment
Internat. J. Pattern Recog. Artif. Intell.
(1988) - et al.
Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography
Commun. ACM
(1981) - S. Fleischer, R. Marks, S. Rock, and M. Lee, Improved real-time video mosaicking of the ocean floor, in Proc. of the...
Matrix Computations
Self-calibration from stationary cameras
Internat. J. Comput. Vision
Cited by (145)
Method for detecting surface defects of underwater buildings: Binocular vision based on sinusoidal grating fringe assistance
2023, Alexandria Engineering JournalAutonomous Underwater Vehicle navigation: A review
2023, Ocean EngineeringHigh precision underwater 3D imaging of non-cooperative target with frequency comb
2022, Optics and Laser TechnologyCitation Excerpt :3D imaging refers to the formation of sampling points by emitting laser beams at the target, and then receiving echoes to obtain depth information (distance) and coordinates to perform 3D imaging of the target. In underwater application scenarios, such as seabed mapping, 3D shape detection, and pipeline inspection, high-precision 3D imaging is increasingly indispensable [27–29]. With the growing focus on ocean exploration worldwide, high-precision underwater 3D imaging is becoming more critical.
MARESye: A hybrid imaging system for underwater robotic applications
2020, Information FusionCitation Excerpt :On the contrary, optical systems can provide dense and accurate texture information updated at high speed, which is crucial for tasks such as, underwater manipulation, oceanographic studies and marine biology, shipwreck surveys and object identification. Although optical systems can increase the feasibility of missions can be carried out by AUVs [10], they still need to overcome the photometric limitations of sub-sea conditions namely, non-uniform lighting and colour filtering, poor visibility due to light attenuation and scattering which is caused by suspended particles in the medium or the abundance of marine life [1,7]. Generic and conventional systems for 3D reconstruction are usually classified into passive or active imaging techniques.
Considering the rates of growth in two taxa of coral across Pacific islands
2020, Advances in Marine BiologyCitation Excerpt :As a result, there is still much to be learned regarding the geographic and temporal variability of community-level coral demography. For over a decade, marine scientists interested in questions at scales relevant to community ecology have taken advantage of the proliferation of digital photography, developing software platforms that blend thousands of individual images together to create detailed two-dimensional orthorectified imagery (Gracias and Santos-Victor, 2000; Lirman et al., 2007) and 3-dimensional reconstructions of benthic habitats using platforms such as Structure from Motion (SfM) software (Burns et al., 2015; Ferrari et al., 2016; Pizarro et al., 2009). More recently, the robust software platforms and advanced computational infrastructure necessary to generate these products for large areas has become more accessible.
Direct linear and refraction-invariant pose estimation and calibration model for underwater imaging
2019, ISPRS Journal of Photogrammetry and Remote Sensing
- 1
The work described in this paper has been supported by the Portuguese Foundation for Science and Technology PRAXIS XXI BD/13772/97 and Esprit-LTR Proj. 30185, NARVAL.