The eight-point algorithm of Hartley occupies an important place in computer vision, notably as a means of providing an initial value of the fundamental matrix for use in iterative estimation methods. In this paper, a novel explanation is given for the improvement in performance of the eight-point algorithm that results from using normalised data. A first step is singling out a cost function that the normalised algorithm acts to minimise. The cost function is then shown to be statistically better founded than the cost function associated with the non-normalised algorithm. This augments the original argument that improved performance is due to the better conditioning of a pivotal matrix. Experimental results are given that support the adopted approach. This work continues a wider effort to place a variety of estimation techniques within a coherent framework.
A statistical rationalisation of Hartley's normalised eight-point algorithm
2003-01-01
351856 byte
Conference paper
Electronic Resource
English
A Statistical Rationalisation of Hartley's Normalised Eight-Point Algorithm
British Library Conference Proceedings | 2003
|Better value through rationalisation
British Library Online Contents | 2005
From 3D Point Clouds to Pose-Normalised Depth Maps
British Library Online Contents | 2010
|Rationalisation of cargo handling
Engineering Index Backfile | 1965
The potential for rationalisation
British Library Conference Proceedings | 1993
|