Spatial Layout and Surface Reconstruction from Omnidirectional Images
Fecha
2016-10-09
Autores
Posada, Luis Felipe
Alejandro Velásquez
Título de la revista
ISSN de la revista
Título del volumen
Editor
IEEE
Resumen
This paper presents a spatial layout recovery approach from single omnidirectional images. Vertical structures in the scene are extracted via classification from heterogeneous features computed at the superpixel level. Vertical surfaces are further classified according to their main orientation by fusing oriented line features, floor-wall boundary features and histogram of oriented gradients (HOG) with a Random Forest classifier. Oriented line features are used to build an orientation map that considers the main vanishing points. The floor-wall boundary feature attempts to reconstruct the scene shape as if it were observed from a bird's-eye view. Finally, the HOG descriptors are aggregated per superpixel and summarize the gradient distribution at homogeneous appearance regions. Compared to existing methods in the literature which rely only on corners or lines, our method gains statistical support from multiple cues aggregated per superpixel which provide more robustness against noise, occlusion, and clutter.