Mostrar el registro sencillo del ítem

dc.contributor.authorZúñiga-Noël, David
dc.contributor.authorRuiz-Sarmiento, José Raúl 
dc.contributor.authorGómez-Ojeda, Rubén
dc.contributor.authorGonzález-Jiménez, Antonio Javier 
dc.date.accessioned2024-10-04T10:23:26Z
dc.date.available2024-10-04T10:23:26Z
dc.date.issued2019-06-12
dc.identifier.citationD. Zuñiga-Noël, J. -R. Ruiz-Sarmiento, R. Gomez-Ojeda and J. Gonzalez-Jimenez, "Automatic Multi-Sensor Extrinsic Calibration For Mobile Robots," in IEEE Robotics and Automation Letters, vol. 4, no. 3, pp. 2862-2869, July 2019.es_ES
dc.identifier.urihttps://hdl.handle.net/10630/34342
dc.description.abstractIn order to fuse measurements from multiple sensors mounted on a mobile robot, it is needed to express them in a common reference system through their relative spatial transformations. In this paper, we present a method to estimate the full 6DoF extrinsic calibration parameters of multiple hetero- geneous sensors (Lidars, Depth and RGB cameras) suitable for automatic execution on a mobile robot. Our method computes the 2D calibration parameters (x, y, yaw) through a motion-based approach, while for the remaining 3 parameters (z, pitch, roll) it requires the observation of the ground plane for a short period of time. What set this proposal apart from others is that: i) all calibration parameters are initialized in closed form, and ii) the scale ambiguity inherent to motion estimation from a monocular camera is explicitly handled, enabling the combination of these sensors and metric ones (Lidars, stereo rigs, etc.) within the same optimization framework. We provide a formal definition of the problem, as well as of the contributed method, for which a C++ implementation has been made publicly available. The suitability of the method has been assessed in simulation an with real data from indoor and outdoor scenarios. Finally, improvements over state-of-the-art motion-based calibration proposals are shown through experimental evaluation.es_ES
dc.description.sponsorshipThis work was supported by the research projects WISER (DPI2017-84827-R), funded by the Spanish Government and financed by the European Regional Develop- ment’s funds (FEDER), MoveCare (ICT-26-2016b-GA-732158), funded by the European H2020 program, the European Social Found through the Youth Employment Initiative for the promotion of young researchers, and by a contract from the I-PPIT program of the University of Malaga.es_ES
dc.language.isoenges_ES
dc.publisherIEEEes_ES
dc.subjectRobóticaes_ES
dc.subjectDetectoreses_ES
dc.subject.otherCalibration and Identificationes_ES
dc.subject.otherSensor Fusiones_ES
dc.subject.otherService robotses_ES
dc.subject.otherWheeled robotses_ES
dc.titleAutomatic Multi-Sensor Extrinsic Calibration For Mobile Robots.es_ES
dc.typejournal articlees_ES
dc.centroE.T.S.I. Informáticaes_ES
dc.identifier.doi10.1109/LRA.2019.2922618
dc.type.hasVersionAMes_ES
dc.departamentoIngeniería de Sistemas y Automática
dc.rights.accessRightsopen accesses_ES


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem