Calibrating an outdoor distributed camera network using laser range finder data
Document typeConference report
PublisherIEEE Press. Institute of Electrical and Electronics Engineers
Rights accessOpen Access
Outdoor camera networks are becoming ubiquitous in critical urban areas of large cities around the world. Although current applications of camera networks are mostly limited to video surveillance, recent research projects are exploiting advances on outdoor robotics technology to develop systems that put together networks of cameras and mobile robots in people assisting tasks. Such systems require the creation of robot navigation systems in urban areas with a precise calibration of the distributed camera network. Despite camera calibration has been an extensively studied topic, the calibration (intrinsic and extrinsic) of large outdoor camera networks with no overlapping view fields, and likely to suffer frequent recalibration, poses novel challenges in the development of practical methods for user-assisted calibration that minimize intervention times and maximize precision. In this paper we propose the utilization of Laser Range Finder (LRF) data covering the area of the camera network to support the calibration process and develop a semi-automated methodology allowing quick and precise calibration of large camera networks. The proposed methods have been tested in a real urban environment and have been applied to create direct mappings (homographies) between image coordinates and world points in the ground plane (walking areas) to support person and robot detection and localization algorithms.
CitationOrtega, A.A. [et al.]. Calibrating an outdoor distributed camera network using laser range finder data. A: IEEE/RSJ International Conference on Intelligent Robots and Systems. "2009 IEEE/RSJ International Conference on Intelligent Robots and Systems". St. Louis, MO: IEEE Press. Institute of Electrical and Electronics Engineers, 2009, p. 303-308.