ICA  Vol.6 No.1 , February 2015
Dual Dynamic PTZ Tracking Using Cooperating Cameras
Abstract: This paper presents a real-time, dynamic system that uses high resolution gimbals and motorized lenses with position encoders on their zoom and focus elements to “recalibrate” the system as needed to track a target. Systems that initially calibrate for a mapping between pixels of a wide field of view (FOV) master camera and the pan-tilt (PT) settings of a steerable narrow FOV slave camera assume that the target is travelling on a plane. As the target travels through the FOV of the master camera, the slave cameras PT settings are then adjusted to keep the target centered within its FOV. In this paper, we describe a system we have developed that allows both cameras to move and extract the 3D coordinates of the target. This is done with only a single initial calibration between pairs of cameras and high-resolution pan-tilt-zoom (PTZ) platforms. Using the information from the PT settings of the PTZ platform as well as the precalibrated settings from a preset zoom lens, the 3D coordinates of the target are extracted and compared to those of a laser range finder and static-dynamic camera pair accuracies.
Cite this paper: Eslami, M. , Rzasa, J. , Milner, S. and Davis, C. (2015) Dual Dynamic PTZ Tracking Using Cooperating Cameras. Intelligent Control and Automation, 6, 45-54. doi: 10.4236/ica.2015.61006.

[1]   Zhou, J. and Wan, D. (2008) Stereo Vision Using Two PTZ Cameras. Computer Vision and Image Understanding, 112, 184-194.

[2]   Zhou, J., Wan, D. and Ying, W. (2010) The Chameleon Like Vision System. IEEE Signal Processing Magazine, 27, 91-101.

[3]   Badri, J., Tilmant, C., Lavest, J.-M. and Pham, Q.-C. (2007) Camera-to-Camera Mapping for Hybrid Pan-Tilt-Zoom Sensors Calibration. Lecture Notes in Computer Science, 4522, 132-141.

[4]   Horaud, R., Knossow, D. and Michaelis, M. (2006) Camera Cooperation for Achieving Visual Attention. Machine Vision Applications, 16, 331-342.

[5]   Khan, S., Javid, O. and Rasheed, Z. (2001) Human Tracking in Multiple Cameras. Proceedings of IEEE International Conference of Computer Vision, 331-336.

[6]   Senior, A., Hampapur, A. and Lu, M. (2005) Acquiring Multi-Scale Images by Pan-Tilt-Zoom Ontrola nd Automatic Multi-Camera Calibration. IEEE Workshop on Applications on Computer Vision, 433-438.

[7]   Sinha, S. and Pollefeys, M. (2004) Towards Calibrating a Pan-Tilt-Zoom Camera Network. EECV Conference Workshop.

[8]   Nelson, E.D. and Cockburn, J.C. (2007) Dual Camera Zoom Control: A Study of Zoom Tracking Stability. Proceedings of IEEE International Conference of Acoustics, Speech and Signal Processing, 941-944.

[9]   Bazin, J.-C. and Demonceaux, C. (2008) UAV Attitude Estimation by Vanishing Points in Catadioptric Images. International Conference on Robotics and Automation, 2743-2749.

[10]   Caprile, B. and Torre, V. (1990) Using Vanishing Points for Camera Calibration. International Journal of Computer Vision, 4, 127-140.

[11]   Chen, Y.S., Hung, Y.P., Fuh, C.S. and Shih, S.W. (2000) Camera Calibration with a Motorized Zoom Lens. International Conference on Pattern Recognition, 495-498.

[12]   Rzasa, J., Milner, S.D. and Davis, C.C. (2011) Design and Implementation of Pan-Tilt FSO Transceiver Gimbals for Real-Time Compensation of Platform Distrubances Using a Secondary Control Network. SPIE Laser Communication and Propagation through the Atmosphere and Oceans, San Diego.

[13]   Bouget, J.

[14]   Wilson, R.K. (1994) Modeling and Calibration of Automated Zoom Lenses. Proceedings of SPIE, 170-186.

[15]   Hartley, R. and Zisserman, A. (2003) Multiple View Geometry in Computer Vision. 2nd Edition, Cambridge University Press, Cambridge.

[16]   Chen, C.-H., et al. (2008) Heterogeneous Fusion of Omnidirectional and PTZ Cameras for Multiple Object Tracking. IEEE Transactions on Circuits and Systems for Video Technology, 18, 1052-1063.