This paper introduces a method to calibrate a wide area system of
unsynchronized cameras with respect to a single global coordinate
system. The method is simple and does not require the physical
construction of a large calibration object. The user need only
an identifiable point in front of all cameras. The method
rough estimate o f camera pose by first performing pair -wise
structure -from-motion on observed points, and then combining
pair -wise registrations into a single coordinate frame. Using
initial camera pose, the moving point can be tracked in world
The path of the point defines a "virtual calibration object"
can be used to improve the initial estimates of camera pose.
the above process yields a more precise estimate of both camera
and the point path. Experimental results show that it perform
well as calibration from a physical target, in cases where all
cameras share some common working volume. We then demonstrate
effectiveness in wide area settings by calibrating a system of
having non -overlapping fields of view, a situation where
methods cannot be applied directly.