288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

April 2015, Vol. 27, No. 4, Pages 925-953
(doi: 10.1162/NECO_a_00720)
© 2015 Massachusetts Institute of Technology
Visual Tracking Using Neuromorphic Asynchronous Event-Based Cameras
Article PDF (1.69 MB)

This letter presents a novel computationally efficient and robust pattern tracking method based on a time-encoded, frame-free visual data. Recent interdisciplinary developments, combining inputs from engineering and biology, have yielded a novel type of camera that encodes visual information into a continuous stream of asynchronous, temporal events. These events encode temporal contrast and intensity locally in space and time. We show that the sparse yet accurately timed information is well suited as a computational input for object tracking. In this letter, visual data processing is performed for each incoming event at the time it arrives. The method provides a continuous and iterative estimation of the geometric transformation between the model and the events representing the tracked object. It can handle isometry, similarities, and affine distortions and allows for unprecedented real-time performance at equivalent frame rates in the kilohertz range on a standard PC. Furthermore, by using the dimension of time that is currently underexploited by most artificial vision systems, the method we present is able to solve ambiguous cases of object occlusions that classical frame-based techniques handle poorly.