288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

June 2019, Vol. 31, No. 6, Pages 1114-1138
(doi: 10.1162/neco_a_01191)
© 2019 Massachusetts Institute of Technology
Asynchronous Event-Based Motion Processing: From Visual Events to Probabilistic Sensory Representation
Article PDF (1.24 MB)
In this work, we propose a two-layered descriptive model for motion processing from retina to the cortex, with an event-based input from the asynchronous time-based image sensor (ATIS) camera. Spatial and spatiotemporal filtering of visual scenes by motion energy detectors has been implemented in two steps in a simple layer of a lateral geniculate nucleus model and a set of three-dimensional Gabor kernels, eventually forming a probabilistic population response. The high temporal resolution of independent and asynchronous local sensory pixels from the ATIS provides a realistic stimulation to study biological motion processing, as well as developing bio-inspired motion processors for computer vision applications. Our study combines two significant theories in neuroscience: event-based stimulation and probabilistic sensory representation. We have modeled how this might be done at the vision level, as well as suggesting this framework as a generic computational principle among different sensory modalities.