Event-based cameras use in-pixel analog processing to respond to changes in illumination (events). Pixels report events asynchronously, enabling very fast response and reduced data volumes compared to conventional frame- based arrays. The asynchronous event reporting circuit timestamps events to 1 microsecond resolution, but latency in the circuit and the serial nature of the output lead to variable latency when many pixels are stimulated simultaneously. To characterize this variability, three iniVation cameras and one Prophesee camera were exposed to single step-function ashes with varying amplitude and diameter. The Median Absolute Deviation of pixel response times ranged between 0 and 6086μs, increasing with the fraction of array exposed. The number of events generated per pixel generally decreased with pixel stimulation percent, with all cameras producing fewer than 59 events per pixel. In three cameras, an increased stimulus amplitude caused an increase in event generation, while the fourth camera generated fewer events with increasing stimulus amplitude, down to 0.32 events per stimulus. The instantaneous event throughput exceeded manufacturer specifications for 3 of 4 cameras, though the average throughput was lower than specified over longer time scales. While individual pixels may be able to accurately detect microsecond-scale change, data bottlenecks may cause missed events or erroneous timestamps.
|