Abstract
Despite the success of neural networks in computer vision tasks, digital
'neurons' are a very loose approximation of biological neurons. Today's
learning approaches are designed to function on digital devices with digital
data representations such as image frames. In contrast, biological vision
systems are generally much more capable and efficient than state-of-the-art
digital computer vision algorithms. Event cameras are an emerging sensor
technology which imitates biological vision with asynchronously firing pixels,
eschewing the concept of the image frame. To leverage modern learning
techniques, many event-based algorithms are forced to accumulate events back to
image frames, somewhat squandering the advantages of event cameras.
We follow the opposite paradigm and develop a new type of neural network
which operates closer to the original event data stream. We demonstrate
state-of-the-art performance in angular velocity regression and competitive
optical flow estimation, while avoiding difficulties related to training SNN.
Furthermore, the processing latency of our proposed approach is less than 1/10
any other implementation, while continuous inference increases this improvement
by another order of magnitude.