Neuromorphic vision sensors based on an in-sensor computing paradigm
-
-
Abstract
Conventional digital image systems include both image sensors and image processing units, which are physically separated. The data transfer between them gives rise to time delay and high-power consumption. In addition, the digital imaging system works through a framebased operation, which causes the loss of important information or results in data redundancy. Our human visual system offers an efficient and parallel information processing method. Neuromorphic vision sensors can emulate the functions of the human retina for sensing light signals, storing the signals, and performing information preprocessing. This paradigm greatly simplifies the circuit complexity of the artificial vision system, improves the efficiency of information processing, and reduces system power consumption. In this work, we summarize the problems of conventional digital image sensors, introduce a few important artificial neural networks, and discuss the research progress and existing problems of emerging neuromorphic vision sensors.
-
-