This paper presents new kinds of image sensors based on TFS (Time to First Spike) pixels and DVS (Dynamic Vision Sensor) pixels, which take advantage of non-uniform sampling and redundancy suppression to reduce the data throughput. The DVS pixels only detect a luminance variation, while TFS pixels quantized luminance by measuring the required time to cross a threshold. Such image sensors output requests through an Address Event Representation (AER), which helps to reduce the data stream The resulting event bitstream is composed by time, position, polarity, and magnitude information. Such a bitstream offers new possibilities for image processing such as event-by-event object tracking. In particular, we propose some processing to cluster events, filter noise and extract other useful features, such as a velocity estimation.