S2N: Suppression-Strengthen Network for Event-Based Recognition under Variant Illuminations

Zengyu Wan, Yang Wang, Ganchao Tan, Yang Cao, Zheng-Jun Zha ;

Abstract


"The emerging event-based sensors have demonstrated out-standing potential in visual tasks thanks to their high speed and high dynamic range. However, the event degradation due to imaging under low illumination obscures the correlation between event signals and brings uncertainty into event representation. Targeting this issue, we present a novel suppression-strengthen network (S2N) to augment the event feature representation after suppressing the influence of degradation. Specifically, a suppression sub-network is devised to obtain intensity mapping between the degraded and denoised enhancement frames by unsupervised learning. To further restrain the degradation’s influence, a strengthen sub-network is presented to generate robust event representation by adaptively perceiving the local variations between the center and surrounding regions. After being trained on a single illumination condition, our S2N can be directly generalized to other illuminations to boost the recognition performance. Experimental results on three challenging recognition tasks demonstrate the superiority of our method. The codes and datasets could refer to https://github.com/wanzengy/S2N-Suppression-Strengthen-Network."

Related Material


[pdf] [supplementary material] [DOI]