Temporal saliency-based human action recognition

IEEE ICASSP 2019

Salah Al-Obaidi and Charith Abhayaratne

This paper proposes a new approach for human action recognition exploring the temporal salience. We exploit features over the temporal saliency maps for learning the action representation using a local dense descriptor. This approach automatically guides the descriptor towards the most interesting contents, ie the salience region, and obtains the action representation using solely the saliency information.

Outperforming results on Weizmann, DHA and KTH datasets confirm the efficiency of the proposed approach as compared to the state-of-the-art methods, in terms of accuracy and robustness to the variations inside the action and similarities among actions. The proposed method outperforms by 2.7% with DHA, 1% with KTH and it is comparable in the case of Weizmann.

S. Al-Obaidi and C. Abhayaratne, Temporal salience-based human action recognition, in Proc. IEEE ICASSP 2019, pp. 20172021.

Links