Action Search: Learning to Search for Human Activities in Untrimmed Videos

Handle URI:
http://hdl.handle.net/10754/626462
Title:
Action Search: Learning to Search for Human Activities in Untrimmed Videos
Authors:
Alwassel, Humam; Heilbron, Fabian Caba; Ghanem, Bernard ( 0000-0002-5534-587X )
Abstract:
Traditional approaches for action detection use trimmed data to learn sophisticated action detector models. Although these methods have achieved great success at detecting human actions, we argue that huge information is discarded when ignoring the process, through which this trimmed data is obtained. In this paper, we propose Action Search, a novel approach that mimics the way people annotate activities in video sequences. Using a Recurrent Neural Network, Action Search can efficiently explore a video and determine the time boundaries during which an action occurs. Experiments on the THUMOS14 dataset reveal that our model is not only able to explore the video efficiently but also accurately find human activities, outperforming state-of-the-art methods.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division; Computer Science Program; Electrical Engineering Program; Visual Computing Center (VCC)
Publisher:
arXiv
Issue Date:
13-Jun-2017
ARXIV:
arXiv:1706.04269
Type:
Preprint
Additional Links:
http://arxiv.org/abs/1706.04269v1; http://arxiv.org/pdf/1706.04269v1
Appears in Collections:
Other/General Submission; Other/General Submission; Computer Science Program; Electrical Engineering Program; Visual Computing Center (VCC); Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorAlwassel, Humamen
dc.contributor.authorHeilbron, Fabian Cabaen
dc.contributor.authorGhanem, Bernarden
dc.date.accessioned2017-12-28T07:32:11Z-
dc.date.available2017-12-28T07:32:11Z-
dc.date.issued2017-06-13en
dc.identifier.urihttp://hdl.handle.net/10754/626462-
dc.description.abstractTraditional approaches for action detection use trimmed data to learn sophisticated action detector models. Although these methods have achieved great success at detecting human actions, we argue that huge information is discarded when ignoring the process, through which this trimmed data is obtained. In this paper, we propose Action Search, a novel approach that mimics the way people annotate activities in video sequences. Using a Recurrent Neural Network, Action Search can efficiently explore a video and determine the time boundaries during which an action occurs. Experiments on the THUMOS14 dataset reveal that our model is not only able to explore the video efficiently but also accurately find human activities, outperforming state-of-the-art methods.en
dc.publisherarXiven
dc.relation.urlhttp://arxiv.org/abs/1706.04269v1en
dc.relation.urlhttp://arxiv.org/pdf/1706.04269v1en
dc.rightsArchived with thanks to arXiven
dc.titleAction Search: Learning to Search for Human Activities in Untrimmed Videosen
dc.typePreprinten
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.contributor.departmentComputer Science Programen
dc.contributor.departmentElectrical Engineering Programen
dc.contributor.departmentVisual Computing Center (VCC)en
dc.eprint.versionPre-printen
dc.identifier.arxividarXiv:1706.04269en
kaust.authorAlwassel, Humamen
kaust.authorHeilbron, Fabian Cabaen
kaust.authorGhanem, Bernarden
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.