Show simple item record

dc.contributor.authorZhang, Yongqiang
dc.contributor.authorDing, Mingli
dc.contributor.authorBai, Yancheng
dc.contributor.authorGhanem, Bernard
dc.date.accessioned2019-05-20T08:58:15Z
dc.date.available2019-05-20T08:58:15Z
dc.date.issued2019-05-15
dc.identifier.citationZhang Y, Ding M, Bai Y, Ghanem B (2019) Detecting Small Faces in the Wild Based on Generative Adversarial Network and Contextual Information. Pattern Recognition. Available: http://dx.doi.org/10.1016/j.patcog.2019.05.023.
dc.identifier.issn0031-3203
dc.identifier.doi10.1016/j.patcog.2019.05.023
dc.identifier.urihttp://hdl.handle.net/10754/652909
dc.description.abstractFace detection techniques have been developed for decades, and one of the remaining open challenges is detecting small faces in unconstrained conditions. The reason is that tiny faces are often lacking detailed information and blurry. In this paper, we proposed an algorithm to directly generate a clear high-resolution face from a small blurry one by adopting a generative adversarial network (GAN). Toward this end, the basic GAN formulation achieves it by super-resolving and refining sequentially (e.g. SR-GAN and Cycle-GAN). However, we design a novel network to address the problem of super-resolving and refining jointly. Moreover, we also introduce new training losses (i.e. classification loss and regression loss) to promote the generator network to recover fine details of the small faces and to guide the discriminator network to distinguish face vs. non-face and to refine location simultaneously. Additionally, considering the importance of contextual information when detecting tiny faces in crowded cases, the context around face regions is combined to train the proposed GAN-based network for mining those very small faces from unconstrained scenarios. Extensive experiments on the challenging datasets WIDER FACE and FDDB demonstrate the effectiveness of the proposed method in restoring a clear high-resolution face from a small blurry one, and show that the achieved performance outperforms previous state-of-the-art methods by a large margin.
dc.description.sponsorshipThe majority of this work was done when Yongqiang Zhang worked at King Abdullah University of Science and Technology (KAUST) as a visiting PhD student, and continued at Harbin Institute of Technology (HIT). This work was supported by Natural Science Foundation of China, Grant No. 61603372.
dc.publisherElsevier BV
dc.relation.urlhttps://www.sciencedirect.com/science/article/pii/S0031320319301980
dc.rightsNOTICE: this is the author’s version of a work that was accepted for publication in Pattern Recognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition, [, , (2019-05-15)] DOI: 10.1016/j.patcog.2019.05.023 . © 2019. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subjectFace detection
dc.subjectTiny faces
dc.subjectSuper-resolution
dc.subjectGenerative adversarial network
dc.subjectContextual information
dc.titleDetecting Small Faces in the Wild Based on Generative Adversarial Network and Contextual Information
dc.typeArticle
dc.contributor.departmentVisual Computing Center (VCC)
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentElectrical Engineering Program
dc.identifier.journalPattern Recognition
dc.eprint.versionPost-print
dc.contributor.institutionSchool of Electrical Engineering and Automation, Harbin Institute of Technology, Harbin 15001, China
kaust.personBai, Yancheng
kaust.personGhanem, Bernard
dc.date.published-online2019-05-15
dc.date.published-print2019-10


Files in this item

Thumbnail
Name:
1-s2.0-S0031320319301980-main.pdf
Size:
27.76Mb
Format:
PDF
Description:
Accepted Manuscript

This item appears in the following Collection(s)

Show simple item record