Show simple item record

dc.contributor.authorYang, Yufeng
dc.contributor.authorMa, Yixiao
dc.contributor.authorZhang, Jing
dc.contributor.authorGao, Xin
dc.contributor.authorXu, Min
dc.date.accessioned2020-09-27T13:21:25Z
dc.date.available2020-09-27T13:21:25Z
dc.date.issued2020-09-23
dc.date.submitted2020-08-14
dc.identifier.citationYang, Y., Ma, Y., Zhang, J., Gao, X., & Xu, M. (2020). AttPNet: Attention-Based Deep Neural Network for 3D Point Set Analysis. Sensors, 20(19), 5455. doi:10.3390/s20195455
dc.identifier.issn1424-8220
dc.identifier.pmid32977508
dc.identifier.doi10.3390/s20195455
dc.identifier.urihttp://hdl.handle.net/10754/665326
dc.description.abstractPoint set is a major type of 3D structure representation format characterized by its data availability and compactness. Most former deep learning-based point set models pay equal attention to different point set regions and channels, thus having limited ability in focusing on small regions and specific channels that are important for characterizing the object of interest. In this paper, we introduce a novel model named Attention-based Point Network (AttPNet). It uses attention mechanism for both global feature masking and channel weighting to focus on characteristic regions and channels. There are two branches in our model. The first branch calculates an attention mask for every point. The second branch uses convolution layers to abstract global features from point sets, where channel attention block is adapted to focus on important channels. Evaluations on the ModelNet40 benchmark dataset show that our model outperforms the existing best model in classification tasks by 0.7% without voting. In addition, experiments on augmented data demonstrate that our model is robust to rotational perturbations and missing points. We also design a Electron Cryo-Tomography (ECT) point cloud dataset and further demonstrate our model’s ability in dealing with fine-grained structures on the ECT dataset.
dc.description.sponsorshipThis work was supported in part by U.S. National Institutes of Health (NIH) grant P41GM103712, R01GM134020, and K01MH123896, U.S. National Science Foundation (NSF) grant DBI-1949629 and IIS-2007595. Mark Foundation for Cancer Research grant 19-044-ASP, King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research (OSR) under Award No. URF/1/2602-01 and URF/1/3007-01.
dc.publisherMDPI AG
dc.relation.urlhttps://www.mdpi.com/1424-8220/20/19/5455
dc.rightsThis is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleAttPNet: Attention-Based Deep Neural Network for 3D Point Set Analysis
dc.typeArticle
dc.contributor.departmentComputational Bioscience Research Center (CBRC)
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentStructural and Functional Bioinformatics Group
dc.identifier.journalSensors
dc.eprint.versionPublisher's Version/PDF
dc.contributor.institutionComputational Biology Department, School of Computer Science, Carnegie Mellon University, Pittsburgh, PA 15213, USA.
dc.contributor.institutionDepartment of Computer Science, University of California, Irvine, CA 92697, USA.
dc.identifier.volume20
dc.identifier.issue19
dc.identifier.pages5455
kaust.personGao, Xin
kaust.grant.numberURF/1/2602-01
kaust.grant.numberURF/1/3007-01
dc.date.accepted2020-09-08
refterms.dateFOA2020-09-27T13:22:30Z
kaust.acknowledged.supportUnitOffice of Sponsored Research (OSR)


Files in this item

Thumbnail
Name:
sensors-20-05455-v2.pdf
Size:
6.255Mb
Format:
PDF
Description:
Published version

This item appears in the following Collection(s)

Show simple item record

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Except where otherwise noted, this item's license is described as This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.