Show simple item record

dc.contributor.authorChen, Xiuying
dc.contributor.authorAlamro, Hind
dc.contributor.authorLi, Mingzhe
dc.contributor.authorGao, Shen
dc.contributor.authorZhang, Xiangliang
dc.contributor.authorZhao, Dongyan
dc.contributor.authorYan, Rui
dc.date.accessioned2022-04-05T06:28:04Z
dc.date.available2022-04-05T06:28:04Z
dc.date.issued2021
dc.identifier.citationChen, X., Alamro, H., Li, M., Gao, S., Zhang, X., Zhao, D., & Yan, R. (2021). Capturing Relations between Scientific Papers: An Abstractive Model for Related Work Section Generation. Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). https://doi.org/10.18653/v1/2021.acl-long.473
dc.identifier.isbn9781954085527
dc.identifier.doi10.18653/v1/2021.acl-long.473
dc.identifier.urihttp://hdl.handle.net/10754/676121
dc.description.abstractGiven a set of related publications, related work section generation aims to provide researchers with an overview of the specific research area by summarizing these works and introducing them in a logical order. Most of existing related work section generation models follow the inflexible extractive style, which directly extract sentences from multiple original papers to form a related work discussion. Hence, in this paper, we propose a Relation-aware Related work Generator (RRG), which generates an abstractive related work section from multiple scientific papers in the same research area. Concretely, we propose a relation-aware multi-document encoder that relates one document to another according to their content dependency in a relation graph. The relation graph and the document representation interact and are refined iteratively, complementing each other in the training process. We also contribute two public datasets composed of related work sections and their corresponding papers. Extensive experiments on the two datasets show that the proposed model brings substantial improvements over several strong baselines. We hope that this work will promote advances in related work section generation task.
dc.description.sponsorshipWe would like to thank the anonymous reviewers for their constructive comments. This work was supported by the National Key Research and Development Program of China (No. 2017YFC0804001), the National Science Foundation of China (NSFC No. 61876196 and NSFC No. 61672058). Rui Yan is partially supported as a Young Fellow of Beijing Institute of Artificial Intelligence (BAAI).
dc.publisherAssociation for Computational Linguistics
dc.relation.urlhttps://aclanthology.org/2021.acl-long.473
dc.rightsArchived with thanks to Association for Computational Linguistics
dc.titleCapturing relations between scientific papers: An abstractive model for related work section generation
dc.typeConference Paper
dc.contributor.departmentKing Abdullah University of Science and Technology, Thuwal, Saudi Arabia
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Science and Engineering (CEMSE) Division
dc.conference.date2021-08-01 to 2021-08-06
dc.conference.nameJoint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, ACL-IJCNLP 2021
dc.conference.locationVirtual, Online
dc.eprint.versionPublisher's Version/PDF
dc.contributor.institutionWangxuan Institute of Computer Technology, Peking University, Beijing,China
dc.contributor.institutionCenter for Data Science, Peking University, Beijing, China
dc.contributor.institutionComputer Science Department, Umm Al-Qura University, Makkah, Saudi Arabia
dc.contributor.institutionState Key Laboratory of Media Convergence Production Technology and Systems
dc.contributor.institutionGaoling School of Artificial Intelligence, Renmin University of China
dc.identifier.pages6068-6077
kaust.personChen, Xiuying
kaust.personAlamro, Hind
kaust.personZhang, Xiangliang
dc.identifier.eid2-s2.0-85118936787
refterms.dateFOA2022-04-05T06:29:19Z


Files in this item

Thumbnail
Name:
2021.acl-long.473.pdf
Size:
674.3Kb
Format:
PDF
Description:
Publisher's Version/PDF

This item appears in the following Collection(s)

Show simple item record