Show simple item record

dc.contributor.authorTu, Jingzheng
dc.contributor.authorYu, Guoxian
dc.contributor.authorWang, Jun
dc.contributor.authorDomeniconi, Carlotta
dc.contributor.authorZhang, Xiangliang
dc.date.accessioned2020-01-15T07:15:14Z
dc.date.available2020-01-15T07:15:14Z
dc.date.issued2019-12-24
dc.identifier.citationTu, J., Yu, G., Wang, J., Domeniconi, C., & Zhang, X. (2020). Attention-Aware Answers of the Crowd. Proceedings of the 2020 SIAM International Conference on Data Mining, 451–459. doi:10.1137/1.9781611976236.51
dc.identifier.doi10.1137/1.9781611976236.51
dc.identifier.urihttp://hdl.handle.net/10754/661042
dc.description.abstractCrowdsourcing is a relatively economic and efficient solution to collect annotations from the crowd through online platforms. Answers collected from workers with different expertise may be noisy and unreliable, and the quality of annotated data needs to be further maintained. Various solutions have been attempted to obtain high-quality annotations. However, they all assume that workers' label quality is stable over time (always at the same level whenever they conduct the tasks). In practice, workers' attention level changes over time, and the ignorance of which can affect the reliability of the annotations. In this paper, we focus on a novel and realistic crowdsourcing scenario involving attention-aware annotations. We propose a new probabilistic model that takes into account workers' attention to estimate the label quality. Expectation propagation is adopted for efficient Bayesian inference of our model, and a generalized Expectation Maximization algorithm is derived to estimate both the ground truth of all tasks and the label-quality of each individual crowd worker with attention. In addition, the number of tasks best suited for a worker is estimated according to changes in attention. Experiments against related methods on three real-world and one semi-simulated datasets demonstrate that our method quantifies the relationship between workers' attention and label-quality on the given tasks, and improves the aggregated labels.
dc.publisherarXiv
dc.relation.urlhttps://arxiv.org/pdf/1912.11238
dc.rightsArchived with thanks to arXiv
dc.titleAttention-Aware Answers of the Crowd
dc.typePreprint
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentMachine Intelligence & kNowledge Engineering Lab
dc.eprint.versionPre-print
dc.contributor.institutionSouthwest University.
dc.contributor.institutionGeorge Mason University.
dc.identifier.arxivid1912.11238
kaust.personZhang, Xiangliang
refterms.dateFOA2020-01-15T07:15:44Z


Files in this item

Thumbnail
Name:
Preprintfile1.pdf
Size:
475.5Kb
Format:
PDF
Description:
Pre-print

This item appears in the following Collection(s)

Show simple item record