Show simple item record

dc.contributor.authorShang, Shuo
dc.contributor.authorChen, Lisi
dc.contributor.authorWei, Zhewei
dc.contributor.authorJensen, Christian S.
dc.contributor.authorZheng, Kai
dc.contributor.authorKalnis, Panos
dc.date.accessioned2017-09-21T09:25:34Z
dc.date.available2017-09-21T09:25:34Z
dc.date.issued2017-09-07
dc.identifier.citationShang S, Chen L, Wei Z, Jensen CS, Zheng K, et al. (2017) Trajectory similarity join in spatial networks. Proceedings of the VLDB Endowment 10: 1178–1189. Available: http://dx.doi.org/10.14778/3137628.3137630.
dc.identifier.issn2150-8097
dc.identifier.doi10.14778/3137628.3137630
dc.identifier.urihttp://hdl.handle.net/10754/625506
dc.description.abstractThe matching of similar pairs of objects, called similarity join, is fundamental functionality in data management. We consider the case of trajectory similarity join (TS-Join), where the objects are trajectories of vehicles moving in road networks. Thus, given two sets of trajectories and a threshold θ, the TS-Join returns all pairs of trajectories from the two sets with similarity above θ. This join targets applications such as trajectory near-duplicate detection, data cleaning, ridesharing recommendation, and traffic congestion prediction. With these applications in mind, we provide a purposeful definition of similarity. To enable efficient TS-Join processing on large sets of trajectories, we develop search space pruning techniques and take into account the parallel processing capabilities of modern processors. Specifically, we present a two-phase divide-and-conquer algorithm. For each trajectory, the algorithm first finds similar trajectories. Then it merges the results to achieve a final result. The algorithm exploits an upper bound on the spatiotemporal similarity and a heuristic scheduling strategy for search space pruning. The algorithm's per-trajectory searches are independent of each other and can be performed in parallel, and the merging has constant cost. An empirical study with real data offers insight in the performance of the algorithm and demonstrates that is capable of outperforming a well-designed baseline algorithm by an order of magnitude.
dc.description.sponsorshipThis work is partially supported by KAUST, the National Natural Science Foundation of China (61402532, 61532018), Beijing Nova Program (xx2016078), and by the DiCyPS center, funded by Innovation Fund Denmark.
dc.publisherVLDB Endowment
dc.relation.urlhttp://dl.acm.org/citation.cfm?doid=3137628.3137630
dc.rightsThis work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. For any use beyond those covered by this license, obtain permission by emailing info@vldb.org.
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleTrajectory similarity join in spatial networks
dc.typeArticle
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentComputer Science Program
dc.identifier.journalProceedings of the VLDB Endowment
dc.eprint.versionPublisher's Version/PDF
dc.contributor.institutionHKBU
dc.contributor.institutionRenmin University of China
dc.contributor.institutionAalborg University
dc.contributor.institutionSoochow University
kaust.personShang, Shuo
kaust.personKalnis, Panos
refterms.dateFOA2018-06-14T03:31:22Z


Files in this item

Thumbnail
Name:
p1178-shang.pdf
Size:
904.7Kb
Format:
PDF
Description:
Main article

This item appears in the following Collection(s)

Show simple item record

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. For any use beyond those covered by this license, obtain permission by emailing info@vldb.org.
Except where otherwise noted, this item's license is described as This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. For any use beyond those covered by this license, obtain permission by emailing info@vldb.org.