When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System

Handle URI:
http://hdl.handle.net/10754/623935
Title:
When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System
Authors:
Liu, Xiao; Liu, An; Zhang, Xiangliang ( 0000-0002-3574-5665 ) ; Li, Zhixu; Liu, Guanfeng; Zhao, Lei; Zhou, Xiaofang
Abstract:
Privacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.
KAUST Department:
King Abdullah University of Science and Technology, Thuwal, Kingdom of Saudi Arabia
Citation:
Liu X, Liu A, Zhang X, Li Z, Liu G, et al. (2017) When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System. Lecture Notes in Computer Science: 576–591. Available: http://dx.doi.org/10.1007/978-3-319-55753-3_36.
Publisher:
Springer Nature
Journal:
Database Systems for Advanced Applications
Issue Date:
21-Mar-2017
DOI:
10.1007/978-3-319-55753-3_36
Type:
Book Chapter
ISSN:
0302-9743; 1611-3349
Sponsors:
This work was done while the first author was a visiting student at King Abdullah University of Science and Technology (KAUST). Research reported in this publication was partially supported by KAUST and Natural Science Foundation of China (Grant Nos. 61572336, 61572335, 61632016, 61402313).
Additional Links:
http://link.springer.com/chapter/10.1007/978-3-319-55753-3_36
Appears in Collections:
Book Chapters

Full metadata record

DC FieldValue Language
dc.contributor.authorLiu, Xiaoen
dc.contributor.authorLiu, Anen
dc.contributor.authorZhang, Xiangliangen
dc.contributor.authorLi, Zhixuen
dc.contributor.authorLiu, Guanfengen
dc.contributor.authorZhao, Leien
dc.contributor.authorZhou, Xiaofangen
dc.date.accessioned2017-05-31T11:23:14Z-
dc.date.available2017-05-31T11:23:14Z-
dc.date.issued2017-03-21en
dc.identifier.citationLiu X, Liu A, Zhang X, Li Z, Liu G, et al. (2017) When Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender System. Lecture Notes in Computer Science: 576–591. Available: http://dx.doi.org/10.1007/978-3-319-55753-3_36.en
dc.identifier.issn0302-9743en
dc.identifier.issn1611-3349en
dc.identifier.doi10.1007/978-3-319-55753-3_36en
dc.identifier.urihttp://hdl.handle.net/10754/623935-
dc.description.abstractPrivacy risks of recommender systems have caused increasing attention. Users’ private data is often collected by probably untrusted recommender system in order to provide high-quality recommendation. Meanwhile, malicious attackers may utilize recommendation results to make inferences about other users’ private data. Existing approaches focus either on keeping users’ private data protected during recommendation computation or on preventing the inference of any single user’s data from the recommendation result. However, none is designed for both hiding users’ private data and preventing privacy inference. To achieve this goal, we propose in this paper a hybrid approach for privacy-preserving recommender systems by combining differential privacy (DP) with randomized perturbation (RP). We theoretically show the noise added by RP has limited effect on recommendation accuracy and the noise added by DP can be well controlled based on the sensitivity analysis of functions on the perturbed data. Extensive experiments on three large-scale real world datasets show that the hybrid approach generally provides more privacy protection with acceptable recommendation accuracy loss, and surprisingly sometimes achieves better privacy without sacrificing accuracy, thus validating its feasibility in practice.en
dc.description.sponsorshipThis work was done while the first author was a visiting student at King Abdullah University of Science and Technology (KAUST). Research reported in this publication was partially supported by KAUST and Natural Science Foundation of China (Grant Nos. 61572336, 61572335, 61632016, 61402313).en
dc.publisherSpringer Natureen
dc.relation.urlhttp://link.springer.com/chapter/10.1007/978-3-319-55753-3_36en
dc.subjectRecommender systemsen
dc.subjectPrivacy-preservingen
dc.subjectDifferential privacyen
dc.subjectRandomized perturbationen
dc.titleWhen Differential Privacy Meets Randomized Perturbation: A Hybrid Approach for Privacy-Preserving Recommender Systemen
dc.typeBook Chapteren
dc.contributor.departmentKing Abdullah University of Science and Technology, Thuwal, Kingdom of Saudi Arabiaen
dc.identifier.journalDatabase Systems for Advanced Applicationsen
dc.contributor.institutionSoochow University, Suzhou, Chinaen
dc.contributor.institutionUniversity of Queensland, Brisbane, Australiaen
kaust.authorLiu, Anen
kaust.authorZhang, Xiangliangen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.