• Login
    View Item 
    •   Home
    • Research
    • Articles
    • View Item
    •   Home
    • Research
    • Articles
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of KAUSTCommunitiesIssue DateSubmit DateThis CollectionIssue DateSubmit Date

    My Account

    Login

    Quick Links

    Open Access PolicyORCID LibguideTheses and Dissertations LibguideSubmit an Item

    Statistics

    Display statistics

    On Sparse Linear Regression in the Local Differential Privacy Model

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Type
    Article
    Authors
    Wang, Di cc
    Xu, Jinhui
    KAUST Department
    Division of Computer, Electrical and Mathematical Sciences and Engineering, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia.
    Date
    2020-11-25
    Online Publication Date
    2020-11-25
    Print Publication Date
    2021-02
    Submitted Date
    2005-04-19
    Permanent link to this record
    http://hdl.handle.net/10754/666352
    
    Metadata
    Show full item record
    Abstract
    In this paper, we study the sparse linear regression problem under the Local Differential Privacy (LDP) model. We first show that polynomial dependency on the dimensionality p of the space is unavoidable for the estimation error in both non-interactive and sequential interactive local models, if the privacy of the whole dataset needs to be preserved. Similar limitations also exist for other types of error measurements and in the relaxed local models. This indicates that differential privacy in high dimensional space is unlikely achievable for the problem. With the understanding of this limitation, we then present two algorithmic results. The first one is a sequential interactive LDP algorithm for the low dimensional sparse case, called Locally Differentially Private Iterative Hard Thresholding (LDP-IHT), which achieves a near optimal upper bound. This algorithm is actually rather general and can be used to solve quite a few other problems, such as (Local) DP-ERM with sparsity constraints and sparse regression with non-linear measurements. The second one is for the restricted (high dimensional) case where only the privacy of the responses (labels) needs to be preserved. For this case, we show that the optimal rate of the error estimation can be made logarithmically dependent on p (i.e., log p) in the local model, where an upper bound is obtained by a label-privacy version of LDP-IHT. Experiments on real world and synthetic datasets confirm our theoretical analysis.
    Citation
    Wang, D., & Xu, J. (2020). On Sparse Linear Regression in the Local Differential Privacy Model. IEEE Transactions on Information Theory, 1–1. doi:10.1109/tit.2020.3040406
    Publisher
    Institute of Electrical and Electronics Engineers (IEEE)
    Journal
    IEEE Transactions on Information Theory
    DOI
    10.1109/TIT.2020.3040406
    Additional Links
    https://ieeexplore.ieee.org/document/9269994/
    ae974a485f413a2113503eed53cd6c53
    10.1109/TIT.2020.3040406
    Scopus Count
    Collections
    Articles

    entitlement

     
    DSpace software copyright © 2002-2022  DuraSpace
    Quick Guide | Contact Us | KAUST University Library
    Open Repository is a service hosted by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items. For anonymous users the allowed maximum amount is 50 search results.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.