On Privacy-Utility Tradeoffs for Constrained Data Release Mechanisms

Article Directory

Summary

The privacy protection data release mechanism aims to simultaneously minimize the leakage of sensitive data and the distortion of useful data.The dependency between sensitive data and useful data leads to a privacy-utility trade-off, It is closely related to the generalized rate distortion problem.

introduction

In this work, we studied how to directly constrain the data available as input to the publishing mechanism, thereby affecting the best privacy utility compromise area. Such restrictions may be caused by applications that cannot directly observe sensitive or useful data. For example, useful data may be an unknown attribute, which can only be inferred from sensitive data. Alternatively, constraints can be used to capture the limitations of a particular method, such as the output disturbance data release mechanism that only uses useful data as input and ignores the remaining sensitive data.

Basic attempts to anonymize data have led to the leakage of widely disclosed sensitive information, such as [2], [3]. These later inspired various statistical formulas and techniques to protect privacy, such as k anonymity [4], L diversity [5], t compactness [6] and differential privacy [7]. Our work involves the non-asymptotic information theory processing of this problem, for example in [1], [8], where sensitive data and useful data are modeled as random variables X and Y, respectively, and the mechanism design is constructed to obtain the best privacy- Channel problem of utility trade-off.

Guess you like

Origin blog.csdn.net/weixin_42253964/article/details/107715456