Overview

index_image
GroupPriv is a privacy-aware data disclosure scheme that considers group privacy requirements of individuals in bipartite association graph datasets (e.g., graphs that represent associations between entities such as customers and products bought from a pharmacy store).

The key features include the new notion of εg-Group Differential Privacy that protects sensitive information of groups of individuals at various defined group protection levels, enabling data users to obtain the level of information entitled to them. It includes a suite of differentially private mechanisms that protect group privacy in bipartite association graphs at different group privacy levels based on specialization hierarchies.

Motivation

Conventional differential privacy mechanisms are designed to protect the privacy of individual's information. In certain situations, even aggregate (statistical) information about individuals may not be safe for disclosure as the aggregate information itself can be sensitive and may need protection. In general, sensitive information may arise either as:

  • an individual sensitive value indicating an individual’s private information (e.g., did buyer ‘Bob’ purchase the drug ‘insulin’?) in a dataset
  • a statistical value representing some sensitive statistics about a group/sub-group of individuals (e.g., the total number of ‘Psychiatric’ drugs purchased by buyers in a given neighborhood represented by a zipcode).



Group Differential Privacy

Group Differential Privacy extends the conventional notion of differential privacy model to protect privacy of groups of individuals at various group granularity levels. We focus on the scenarios where one needs to protect group-level privacy in addition to individual privacy, where a group consists of a set of individuals. We define the proposed notion of g - group differential privacy by considering adjacent data sets from a group privacy perspective.
group


Mechanisms

The approach consists of two parts:

  • The first part of the proposed approach, namely DiffPar hierarchically partitions and groups the nodes and edges of the given association graph into different levels of granularity of disclosure in terms of group size considering the sensitivity of the formed groups
  • The second component of the algorithm, namely DiffAggre performs a bottom-up aggregation and noise injection to guarantee g-group differential privacy in the published dataset

mechanism


Publications

  • Balaji Palanisamy, Chao Li and Prashant Krishnamurthy, "Group Differential Privacy-preserving Disclosure of Multi-level Association Graphs", Proc. of 37th IEEE International Conference on Distributed Computing Systems (ICDCS 2017), Atlanta, USA. [poster] [PDF]
  • Balaji Palanisamy, Chao Li and Prashant Krishnamurthy, "Group Privacy-aware Disclosure of Association Graph Data", in submission.


Slides