EPSRC logo

Details of Grant 

EPSRC Reference: EP/M022358/1
Title: PrivInfer - Programming Languages for Differential Privacy: Conditioning and Inference
Principal Investigator: Gaboardi, Dr M
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Department: Computing
Organisation: University of Dundee
Scheme: First Grant - Revised 2009
Starts: 01 August 2015 Ends: 31 December 2015 Value (£): 91,961
EPSRC Research Topic Classifications:
Fundamentals of Computing
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
Panel History:
Panel DatePanel NameOutcome
27 Jan 2015 EPSRC ICT Prioritisation Panel - Jan 2015 Announced
Summary on Grant Application Form
An enormous amount of individuals' data is collected every day. These

data could potentially be very valuable for scientific and medical

research or for targeting business. Unfortunately, privacy concerns

restrict the way this huge amount of information can be used and

released. Several techniques have been proposed with the aim of

making the data anonymous. These techniques however lose their

effectiveness when attackers can exploit additional knowledge.

Differential privacy is a promising approach to the privacy-preserving

release of data: it offers a strong guaranteed bound on the increase

in harm that a user I incurs as a result of participating in a

differentially private data analysis, even under worst-case

assumptions.

A standard way to ensure differential privacy is by adding some

statistical noise to the result of a data analysis. Differentially

private mechanisms have been proposed for a wide range of interesting

problems like statistical analysis, combinatorial optimization,

machine learning, distributed computations, etc. Moreover, several

programming language verification tools have been proposed with the

goal of assisting a programmer in checking whether a given program is

differentially private or not.

These tools have been proved successful in checking differentially

private programs that uses standard mechanisms. They offer however only a

limited support for reasoning about differential privacy when this is

obtained using non-standard mechanisms. One limitation comes from the

simplified probabilistic models that are built-in to those tools. In

particular, these simplified models provide no support (or only very

limited support) for reasoning about explicit conditional

distributions and probabilistic inference. From the verification

point of view, dealing with explicit conditional distributions is

difficult because it requires finding a manageable representation, in

the internal logic of the verification tool, of events and probability

measures. Moreover, it requires a set of primitives to handle them

efficiently.

In this project we aim at overcoming these limitations by extending

the scope of verification tools for differential privacy to support

explicit reasoning about conditional distributions and probabilistic

inference. Support for conditional distributions and probabilistic

inference is crucial for reasoning about machine learning

algorithms. Those are essential tools for achieving efficient and

accurate data analysis for massive collection of data. So, the goal of

the project is to provide a novel programming language technology

useful for enhancing privacy-preserving data analysis based on machine learning.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.dundee.ac.uk