 # Details of Grant

EPSRC Reference: EP/N004221/1
Title: Algorithms that count: exploring the limits of tractability
Principal Investigator: Jerrum, Professor M
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Department: Sch of Mathematical Sciences
Organisation: Queen Mary University of London
Scheme: Standard Research
Starts: 01 November 2015 Ends: 30 April 2019 Value (£): 361,972
EPSRC Research Topic Classifications:
 Fundamentals of Computing Logic & Combinatorics Statistics & Appl. Probability
EPSRC Industrial Sector Classifications:
 No relevance to Underpinning Sectors
Related Grants:
Panel History:
 Panel Date Panel Name Outcome 14 Apr 2015 EPSRC ICT Prioritisation Panel - Apr 2015 Announced
Summary on Grant Application Form
Computational Complexity is the name given to the rigorous study of the resources required to solve specified computational problems. These are often decision problems (does there exist a structure satisfying certain properties?) or optimisation problems (what is the optimal structure?), but in this project we concentrate on a third class, namely counting problems. The phrase "counting problem" here is widely interpreted; examples of computational problems in this category include: (i) "find the number of satisfying assignments to a CNF Boolean formula", (ii) "evaluate the partition function of the Ising model (an extensively studied model in statistical physics) with interactions specified by a weighted graph", and (iii) "find the volume of a convex body". Example (i) is a straightforward counting problem, (ii) asks for the evaluation of a weighted sum over spin configurations, and (iii) is a definite integral, which is the limiting case of a summation.

Crudely put, there are two objectives in computational complexity, namely lower bounds and upper bounds. Establishing a lower bound amounts to proving that a certain amount of resource, say, time or space, is required to achieve a certain computational goal. Generally, lower bounds can only be established under some complexity-theoretic assumption, the most famous being P not equal to NP. In this project we concentrate on the more optimistic activity of establishing upper bounds, i.e., designing and analysing algorithms for a computational task that provably require only a certain amount of resource. Part of the rationale for the stress on upper bounds is that substantial progress has been made in recent years on lower bounds, and it is important now to see how far these can be matched from the other direction.

As the vast majority of counting problems are intractable, assuming exact solutions are sought, it will be necessary to investigate various escape routes: efficient approximation algorithms with guaranteed error bounds, algorithms that are provably efficient on restricted classes of problem instances, and parameterised algorithms whose bad behaviour can be controlled by a parameter that is assumed small in problem instances of practical interest. Another strand to the proposed research is to narrow the gap between what is efficient in theory and efficient in practice. In the classical theory, an algorithm is deemed to be efficient if it runs in time polynomial in the instance size. This theoretical notion of efficiency often corresponds to tractability in practice, but the correspondence is not so good in the context of counting problems, where Markov chain Monte Carlo is the most common solution technique.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:
Further Information:
Organisation Website: