EPSRC Reference: |
EP/V026607/1 |
Title: |
UKRI Trustworthy Autonomous Systems Node in Governance and Regulation |
Principal Investigator: |
Ramamoorthy, Professor S |
Other Investigators: |
Marsden, Professor C |
Crabtree, Professor A |
Bundy, Professor A |
Lascarides, Professor A |
Williams, Professor RA |
Schafer, Professor B |
Chockler, Dr H |
Urquhart, Dr LD |
Vallor, Professor S |
Rajan, Dr A |
Li, Dr P |
Anderson, Professor S |
Ireland, Professor A |
Jackson, Dr PB |
Miller, Professor AA |
Innes, Mr C |
Belle, Dr V |
|
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Sch of Informatics |
Organisation: |
University of Edinburgh |
Scheme: |
Standard Research |
Starts: |
01 November 2020 |
Ends: |
31 July 2024 |
Value (£): |
2,671,812
|
EPSRC Research Topic Classifications: |
Artificial Intelligence |
Common Law inc. Commercial Law |
Design Engineering |
Ethics |
Robotics & Autonomy |
|
|
EPSRC Industrial Sector Classifications: |
|
Related Grants: |
|
Panel History: |
|
Summary on Grant Application Form |
How can we trust autonomous computer-based systems? Autonomous means "independent and having the power to make your own decisions". This proposal tackles the issue of trusting autonomous systems (AS) by building: experience of regulatory structure and practice, notions of cause, responsibility and liability, and tools to create evidence of trustworthiness into modern development practice. Modern development practice includes continuous integration and continuous delivery. These practices allow continuous gathering of operational experience, its amplification through the use of simulators, and the folding of that experience into development decisions. This, combined with notions of anticipatory regulation and incremental trust building form the basis for new practice in the development of autonomous systems where regulation, systems, and evidence of dependable behaviour co-evolve incrementally to support our trust in systems.
This proposal is in consortium with a multi-disciplinary team from Edinburgh, Heriot-Watt, Glasgow, KCL, Nottingham and Sussex, bringing together computer science and AI specialists, legal scholars, AI ethicists, as well as experts in science and technology studies and design ethnography. Together, we present a novel software engineering and governance methodology that includes:
1) New frameworks that help bridge gaps between legal and ethical principles (including emerging questions around privacy, fairness, accountability and transparency) and an autonomous systems design process that entails rapid iterations driven by emerging technologies (including, e.g. machine learning in-the-loop decision making systems)
2) New tools for an ecosystem of regulators, developers and trusted third parties to address not only functionality or correctness (the focus of many other Nodes) but also questions of how systems fail, and how one can manage evidence associated with this to facilitate better governance.
3) Evidence base from full-cycle case studies of taking AS through regulatory processes, as experienced by our partners, to facilitate policy discussion regarding reflexive regulation practices.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.ed.ac.uk |