EPSRC Reference: |
EP/W005573/1 |
Title: |
Using Robust Graph Clustering to Detect Fake News |
Principal Investigator: |
Mallmann-Trenn, Dr F |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Informatics |
Organisation: |
Kings College London |
Scheme: |
New Investigator Award |
Starts: |
07 January 2022 |
Ends: |
26 March 2025 |
Value (£): |
295,373
|
EPSRC Research Topic Classifications: |
Fundamentals of Computing |
|
|
EPSRC Industrial Sector Classifications: |
No relevance to Underpinning Sectors |
|
|
Related Grants: |
|
Panel History: |
Panel Date | Panel Name | Outcome |
13 Sep 2021
|
EPSRC ICT Prioritisation Panel September 2021
|
Announced
|
|
Summary on Grant Application Form |
Misinformation and fake news are a threat to society on numerous levels ranging from violence to the promotion of racism. The modern era and the rise of social networks have contributed to a rapid spread of misinformation. Partly, this is due to the fact that stopping fake news is a delicate matter: deciding whether a piece of information is fake news often requires human intervention, which is not a scalable solution for large (social) networks. It seems therefore necessary to rely on algorithms to make decisions or to at least help in the decision process.
The goal of this project is to develop an algorithmic framework to help prevent fake news from spreading. We aim to use recent advances in hierarchical graph clustering to achieve this. To see why this is promising consider one of our two applications: Wikipedia. Wikipedia relies on users world-wide to edit the content of articles in order to build an encyclopaedia that contains information on all branches of knowledge. It is inevitable that some of the edits are factually incorrect --- intentionally or unintentionally. This occurs in particular when the articles are contentious (e.g., politicians, vaccination, etc.).
The result is often that so-called 'edit-wars' break out and users start changing contested information over and over. In the process of these edits, Wikipedia can be used a weapon of misinformation and propaganda. The main tool used by the Wikipedia admins to prevent this is to restrict the editing to a limited range of users.
Our goal is to predict which articles should be restricted before edit-wars take place in order to avoid the spread of misinformation. To achieve this, we propose to use hierarchical graph clustering algorithms.
Framing the problem as a hierarchical graph clustering problem is natural: Note that the applications we will focus on, Twitter and Wikipedia, are both graphs. In the case of Wikipedia, the nodes of this graph are the articles and there is a directed edge from one article to another if one article links to the other. The hierarchical structure stems from the categories the article belongs to. For example, the articles on Barack Obama and Donald Trump are both restricted. Both belong to the category "21st-century Presidents of the United States" which in turn is part "Presidents of the United States". It turns out all articles concerning presidents are restricted, illustrating the influence of the underlying hierarchy.
The project consists of two parts. In the first part, we aim to analyse graph clustering algorithms in more general settings with the aim of finding provable guarantees and limitations of practically relevant algorithms such as the Louvain algorithm. In the second part, we aim to apply these results to finding misinformation and fake news.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
|