- A Charles Sturt University PhD student’s research has createda valuable tool in the fight against social media misinformation
- The newly developed MAPX prototype offers a robust and explainable approach to false information detection by effectively combining content and context features and adapting to the quality of information
- Extensive experiments on benchmarked ‘fake news’ datasets demonstrate that MAPX consistently outperforms state-of-the-art models
A framework developed by a Charles Sturt University PhD research student and colleagues could significantly assist the detection of false information on social media, making it a valuable tool in the fight against misinformation.
PhD student Ms Sarah Condran (picture above, right) said the newly developed ‘Model-agnostic Aggregation Prediction eXplanation’ (MAPX) tool is a prototype from her PhD work at Charles Sturt University.
“The automated detection of false information has become a fundamental task in combating the spread of ‘fake news’ on online social media networks (OSMN), as it reduces the need for manual discernment by individuals,” Ms Condran said.
“Existing models often use content or context features in isolation, limiting their effectiveness, because the dynamic and temporal nature of social media content is often overlooked, and the quality of document features and their impact on prediction trustworthiness is not adequately considered.”
Ms Condran explained MAPX supports the integration of multiple false information detection models by using DAPA, a dynamic adaptive prediction aggregation, a novel algorithm that dynamically integrates base models based on their reliability and the quality of document features. This is used in combination with Hierarchical Tiered eXplanation (HTX), which provides granular explanations to improve the trustworthiness and explainability of predictions.
Extensive experiments on benchmarked ‘fake news’ datasets demonstrate that MAPX consistently outperforms state-of-the-art models and maintains high performance even when the quality of prediction features deteriorates, unlike other models whose performance drops significantly.
“Thus, MAPX offers a robust and explainable approach to false information detection by effectively combining content and context features and adapting to the quality of information,” she said.
PhD supervisor and co-author Dr Michael Bewong in the Charles Sturt School of Computing, Mathematics and Engineering said this research and framework takes on greater significance with the newly proposed Communications Legislation Amendment (Combating Misinformation and Disinformation) Bill 2024 in Australia.
“This Bill seeks to provide the Australian Communications and Media Authority (ACMA) with powers to hold digital platforms accountable while preserving the fundamental right to freedom of speech,” Dr Bewong said.
“This will require social media platforms to rethink how ‘false news’ is detected and mitigated, and it unequivocally implies technological solutions that are effective, efficient and transparent; this is where MAPX fits in.”
Ms Condran said she was excited that their paper, ‘MAPX: An explainable model-agnostic framework for the detection of false information on social media networks’, has been accepted to present at the 25th International Web Information Systems Engineering Conference (WISE2024) (Monday 2 to Thursday 5 December 2024) in Doha, Qatar. Ms Condran will deliver the presentation in Doha.
She thanked her co-authors Dr Michael Bewong, Dr Selasi Kwashie, Professor Zahid Islam, Associate Professor Irfan Altas, and Mr Joshua Condran for contributing to the creation of this work. Dr Bewong is also a Research Acceleration Fellow and Dr Selasi Kwashie is also Senior Lecturer and Teaching Lead in Executive Education, both in the Charles Sturt Artificial Intelligence and Cyber Futures Institute.
Social
Explore the world of social