In Collaboration with Northeastern University London
We are experiencing a massive, dangerous “infodemic”, fuelled by social media. For example, myths and false remedies about coronavirus can cost lives. Our project aims to explain how ignorance mushrooms even in groups of ideally rational individuals; and how groups can combat it collectively. We ask, how do group knowledge and belief (collectively, “attitudes”) form? And, can “higher-order” information (e.g. knowing there is a misinformant in the group) improve individual judgement?
We are the first to explore group attitudes in relation to structural properties of social networks as well as sensitivity to higher-order evidence. Our basic model, stemming from economics, is that individuals learn from neighbours. We extend it in two ways. First, we build a multi-level, weighted group hierarchy, informed by algorithms for identifying influencers and authorities in social networks. Second, we model learning from evidence beyond that which is shared by direct neighbours.
Our interdisciplinary project employs computer simulations of large, realistic social networks. A few pioneering philosophers now advocate computer simulations as a tool for philosophical investigations. But our proposed simulation workload is not trivially encoded in modern graph analytics engines and requires substantial computational resources to scale. In philosophy, our results will yield new evidence about the metaphysics of group attitudes and shed light on current debates on the use of higher-order evidence. In turn, our philosophical simulations may: provide insights in the deployment of existing economic models of information sharing; challenge assumptions about computational workloads on graphs; and ultimately inform company and government policy.
For more information, please go to the Polygraphs Webpage.