Automated Decision Making and The Impact on News and Media

An interview with Brooke Myler and Professor Jean Burgess from the ARC Centre of Excellence for Automated Decision-Making and Society.

Brooke: Welcome to the Automated Decision-Making and Society podcast.

My name is Brooke Myler and today we are discussing what Automated Decision-Making is, and how it affects our news and media. Joining me in this episode is Professor Jean Burgess from the ARC Centre of Automated Decision-Making and Society. Jean is a professor of digital media in the Digital Media Research Centre here at the Queensland University of Technology. She is also the Associate Director of the ADM+S and is here to discuss automated decision-making in news and media.

Intro: The use of automated decision-making technologies promises to address challenges in many fields, but it also creates the potential for misuse. In news and media, automated and algorithmic decision-making systems are used intensively in search engines, personalised newsfeeds, content moderation systems, and programmatic advertising. This means that computers are increasingly making decisions that impact us and our society.

Some researchers have made it their mission to investigate the effect of automated decision-making technologies on news and media in the society, and the ARC Centre of Automated Decision-Making and Society does just that.

Brooke: Welcome to the podcast Jean and thank you for joining us today.

Prof. Jean Burgess: Thanks for having me. Great to be here.

Brooke: Firstly Jean, lately we’ve been hearing the term automated decision-making, can you explain what that term actually means?

Prof. Jean Burgess: We’ve heard a lot about automation in society more generally, you know the idea of machines making more and more things and then maybe even taking our jobs because of that. But automated decision-making is when the machines start making decisions; in particular when we use computational processes to displace normal, human organisational processes of decision-making. So, we rely on computers throughout everyday life to process data, to make predictions, to apply rules, choose actions and determine outcomes of quite significant processes.

So, by automated decision-making we mean a range of technologies that are used in social systems to make decisions. Those technologies could be deep learning or machine learning, or blockchain and cryptocurrency. These are technologies which do promise to solve a lot of problems across sectors from health care and social services to transport and media. But they can also have societal problems and challenges associated with them.

Brooke: So how is this term different from artificial intelligence?

Prof. Jean Burgess: Yeah, that’s a great question. Artificial intelligence I guess we would see as, actually a slightly more narrow term, as I mentioned before we’re hearing a lot about machine learning, about deep learning and about algorithms which usually those terms sort of substitute for each other. And they are part of the mix of automated decision-making. But automated decision-making is, I guess, a broader concept that can include AI, but I guess a lot of the hype is about AI and that’s certainly a part of what we are looking at.

Brooke: What is the ARC Centre of Excellence for Automated Decision-Making and Society, and what does it do?

Prof. Jean Burgess: The ARC Centre of Excellence for Automated Decision-Making and Society, or the ADM+S centre for short, is being funded by the Australian Government through the Australian Research Council for 7 years.

To bring together universities, industry, government and community organisations, to help investigate and support the development of more responsible, ethical and inclusive automated decision-making. It’s a large centre across a number of universities involving a number of different knowledge domains and disciplines from the humanities, social sciences, technological sciences and across Australia and around the world. We have a number of international research partners involved as well.

Brooke: What areas of automated decision-making does the centre cover?

Prof. Jean Burgess:  So, automated decision-making is already being used in a lot of different areas in everyday life. In this centre we have chosen four particular areas where ADM, as we call it, is really playing a role and important issues and challenges are emerging. We organised these under four focus areas, news and media which we are going to be talking about today, transport and mobilities, health and social services.

Brooke: Of these focus areas, how are automated decision-making systems used in news and media?

Prof. Jean Burgess: Yeah, so automated decision-making sounds like a very big thing and in fact at the big end of the scale, apparently processes of decision-making can have big effects. So, if you think about Centrelink’s robo-debt fiasco for example. But there are lots of little decision-making processes that take place in a software that govern and help curate our news and media environment and everyday life as well. From search engines to the way that digital media platforms suggest or recommend content to us and also the way they moderate content and make some content invisible, or remove it from their systems and also, in various processes of news gathering and news productions, so journalists will encounter automated decision-making technologies in their everyday work as well.

Brooke: So Jean, what are some of the issues that could arise from the use of automated decision-making systems in the news and media?

Prof. Jean Burgess: In journalism automated decision-making systems enable new forms of reporting that are computer assisted or data driven, sometimes talked about as data journalism. Also, really sophisticated audience engagement that draw on metrics from social media data, for example. But a flip side of the same coin I guess, is that these technologies also support the automated mining of personal information, they enable the algorithmic amplification of dis-information and misinformation, or even of hate speech. And this poses substantial risks to individuals, to communities, to our democracy. There are also issues around unfair content moderation. So, for example, some platforms might ban nudity making it impossible for a breast-feeding association to reach their audience but allow subtle forms of white supremacy and racism.

Brooke:  What are the researchers at the centre doing to address these issues?

Prof. Jean Burgess: The centre is looking at how ADM systems help or hinder users in finding balanced and diverse news and media content, and to what extent such systems foster civil discourse or spread partisan propaganda.

We have a number of projects that investigate the use of ADM in news and media along those lines. One project is going to look at how automation is used in the newsroom from generating news, to news distribution and how it is affecting journalists and their everyday work practices on the ground. Another project is looking at platforms operationalise ideas about safety in the way they use automated decision-making to keep their users safe including dating apps for example. We have a couple of other projects using citizen science models, an exciting new method of gathering data from the public called ‘data donation’, to investigate the impact of ADM on the news and media experiences of real users.

So, for example, one of these projects is called the Australian Search Experience Project and it investigates the extent which recommendations by major search engines are personalised to address users interest. We work with one of our international partner organisations there, they’re called AlgorithmWatch, they are based in Germany and working with them will provide the first really independent assessment of how search algorithm goes for Australian users. And building on that we really hope to work directly with policy makers, with educators, with news and media organisations themselves to help mitigate any negative effects that might result from that. We also have a Facebook ad tracker project which will examine how advertising on Facebook targets specific user demographics.

Building on research that has already been conducted by the US initiative ProPublica we’re examining who sees which ads on Facebook and investigating whether targeting advertising like this is used to exploit vulnerable populations or push problematic political messages. So, you know, this kind of project will really enable us to shine more light on the largely untransparent and unregulated world of Facebook advertising which is kind of a black box to ordinary users and to citizens.

Brooke: So, Jean, how can listeners get involved in this project?

Prof. Jean Burgess: As I’ve said some of these projects really depend directly on data donations from the general public in order to develop as detailed and accurate picture as possible of how ADM systems are affecting ordinary people in their everyday media use. We’ll ask participants to just install a simple browser plugin that will observe their experience with search engines and social media and contribute anonymous data to our research. So, this is really an exciting and powerful way that ordinary citizens can help hold platforms to account. These platforms have often actually tried to avoid such independent critical scrutiny from researchers and policy makers, so this is an opportunity for citizens to play a role in building far greater transparency for what these platforms are doing.

Brooke: Where can listeners find more information on these projects?

Prof. Jean Burgess: Yep, they can visit our website admscentre.org.au to read about the centre more generally, or the people involved these projects, as well as plenty of other fantastic projects across the centre.

Brooke: What does the centre plan to ultimately achieve with these projects?

Prof. Jean Burgess: The ADM+S centre really is aiming to enhance public understanding to inform policy and practice and support the practical development of more responsible, ethical and inclusive automated decision making now and in the future. So, our basic fundamental research is a huge part of that but so is really directly engaging the public and getting them involved in our work.

Brooke: It sounds like the use of automated decision-making technologies promises to address challenges in many fields but at the same time as the potential for misuse. Professor Jean Burgess thank you for helping us better understand what automated decision-making is and how it affects our news and media and what the ARC Centre for automated decision-making and society are doing to address potential issues in this field.

Prof. Jean Burgess: Thanks for having me.

Brooke: You’ve been listening to a podcast from the ARC Centre of Excellence for Automated Decision- Making and Society. For more information on the Centre go to www.admscentre.org.au