How Do Search Engines Decide What We See Online?

An interview with Brooke Myler and the following talent:

  • Professor Jean Burgess – Associate Director ARC Centre of Excellence for Automated Decision-Making and Society
  • Dr Verity Trott – Lecturer in digital media research at Monash University
  • Matthias Spielkamp – Co-founder and executive director of AlgorithmWatch
  • Abdul Obeid – Data Engineer ARC Centre of Excellence for Automated Decision-Making and Society

Brooke: Welcome to the Automated Decision- Making and society podcast, my name is Brooke Myler and today we’ll be discussing the impact of search algorithms on our search results and the information we are exposed to online. We’ll be talking with various guests involved in the Australian Search Experience Project being conducted by the ARC Centre of Excellence for Automated Decision- Making and Society.

Intro: Have you ever wondered how search engines decide what search results you see? Or have you ever noticed that other people see different results from you?

Search engine algorithms have a large impact on our online lives. Every topic you search online is considered by the search engine and the algorithm chooses results for you to see. This can cause issues in news and media as users are only receiving tailored content that might be leaving out important information, biasing opinions, or even spreading mis and disinformation.

The Australian Search Experience engineered by the ARC Centre of Excellence for Automated Decision-Making and Society, aims to determine if search engines are creating ‘filter bubbles’ by inviting individuals to become citizen scientists and contribute their search result data to this project. The experience will be carried out via a plugin that identifies search data on your computer. The plugin is secure and not interested in any personal data. With billions of topic searches online per day, the plugin is necessary to uncover the impact algorithms have on users.

To investigate what the public really knows about search algorithms, we asked students at the Queensland University of Technology what they thought.

Brooke: What do you think of search engine personalisation?

Student 1: I think it’s pretty cool how it can know what type of things you like so it can give you advertisements of the product you like. But it’s also creepy with how much it knows.

Student 2: I don’t really know what it is to be honest, but I feel like it might be a good thing maybe?

Student 3: It’s a bit scary. It’s a bit like is my phone actually listening to me? I don’t understand the algorithms and what goes on behind it. It’s just a little bit off-putting.

Brooke: How much do you think a search engine knows about you from what you do online?

Student 1: What kind of shows I like and my social media profiles.


Student 2: Definitely far too much, but I guess it can be useful sometimes.

Student 3: Oh, my goodness, they know that I’m obsessed with Kristen Stewart and raw vegan treats. From what I gather from what I get on my search engine and on Instagram, but is that just the frequency of what you search? Cause I assume so, or what you’ve recently searched?

Brooke: There is a lot of speculation about the impact that search engines have on the information we encounter. But we know very little about how they order and display information. We need a way to independently assess the information search engines recommend, because they are so central to our daily lives. This is where The Australian Search Experience can help.

The Australian Search Experience Project studies the personalisation of search results for critical news and information, across key search, news and video platforms.

Brooke: To begin, I’m here with Professor Jean Burges, the Associate Director of the ARC Centre of Excellence for Automated Decision- Making and Society, or ADMS.  Jean, what do we know about search algorithms and how do they influence our search results?

Jean: With a huge amount of information available on the web, search engines quite some time ago introduced various ways of ranking the content that comes up for a particular search. Sorting through billions of web pages in an index of possible search results to find the most relevant and useful results. But of course, also these algorithmic systems need to serve the business interests, various policy considerations and so on of the search engines. So, there is a really complicated recipe that goes into presenting the results of any search you might enter.

So, we have to understand something like the search engines as being made up of a complicated series or combination of algorithms. The algorithms take into account various elements or factors including your search terms, the relevance of particular pages to those search terms, the relative popularity with other users of these pages and so on; but as well as that they take into account personalisation elements that are specific to you as an individual user or that are specific to say your location in the world.

Brooke: In order to study algorithms, researchers need search data. The best way to get accurate data results is from people who use the internet, which is almost everyone. Researchers are encouraging people to ‘donate their data’ to assist in algorithm studies. However, don’t be alarmed donating data is simply a term between researchers.

Joining me to discuss the importance of data donation is Doctor Verity Trott. Verity is a lecturer in digital media research in the school of media, film and journalism at Monash university.

Thank you for joining me today Verity.

Verity: No worries! It’s a pleasure to be here.

Brooke:  So, Verity, can you tell us about the sort of research have you been doing with data donation?

Verity: One of the projects that I’ve been part of has been trying to understand dark and targeted patterns with advertising on social media, and dark ads are only visible to those whom they are targeted toward. So, I can’t see the ads you receive and you can’t see the ads I receive, making it tricky for us to identify patterns. So, to study this, we’ve been asking users to essentially ‘donate’ their ad data. That is to install a plugin that collects the ads they see uniquely, as they scroll through their social media. Through this process we will be able to see the end result of how ads are targeted, which is influenced by several different factors, and begin to make visible this very obscured type of media content.

Brooke: So, why do you use data donation methods for this research?

Verity: So, platforms themselves aren’t super transparent with the algorithms they use and how they use our data. And if we were to set up something like artificial user accounts and search browsers, and use those, we would end up with pretty artificial results. So, the only way we can really understand what is happening, is to invite users to join the project and donate the data surrounding what they see. So, to truly understand the users experience, we need to connect with users, and this is why we use the data donation method for several different research projects.

Brooke: Verity, the term data donation sounds like people are giving you their data, is this what is happening?

Verity: Not really, the term can be a bit misleading. We’re not actually collecting personal data about users, rather we’re asking users to install a plugin which then collects or records the targeted content that they might see from certain platforms as they are browsing. So, we really just see the content that they are seeing rather than collecting their own personal data and information.

Brooke: In a similar project overseas, partners with the ARC Centre of Excellence for Automated Decision-Making and Society, AlgorithmWatch conducted a data donation project to gather search results surrounding an election. Here to talk about the plugin is Matthias Spielkamp the co-founder and executive director of AlgorithmWatch.

So, Matthias, what is AlgorithmWatch and what are your aims?

Matthias: AlgorithmWatch is an evidence-based society organisation, that means that we work with some of the world’s best academics, researchers and scientists to better understand what we call automated decision-making systems. Meaning algorithmic systems, we delegate decisions to.  Examples are systems that decide whether someone would receive welfare support or a loan, but also those that decide what content we see on the internet.

Our aim with that is two-fold. We want people to better understand how extremely relevant these systems have become and what impact they have on individual freedom but also democratic society as a whole, and then we work to control them in a democratic fashion. There needs to be clear rules where these systems can be used and where they should be forbidden, but also what transparency and accountability measures need to be in place when they are used.

Brooke: Why did the team at AlgorithmWatch use a data donation model for your research?

Matthias: For many of these systems there are no clear rules in place yet to audit them. For example, by experts getting access to the algorithm and the data to assess what they are doing and whether they are doing a good job. So, we decided to ask users to donate their data to us with the help of a software program they would download onto their computer. That allows us to see search requests to Google and collect the results that were returned with a focus on the German parliamentary elections.

Brooke: So, what was the response?

Matthias: The response was incredible. Almost 4,000 users downloaded the plugin and, in the end, they donated around 3 million data sets for us to analyse.

Brooke: What search terms did you use?

Matthias: The search terms we used were names of politicians and political parties, which turned out to be a little bit of a problem because we should have used some more politically controversial terms. For example, relating to migration or the German pension system. Also, because of technical restrictions we were not able to adjust search terms dynamically to the development of the political discussion and legal restrictions kept us from collecting a lot of demographical data.

Brooke: Matthias what did your research find?

Matthias: So, the level of personalisation of search results we found were low, but it was an experiment and we learned a lot from it. Most importantly it was successful as a campaign. People are willing to participate and donate data. So, we continued similar experiments and are now even building a platform we can offer to other researchers and civil society organisations to run their own experiments on, without having to create the technical infrastructure from scratch every time.

Brooke: The technical infrastructure developed by AlgorithmWatch is now being used here in Australia, for the Australian Search Experience. We talk to Data Engineer Abdul Obeid who has been building on this research to form the search plugin.

Abdul: Well to answer your question, when the research imitative was originally undertaken, it was undertaken in Germany as you mention, by a partner organisation of the ADM+S, AlgorithmWatch. The Australian Search Experience is related to that, in that it builds on the research undertaken by AlgorithmWatch and looks at static search terms directly to their election, our project on the other hand uses a broader set of search engine terms relating to generic topics and current events. Our plugin will also operate across a broader set of platforms including common search engines relating to news and video.

Brooke: So, Abdul, can you tell us more about the plugin you’ve developed?

Abdul: Well firstly, the plugin does not gather any information about you. It de-identifies accumulative data and only gathers search experience data. The plugin parodically runs searches on a number of leading search engines using configurable keywords and dynamic searches. Their reports result to our project, all of this happens in the background with minimal disruption to participating users. So, with the help of our citizen scientists, as I like to call them, this approach enables us to gain valuable and otherwise unavailable insights into how search engines are shaping their search results to suit Australian users’ interests.

Brooke: You mentioned the plugin runs searches using configurable keywords and dynamic searches, what does this mean?

Abdul: Well, this means that over time the searches are going to change with respect to the research project itself. So, a dynamic search is a search that is updated based on a topic that’s given in real time, these then allow us to track current events with the plugin.

Brooke: So, what search terms will you be using?

Abdul: Well, this depends, and I would base it on the current events that are being most searched for in search engines. So, for example, COVID is a very much talked about issue at the moment and this is something that people are regularly searching in their browsers. This is something that then ideally, we’d want to acknowledge within our research program and study.

Brooke: So, what goes into developing a plugin like this?

Abdul: the plugin itself attempts to model the search experience and this is really interesting because ideally what we want to do is to understand what goes on behind the scenes of a search engine when you are attempting to search the web.

Brooke: So now that the plugin has been developed, how can listeners get involved and what does it actually do on our computers?

Abdul: Well, if you are interested in participating as a citizen scientist, you can install the plugin from your browser’s extension store. Instructions and links are available on our webpage, The plugin parodically searchers for search rankings. These rankings may update depending on how these studies progress over time. We monitor that closely and then your data is anonymised. And the reason we do this is because we are more interested in seeing how a search engine would then personalise the searches to the user.

Brooke: Abdul, what exactly are these search rankings?

Abdul: Search rankings are the main data that we grab from the plugin. For example, I could not know anything about you or your demographic. Surprisingly, your demographic doesn’t actually affect the search rankings at all. We are not using personal information to inform the research, but if I gave you a term such as conspiracy theories, the search rankings would be different on your computer compared to someone else’s. So, what we are trying to understand is if there is a differentiation and if so, what does this mean?

Brooke: Will users know when the plugin is running a search on their computer?

Abdul: Many data analysis programs aren’t shown on your computer, but we purposely designed The Australian Search Experience plugin so it could be seen on your desktop. We wanted the plugin to be completely transparent and for users to see everything that it is doing. The plugin was also designed to have the absolute minimum processing power needed and operates within regulation of the chrome web browser policy.

Brooke: So, how do users know that the plugin is private?

Abdul: The plugin does not gather any personal information about you. It is restricted in what it collects and is designed to comply with ethics laws. We’ve gone to a great extent to ensure that this doesn’t impact your web experience.

Brooke: So, will the researchers see people’s search history?

Abdul: Fortunately, no, we will not see your search history, the search results are anonymous but will allow us to determine which factors might influence the weightings of search results.

Brooke: So, what happens to the results when everything is compiled from the plugin?

Abdul: Well, after the project we’re hoping to make the project publicly accessible. We are all for open data sets and this is very valuable information for the betterment of understanding social experiences.

Brooke: What was your favourite part of developing the plugin?

Abdul: Really being able to contribute to the improvement of search experiences here in Australia.

Brooke: Thank you for joining me today Abdul.

Abdul: Thank you for having me.

Brooke: So, Jean, how many participants are you looking for to join this project?

Jean: To understand the potential social impacts of search engine personalisation, we need at least 1,000 ideally 3,000 well in fact, ideally everyone in Australia to join the project.

Brooke: How can listeners actually get involved in this project?

Jean: If people are interested in taking part in the project, they can visit our webpage for further info about it and instructions on how to install the browser plugin. The address is

Brooke: Jean thank you so much for joining us on the podcast today.

Jean: My pleasure, thanks for having me.

Brooke: So, it seems like online algorithms control a lot more than we originally thought. It is promising that with the development of The Australian Search Experience plugin, users will be able to help investigate the personalisation of algorithms which is usually hidden and unknown.

Please visit our website if you would like to become a citizen scientist and install the plugin on your computer. Thank you everyone for joining me on the podcast today.

End: You’ve been listening to a podcast from the ARC Centre of Excellence for Automated Decision-Making and Society. For more information on the Centre go to