The investigation conducted by Human Rights Watch indicated 89 per cent of the educational technology (EdTech) products used may have put children’s privacy at risk by tracking their online activity and sharing their data with advertisers.
Many commentators have already argued that online surveillance of minors is unacceptable calling for tighter regulation of child privacy.
The Executive Director of the Cybersecurity and Data Science Institute at Charles Sturt University Professor Ganna Pogrebna, together with her colleagues in the Charles Sturt School of Computing, Mathematics and Engineering Dr Rafiqul Islam and Dr Arash Mahboubi outline the concept of algorithmic decision manipulation and four main dangers confronting children while online.
What is ‘Algorithmic Decision Manipulation’?
Algorithmic decision-making enters our lives through multiple channels. Broadly speaking it is the automation that we have become accustomed to in the Industrial Revolution 4.0, and the ever-growing impact of artificial intelligence on various business processes.
These processes require us to make decisions faster. As a result, our brains do not always have time to analyse all the relevant information we need to make our decisions.
Contemporary decision-making often involves close interaction between humans and algorithms.
Common examples of this include the process of securing a loan, picking a movie, choosing a school for your children, or even reading this article – all are highly dependent on algorithms within the technology you use, as algorithms often determine the order in which people receive different pieces of information.
As a result, in the modern dynamic world, we tend to value speed and convenience and often do not pay that much attention to the information which contributes to our decision-making. We also rarely think about where this information comes from.
Dr Rafiqul Islam, Associate Professor in Computing at Charles Sturt University, explains that this is a problem because machines don't think as we do, and despite all the discussion about ‘neural’ networks in machine learning, the machines’ thought processes are quite different from ours.
Specifically, human neural networks develop person-to-person, depending on age and circumstance. This is vastly different from machines’ ‘neural’ networks which can develop via linear patterns based on the data they are given to absorb, i.e., there is no emotion or empathy factored into machine decision-making.
‘Business’ thinking exacerbates the basic difference between machine and human reasoning even further. This is because companies develop many algorithms for business purposes, and the symbiotic relationship between business and machine thinking influences much of the day-to-day human decision-making.
How is this a bigger problem for our children?
While both children and adults experience these algorithmic influences, Dr Islam argues that children are particularly vulnerable as they are under tremendous pressure due to study, peers, communication tools, and materialistic demands.
In addition to this, an increasing number of cross-cultural and acculturation studies have served to advance understanding that can significantly impact normal protective and resilience factors in the decision-making process.
For example, why does certain content go viral on some channels while other content goes unnoticed on platforms such as YouTube?
The quality of information is not the sole deciding factor. Higher quality videos will usually perform better, but other factors the YouTube algorithms favour are the frequency of posting and engagement.
For example, a teenager who films one video a day in their bedroom, with likes and comments remaining consistent throughout each video, will fare far better on YouTube than a professional make-up artist who films one video per month to underwhelming or fluctuating engagement.
“The issue, however, goes deeper than that,” Dr Arash Mahboubi said.
He argues that much of algorithmic decision-making is opaque to the human eye and not easy to explain to a person. This leads to people being influenced, often without noticing it.
Specifically, we often gradually switch from watching one form of content, such as educational National Geographic movies to looking at photos of cute cats, or from watching detailed statistical analyses of electoral campaigns to a promotional video of one political party.
Adults may have the ability to recognise the influences they are subjected to. However, children are often unaware of the substantial impact machines have on their day-to-day behaviour.
In this regard, Professor Pogrebna, Dr Islam, and Dr Mahboubi identify four main dangers associated with tracking children’s data through EdTech:
1. Lack of informed consent and ‘no privacy’ defaults
Collecting and sharing children’s data raises an issue of informed consent as they do not have the experience to fully appreciate the risks associated with their data being shared with advertisers and to make informed decisions about their privacy.
Furthermore, children are getting used to ‘no privacy’ defaults because their data is being shared by default at an early age. By the time they are adults, they may value their privacy a lot less than older generations by simply accepting the default. This is hardly the optimal way forward for us as an Australian society, as we want our children to be able to defend their rights and civil liberties, of which privacy is an important component.
2. Lack of choice ‘not to engage’ with technology
The ‘business’ argument is that the customer (a human) always has the freedom to ‘switch off’, ‘push the turn off button’ or ‘delete the app’ and simply not engage with the service. But is this really so? Especially, for children?
Unfortunately, the argument about ‘well-informed’ consumers making ‘conscious’ and ‘independent’ decisions is no longer valid.
Our decisions are influenced by context, decision architecture (a feature of the decision environment) as well as many behavioural biases even without the influence of technology.
Recently, the long list of these influences has also extended to technology, which surrounds people to the extent that many develop technology addictions. For example, many of us fall asleep by looking at our smartphones and the first thing we do when we wake up is to check our social media.
This problem is even more pronounced in children and young adults, whose lives are deeply connected to digital platforms such as TikTok, Instagram, and many others. We simply cannot ‘switch off’ anymore.
This is, of course, particularly true for EdTech as children do not have a choice – they cannot say “no” to the tools their schools impose on them. Through the EdTech tools, their online activity is tracked and passed on to third parties without them even realising that this is happening.
As a result, many children and young adults become subjected to unsolicited advertisements, encouraging them not only to buy things they do not need but also to give away more of their valuable data to third parties with unknown privacy protection policies.
Consequently, many children and young adults will buy and will share further personal information without thinking much about it.
3. Lack of ability to stop and reflect
In this information age, we somehow lost the ability to stop and reflect on the information we receive from the various technology platforms and how this information impacts our decisions.
As a result, often we fall into the recursive pattern of clicking on things offered to us by the various algorithms or accepting algorithmic manipulation without even trying to understand how the algorithms work.
The extreme outcome of this vicious circular pattern was once captured in the Wall-E animated Pixar film, where the humankind turned into a bunch of ‘typical’ utility-maximizing organisms (who all look exactly the same) making ‘typical’ decisions, or, rather, not making any decisions at all.
These processes impact both adults and children alike, yet for children, the problem is far more severe.
4. Adverse cybersecurity implications – the privacy paradox
The privacy paradox refers to the fact that when people are asked about their attitude towards privacy, they all tend to say that they really care and value their privacy. However, when faced with trading some of that privacy for receiving a digital service, they sacrifice it without much reflection.
Professor Pogrebna said that Charles Sturt University has conducted multiple cross-national tests of how people perceive various cyber risks. “Downloading apps without reading terms of service documentation” is often cited by many respondents (from all over the world, including Australia) as something they do very often, and, most importantly, as something they are not very concerned about.
Concerningly, with the current dominance of technology in the life of adults, one would expect to observe even higher algorithmic decision-making manipulation in children.
A silver lining: turning ‘Big Brother’ into ‘collaboration’
Academics at Charles Sturt University agree that tighter regulation of how children’s data is collected, processed, and used by EdTech is overdue.
Yet, it is also clear that such regulation should not only concentrate on privacy but address all four dangers of algorithmic manipulation of child decision-making.
Furthermore, the regulation should cover all child data, not just the data collected through EdTech.
Professor Pogrebna said, “There is much talk about regulation and governance in the technological domain, yet, we should not rely on regulation alone.
“Each of us should ‘unlearn’ the habit of blindly trusting technology and train ourselves and our children to reflect on the various inputs that digital technology and algorithms are offering us as inputs into our decision-making process.
“Only through regaining this ability to stop and reflect will we ever be able to regain our independence as human decision-makers.”
ENDS
Social
Explore the world of social