Colman Noctor: Algorithms shaping childhood byte-by-byte 

"Adults might be dismissive and think of themselves as ‘too old’ to be concerned about such things - but the metaverse is not for 40-somethings like you or me, but for our children."
Colman Noctor: Algorithms shaping childhood byte-by-byte 

A And Also Emotionally Choice Shaping Charged Architecture Public Can More Creating " Opinion "algorithms Polarised Extreme Or Amplify Views,

On the way to school last Wednesday, I had the radio on in the car, listening to the US election results. My sleepy-headed 12-year-old daughter said: “Oh no, he won. I was hoping Kamala would have gotten it.”

I had no idea she was keeping up with American political issues. She explained how she hoped Kamala Harris would win because she ‘stands up for women’ and ‘Trump does not’.

I asked where she had come across the information about the US election. “Online,” she said with a shrug. She was unable or unwilling to elaborate but added, “It’s been popping up everywhere”.

She then asked: “Dad, if everyone sees the information about Trump, why would they vote for him?”

Her simple but poignant question highlights that not everyone sees the same information online.

Our exchange got me thinking about ‘media literacy’ and how we must design programmes for children and young people to help them understand how the internet works. 

It is not enough to tick a box and say we have covered online safety; it has to be woven into our educational curriculum and everyday conversations.

When I talk about algorithms, I am often met with a glazed-over expression and a dismissive response such as ‘Sure, I don’t care who sees my Google search history’. 

However, far more significant than our social media feed being packed with advertisements for dog food are the algorithms determining the choices we make and the worldview we create. 

With advancements in AI, algorithms are becoming much more sophisticated, while our corresponding knowledge of how they work falls further and further behind.

AI algorithms enable machines to learn, analyse data, and make decisions based on that knowledge. 

These sophisticated algorithms perform tasks typically requiring human intelligence, such as recognising patterns, understanding natural language, problem-solving, and decision-making.

They also make social media, streaming platforms, and advertising algorithms far more effective.

Adults might be dismissive and think of themselves as ‘too old’ to be concerned about such things. We may look at the use of avatars in the metaverse and think, ‘That’ll never take off’. 

Similar attitudes existed when texting was first introduced. But the metaverse is not for 40-somethings like you or me but for our children. 

They don’t see the use of avatars as unusual or odd as we do. Because of video games like Fortnite or Roblox, they have been using avatars all their lives, so the leap into that world will be more seamless. 

It’s as if the avatars and skins in video games are training grounds for our departure to the metaverse, which will further isolate children from the real world.

Personalised algorithms

Our online lives are underpinned by personalised algorithms that create ‘filter bubbles’ by repeatedly showing users content similar to what they’ve already engaged with based on data about clicks, likes, and time spent on specific posts or pages. 

This limits exposure to a narrow range of information, subtly nudging users toward certain choices.

Most people know how algorithms analyse users’ online activity to create highly targeted ads, tailoring messages to appeal to individual interests, needs, or desires. 

However, when they think of ‘ads’, they usually associate them with products or consumer items. 

Online ‘advertising’ is much broader, including pointing users toward political ideologies and lifestyle choices.

Recent examples of how algorithms can influence young people’s ‘choice architecture’ include a rise in misogyny among teenage boys and the explosion of skincare routines among preteens. 

These trends are fuelled by algorithms that recognise this content as appealing and bombard children with videos promoting them.

Algorithms use personalised incentives to steer users toward specific products, services, beliefs, or actions. For instance, they might show more sales or limited-time offers to people who respond to urgency, directly influencing their purchasing decisions. 

Or algorithms might nudge people toward ‘far right’ ideologies if they perceive them as being disenfranchised or dissatisfied with the regime currently in place. 

It’s not explicit like traditional advertising trying to get you to a particular toy store at Christmas; it is far more implicit, which is why we need to help children recognise how algorithms work.

Selling harvested data

AI algorithms collect data from the many apps on our handheld devices. They also gather information about us beyond our social media or smartphone use. 

Every smart-enabled device collects information about our habits, which are harvested and sold. Whether it’s a virtual assistant or smart doorbell, we are providing it with information that could be used to profile us.

Even a smart air fryer made the news recently when it was found to be gathering data about users.

If you habitually check your phone late at night, the algorithm will use that information to sell your details to companies selling pillows, sleeping aids, or relaxation programmes.

The algorithm does not have a conscience, and there is no ethical ethos in the online world to limit or curtail content unsuitable for young users. Its sole objective is to show content it believes will increase user engagement.

Algorithms predict what a user will like based on their previous interactions. Over time, this reinforcement can create a ‘choice tunnel’, narrowing users’ preferences to a small subset of topics or viewpoints and shaping future choices.

Certain content can be strategically positioned in prominent places on a user’s screen, effectively priming them to consider these options more seriously.

For example, your teenager’s music streaming service might recommend a particular playlist based on prior listening, or a news feed might prioritise specific stories with high engagement, shaping what they listen to or read.

Algorithms can also amplify extreme or emotionally charged views, shaping public opinion and creating a more polarised choice architecture. 

Political landscapes are sociocultural, and content that steers young people to develop misogynistic or anti-immigration opinions must also be deemed political.

Narrowing our views

Traditionally, decision-making included a certain degree of serendipity, where people encountered options they were not initially looking for. 

For example, you might be in a friend’s house and hear them play a song by an artist you like, buy the album, and become a big fan. You may never have been exposed to that artist without this chance encounter.

Algorithms influence social connections by recommending friends, groups, or events that align with users’ previous behaviours or demographic profiles, resulting in echo chambers, where people are primarily exposed to like-minded individuals, shaping their social choices and reinforcing pre-existing beliefs. 

With more young people socialising online without parental supervision, the likelihood of developing biased views is much higher.

Children need to know that algorithms narrow the range of options and present choices that align with what the algorithm surmises their preferences are.

They also need to be aware that this happens often without realising the extent of the guidance we are receiving. The pervasive influence of algorithms impacts our autonomy and can subtly shape our personal beliefs and decisions.

Awareness of these influences and conscious efforts to seek diverse sources of information and perspectives can help young people maintain greater agency over their choices, which is surely pivotal to a healthy society.

I attempted to explain to my daughter in the car last week that politicians and political campaigns use algorithms to micro-target messages to individuals based on psychographic profiles.

Algorithms allow for highly tailored messaging, where different population segments receive different appeals based on their specific anxieties, interests, or behaviours. The algorithm steers voter preferences by making certain choices feel more personal or relevant.

In answer to her question, ‘If everyone sees the information, why would they still vote that way?’, I explained that not everyone sees the same information, and so is not choosing from a full presentation of the facts.

While much of this explanation went over her head, I will return to it when she’s a little older. She will undoubtedly have the same glazed-over or eye-rolling response, but I will persist and say it anyway. Why? Because it is too important not to have this conversation.

  • Dr Colman Noctor is a child psychotherapist

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Echo Group Limited Examiner