Charles Sturt researcher unpacks the ethical risks of ‘illusionary friendship’ with AI

25 MARCH 2026

Charles Sturt researcher unpacks the ethical risks of ‘illusionary friendship’ with AI

Charles Sturt researcher explores why some generative artificial intelligence users may interpret sustained supportive interaction as friend-like.

People are increasingly forming emotional attachments to generative artificial intelligence systems, a trend that raises new ethical and psychological risks, according to a Charles Sturt University researcher.

Charles Sturt hosted a free lecture examining why people increasingly experience generative artificial intelligence (GenAI) systems as companionlike.

Professor Zahid Islam, Centre Director of the AI and Cyber Futures Centre and Associate Dean (Research) in the Faculty of Business, Justice and Behavioural Sciences, hosted a lecture which forms part of the University’s Provocations series which invites the public to engage with emerging societal challenges.

Professor Islam said the rapid uptake of GenAI tools for writing, tutoring and decision support has created new forms of emotional entanglement between humans and machines.

“Many Australians are using GenAI every day, and it’s understandable that some people begin to interpret sustained, supportive interaction as something friendlike,” Professor Islam said.

This illusion of friendship can become ethically risky when it delays helpseeking, encourages dependency or influences judgement in highstakes situations.”

Drawing on classical philosophical accounts of friendship, Professor Islam will explain why humans are predisposed to attribute empathy and intention to systems that produce fluent, emotionally resonant language.

“GenAI can sound caring, but it does not possess consciousness, intention or moral agency,” he said.

“Understanding this distinction is essential if we want to use these technologies safely and responsibly,  including in regional Australia where digital tools increasingly support education, health and industry.”

The lecture will also offer a clear, accessible explanation of how transformerbased AI systems actually work, helping audiences recognise the computational processes behind seemingly humanlike responses.

“My goal is to demystify the technology so people can appreciate its benefits without overattributing emotional depth or reliability,” Professor Islam said.

“With the right safeguards, Australia can embrace GenAI while preserving human responsibility and judgement.”

Media Note:

For more information please contact Charles Sturt Senior Manager of External Relations Dave Neil on 0407 332 718 or at news@csu.edu.au

Share this article
share

Share on Facebook Share
Share on Twitter Tweet
Share by Email Email
Share on LinkedIn Share
Print this page Print

Charles Sturt UniversityComputer ScienceAI and Cyber Futures Centre

More Latest news articles

XYZ
Latest news  26 Mar 2026

XYZ

How to navigate and why you should embrace the frog boom
Latest news  26 Mar 2026

How to navigate and why you should embrace the frog boom