IT and AI – benefits and dangers abound

22 MARCH 2024

IT and AI – benefits and dangers abound

A Charles Sturt University computing expert hails the burgeoning opportunities of careers in information technology (IT) to maximise the benefits and minimise the potential harms of artificial intelligence (AI).

By Dr Jason Howarth (pictured, inset), Course Director and Senior Lecturer in Computing in the Charles Sturt School of Computing, Mathematics and Engineering.

Technology? You need only look at the mobile phone to see a small-scale example of how widely used technology is in our daily lives and how big a role it plays in society.

One device, the mobile phone, can now be used for making payments, checking our vaccination status, booking flights and accommodation, authenticating access to programs, using it as a flashlight or connecting to the radio if we need to, and so much more. There is not much our tech devices can’t do these days.

How far has the IT industry come over the years?

When we think back to the first iPhone released in June 2007 it might seem like a long time ago to many people, but it was only 17 years ago, and since then, smartphones have become passé; everyone has them.

There are critical, pivotal moments in IT where things just get better; phones get faster, computers get faster, the graphics become more realistic.

But then we reach ‘watershed’ moments on top of this when entirely new things happen. In my lifetime there have been many of these – for example, GPS, the Internet, social media applications, drone technologies and smartphones.

The watershed moment today is artificial intelligence and we’re seeing the emergence of AI as a real game changer in terms of not just benefits but also threats.

If we were to go back to 2007 and think, ‘Well, how far could mobile phones go?’ we would never have guessed half the capabilities they have today. So, imagine how far things can go with AI, or what the next big watershed moment will be.

Technology certainly has come a long way, and the big question now is, how far can or will it go?

The sky is the limit

Looking ahead 20 or 30 years, how the world works and looks will be vastly different from today.

There are always risks with trying new things, so as technology becomes more advanced, there are some key threats to users, to the industry and to the world, not least cybersecurity.

As we increasingly entrust our data to digital devices and digital networks, clearly cybersecurity becomes more and more important.

In an ideal world, it’s more about knowing when ‘enough is enough’, or only using it where needed.

But one thing humans aren’t very good at is, if we can do it, we do tend to do it, even though it might not always be the sensible thing to do.

We might say that the development of nuclear weapons was a bridge too far, but we know how that panned out; nuclear weapons are here to stay. I don’t think it’s going to stop with AI.

I think there will be regulations around what data can be fed into AI, and I think regulatory authorities might put constraints around how much we can trust it and what we can use it for.

But there’s a good chance that most of those factors will be ignored by innovators, and there will be job losses for sure, or job shifts, as recently demonstrated by the four-month Hollywood screenwriters and actors strikes in late 2023.

The genie is out of the bottle

I think the argument has already been answered by the Silicon Valley AI developers who’ve said the genie is already out of the bottle and AI is only going to get more and more powerful.

We face societal dilemmas; for example, what are the implications for defence and military applications? We are already witnessing the deployment of AI guided ‘drones’ (unmanned aerial and maritime vehicles) in the Russia-Ukraine war. What are the dangers of increasingly autonomous weaponry being deployed on the battlefield?

In the realm of biology and medicine, what miracles and monsters will AI create? A cure for cancer, and a new class of super-intelligent humans, perhaps of one nationality, or corporation, who dominate and rule the other 99.99 per cent of humanity?

Will AI help to alleviate and overcome environmental strains and climate threats, or accelerate interplanetary exploration and settlement leading to a disregard for - and abandonment of - Earth?

IT jobs abound

As these threats continue to grow, Charles Sturt University is helping train the next generation of IT experts to help alleviate some of the pressures facing the industry, not least the growing demand for the expertise of IT professionals.

We make no assumptions about people’s IT knowledge when they enrol in a degree with Charles Sturt. You don’t need an IT background, and for the foreseeable future, jobs abound.

For more information about Charles Sturt IT and AI courses see below or visit www.csu.edu.au or call 1800 275 278.


Media Note:

To arrange interviews with Dr Jason Howarth who is based in Port Macquarie, contact Bruce Andrews at Charles Sturt Media on mobile 0418 669 362 or news@csu.edu.au

Charles Sturt University offers three undergraduate courses:

The Undergraduate Certificate in Information Technology is four core subjects about computer systems and how they work. It’s a great starting point to then transfer into one of our main bachelor programs with full credit for the subjects passed.

The Bachelor of Information Technology, offered online and on campus at Port Macquarie and Bathurst, has two main components - core subjects in areas such as programming, networking and project management. Students can also specialise in a dedicated area of technology, such as cyber security, network engineering, or software development. To ensure real world experience, students also undertake a group project as well as work placement during the course.

The Bachelor of Computer Science is more focused on programming, mathematics and specialisations in areas like web development and cybersecurity. Students also get the opportunity to build and deploy a real-world project as well as undertake work placement in a role that is relevant.


Share this article
share

Share on Facebook Share
Share on Twitter Tweet
Share by Email Email
Share on LinkedIn Share
Print this page Print

All Local NewsComputer ScienceSociety and CommunityTechnology