Australia’s AI Reckoning

Opinion | Australia Can’t Afford to Sleepwalk Into the Age of AI

By Maddison McCoy

When I closed a recent panel on artificial intelligence at UNSW, I told the audience this:

“In the depths of our strategic void, we find ourselves navigating in this new world of synthetic data, job displacement, and the discrepancies between thinking about AI and acting on what is a closing window of competitive advantage. But there is hope. AI is leveling the playing field for startups. So we say fair game. But for us to advance our position in this game, we need a structural change and a cultural change.”

This was my honest reflection after listening to a spirited discussion led by Dr Sue Keay, CEO of the UNSW AI Institute and one of Australia’s most respected voices in robotics and AI. She steered a conversation between Michelle Gilmore, CEO of Juno; Distinguished Professor Fang Chen, Executive Director of the Data Science Institute at UTS; and a room full of people eager to know whether Australia is ready for AI.

What I heard convinced me that we’re at a turning point. We can allow fear to stall us, or we can acknowledge that fear is no strategy at all.

Fear Is Our Default

Sue set the tone early by putting the uncomfortable fact on the table: Australians are the most fearful of AI in the world. Her question to the panel was simple: why?

Michelle Gilmore cut to the heart of it:

“There’s a difference between what Australia is on paper and how we market ourselves to the world and what it’s actually like to live here… our government certainly encourages a fear-led mindset as it relates to AI and so does the Australian media.”

That rang true. Fear has become a reflex in our national conversations, whether it’s automation, immigration, or climate change. But AI will not wait for us to resolve our anxieties. Other nations are already setting the standards, shaping the markets, and creating the jobs. If we hesitate too long, we will inherit other people’s decisions.

 

The “No-Brainer” Uses We Ignore

Sue also asked the panel whether too much of the national debate gets stuck in abstractions — fairness, bias, “doomsday” speculation — at the expense of practical adoption.

Fang Chen responded with a down-to-earth example from her own work: sewer inspections.

“People wear the protective gear, drop through the manhole, and take pictures… I would just say this job is not sexy. Not at all. And then after, of course, we created image recognition… by doing that we prevent the safety issues for human beings and also provide the better solution for maintaining the civil infrastructure,” she explained.

This isn’t science fiction; it’s basic public works. We should see the great opportunity that AI can provide, like sparing workers from dangerous, unpleasant jobs while improving infrastructure reliability. If we can’t embrace these obvious, low-risk opportunities, how will we ever be ready for the more complex ones?

Jobs Will Change. They Always Do.

Sue pressed the panel on the public’s greatest concern: jobs. What about entry-level workers? What about the new graduates in the room?

Michelle was direct:

“This is real. I mean, this concern is absolutely real… of course from enterprise and government perspectives it’s how to provide the pathways to those people to get different jobs or different training.”

Then she added the candor we need more of:

“Anyone that says that job displacement won’t happen is just flat-out lying.”

History proves her right. Displacement has come with every technological shift, from the Industrial Revolution to the rise of the personal computer. The challenge is not to prevent it — that’s impossible — but to manage it. That means reinvesting productivity gains into reskilling, retraining, and preparing people for the jobs of tomorrow.

Corporates, Startups, and the Competitive Landscape

Sue noted that without forward planning, job displacement “is not necessarily going to end well.” That observation applies just as much to organisations as it does to individuals. The landscape is shifting. Startups now have access to tools and insights once reserved for the largest firms.

Michelle reminded us of this reality:

“I raised $2.8 million last year… I’m still here. I don’t believe that the playbook to start here, get a little bit of funding and then go straight to the US is true anymore.”

It was an encouraging note. But Sue added an important caveat:

“It’s very unusual for these [Australian startup] companies to list on the stock exchange… the only path for investors to get a return is for the company to be acquired and there are no Australian companies to acquire them.”

She’s right. Too many Australian startups sell early, sending intellectual property and profits overseas. That’s not about a lack of talent — it’s about the structure of our economy. Changing that will require deliberate effort from universities, industry, and policymakers alike.

Ethics, Data, and National Responsibility

On ethics, Sue asked perhaps the hardest question of the night: should Australia build its own ethically trained AI model?

Fang Chen emphasised the urgency of data governance:

“How to secure data probably is one of the really important ones… and I worry even more about AI-generated data, how to differentiate the generated data versus the real data.”

Michelle voiced another reality check:

“We aren’t great at shared collective understanding and agreements. And a great foundational model needs to be built from that.”

That tension captures Australia’s challenge. We may not build a foundational model tomorrow — but we can lead in setting the ethical standards, frameworks, and applied research that will define how AI is used responsibly.

An Awakening, Not a Reckoning

Which brings me back to my closing words. Australia is teetering on the edge of an AI reckoning. If fear continues to dominate, if we delay investment in skills and research, if we shy away from practical adoption, then we will remain passive consumers of technologies built elsewhere.

But that isn’t inevitable. If we nurture our startups, embrace obvious “no-brainer” applications, and build ethical frameworks that reflect our values, we can shift from reckoning to awakening.

This is where the UNSW AI Institute plays a crucial role. Under Sue Keay’s leadership, it is not just advancing world-class research, it is convening industry, government, and the startup community to wrestle with the hard questions and test solutions in practice. The AI Institute is building the connective tissue between researchers, founders, and decision-makers that Australia needs if we want to shape our own AI future.

AI is fair game. The real question is whether Australia will choose to play, and whether we will back the people and institutions already leading the way.

Previous
Previous

Tariffs, Trade and the Clean Tech Domino

Next
Next

Not Done Yet – Protecting Inclusion in a Changing World