I don’t fear AI, but I fear what some people might do with it

Posted on 18 Mar 2026

By David Crosbie, CEO, Community Council for Australia

Shutterstock robots
The question is not whether the artificial intelligence itself is good or bad. It's about the people using it. Pic: Shutterstock

Like many of you, I’m starting to slowly incorporate more AI tools into my work. I claim no particular expertise or knowledge of AI, but I am an amateur user, and I have had opportunities to participate in many discussions about how AI is now being used across our sector.

Most of the AI use across our sector appears to be both intuitive and useful. According to Infoxchange’s Digital Technology in the Not-For-Profit Sector Report, “AI use has become mainstream across NFP operations. Sixty-seven per cent now use generative AI, with 43% of organisations using ChatGPT. One-third are using Copilot. Three out of four (75%) are deploying AI for content creation, reporting, writing and editing. The use of applications with built-in AI has risen from 12% in 2024 to 27% in 2025.”

The major challenges to AI adoption in our sector appear to be cost (training, time and software) and the way AI challenges the existing knowledge base, data systems and privacy settings within organisations.

David Crosbie

There is also a major gap between AI use and the development and adoption of AI policies across our sector, with less than 15 per cent of charities and NFPs having clear AI policies.

It’s important to note that charities and NFPs are invariably seeking to use AI to better serve their communities, strengthen resilience and capacity, and support flourishing individuals, families, communities and our world. Charities and NFPs use AI for good.

What concerns me is not the use of AI in our sector, but the application of AI by people who use it to increase their power and wealth.

We’ve seen this pattern of technological advancement before.

I may occasionally miss the feel and smell of sets of encyclopedias on the bookshelves, but I enjoy being able to go online and have access to almost instant knowledge wherever I am (subject to Telstra’s misleading coverage maps). The internet is now an invaluable part of daily life for me and most people, but it comes at a price.

I feel as though there is an increasing and uncomfortable level of personal surveillance that has been normalised in our lives. We are all subject to ongoing manipulation to get clicks, to watch another clip, read another line, to buy that product, and even the inevitable attempts to scam us have become an accepted daily reality. AI is already driving the algorithms that shape many of our interactions, much of our information gathering, and a lot of our economic and social activity.

“We can negate some of the negatives of AI if we stand for our values against the dehumanising exploitative use of AI, and continue to adopt and adapt AI in partnership with our communities.”
David Crosbie

The development of the attention economy has also effectively undermined truth and distorted our information systems to the benefit of a very limited number of international billionaires. These super powerful oligarchs, by their own admission, are not humanists. They see empathy as counter-productive to their goals, and any form of government or collective approach to well-being is opposed and undermined. They thrive where fear and outrage are both magnified and monetised.

This loss of privacy and ongoing sense of outrage marketing can sometimes feel like a loss of agency. But the price we pay for the wonders of the internet is not just personal. One of the most concerning potential outcomes of the creeping application and adaptation of AI driven by a small number of cashed up powerful commercial interests is a rise in inequality.

Or as Bernard Marr wrote wrote in “The 15 Biggest Risks of Artificial Intelligence”, published in Forbes in 2023, “The risk of AI development being dominated by a small number of large corporations and governments could exacerbate inequality and limit diversity in AI applications.”

If the development and application of AI is largely controlled by a limited number of powerful people, it poses a risk to our way of life. This risk is not AI, it’s the people who are developing and applying AI to maximise personal power and wealth.

As many who are much more qualified than me have highlighted, regulations are needed to ensure transparency and strong, effective guard rails around AI and its application. The European Union’s AI Act is a good example.

But regulation of AI will only go so far if power and control remain concentrated and centralised. And that is where charities and NFPs can and do play a critical role.

Firstly, we can support our communities in holding AI companies to account, challenge their social licence and demand a level of return for the profit they make through our use of AI.

Perhaps more importantly, if we can work proactively with the communities we serve to develop and apply AI based responses to their issues and their needs, AI becomes another significant tool in building flourishing communities.

As a judge of the Australian NFP Technology Awards I have seen charities and NFPs using AI tools in partnership with their communities to improve engagement and provide more targeted and effective services. This use of AI benefits both organisations and the communities they serve.

It’s clear that we need more regulation of AI and better safeguards, but we can also negate some of the negatives of AI if we stand for our values against the dehumanising exploitative use of AI, and continue to adopt and adapt AI in partnership with our communities.

As many charities and NFPs are already demonstrating, AI can be a powerful tool to empower communities and better achieve their purpose.

I’m not afraid of AI, but I am afraid of what some people might do with AI if we let them.

PS: The Tech Awards are still open: Technology Awards.

David Crosbie has been CEO of the Community Council for Australia for the past decade and has spent more than a quarter of a century leading significant not-for-profit organisations, including the Mental Health Council of Australia, the Alcohol and Other Drugs Council of Australia, and Odyssey House Victoria.

More opinion

Become a member of ICDA – it's free!