Disability savings directed straight to defence in a mixed Budget
Posted on 13 May 2026
There are few surprises in the federal Budget. The flagged reforms are welcome, although…
Posted on 13 May 2026
By Doug Taylor, CEO, The Smith Family
Doug Taylor, the CEO of The Smith Family, a children’s education charity, and also a teacher of governance for social impact, argues that what really matters when it comes to adopting AI is how you plan to use it, and whether it will serve your mission.
Ask most not‑for‑profits that haven’t adopted artificial intelligence what’s holding them back and you’ll hear concerns about data security, privacy, data sovereignty and ethical risk. That’s what Infoxchange’s 2025 survey of more than 820 organisations across Australia and New Zealand found.
These are serious issues and should guide every decision about the use of AI. Trust is the currency of our sector – hard won, easily lost – and the people and communities we support have little margin for error in their lives if we get this wrong.

It can be tempting to think the safest option is to stop and wait. That hesitation is in the data: 25 per cent of not‑for‑profits told Infoxchange they don’t consider AI a priority.
But waiting is like pulling the family car over to the side of the road. Everyone else keeps moving and when you eventually rejoin the traffic, you’re behind with fewer choices. That risk isn’t just to our organisations, but most importantly to the very communities we exist to serve.
Using AI is not about being fast, or about how much work we can automate. It’s not about cutting costs, boosting the bottom line or doing more with less. It’s not about making things less human – quite the opposite.
AI is a means to an end. And that end is our purpose: to create better outcomes for the communities we exist to support. The technology matters only if it helps us do that more effectively and efficiently.
Responsible adoption goes beyond how we use AI tools in our organisations.
As AI becomes an increasingly common part of everyday life – shaping how students learn, the job market and how people book appointments or access services – we can’t lose sight of the broader social impacts.
For people and communities already shut out of the digital world, without affordable, reliable internet, digital devices or digital skills, AI risks adding another source of exclusion and it could end up widening the inequalities that already exist.
At The Smith Family, over the past few years we’ve trialled software with inbuilt AI, custom tools and generative AI to learn where they add value and where they don’t. We’ve done that with clear rules, a strong sense of purpose and a healthy dose of caution.
What’s important isn’t the tools, but how they’re used, the boundaries we draw and the judgements we make.
Used well, AI has the potential to make our organisations more human, not less, giving teams more time to focus on people.
AI is already changing how work gets done across almost every industry, from agriculture to accounting to zoology. Two-thirds of not-for-profits who took part in Infoxchange’s survey said they’re already using generative AI, and almost a third are using tools and platforms with AI built in.
But it’s concerning that governance hasn’t kept pace.
Of the organisations using AI, 85 per cent don’t have a related policy, a basic building block of responsible governance. More than a third say they either don’t see the need for a policy or lack the means to put one in place.
Even if your organisation hasn’t formally approved the use of AI, AI is almost certainly already in the building.
Staff are likely to be using it to draft emails, summarise reports and sift through data. When that happens without guidance, it’s risky.
“Even if your organisation hasn’t formally signed off on AI, AI is almost certainly already in the building.”
To be useful, AI needs to be supported by good governance, safe workflows and human skill.
Designed well and used wisely, it can reduce unnecessary complexity. It can help us to be more effective and efficient, whether that’s supporting researchers studying dementia or young Australians facing barriers to learning because of poverty.
Our sector’s success is built on relationships – in the case of The Smith Family, with students, families, schools, donors and communities. We’re using AI to strengthen those connections, not to replace people or diminish professional judgement. This gives our people more time to focus on what only humans can do.
That’s the view we’ve taken as we’ve looked at where AI can help solve specific challenges. How can it improve outcomes for students? Where can it reduce administrative overload? When can it enable us to make better decisions for families? Can those solutions be applied widely?
From the outset, we set up an internal governance group to guide all decisions, ensuring our approach to AI was responsible, ethical and aligned with our purpose and mission. That has meant answering three questions every not-for-profit must consider:
What are we willing to use AI for?
What are we not willing to use it for?
And what would need to be true for us to broaden its use?
One example of how we’re using AI is through a digital personal assistant used by our family partnership coordinators. It summarises past interactions with families and information about their child’s educational progress, so our teams can be well prepared for conversations.
The assistant doesn’t make decisions or act alone, but it gives people better context, and more time, for quality one-on-one conversations with families.
Another AI digital assistant helps our call centre teams who speak with donors and supporters. This tool collates information, such as past enquiries and giving history, so staff can understand the context of a call straightaway. That helps them respond more quickly and have better conversations. And behind the scenes, AI is also improving payroll processes to help staff spot issues early and reduce errors, lowering organisational risk.
Then there’s the AI tutor we’ve trialled with some high school students we work with. In a trial led by the not‑for‑profit High Resolves and funded by the Paul Ramsay Foundation, students were paired with an AI tutor that adapted lessons to their interests and learning styles. Early results are promising, offering a glimpse of the benefits and possibilities.
Good governance doesn’t need to be complex, but it must be explicit. That means detailing which AI tools are approved, what uses are off‑limits, what information should never be entered into public tools, where human judgement must remain central, and how issues are handled when something goes wrong.
We’re still learning, as is the whole sector, and that’s why governance is vital – learning without rules and guidelines is unlikely to succeed.
Infoxchange’s research shows that among organisations that have implemented AI, 80 per cent report improvements in productivity and results, and that’s encouraging.
If you are wondering what the first step is, start small. Set up one low-risk trial and put guidelines in place so teams know what’s okay and what’s not. Run it for 90 days and measure the results. That’s how you discover what works and can make informed choices about where next.
The Smith Family will be sharing more of what it’s learned from using AI at a forthcoming webinar, “AI in action: practical lessons to advance purpose and impact”, on Thursday, May 21.
RSVP: https://thesmithfamily.zoom.us...
Doug Taylor is the CEO of The Smith Family, a children’s education charity. He also teaches Governance for Social Impact at the Australian Graduate School of Management with The Centre for Social Impact.
Posted on 13 May 2026
There are few surprises in the federal Budget. The flagged reforms are welcome, although…
Posted on 13 May 2026
Is it possible that Australians are revelling in demented hysteria at the moment, imagining all…
Posted on 13 May 2026
Doug Taylor, the CEO of The Smith Family, a children’s education charity, and also a teacher of…
Posted on 06 May 2026
$386 billion is quite a chunk of change. That's how much is earmarked for the AUKUS defence deal…
Posted on 01 May 2026
In all charities and NFPs – big and small – annual budgeting brings with it a degree of…
Posted on 29 Apr 2026
The creeping division, hostility and racism in our society were on horrible show last weekend when…
Posted on 22 Apr 2026
Don't be afraid to explore the ways that AI can help your not-for-profit. It would be remiss of a…
Posted on 21 Apr 2026
Earlier today, federal charities minister Andrew Leigh delivered a keynote speech, 'Bequests,…
Posted on 15 Apr 2026
One of the most important lessons from the covid 19 pandemic was to ask for what is needed. This…
Posted on 15 Apr 2026
Not-for-profit leader Ryan Ginard says that if luck made Australia wealthy, generosity will keep it…
Posted on 08 Apr 2026
It doesn't seem unreasonable to ask our political parties to actually outline their policies, what…
Posted on 08 Apr 2026
In the 100th issue of the Quarterly Essay, Sean Kelly asked questions of Labor, and specifically…