Workers with disability facing higher levels of discrimination and harassment: report
Posted on 03 Dec 2025
Today is the International Day of People with Disability, but for many, there is little to…
Posted on 28 Oct 2025
By Nick Place, journalist, Community Directors
We all understand that artificial intelligence (AI) has arrived, like a shiny new car, but how many of us actually know how to drive it? With this question in mind, the National Artificial Intelligence Centre (NAIC) has published a “driving manual” in the form of guidance to help Australian businesses, not-for-profits and other users to safely incorporate AI into their plans.
Guidance for AI Adoption outlines six practices to help organisations plan, manage and use AI to build trust and provide benefits, and it is intended to help those just starting out on adopting AI into business plans, and those already building complex systems.
The guide includes practical tools and templates, including handy information for charities that might not know where to start, such as an AI policy template and an AI register template.

The guide is based on national and international ethics principles, and the NAIC says that by following the guidelines, organisations can build trust with stakeholders, get the benefits of AI while managing the risks, build public confidence in adopting AI, and set a roadmap for navigating a complex government landscape.
There are two versions of the guidance: a “Foundations” version for those just starting out in adopting and incorporating AI, and an “Implementation practices” version, for governance professionals and technical experts.
“Foundations” looks at using AI in low-risk ways, and it is aimed at professionals who are new to AI or looking for general guidance on best practice when using AI in business, not-for-profit and for-purpose contexts.
The executive director of NAIC, Lee Hickin, told the Community Advocate that creating the guidelines was an important part of the centre’s role in helping Australian industry to adopt AI safely and responsibly.
“We have heard from industry that they need more than principles. They need practical tools that unlock the economic benefit of AI,” he said. “This guidance delivers exactly that – clear practices, templates and pathways to help organisations embed responsible AI into their operations.”
Before taking up his role as NAIC leader, Hickin was chief technology officer for Microsoft ANZ and then led Microsoft’s global Responsible AI team for the Asia region. He also worked independently in the AI assurance field for several international governments. His background left him well equipped to lead the NAIC’s mission of building “consistent and appropriate responsible safety mechanisms to support AI, balanced with the need to empower Australians with confidence and trust in the positive and transformative potential of AI.”
“We have heard from industry that they need more than principles. They need practical tools that unlock the economic benefit of AI.”
Hickin said good AI governance was essential. “Responsible AI starts with strong governance. This guidance gives organisations the structure they need to manage AI risks, build trust and ensure their use of AI aligns with community expectations and business goals,” he explained.
“The guidance is designed to support both new adopters, often smaller organisations taking their first steps with AI, and those further along in their adoption journey. It provides foundational practices to help newcomers build capability and confidence, while offering more advanced implementation strategies for experienced users to strengthen governance, manage risk and align with best practice,” he said.
The six practices that form the backbone of the guidance have been streamlined since the previous Voluntary AI Safety Standard (VAISS) was published, and they incorporate feedback received from industry consultations. “The inclusion of practical templates, such as an AI risk register and AI policy, makes it easier for organisations to take action, not just understand the theory,” Hickin said.
“We’re aiming to reach decision-makers, risk managers, and technical leads across sectors who are responsible for deploying AI safely. The guidance is a step toward building national consistency and confidence in AI adoption.”
More information
Posted on 03 Dec 2025
Today is the International Day of People with Disability, but for many, there is little to…
Posted on 03 Dec 2025
The over-medicalisation of distress affects pretty much everyone in Australia, leading to needless…
Posted on 03 Dec 2025
Many not-for-profit (NFP) board members in Australia are burnt out, overwhelmed and considering…
Posted on 03 Dec 2025
Infoxchange has announced a partnership with the National Artificial Intelligence Centre to address…
Posted on 03 Dec 2025
Tonight, in Adelaide, the people least likely ever to be accused of doing what they do for…
Posted on 26 Nov 2025
If you think it’s inefficient for every small organisation seeking funds in regional, rural or…
Posted on 26 Nov 2025
An emerging tax scheme that offers tax deductions by using barter credits to inflate DGR donations…
Posted on 26 Nov 2025
A landmark conference starting tomorrow in Sydney will bring together the dual sensory impairment…
Posted on 26 Nov 2025
Trade revenue among social enterprises grew by 10 per cent between 2019 and 2023, leading to…
Posted on 26 Nov 2025
A major new report says a cohesive, national, all-governments strategy is required to ensure better…
Posted on 19 Nov 2025
The not-for-profit (NFP) sector has been urged to press on with reforms outlined in a…
Posted on 19 Nov 2025
It pays to check your receipts. The Australian Charities and Not-for-profits Commission (ACNC) has…