Not-for-profits getting smart about artificial intelligence: ICDA survey

Posted on 07 Feb 2024

By Matthew Schulz, journalist, Institute of Community Directors Australia

Artificial Intelligence i Stock 1464561797
Nearly all not-for-profits are using artificial intelligence, yet just a handful have adopted AI governance principles to match.

Not-for-profit leaders believe there’s potential to make greater use of AI in their organisations to make data-driven decisions (21%), write better grant applications (20%), analyse data more effectively (15%), and improve customer engagement (9%).

These are some of the early findings from the latest Institute of Community Directors Australia (ICDA) Pulse Poll.

Not surprisingly, the most popular AI system in use was ChatGPT, used by nearly one-third of respondents. Some even used ChatGPT to complete the survey, asking the bot to suggest potential uses for AI and to flesh out other answers.

Popular uses for AI systems – according to human respondents – include:

  • writing presentations, preparing speeches and developing communications (26%)
  • customer engagement (9%)
  • data analytics (6%)
  • customer services (6%).

Almost all not-for-profits are using artificial intelligence, with two-thirds of respondents saying they could experiment more to make better use of it.

"There is a massive scope for wider implementation of AI across all NFP organisations as knowledge of capabilities and application increases." – Survey respondent

At the same time, just a handful of organisations are yet to adopt AI governance principles, with many nervous, risk-averse and confused about how best to deploy the new technology.

Only three of the 185 organisations that responded to the poll had not used artificial intelligence systems, with 2% saying the AI systems they’d tried had been “too risky” to continue using.

According to the study, relatively few frontline staff (12%) and tech support staff (9%) were using AI for work. Instead, managers and leaders (32%) had taken the lead, while 13% of board members were using the new tech.

As a respondent from a small NT-based organisation put it: “There is a massive scope for wider implementation of AI across all NFP organisations as knowledge of capabilities and application increases. For small NFPs in niche sectors such as ours, this relies on keeping younger memberships active and involved at the board level to keep use of digital technology and AI progressing rather than reverting back to the paper-based (snail mail) communications and memberships we had in 2022 when our new board were voted in!”

Another leader in a small NFP declared in their comments: “AI should be looked at by the NFP sector for the obvious efficiencies it can bring. Its ability to do this and also link to other apps and applications has the opportunity to save money and hours in administration and reporting analysis work.

"We can still keep sensitive customer data out of the model, and train it to be of use at an operational, strategic and governance level. It can also be fed information, so it has the ability to learn about your sector and the organization more broadly. You can develop "personas" through prompt engineering to automate customer experiences, assist with report and proposal writing, and even some top CEOs use it as a personal assistant/sounding board for strategy ideas.”

AI graphic
Produce a 16:9 image of what ChatGPT might look like in humanoid form. It should appear somewhat futuristic and non-threatening, yet also hint at demi-god status. Image generated with a Dall-E prompt.

AI enthusiasm tempered by risk concerns

More than half of the survey respondents (55%) were personally enthusiastic about AI, although a much lower 27% believed their organisations were AI enthusiasts.

Overall, NFPs are slow adopters of AI, with nearly three quarters of respondents describing their organisations as “neutral”, “cautious” or “very cautious” about its use.

That sense of caution also appears to apply to the types of data that organisations are willing to allow AI systems to use. As a result, most organisations are neutral or have little appetite for allowing AI to run chatbots on their websites, generate predictive answers, or read some kinds of text data.

The reticence about fully activating AI is reflected in the high level of awareness of AI risks. Leaders nominated concerns about:

  • privacy and data security, with fears about breaches and securing sensitive information such as client data, financial details, and proprietary information
  • a fear and lack of understanding of AI in their organisations, including general workplace anxiety about AI and how it should be used
  • legal and ethical issues, such as copyright, privacy, the misuse of AI, and the potential breakdown of human connection in human-driven services
  • inaccuracy and misinformation, with the effect it could have on reputation, client services and legal compliance
  • human resources worries, including concerns about losing volunteers and workers
  • technological risks of relying too much on AI, potential bias, and the loss of authenticity and trust
  • the internal risks posed by staff and others using AI tools without approval or understanding, as well as board-level “inertia” about the adoption of AI.

The slow take-up also appeared to be related to the fact that more than 80% of NFPs had limited, very limited or no capacity to pay for AI products.

One respondent summed up the feelings of many: “I think people in the NFP sector (in general) are overworked and underpaid, and as a result of this they have not invested any time or energy into learning about AI and accepting it into their lives. I see a lot of fear-based resistance to AI in the NFP sector.”

Organisations must up their AI governance game

According to the study, more than 45% or organisations did not routinely identify AI risks, with just 16% having done a formal risk assessment of the use of AI.

Just a handful of organisations (5.8%) had already adopted AI governance principles, with 28% of respondents unaware of how AI was governed in their organisations.

And just 17% had policies, procedures or guides that mentioned AI risks and privacy.

Some organisations said they wanted to adopt policies and assess risks before implementing AI.

Adele
ICDA general manager Adele Stowe-Lindner

ICDA general manager Adele Stowe-Lindner said the study reinforced the great potential for the sector to deploy AI more widely, but also highlighted the lack of governance principles being applied to steer that change.

“The survey results show that too few board members are making key decisions about how AI is being rolled out in organisations, with that task currently falling to senior staff or the roll-out happening simply by default,” Ms Stowe-Lindner said.

“The poll shows that the use of AI is currently mainly staff-led (43%), which is positive in the sense that they’re showing that initiative, but at the same time less than five per cent of organisations are adopting AI products via board recommendations.

“This raises significant governance implications, with the potential consequences of board members being unaware of how artificial intelligence applications are permeating through their organisations.”

“I would like to see NFP leaders be more confident about how and when they should, and should not, employ artificial intelligence.

“For example, NFPs realise there’s great potential for deploying AI to help them with grant applications, yet the figures show there’s some reticence to do that. Some of the comments indicate this could be because of fears that it might be considered ‘cheating’, or too risky with high-stakes projects. Our data scientists say good prompt engineering – essentially asking your favourite chat bot the right questions – can ensure AI outputs can be verified and trusted.

“So, knowing the right questions to ask, maintaining ethical behaviour and avoiding privacy breaches – alongside strategic use of AI – is all a matter of good governance.

“We know that NFPs are resource poor, and the proper use of AI can significantly save time and improve their effectiveness. You don’t want to miss out on this opportunity by being affected by a lack of action triggered by decision paralysis or ‘overwhelm’. People need to understand what’s possible with artificial intelligence, but importantly governance must come with that.”

Ms Stowe-Lindner said ICDA had produced a suite of help sheets, guidance and training to help community directors, including an AI governance help sheet and framework.

She said the significant gap between the 55% of survey respondents in favour of greater AI use and the 27% of organisations in favour was a red flag.

“The advent of AI isn’t something that can be ignored in an organisation, and for those engaging in and leading cultural change, this can be a big challenge.”

Nevertheless, it appears that the potential is there for NFPs to lift their AI game, with most organisations (51%) describing themselves as having been successful in adopting new technology in the past. Groups also rated themselves a healthy three stars out of five for their tech abilities.

The ICDA findings provide further insight to the broader findings of the recent Infoxchange Digital Technology in the Not-for-profit Sector report which found the use of generative AI tools had doubled since 2022.

Infoxchange chief executive David Spriggs said about AI: “These new tools require careful consideration, guidelines and training to ensure they are being used with data protection and information security at the forefront”.

More from Community Directors Intelligence

Become a member of ICDA – it's free!