Template: Artificial Intelligence and Governance Framework
An AI governance framework can guide the board’s response to the opportunities and risks that come…
Artificial intelligence (AI) opens new avenues for efficiency in writing anything from marketing materials to survey questions to op-ed essays. In the world of not-for-profits (NFPs), another onerous task is writing funding applications. Applicants are typically asked repetitive, slightly differently worded questions by a range of different funders in different formats. We find ourselves needing to explain again and again how our program will solve a problem and what that problem is, covering the same answer from different angles – youth, CALD communities, people with disability, rural Australians.
AI tools such as ChatGPT, copilot and Gemini offer the ability to streamline the grant-writing process, but they also present significant challenges. Understanding both the opportunities and the pitfalls of employing AI to write grant applications is crucial for NFPs that want to not only save time but also maintain authenticity.
AI can automate routine tasks associated with grant writing, such as drafting standard sections, formatting documents, and conducting preliminary research (see also ‘Inaccurate information’, below). This automation reduces the administrative burden, freeing up staff to focus on strategy and relationships, or generate more grant applications.
AI can assist in brainstorming and refining language, offering ways to articulate complex ideas clearly and to improve grammar.
AI can come up with program-naming ideas, and even project ideas for you to play with and make your own. Many people find it easier to start with some AI-generated ideas than with a blank page. For instance, if your homelessness service is applying for funding, AI can provide insights into existing local services and suggest ways to uniquely position your approach.
AI's ability to analyse large datasets enables NFPs to identify trends in their own work as well as public work and measure the impact of programs more effectively. You could anonymise all the qualitative data collected in a survey through free-text questions, enter it into an AI program, and ask it to identify important themes. Taking a data-driven approach can strengthen funding applications by providing concrete evidence of success and also spark future project development.
AI can, and regularly does, generate information that is inaccurate and then communicate it clearly and confidently. It will sometimes ‘fess up if you ask “What is your source?” or “Is that a direct quote?”, telling you that the information it just offered was no more than a guess. It is essential to check the accuracy of information generated by AI, using a human fact checker.
While AI can generate text, it does not understand the nuance of an organisation's mission or values, or the communities it serves. Even if you upload this information, it will often generate text whose tone is not quite right: it sounds robotic, or too perfect, or …just not you. Grant applications require a personal touch that reflects genuine commitment to the cause. Over-reliance on AI-generated content can result in applications that feel impersonal or generic and this can diminish their impact.
Artificial intelligence is inherently biased. It is built by humans, so it shares the human biases of those who created it. The potential for AI systems to perpetuate biases by generating biased information – for example, by skewing towards dominant cultural narratives – could see a grant applicant submit an inaccurate or unethical application.
Another potential form of bias lies with grant assessors. Some grant assessors feel passionately that AI levels the playing field and enables anyone to write an application, even if their English or written skills are not strong. Others strongly oppose AI, on the basis that it produces inauthentic work.
The safest way to avoid both these forms of bias is to have humans write, or at least very carefully check and edit, your organisation’s grant applications.
Using AI sometimes involves entering sensitive organisational information into digital platforms, which carries data privacy and confidentiality risks. For example, the Australian Research Council has cautioned against the use of AI by peer reviewers and pointed out that entering a research grant application into a platform such as ChatGPT would represent a breach of confidentiality.
NFPs must ensure that they have robust data protection measures in place and consider the ethical implications of AI use in fundraising applications, particularly when they include your organisation’s data and intellectual property.
Some organisations are concerned that relying on AI for drafting funding applications may lead to complacency, with staff neglecting the development and maintenance of their grant-writing skills. Storytelling, empathy and strategic thinking are critical in conveying an organisation's vision and impact, and these are human skills and qualities that AI cannot replicate.
Humans need to consider carefully how they prompt any AI model to ensure that the resulting narrative is accurate and aligns with the organisation's voice. Moreover, while AI can be called on to assist in drafting content or ideas for grant applications, it is imperative that experienced professionals edit and review them. AI should be seen as a "thought partner" rather than a replacement for grant writers.
Develop clear guidelines for AI usage that address data privacy, consent and transparency. Establish protocols to protect sensitive information and ensure that AI tools are used responsibly and ethically. See this policy template: https://www.communitydirectors.com.au/policies/use-of-artificial-intelligence-policy-2024
Encourage ongoing training for staff in grant writing and storytelling to complement the use of AI tools. Balancing technological assistance with human creativity and insight helps to ensure that applications remain compelling and authentic.
AI offers both opportunities and pitfalls when it comes to writing funding applications that match the funder’s interests. On the one hand, you can upload your funder’s form and ask AI to adapt your programs and draft text to the interests of the funder (an opportunity). On the other, there is a significant risk that AI will generate text that is merely okay and does not precisely meet the requirements the funder has outlined (a pitfall). Without human oversight and input, this application will not be a winning ticket.
An AI governance framework can guide the board’s response to the opportunities and risks that come…
$36.00 inc GST Artificial Intelligence (AI) is quickly changing the way we work and the community…
This editable tool is designed to help boards explore the effective and ethical use of AI in…