Funding in the era of AI 

By Lani Evans. Lani is currently on parental leave from her role as Head of Te Rourou, One Aotearoa Foundation . She is an honorary member of PNZ. 
 

AI creator tools have arrived and they bring with them a range of possibilities and problems. Essentially super-charged chatbots, tools like ChatGPT, Bard, Bing and others, can respond to almost any prompt, confidently, fluently and sometimes wildly incorrectly. 

So what can it be used for? Does ChatGPT have the potential to create efficiencies in philanthropy and decrease the funding burden for communities? Can it write a high quality funding application?  

To test this, I took (with permission) applications written by community organisations, and re-wrote them, generating answers through ChatGPT. A group of generous funders reviewed both submissions and compared the quality, accuracy and “fundability”.  

When comparing like for like, the AI responses were relatively easy to spot. Most of them read like a first year university paper - overly formal, wordy and a little bit clunky. The human responses offered more nuance, richer language, cultural sensitivity and more appropriate contextual information. ChatGPT struggled to include meaningful narratives, stories of impact and lived experience. 

The AI tool did a better job writing applications on behalf of larger organisations who have comparatively more information available in the public sphere. But even then, there were occasional stark inaccuracies interwoven amongst the facts.  

What ChatGPT appears to excel at is writing a great first draft. The AI generated answers offered good high level data, structure and flow of information, and they were incredibly fast. On average it took 9 minutes to “write” each funding application. The performance of ChatGPT could be improved with practice inputting the right prompts into the tool, so instead of spending time drafting applications, staff time could be spent revising and refining, fact checking and adding in empathy, storytelling and nuance. 

There are additional use-cases for both funders and fundseekers: the CEO of Whanganui Community Trust is using ChatGPT to summarise funding applications and review policy documents; the team at Hui E Community Aotearoa! are using it to draft accountability reports to funders. Others are using the tools to write blog posts, develop social media strategies, and complete desktop research. In the future we may be able to use ChatGPT to complete due diligence reports, or search for strategically aligned funding partners. 

The benefits are clear - but they come at a cost. Our technological developments may be outpacing our ethics. In March, a group of high profile technology leaders released an open letter asking for a pause on AI advancements, so that we can better understand and mitigate risks. There are the apocalyptic, Terminator-style scenarios - but there are also the more immediate risks: disinformation, privacy violations, and the danger (certainty?) of AI exacerbating societal biases.  

Community and philanthropic organisations need to decide where they stand on the use of AI. We need to understand the implications, ethics and the appropriate application of these tools. We need to have governance level conversations that address transparency, bias, privacy, human oversight, legal compliance, reputational risk, data storage, principles of Māori data sovereignty and a range of other considerations. We need to ensure that the use of AI aligns with our organisational values and serves the broader social good. 

For philanthropy, there are some more specific questions to consider. How do we feel about receiving applications generated by AI? Do we understand our own implicit biases? If organisations use ChatGPT will we critique them for being lazy or celebrate their efficient use of time and resources? Do we need to know when work is AI generated? What is our tolerance for inaccuracies, and plagiarism? 

And if we decide we’re onboard the AI waka, how can we use our funding and our privilege to make sure access is equitable? For these tools to live up to their promise of levelling the playing field, we must support AI education, develop and share knowledge, and ensure equitable access to technology and resources for underserved and disadvantaged communities. 

Regardless of our opinions, AI is here, and creator tools like ChatGPT and Dalle-E are likely to proliferate and become increasingly sophisticated. Like all technology, AI can be used for good and bad. The onus is on us to help shape the direction of travel.  

Huge thanks to The Gattung Foundation, Perpetual Guardian, The Selwyn Foundation, DV Bryant Trust and the Department of Internal Affairs | Te Tari Taiwhenua for reviewing applications. Thank you to the community organisations that generously lent their funding applications. 

Previous
Previous

Research shows more comfort advocating 

Next
Next

Communities still in need after storm events