We are publishing the strategic research we are using in preparation for our next generation AI to help inform sector thinking and create debate. Suggestions and critique are very welcome and should be sent to ian@charityexcellence.co.uk.
Part 3 are the strategic factors that underpin the above for anyone who wants more detail.
Nobody knows what will happen but how well we respond is up to us.
There is now general agreement that.
AI will fundamentally change our world and will do so very quickly.
For a desperately hard pressed and hugely underfunded sector
It offers charities the opportunity to increase impact, reduce workload and improve our fundraising effectiveness.
It could do that by deepening our engagement with technology to boost productivity by augmenting our capabilities and reducing the admin burden. However.
AI comes with numerous, very real and growing risks and
We are, as yet, very ill prepared.
We believe that by acting collectively, quickly and well the sector would deliver measurable impact in tackling the crisis. Equally, not doing so, would further expose us to substantial risk of further harm.
Sector bodies must act collaboratively, quickly and well to enable us all to exploit the opportunities and adequately mitigate the risks.
Our Benchmarking Survey found that use of AI in charities has surged, from 35% last year to 60%. . There was little evidence of charitable grant makers investing in AI yet. It also found that charities were using a wide range of AI systems, with the most popular reflecting the wider population. These were ChatGPT (57%), followed by Copilot (23%) and Gemini (14%). Worryingly, on average, half of charities are extremely unprepared in each on the 9 key areas, with only 1 in 20 extremely well prepared. More than half were extremely unprepared to manage AI risk, including in the key areas of cyber security and data protection. By far, what charities most wanted was AI to support their fundraising and, to a lesser extent, to reduce the admin burden, including using text creation AI. What they wanted was help to integrate AI into working, manage data and understand what the future will look like. This article was written in response to that.
We see a fork in the road approaching in 2025
...with medium to large charities moving to AI which offers integration and very small charities who need it less, can't afford paid systems and lack IT skills sticking with stand alone systems, like ChatGPT. But charities remain woefully unprepared to either exploit the benefits or manage the risks. What the sector really needs is 'industry standards' for charity AI, with bodies coming together at sector level to create and promote best practice in charity AI. We think the likelihood of that is very slim, in the near term, so we will continue developing these ourselves and making these available to everyone.
We think the use of AI will be commonplace by end 2025 for businesses and society and that the charity sector will reflect this too.
We see AI being used pretty much everywhere from AI systems to enterprise IT, apps, phones, browsers and social media.
We think increasing numbers of charities will be building their own AI systems and that the very large charities will be rolling out powerful AI applications.
The think the most widespread deployment of charity AI will be for fundraising.
In 2021, we first raised the risk of digital moats. That is, very large charities using their significant resources and capabilities to exploit advanced tech to race ahead of the sector and become super dominant in fundraising. Whether (or not) this materialises should begin to become clearer by 2025.
We do not see the full benefit of AI being realised within the sector for 2 to 5 years,
as this will be predicated not just on implementing the tech but also on changing ways of working and that will take much longer. Done well, it will deepen our relationship with technology to boost productivity, reduce admin, augment capabilities and improve job satisfaction but delivering that is by no means certain. The strategic deployment of AI to create a single sector gateway has the potential to significantly drive efficiencies and the use of AI (including predictive analytics) will substantially improve sector level decision making in areas such as policy, resource and training provision. Equally, AI brings significant risks, particularly cyber security and data protection, and our 2024 sector benchmarking survey has shown that (in mid 2024) we are woefully unprepared for this and AI will increase both the volume and sophistication of scams and cyber attacks.
There will be a significant and growing risk of major data/security breaches and harm to charities and their beneficiaries.
AI will massively change our world, but whether it's an opportunity or a threat depends largely on what you and your charity do about it and what we do now will set our course for the future. Our aim is support charities in making sure that's an opportunity for everyone. Join us on the journey by registering now and become one of the tens of thousands of non-profits that make up the UK’s largest and fastest growing charity community.
Scenario planning can be used when there are too many factors to be able to predict an outcome. The cases below are not what we think will happen but what potentially could happen. For all of the factors to come together in one way is extremely unlikely. For either scenario to actually happen is possible but very unlikely and how good or bad it turns out to be is likely to fall somewhere between the 2. The point I am making is that, if we cat quickly and well, we can influence where that 'somewhere' is to make a good outcome even better or a bad outcome, not as bad as it would otherwise have been.
For good or bad, we think that by 2025 AI will be in almost every charity. Partly driven by ever increasing use by individuals and partly by the fact that it will soon be embedded in almost all enterprise IT (such as CRMs and client databases), websites, apps, phones and social media. There will be a huge surge in AI spam, and the volume and effectiveness of scams and disinformation, with Big Tech and Governments in an arms race to hold back malign actors and organised crime.
Global. Global cooperation results in effective legislation and partnership with Big Tech to tackle AI abuse and make substantial progress in making AI safe, effective and accessible to everyone, including the global south. Global productivity significantly increases benefitting everyone.
UK. Sector bodies collaborate and work with regulators to create charity AI best practice, with core elements made regulatory, and jointly communicated to everyone. Charities make informed choices about which system to use and employ these to support, not replace staff, with appropriate safeguarding and oversight. Scams, and data and security breaches do occur but, by and large are managed and AI is used well. Charities begin to build on this by developing new ways of working and exploiting data.
Working collaboratively, sector bodies create a recovery and growth plan for all charities, which is accepted by Government. Government recognises the huge damage to Civil Society in recent years and that this is not only damaging to society but also holding back economic growth and driving public sector demand.
Impact. Funding remains very tight but targeted investment is used to exploit AI strategically and further support sector recovery, which is boosted further by the recovering economy. Charity sector impact and productivity increase, as does fundraising effectiveness. By 2026, the sector has not only fully recovered and become resilient again but is also growing, and is even able to provide growing support to our still under funded public services. CRUK, BHF and others use the advances in AI in medicine to deliver stunning successes in the fight against disease.
Global. Collaboration melts away to be replaced by competition between individual countries and Big Tech, each wanting to be the winner. In the absence of coordinated regulation and effective push back, AI spam, scams and misinformation rocket, damaging economies and, often, targeting the most vulnerable. The AI already significant gap between the advanced economies and the global south grows, driving increased inequality.
UK. Government gives in to Departmental demands for shiny new toys and lobbying by commerce, and its light touch regulation of AI fails. Within a few years, the first major public scandals occur and productivity gains fall far short of what these could have been. The public sector funding black hole predicted by the Resolution Foundation materialises and Government departments repeat their mistake of 2023 by slashing charity grants and contracts. The sheer volume and sophistication of AI spam, scams and misinformation increasingly harm public confidence and largely drown out charities attempts to engage their communities and fundraise online. Economic growth slows exacerbating the already falling funding income.
Charity Sector. AI use accelerates as predicted but without effective oversight or controls, with many not adapting to AI, and others failing to both realise the benefits and deal with the risks. Many buy the wrong AI and/or install and integrate it badly. We throw the baby out with the bathwater by using AI to replace help lines and junior staff, rather than to augment their abilities; quality of service and morale fall. A string of serious security and data breaches further damages public confidence. Charity regulators respond with draconian regulation but desperately hard pressed and underfunded, charities struggle to reverse the now intractable problems and the scandals continue. The super large charities deploy extremely sophisticated fundraising AI that uses social engineering to deliver hugely compelling messages tailored to individuals. Medium and large charities are increasingly outcompeted by them.
CRUK has spent an extra £24m, reflecting an investment in supporter-focused digital transformation, which will lead to future income growth. Civil Society, 13 Sep 24 -
Impact. The combined factors above drive rapidly growing numbers of charities to insolvency. Those that remain cut services to survive and struggle to retain staff who increasingly leave the sector for less stressful, better paid and more secure jobs. Public confidence falls even further and with it, donations and volunteering numbers. Civil society, not least the most vulnerable, are increasingly harmed and lost charity capacity to support public services increasingly transfers work to the public sector but at far higher unit cost and impacting the already badly over stretched services.
A huge public outcry finally compels sector bodies, regulators and Government to act but the damage has been done. A small but influential group of MPs drive through a 'charity begins at home' agenda to divert IDO funding to charities providing public services. The Lottery launches an emergency strategy doing basically the same by diverting funding from 'lower priority' areas to public sector priorities, such as social care, health and education.
Longer term and taking a lesson from this history, we think the immediate changes are probably over-hyped and the long-term changes under-estimated, or at best, as yet unknown. However, we think there are emerging strategic areas and issues within these.
Hybrid Working. We think that AI has real power to deepen our relationship with technology rather than replace us. That is augmenting our capabilities to enable us to achieve more, more quickly and reducing the admin burden to reduce pressure on people and make jobs more satisfying. No doubt some will use it to simply monitor and control but we think that'll cause problems and not achieve its real potential. Some jobs, such as repetitive data entry will go, more likely in the private sector, new roles will be created and the vast majority will change, many fundamentally in due course. However, achieving the full benefits isn't just about the tech but also how we change our ways of working to exploit that and that is likely to take a lot longer than just implementing a new AI system. Without a sector wide approach, there is a risk that too many will not achieve this.
Customisation. We think the large charities will build their own AI, although whether that creates the digital moats resulting fundraising super dominance remains to be seen. More generally, across society and business and charities, we think the creation of custom AI for organisations and personalisation of AI for its users will become widespread. We think there are real opportunities here to support small charities far better and we think the potentially much greater accessibility offered can be used to better support small and marginalised groups. We are designing the first next generation AI for charities and beginning to move towards realising our concept of a AI concierge service to act as a single gateway to the sector for everyone - delivering holistic impact.
Holistic Impact. In our view, there is no shortage of charity sector resources but too many can't find what they need, or even know it exists. Charity Excellence was built in response to that and we think that our concept of 'holistic impact' offers significant opportunity to improve both efficiency and effectiveness. The sector comprises o.5m UK non profits, many tiny, with duplication and overlap, and all individually promoting themselves, often competing. The result is that many beneficiaries and charities simply cannot find and connect to the support they need. We have trained out AI bots and will migrate our remaining non AI systems to create a single concierge service to act as a clearing house that sends both charities and individuals to the help they need.
Trust. The risks of AI are well known and accepted, the tech is far ahead of the law and regulation and its development is accelerating. We will soon all be using it and subject to decision made by it, so there is good reason to be wary. Moreover, as the tide of AI generated spam, scams and disinformation grows people will inevitably be less and less trusting of what they see and read online, about how organisations are using their data and whether, or not, they can trust them. Trust is fundamental for charities - not only for our beneficiaries to trust us but also in our campaigning and fundraising. Our 2024 AI survey shows that charities are very poorly prepared for managing the risks and, if we get this wrong, and there are a whole series of sector scandals, we risk significantly disenfranchising ourselves from our constituencies. Building and maintaining trust will be critical for charities in an AI enabled world. We believe that what we need is charity 'industry standards'.
Big Data. AI is fundamentally about data. We think there is real need to create sector best practice guidance to ensure data is complete, clean, has consent and is compliant and how that is integrated and used with AI to ensure it is ethical and not biased. We have produced initial advice on cyber security and managing data, but we need much more than this and everyone needs to make the changes Moreover, it is now widely accepted that sector data in general is very poor and that this badly impacts our ability to make decisions at sector level. A major part of the problem is individual organisations collect different data in different ways and using different definitions. For AI to work effectively at sector level, we need a sector data lake, so we're growing our Big Data set to create one. It offers significant opportunity to improve the effectiveness of sector level decision making and far more efficient use of limited resources in grant making, creating policy, training programmes, webinars etc. However, it is likely there will be resistance to using it with some continuing to use surveys, because they want 'their' data, making sector level AI use less effective. Moreover, it's probably not unreasonable to say that everybody wants data but nobody wants data driven decision making. Without data, leaders have considerable latitude to do whatever and make whatever decisions they wish to. Whilst data driven decision making significantly improves effectiveness, it also constrains leaders' scope to do what they want to.
A registered charity ourselves, the CEF works for any non profit, not just charities.
Plus, 100+downloadable funder lists, 40+ policies, 8 online health checks and the huge resource base.
Quick, simple and very effective.
Find Funding, Free Help & Resources - Everything Is Free.
To access help and resources on anything to do with running a charity, including funding, click the AI Bunny icon in the bottom right of your screen and ask it short questions, including key words. Register, then login and the in-system AI Bunny is able to write funding bids and download 40+ charity policy templates as well.
This Article Is Not Professional Advice
I launched Charity Excellence in 2018 and have been designing and building charity AI since 2022. This article was based on significant research and analysis but nobody knows what will happen, not least myself. This article represents my own views, is for general interest only and does not constitute professional advice. I'm neither a lawyer, nor an accountant, so not able to provide this, and I cannot write guidance that covers every charity or eventuality. In using this resource, you accept that I have no responsibility whatsoever from any harm, loss or other detriment that may arise from your use of my work. If you need professional advice, you must seek this from someone else. To do so, register, then login and use the Help Finder directory to find pro bono support. Everything is free.