Charity AI Sector Best Practice and Regulation

A practical guide to the AI challenges facing the charity sector and how we might develop the best practice and regulations we will need

Charity AI Regulation

AI will fundamentally change our world far more and far more quickly than previous technology leaps.  The technology is already far ahead of the law and regulatory framework, is developing extremely quickly and no one disputes the very real and growing risks it will bring.  Not only for charities themselves but (for many) their vulnerable beneficiaries too, as well as wider civil society.  This thought piece outlines AI regulation, the specific challenges for charities and how the sector might best respond to maintain public trust, and keep everyone safe.  Suggestions and critique are very welcome and should be sent to ian@charityexcellence.co.uk.

Current AI Regulation

Have a look at EY's The Artificial Intelligence (AI) global regulatory landscape (Policy trends and considerations to build confidence in AI).   I'm sure you'd find it a riveting read but you're busy so here's my much cruder layman's understanding as at June 2024.

  • United States.  The US is currently working on its own AI guidelines, with major legislations relating to AI regulation in force or soon to be, including rules on recommendation algorithms, synthetically generated content, and generative AI.
  • Europe.  The EU has agreed on rules around AI systems like ChatGPT and facial recognition. The European Parliament will vote on the AI Act proposals early next year, with legislation expected to take effect by at least 2025.

The UK government AI white paper appears to see AI primarily as an economic opportunity, with a potentially 'light touch' to regulation.  The ICO issued updated AI guidance in 2023.  You can find both and a lot more from Government at the bottom of this article. However, I'm not aware of any AI regulatory guidance from the charity sector regulators, as at mid June 2024.

The Need for Charity AI 'Industry Standards'

The Challenge.  It's difficult to issue charity sector regulatory guidance in the absence of wider legislation/guidance from Government and, to be honest, I'm not sure anyone really understands what and how to regulate AI effectively anyway.  Besides, AI development and capability is accelerating so quickly that by the time guidance is issued, it'll may well be out-of-date or there will be new issues not covered by it.

However, the risk are very real, growing rapidly, charities are using AI right now and our AI Benchmarking Survey June 2024 found the sector to be hugely unprepared in each of the 9 key areas surveyed (below).  Half of charities rated how well they are prepared as 1 out of 10 and only 1 in 10 that they are very well prepared.

Key Area 1/10 10/10 Charity Excellence Resources
Policies & procedures 60% 5% Charity AI Policy & Downloadable Policies
Managing AI risk 58% 5% Charity AI Risk Register & Threat Toolkit
Ethical & responsible guidelines 57% 4% Charity AI Ethics & Governance Framework
Trustee oversight 48% 6% Charity AI Ethics & Governance Framework
Data protection 43% 7% Charity Data Protection
Cyber security 42% 7% Charity AI: Cyber Security
Choosing the right AI 39% 5% Best Charity AI and the AI Design Toolkit
Skills and experience 34% 4% AI & Charity Sector Jobs
Changes in ways of working 34% 4% AI & Charity Sector Jobs

Moreover, this is not simply about regulating AI but also about what changes we must make to our existing policies and procedures in response to the impact it will have.  Issues such as maintaining public trust, ethical fundraising and safeguarding beneficiaries, staff and volunteers won't change, but the scale of risk, what needs to be done about these and how, will be different, possibly very different.  This is not just about how we use ChatGPT but recognition that AI will be embedded in virtually every enterprise IT system, website, app, browser and social media platform.  It will impact every area of charity operations bringing new challenges, ranging from how we ensure that charity 'data lakes' are cyber secure and data protection compliant to how we use AI generated imagery.  These issues are here now.

Responding to That.  The approach we have taken is to use our expertise in building charity AI systems and our research to create flexible toolkits, which are living documents that work for any non profit and are updated on an ongoing basis.  There is no 'correct answer' but the risks are real, substantial, here now, will grow significantly and cannot be ignored.  The best we can do is to use the best information we have to make the best decisions we are able to.  Developing and promoting that on a sector wide basis, through collaboration, would unquestionably be the best outcome, but may not happen.

Delivering a Solution.  Here is one way in which we might create a solution.

  • Best Practice.  For the sector to collaborate to create best practice guidance, or adopt or amend ours, as we had originally planned and collectively promote that to all UK non profits. That would ensure the sector not only provides the best advice we are able to but, as importantly get it to everyone.
  • Regulation.  As our collective understanding grew of 'what good looks like', charity regulators would be able to extract from that the 'must do' and 'should do' elements and frame these to fit within any wider AI and other legislation or guidance issued by Government, to form robust and effective charity AI regulatory guidance.

We will continue to seek to encourage debate and to take this forward, either in collaboration with others or, if we have to, on our own.

Wider Regulation - The Need to Campaign

The Challenge.  Nobody denies the huge risks AI will bring, including bias, scams and disinformation.  The challenge is that Government is desperate to promote economic growth, organisations always seem to suffer from 'shiny new toy' syndrome and those building AI want to maximise the potentially eye watering profits to be had.  Imposing a greater regulatory burden would have to be done well but would still delay AI roll out, increases costs and reduces profits and, everyone knows that, if they don't do it, someone else will.  With Big Tech pouring billions into development there is huge pressure to capitalise on the opportunity AI offers by moving as fast as possible.  Expect lots of lobbying and the risk of moral outsourcing materialising.  To paraphrase Rumman Chowdhury who coined the term.

Blaming the machine by applying logic of sentience and choice to AI.  In doing so, allowing those creating AI to effectively reallocate responsibility for the products they build onto the products themselves, rather than taking responsibility. 

The issues with AI arise because of the data sets we choose, how we train these and how we use the AI itself.  We need to own the problem.

Why That Matters.  Cutting corners would result in society more widely and marginalised groups in particular, bearing the cost.   Moreover, there are already well known problems around bias and discrimination and, as the tide of AI generated spam, scams and disinformation surges, we will all suffer in the absence of robust regulation but the most vulnerable will, as usual, suffer the most.

What Should be Done.  There is nothing wrong with public policy that is pro AI innovation but it must include safeguards around social equity and consumer protection.  The sector should seek to

  • Engage policy makers to promote this because there will almost certainly be very well funded lobbying and, quite possibly, vested interests within the public sector pushing for less restrictions.
  • Campaign to ensure we and our beneficiaries know how to keep ourselves safe and well informed in an AI enabled world.
  • And decide what we will do to ensure we maintain the public trust in us that's critical to our work and raising funds.  Ensuring that we are using AI well and safely should be an urgent first step in doing that.

Work By Others

Professors Longoni and Capraro published an interesting piece in the Conversation published in Jun 24 and is well worth reading.  Their suggestions for effective regulation include equitable tax structures, empowering workers, controlling consumer information, supporting human-complementary AI research, and implementing robust measures against AI-generated misinformation.

AI Resources - UK Government & Regulatory

Find the Funding and Free Help Your Charity Needs

A registered charity ourselves, the CEF works for any non profit, not just charities.

Plus, 100+downloadable funder lists, 40+ policies, 8 online health checks and the huge resource base.

Quick, simple and very effective.

Find Funding, Free Help & Resources - Everything Is Free.

Register Now!

To access help and resources on anything to do with running a charity, including funding, click the AI Bunny icon in the bottom right of your screen and ask it short questions, including key words.  Register, then login and the in-system AI Bunny is able to write funding bids and download 40+ charity policy templates as well.

This Article Is Not Professional Advice

I have been researching, designing and building AI since 2022 but I am a man of meagre talents and, importantly, neither a lawyer, nor a public policy expert.  This article represents my own views, is for general interest only and does not constitute professional advice.  I'm neither a lawyer, nor an accountant, so not able to provide this, and I cannot write guidance that covers every charity or eventuality.  I have included links to relevant regulatory guidance, which you must check to ensure that whatever you create reflects correctly your charity’s needs and your obligations.  In using this resource, you accept that I have no responsibility whatsoever from any harm, loss or other detriment that may arise from your use of my work.  If you need professional advice, you must seek this from someone else. To do so, register, then login and use the Help Finder directory to find pro bono support. Everything is free.

Register Now
We are very grateful to the organisations below for the funding and pro bono support they generously provide.

With 40,000 members, growing by 2000 a month, we are the largest and fastest growing UK charity community. How We Help Charities

View our Infographic

Charity Excellence Framework CIO

14 Blackmore Gate
Buckland
Buckinghamshire
United Kingdom
HP22 5JT
charity number: 1195568
Copyrights © 2016 - 2024 All Rights Reserved by Alumna Ltd.
Terms & ConditionsPrivacy Statement
Website by DJMWeb.co
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram