Charity AI Best Practice – Imagery Ethics

Charity AI best practice standard - a practical guide to ensuring that your use of AI and other imagery is manged in a legal, ethical and safe way

Charity AI Best Practice – Imagery Ethics

The advent of AI has raised new issues for charities regarding the safe an ethical use of imagery but the manipulation and faking of imagery has existed at least since the advent of photography.  Stock photos are not ‘authentic’, cropping can be used to unethically change the context of an image and photo shopping images to create unrealistic expectations in young people cause harm.  This guidance applies to all aspects of our use of imagery, including video.  It does not include the use of CCTV as detailed guidance on this has already bene made available by the ICO (link below).

Make Your Voice Heard. This draft guide is one in a series of new Charity AI Best Practice and Safety Standards will be co-created with others to support charities in moving to quickly and safely adopt AI. It is part of the Charity Excellence AI Strategy Phase 2 titled - What Does Good Look Like?  All contributions to improve this guidance would be very welcome and should be sent to ian@charityexcellence.co.uk.

Charity Imagery Ethics - Principles

This charity imagery ethics policy is based on the principles that imagery of any kind should only be created by AI or modified where it:

  • Is essential or adds significant value to the charitable purpose for which the image is being used and.
  • Meets legal and regulatory requirements
  • Is both ethical and minimises risk and harm to others.

The guidelines below provide a framework that individual charities can use to create an ethical imagery policy for their charity, or other nonprofit.  An Imagery Use Policy can be downloaded by logging in the Charity Excellence and asking the in-system AI bunny to download it for you in Word.

Charity Imagery Ethics - Practice

Management

  • Responsibility.  The Board will retain responsibility for the legal and safe use of any imagery and.
  • Oversight. Will ensure that management oversight and procedures appropriate to our needs are in place and consistently applied.  
  • Risk.  We will identify and assess the key risks relating to imagery use and take appropriate measures to ensure that risk remains within acceptable limits.

 Systems and Procedures

  • Perception of AI. Even if an image meets our requirements below, we will not use it where the fact we have used AI may be perceived negatively. For example, we will:
    • Use AI for routine task and issues where its use would be more effective than using an existing 'real' image.
    • Not use AI where our stakeholders may view its use as not appropriate or bad actors may see it as an opportunity to undermine our message.
  • Imagery Sharing.
    • If we share imagery with other organisations, we will put in place a data sharing agreement and ensure this is reflected in our consent process.
  • Data Sets.  Insofar as we are able to and able to ascertain, we will use AI systems and data sets that have strong legal and ethical protections in place, including how data is obtained and trained..
  • Large Language Models. LLMs are the data sets of generative AI systems such as Chat GPT and DALL E. Charity images may not be shared with any LLM.
  • Take Down. We will respond promptly to any take down requests received, and we will take down any imagery posted in our social media that we reasonably believe to be offensive to a reasonable number of our stakeholders, a deepfake, misinformation or illegal.
  • AI Imagery. AI imagery or imagery which has been manipulated by AI will always be annotated as AI, unless it would be obvious to our stakeholders and any other reasonable person that the image is AI generated, making this unnecessary.
  • Deepfakes.  We would not clone anyone’s voice or create a deepfake image or video.

Dignity and Respect for Others

 We will portray people in the way they would wish to be and with respect and dignity.

  • Exploitation. We do not use images that could reasonably be seen as exploitative or dehumanising.
  • Stereotypes. We are mindful to avoid stereotypes and always do our best to represent diverse groups fairly and accurately.
  • Cultural Sensitivity. We do our best to be aware of cultural sensitivities and norms and recognise that what might be acceptable in one culture could be offensive in another.
  • Objectification. We will not use imagery to sexualise or objectify individuals.
  • Accuracy.  We will ensure that anything portrayed in an image, particularly if using AI, such as medical equipment or cultural dress, is accurate, to ensure we do not offend.
  • Recognition.  We will aim to provide appropriate recognition If using imagery that has been created by others or is in the style of a particular creator and we have their permission to use it.
  • Human Creativity. We respect the importance of human creativity and artistic expression, and we will only use AI to complement and enhance this, not replace it.

Imagery Content

  •  Context and Accuracy. We ensure images are used in a context that accurately represents the situation and the message we are using it to convey. We do not use misleading or sensationalised content.
  • Transparency. We are transparent about the source of images and the purpose for which they are being used.

Minimising Potential Harm

  •  Harm and Offence. We may use imagery that is provocative or challenging for some.  We will only do so where there is a clear need, there is no other reasonable alternative that would meet this need, and we will always ensure that any harm that might reasonably be caused by doing is acceptable in terms of the charitable benefit in using such imagery.
  • Personal Safety. We will not include personal or other information in imagery data either if we do not have consent or where doing so may pose a risk to an individual.
  • Deepfake Risks.  We will not upload high-quality, high-resolution images where individuals faces can be seen from various angles and we will take all other reasonable steps to mitigate the risk of imagery being used for deepfakes – Charity AI Best Practice and Safety – Deepfakes.

Legal and Regulatory Compliance  

  • Data Protection Consent. We will ensure that we obtain proper consent for any imagery we use.
  • Retention.
    • Imagery should not be kept secure and.
    • For no longer than necessary for the purposes for which it was collected.
      • A common practice is to retain personal data for up to six years.
    • The consent form must include the period of retention.
  • Legal and Regulatory. We will take reasonable steps to ensure that we do not breach copyright or other IP rights.
    • Where imagery is created by volunteers, we will ensure that our volunteering agreement includes that all intellectual property rights, including copyright, is owned by the charity.
    • In the event that imagery IP is retained by another organisation or individual, this will be made clear as part of obtaining consent.
  • Politics and Law. We do not use imagery in a way that portrays support, or which might reasonably appear to portray support for a politician or political party, or which might be reasonably seen to be illegal.

Regulatory Guidance

ICO: Consent.

ICO: Installing CCTV? Things you need to do first.

Gov.UK: Attributing images.  This is guidance for Government but is quite useful.

Gov.UK: Data protection and CCTV.

AI Imagery Resources

NSPCC: Creating a photo and video policy statement.

Harvard Business Review – Generative AI has an Intellectual Property Problem.

Arts Marketing Association - AI Sector Support Example AI Policy.

This Charity AI Imagery Guidance is not Professional Opinion

This charity AI imagery best practice guidance may be used by non-profits and may not be used on a commercial basis, without our prior written approval.  Copyright and all other intellectual property rights of this and any derivatives of this document are retained by us to the fullest extent possible in law.

We are neither lawyers nor accountants, so are unable to offer professional advice.  Even if we were, we could not offer advice that would adequately cover all charities or all circumstances.  This draft policy is an example only and not intended to be taken into use as is. If you have a regulator other than the Charity Commission, there may be other requirements that are not necessarily included in this example policy.  For example, the Advertising Standards Authority.

In using this imagery guidance, you are accepting that you will take all necessary steps, including seeking professional advice, to ensure the policy approved meets fully your charity’s needs and complies with all regulatory and legal guidance, keep it under regular review and ensure it is up-to-date and that we have no responsibility whatsoever for any loss or detriment that may arise from using it.  I have included links to regulatory guidance, and you can find pro bono support using the Charity Excellence Help Finder.

Ethics note: AI was partially used in researching this guide.

Register Now
We are very grateful to the organisations below for the funding and pro bono support they generously provide.

With 40,000 members, growing by 3500 a month, we are the largest and fastest growing UK charity community. How We Help Charities

View our Infographic

Charity Excellence Framework CIO

14 Blackmore Gate
Buckland
Buckinghamshire
United Kingdom
HP22 5JT
charity number: 1195568
Copyrights © 2016 - 2024 All Rights Reserved by Alumna Ltd.
Terms & ConditionsPrivacy Statement
Website by DJMWeb.co
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram