Skip to content
arrow_back
search
ISM-2103 policy ASD Information Security Manual (ISM)

Use Organisational Data in AI Models with Consent

Don't use organisational data for AI training without data owner's consent.

record_voice_over

Plain language

This control ensures that you get permission before using data from your organisation to train AI models. If you don't get consent, you could misuse someone's data, leading to trust issues and possibly legal trouble.

Framework

ASD Information Security Manual (ISM)

Control effect

Preventative

Classifications

NC, OS, P, S, TS

ISM last updated

Mar 2026

Control Stack last updated

24 Mar 2026

E8 maturity levels

N/A

Official control statement

Organisational data generated, collected or processed by artificial intelligence applications is not used for training, fine-tuning or improving artificial intelligence models unless informed and explicit consent has been obtained from data owners in advance.
policy ASD Information Security Manual (ISM) ISM-2103
priority_high

Why it matters

Without proper consent, using organisational data in AI models risks breaching trust and legal standards, leading to reputational damage and potential penalties.

settings

Operational notes

Regularly review consent processes to ensure they remain compliant with evolving privacy laws and maintain organisational trust.

build

Implementation tips

  • Managers should work with their IT team to identify all the data currently used by AI models. Make a clear list of what data is used, where it comes from, and who owns it.
  • The IT team should set up a consent process for using data in AI models. This might involve creating a consent form that explains what the data will be used for and why it is needed.
  • Procurement should ensure that contracts with AI service providers clearly state the need for data owner consent before using data for training models.
  • Human Resources can help by training staff on the importance of data consent and how to obtain it properly. Provide workshops or online resources explaining the consent process.
  • Legal departments should review data usage policies to ensure they align with the Australian Privacy Principles and other relevant legal requirements for data consent.
fact_check

Audit / evidence tips

  • Askthe consent records: Ensure there is a formal record of consent for data used in AI models Look atsignatures or formal agreements from data owners. Good records will include dates and purposes of consent
  • Look atclear descriptions of the steps taken to identify owners. Good documentation shows owner identification is consistent and thorough
  • Look atrecords of staff training sessions on the importance of data consent. Ensure the records list names and dates Goodincludes regular training with high attendance
  • Review policy documents: Verify there are policies in place regarding data use in AI training. Check for references to consent and legal compliance. Good policies explicitly state consent requirements and legal guidelines.
  • Askcontract templates with AI providers: Check these templates include clauses on data owner consent Look atterms that specify when and how consent must be obtained Goodcontract will make clear consent is non-negotiable
link

Cross-framework mappings

How ISM-2103 relates to controls across ISO/IEC 27001, Essential Eight, and ASD ISM.

ISO 27001

Control Notes Details
handshake Supports (1) expand_less
Annex A 5.34 ISM-2103 requires that organisational data produced or handled by AI applications is not used to train, fine-tune, or improve AI models u...

These mappings show relationships between controls across frameworks. They do not imply full equivalence or certification.

Mapping detail

Mapping

Direction

Controls