Skip to content
Control Stack logo Control Stack
ISM-2103 ASD Information Security Manual (ISM)

Use Organisational Data in AI Models with Consent

Don't use organisational data for AI training without data owner's consent.

🏛️ Framework

ASD Information Security Manual (ISM)

🧭 Control effect

Preventative

🔐 Classifications

NC, OS, P, S, TS

🗓️ ISM last updated

Mar 2026

✏️ Control Stack last updated

23 Mar 2026

🎯 E8 maturity levels

N/A

Official control statement
Organisational data generated, collected or processed by artificial intelligence applications is not used for training, fine-tuning or improving artificial intelligence models unless informed and explicit consent has been obtained from data owners in advance.

Source: ASD Information Security Manual (ISM)

Plain language

This control ensures that you get permission before using data from your organisation to train AI models. If you don't get consent, you could misuse someone's data, leading to trust issues and possibly legal trouble.

Why it matters

Without proper consent, using organisational data in AI models risks breaching trust and legal standards, leading to reputational damage and potential penalties.

Operational notes

Regularly review consent processes to ensure they remain compliant with evolving privacy laws and maintain organisational trust.

Implementation tips

  • Managers should work with their IT team to identify all the data currently used by AI models. Make a clear list of what data is used, where it comes from, and who owns it.
  • The IT team should set up a consent process for using data in AI models. This might involve creating a consent form that explains what the data will be used for and why it is needed.
  • Procurement should ensure that contracts with AI service providers clearly state the need for data owner consent before using data for training models.
  • Human Resources can help by training staff on the importance of data consent and how to obtain it properly. Provide workshops or online resources explaining the consent process.
  • Legal departments should review data usage policies to ensure they align with the Australian Privacy Principles and other relevant legal requirements for data consent.

Audit / evidence tips

  • Ask: the consent records: Ensure there is a formal record of consent for data used in AI models

    Look at: signatures or formal agreements from data owners. Good records will include dates and purposes of consent

  • Look at: clear descriptions of the steps taken to identify owners. Good documentation shows owner identification is consistent and thorough

  • Look at: records of staff training sessions on the importance of data consent. Ensure the records list names and dates

    Good: includes regular training with high attendance

  • Review policy documents: Verify there are policies in place regarding data use in AI training. Check for references to consent and legal compliance. Good policies explicitly state consent requirements and legal guidelines.
  • Ask: contract templates with AI providers: Check these templates include clauses on data owner consent

    Look at: terms that specify when and how consent must be obtained

    Good: contract will make clear consent is non-negotiable

Cross-framework mappings

How ISM-2103 relates to controls across ISO/IEC 27001, Essential Eight, and ASD ISM.

These mappings show relationships between controls across frameworks. They do not imply full equivalence or certification.

ISO 27001

Control Notes Details
Supports (1)
Annex A 5.34 ISM-2103 requires that organisational data produced or handled by AI applications is not used to train, fine-tune, or improve AI models u...

Mapping detail

Mapping

Direction

Controls