The Australian AI privacy landscape
Australia doesn't have dedicated AI legislation yet. But that doesn't mean AI is unregulated. The Privacy Act 1988 and the Australian Privacy Principles (APPs) apply to any system that handles personal information — including AI systems.
The OAIC (Office of the Australian Information Commissioner) has made it clear: existing privacy obligations apply to AI. If your system collects, uses, or stores personal data, you need to comply.
The Privacy Act 1988
The Privacy Act covers organisations with annual turnover above $3 million (and some smaller organisations in specific sectors). Key principles relevant to AI:
- Collection limitation: Only collect personal information that is reasonably necessary for your function.
- Purpose limitation: Use personal information only for the purpose it was collected for.
- Data quality: Take reasonable steps to ensure personal information is accurate and up-to-date.
- Security: Protect personal information from misuse, interference, and loss.
- Cross-border disclosure: If you send data overseas, you remain accountable for its handling.
Australian Privacy Principles (APPs)
The 13 APPs that matter most for AI projects:
- APP 3 (Collection): Don't feed personal data into an AI system unless you have a legitimate reason and have told people.
- APP 5 (Notification): Tell individuals how their data will be used — including by AI systems.
- APP 6 (Use/Disclosure): Don't use data for AI training or inference beyond what you told people it was for.
- APP 8 (Cross-border): If your AI provider processes data overseas, you're responsible for ensuring equivalent protections.
- APP 11 (Security): Implement appropriate technical and organisational safeguards for any data processed by AI.
Cross-border trap: Using OpenAI's API, Google Gemini, or other US-hosted services means your data likely crosses borders. Under APP 8, you remain accountable for how that data is handled overseas.
AI-specific considerations
- Training data: If you fine-tune models on personal data, that data becomes embedded in model weights — effectively irrecoverable.
- RAG systems: Data stays in your vector database (not in the model), making deletion and access requests easier to handle.
- Automated decision-making: If AI makes decisions that significantly affect individuals, you may need to provide explanations and human review.
- Employee monitoring: Using AI to monitor staff (productivity tracking, sentiment analysis) raises additional privacy concerns under workplace laws.
Practical steps for compliance
- Data mapping: Document what personal data flows through your AI system — collection, processing, storage, deletion.
- Privacy impact assessment: Conduct a PIA before deploying any AI system that handles personal data.
- Data minimisation: Only include the personal data your AI actually needs. Anonymise or pseudonymise where possible.
- Residency: Keep data in Australia. For AWS, use ap-southeast-2 (Sydney). Use Bedrock so data doesn't leave your account.
- Access controls: Implement document-level permissions so users only see data they're authorised to access.
- Vendor assessment: Review your AI vendor's data handling practices. Check where data is processed and stored.
- Update your privacy policy: Disclose your use of AI and how it processes personal information.
Common pitfalls
- Using free-tier ChatGPT with customer data (data may be used for training)
- Assuming "cloud AI" means data stays private (it depends on the provider and plan)
- Not conducting a privacy impact assessment before deployment
- Feeding personal data into AI "for testing" without proper safeguards
- Forgetting that AI outputs can contain personal information from the input data
Key takeaways
- The Privacy Act 1988 and APPs apply to AI just like any other data processing system.
- Key areas: consent for data collection, purpose limitation, data security, and cross-border transfer.
- Using cloud AI services may constitute a cross-border data transfer — check your provider's data residency.
- Keep data in Australia (AWS Sydney), minimise what you collect, and document your AI data flows.