By John Pane
The introduction of the Privacy and Other Legislation Amendment Act 2024 (POLA Act) in 2024 represented the federal government’s phased approach to implementing the proposed changes to the Privacy Act 1988 (Cth) (Privacy Act) as published in the Government’s response to the Review of the Privacy Act Report.
One of the deceptively ‘simple’ changes made to the Privacy Act features in Part 15 to Schedule 1 of the POLA Act which introduced new transparency requirements for Automated Decision-Making (ADM) undertaken by APP entities and government agencies. These reforms, which come into effect from 10 December 2026, specifically:
target how organisations use algorithms, AI and computer programs to make decisions that impact individuals;
seek to prevent ‘black box’ decision-making by forcing organisations to be open about when and how they use these technologies; and
uplift organisational privacy policies with all necessary changes.
What does Part 15 of the POLA Act say?
Part 15 of the POLA Act states organisational privacy policies must be amended for greater transparency under APP 1 where:
the organisation uses a ‘computer program’ to either:
perform decision-making functions in a fully automated manner (without a human decision-maker); or
substantially and directly assist human staff to make decisions;
the computer program uses personal information about that individual to perform its function (i.e. to either make the decision, or to assist the human decision-maker); and
the decision could ‘reasonably be expected to significantly affect the rights or interests of an individual’.
Defining the scope: What is a ‘computer program’?
The term ‘computer program’ is not restricted to emerging technologies like Generative AI or Large Language Models (LLMs). Instead, the POLA 2024 Bill Explanatory Memorandum clarifies that it encompasses a wide spectrum of automation, ranging from sophisticated machine learning to basic rule-based logic.
While these reforms have obvious implications for emerging technologies such as autonomous AI agents, they also have the potential to capture a broad range of simpler automation use cases that are already widely used, such as:
software that assesses input data against pre-defined objective criteria and then applies business rules based on those criteria (e.g. whether to approve or reject an application);
software that processes data to generate evaluative ratings or scorecards, which are then used by human decision makers (e.g. predictive analytics); and
robotic process automation (which uses software to replace human operators for simple and repetitive rule-based tasks, such as data entry, data extraction and form filing).
If a tool (such as predictive analytics, algorithmic sorting or legacy macros) leverages personal information to either make a final determination or provide ‘substantial and direct assistance’ to a human decision-maker, it may fall within the scope of Part 15.
The materiality threshold: Assessing ‘significant effect’
Not every automated process requires disclosure in the organisational privacy policy. The obligation is triggered when a decision could ‘reasonably be expected to significantly affect the rights or interests of an individual.’
The POLA Act emphasises that the impact must be more than trivial and have the potential to significantly influence an individual’s circumstances. Management must conduct case-by-case assessments to determine if their use cases meet this threshold. The legislation provides a non-exhaustive list of high-impact domains:
Legal and Statutory Benefits: Decisions regarding the granting or revocation of benefits under law.
Contractual Rights: Decisions affecting insurance policies, credit facilities, or service agreements.
Access to Essential Services: Impacting an individual’s ability to access healthcare, education, or significant social supports.
For organisations operating in regulated sectors – such as financial services, healthcare, and law enforcement – the majority of automated processes are likely to meet this threshold. However, general corporate functions, such as automated fraud detection or facial recognition for security, must also be evaluated under the same materiality lens.
Mandatory updates to privacy policies
The most immediate operational requirement arising from the ADM reforms is ensuring your APP Privacy Policy is ready for the new APP 1.7-1.9 obligations, which commence on 10 December 2026. From that date, an APP entity must include additional information in its privacy policy if it has arranged for a computer program to use personal information to make (or substantially and directly support) decisions that could reasonably be expected to significantly affect an individual’s rights or interests.
1. What must be included in the privacy policy
Where the threshold is met, the privacy policy must describe the kinds of:
Data Inputs: What categories of personal information are fed into the relevant computer programs?
Fully Automated Decisions: Which processes are handled entirely by technology without human intervention?
Assisted Decision-Making: Which processes involve human-in-the-loop models where a computer program provides the primary evaluative data?
2. Practical drafting tip
The Office of the Australian Information Commissioner’s (OAIC’s) framing in its APP 1 Guidelines is deliberately ‘kinds of’ rather than ‘every single model/rule’. For many organisations, the cleanest approach is to add an ‘Automated decisions’ (or similar) subsection in the privacy policy that:
lists the key decision areas that meet the ‘significant effect’ threshold (e.g. onboarding/eligibility, credit/insurance decisions, fraud outcomes that affect access, service access/termination), and
for each, summarises the kinds of personal information used and whether it is fully automated or substantially and directly assists a human decision-maker.
Failure to update organisational privacy policies carries direct legal, financial and reputational risk. The OAIC has the authority to issue infringement notices for non-compliant policies.
Strengthening internal procedures and controls
Besides review and uplift to organisational privacy policies, impacted APP entities (including government agencies) should ensure alignment and compliance with these new obligations.: As a limited example this may include the following activities:
1. Automation mapping and inventory
Organisations should conduct a comprehensive audit of their ‘automation footprint.’ This involves identifying all instances where computer programs interact with personal information to influence outcomes. This inventory should distinguish between ‘back-office’ efficiencies (like data entry) and ‘evaluative’ functions (like credit scoring, tenancy applications, talent acquisition screening) that meet the materiality threshold.
2. Risk Impact Assessments
In line with existing practices for conducting a Privacy Impact Assessment (PIA), organisations should incorporate automated decision-making into their analysis. This would evaluate the logic of the program, the quality of the data inputs, and the potential for biased or inaccurate outputs, particularly for vulnerable groups such as children or individuals with disabilities.
3. Human-in-the-loop verification
Where computer programs/applications provide ‘substantial assistance’ to humans, internal controls must verify that the human involvement is meaningful rather than a ‘rubber-stamping’ exercise. Documenting the level of human oversight is essential for demonstrating compliance with the assisted decision-making disclosure requirements.
Navigating ambiguities and future privacy reforms
While the POLA Act provides a clearer transparency framework, certain areas remain subject to interpretation. The ‘second tranche’ of Privacy Act reforms may further impact these more recent changes. For instance, the current exemption for employee records and the evolving definition of ‘personal information’ (regarding metadata, disambiguated data or biometric templates) may impact how ADM rules apply to internal workplace monitoring.
Furthermore, while the POLA Act does not currently mandate changes to APP 5 Collection Notices, the existing requirements of APP 5.2 may already necessitate disclosures if personal information is shared with third-party AI platforms or automation vendors. Management therefore should take a holistic view of their privacy compliance obligations across all organisational touchpoints of the personal information lifecycle.
IIS can help
IIS can help you with compliance and best practice in the ADM space, including:
Undertake automation mapping and inventory activities.
Assess whether ADM processes meet the materiality threshold.
Conduct PIAs that assess ADM-related projects.
Prepare updates to privacy policies to enhance ADM transparency.
More broadly, we help organisations:
Navigate the complexity of the privacy, cyber security, and digital regulatory landscape.
Get the basics right and help you comply with current and incoming requirements, to satisfy customer expectations and to avoid regulator scrutiny and enforcement.
Move beyond compliance to performance and resilience that builds trust and achieves business objectives in a fast-changing world.
Why? Because as we have said at IIS for two decades, ‘It is just good business.’
Please contact us if you have any questions about the Privacy Act reforms and how they may affect your organisation. You can also subscribe to receive regular updates from us about key developments in the privacy and security space.
