By Chong Shao

On Monday, 4 May 2026, IIS joined other IAPP members for the Sydney launch of Privacy Awareness Week, where Privacy Commissioner Carly Kind gave the keynote address.

Commissioner Kind opened by alluding to the thing that many in the room were probably thinking: It’s Privacy Awareness Week again; is there anyone left who isn’t aware that privacy matters? She floated the heretical thought that PAW might have achieved its purpose. Awareness is not the gap anymore. The harder question – the one organisations should be sitting with – is whether that awareness is being converted into something real.

That framing set up the rest of her address, which was organised around three ideas: action, agency and alternatives. It also gave the speech a different feel from previous PAW addresses. As Olga Ganopolsky (General Counsel, Privacy and Data at Macquarie Group Limited) observed during a later fireside chat, where past years have been heavy on law reform, this year was striking for how much was being done under the regime that already exists.

That observation also captures the IIS view of where things stand: no Tranche 2, no problems. Commissioner Kind is proceeding full steam ahead – and, unlike certain other ‘full steam ahead’ projects in Australian public life (ahem, AUKUS), she has actual progress to show for it.

Twelve months on from last year’s PAW, the four themes we identified in the Commissioner’s stance have gathered apace. Commissioner Kind is still working with a holistic view of privacy grounded in power imbalance. She is still using the full regulatory toolkit, including pursuing matters in court. She is still interested in fresh, purposive readings of the existing Privacy Act. What is new this year is that there are now concrete cases and determinations to point to – and a clearer picture of where the OAIC is heading.

Action: mere compliance is not enough

The first theme was the move from awareness to action. The Commissioner’s organising question was a practical one: what does ‘good’ actually look like? Two recent matters illustrate her answer.

The Federal Court’s decision in Australian Information Commissioner v Australian Clinical Labs Limited deserves close reading not only for its technical security findings but for the governance failures sitting alongside. As the Commissioner put it in the fireside, the breach occurred on a subset of entities ACL had acquired, and post-acquisition the organisational measures were never properly embedded into the business (Medlab Pathology) which ACL had acquired. Key personnel had not been trained on the relevant policies and processes. There was over-reliance on a technical consultant whose advice turned out to be inadequate.

The Court found upwards of 200,000 contraventions on the OAIC’s preferred reading – each affected individual counting as a separate contravention. The Commissioner indicated that this will be the OAIC’s position going forward. With the maximum penalty per contravention now at $50 million rather than the pre-reform $2.2 million, the arithmetic speaks for itself.

The takeaway is not really about cyber security. It is that APP 11 has both a technical limb and an organisational one, and the latter does a great deal of work in practice. Acquisitions, integrations, restructures and outsourcing arrangements are exactly the moments when gaps start to show. A privacy policy is one thing. A privacy program – funded, properly governed, reflected in training, surviving an M&A event – is another.

The Vinomofo Pty Ltd investigation makes the same point from the other direction. The policies existed. The training, as the Commissioner described it, was nominal. Privacy was not embedded.

The third matter – the Bunnings review decision – extends the point from culture and training into process. APP 1.2’s requirement to take reasonable steps to implement procedures, processes and systems is not satisfied by scattered internal enquiries and informal sign-offs. For new, invasive or high-risk practices, the baseline is a formal, structured, documented assessment. Olga’s framing – ‘to avoid the Death Star, do a PIA’ – drew a chuckle from the room.

Bunnings is also worth reading for what the OAIC won and what it lost. The Tribunal departed from the Commissioner on proportionality and necessity, which the OAIC has acknowledged and will address in forthcoming updates to the APP 3 collection guidelines (now published). But on the points that matter most the OAIC won decisively. Collection-is-collection-no-matter-how-transient is the holding that will persist and make a difference. As the Commissioner noted, future collection events will look nothing like a paper form. They will be milliseconds long, mediated by AI, embedded in pixels, layered through brief and opaque encounters. Dispensing with the temporal threshold for ‘collection’ now matters enormously for how the Privacy Act applies later.

Agency: privacy as power, not paperwork

The second theme picked up the Commissioner’s continuing concern with power and information asymmetries. The question, she suggested, is not whether an individual could in principle have made a different choice. It is whether the individual was ever in a meaningful position to do so. Two areas stand out.

The first was AI. The Commissioner has clearly been mapping the AI landscape over the past year – engaging with developers, providers, agencies and civil society on the use of personal information to train AI models, and on the rollout of AI scribe technology in clinical settings. The iMed investigation closed without findings; others are ongoing and likely to produce decisions next year. The 2026 community attitudes survey, when it lands, will show that 93% of Australians do not think it is fair and reasonable for organisations to use personal information to train AI systems. That figure will inform how the OAIC interprets ‘purpose’, ‘use’ and ‘disclosure’ in this space.

The second practice was excessive collection. The 2Apply / InspectRealEstate determination is a striking application of APP 3. The factual setting matters: in a rental market with severe power imbalance and limited alternatives, a prospective tenant has little real say in what information they hand over or how the request is put to them. The OAIC found that the platform’s collection practices breached APP 3.3 (collection of sensitive information) and, more interestingly, breached APP 3.5 (lawful and fair collection), on the basis that the design of the application flow was unfair. Drawing on the UK ICO’s work on online choice architecture, the OAIC identified specific design patterns – ‘confirmshaming’ and biased framing – that contravened the fair-and-lawful-means requirement..

This is APP 3 doing more work than most organisations have assumed it does. The question is no longer just ‘can we identify a business reason for asking?’ It is whether each piece of information being collected is genuinely necessary – particularly sensitive or high-risk information that may carry more risk than value – and whether the way the request is put to the individual is fair on its own terms. Choice architecture has now arrived as a privacy concept, not just a consumer law one.

The thread connecting AI and rental applications (and, in forthcoming investigations, tracking pixels) is the one the Commissioner drew explicitly. These are all practices that are passive, opaque, or offer false choices. They are not legible to the people they affect. The OAIC’s regulatory interest is concentrating in exactly those places.

Alternatives: the Children’s Online Privacy Code as proof of concept

The third theme was the most forward-looking, and the most interesting departure from where one might have expected the speech to go.

There is a natural reading of action and agency together – fines are getting bigger, the OAIC is more active, the law is being read more purposively – that is essentially enforcement-focused. The Commissioner’s third move was to step out of that frame and ask a different question: what if the regulator did not just enforce against bad practice, but demonstrated what good practice could look like?

This is where the Children’s Online Privacy Code comes in. The exposure draft was published earlier this year. Three features stand out that make the Code structurally different from ordinary APP compliance.

First, the Code regulates at the service level, not the entity level. This follows the model used in online safety regulation. It also reflects a recognition that the entity is often not the right unit of analysis for digital services, where the same company might run multiple services with quite different risk profiles.

Second, data minimisation is the default starting position. Collection settings are switched off unless the child opts in. Consent must be genuine, not bundled or guilt-tripped, and where the child is under the age of digital consent they must still be brought into the conversation in age-appropriate language. There is a right to erasure, not just de-identification.

Third, the best interests of the child is the primary consideration. This is not a familiar concept in Australian privacy law. It draws from international children’s rights law and changes the orientation of the entire framework. Compliance is no longer principally about whether the organisation has acted reasonably from its own perspective. It is about whether the design of the service is in the interests of the children using it.

These are not incremental adjustments; they change the starting point. Commissioner Kind described feeling ‘something close to excitement’ about the Code’s potential. She also framed it as a proof of concept: ‘the aspiration is to build the alternative, then extend it to everyone else.’ If a digital ecosystem with stronger defaults, more honest design and meaningful user agency can be made workable for children, it becomes harder to argue that the same is impossible for others.

That is the part worth watching. There is an emerging Australian regulatory pattern here – the eSafety Commissioner’s Social Media Minimum Age framework, and now the OAIC’s Children’s Online Privacy Code – in which Australia is taking a more design-forward and structurally interventionist approach to digital regulation than comparable jurisdictions. The Children’s Online Privacy Code is the most ambitious yet because as the Commissioner indicated, the aspiration is to use it as a stepping stone: first prove the model with children, then extend the same defaults, design standards and user controls to digital services more broadly.

What this means for organisations

The clearest message of PAW 2026 is that waiting for Tranche 2 is not a compliance strategy. The Commissioner is using the Act she has, using it well, and signalling that she will continue to explore new understandings and applications of its existing terms.

The concepts that Commissioner Kind is seeking to clarify in the coming year include the definition of personal information, and purpose, use and disclosure under APP 6. These terms are especially pertinent when it comes to how the Act applies to AI training, profiling and connected devices.

For organisations, the practical implications follow from each of the three themes.

On action, paper compliance is no longer a safe place to sit. Privacy needs to be funded, embedded, reinforced in training and reflected in how the organisation actually makes decisions about new technologies. Acquisitions and integrations are where this can fall over in practice. High-risk and novel practices should be supported by formal, structured, documented assessments. The Commissioner has now made clear that anything less is unlikely to satisfy APP 1.2.

On agency, the orientation has shifted. The question is no longer whether the organisation’s privacy practices can survive a narrow legal review. Rather, the lens should be about trust: how do these practices hold up when looked at from the perspective of the person on the other side of the form, the screen or the AI model? Excessive collection, opaque processing and dark-pattern design are in the Commissioner’s crosshairs, and they will not be defended by pointing to a privacy policy.

On alternatives, the Children’s Online Privacy Code is worth paying attention to, including by organisations that are not directly captured by it. The design choices in the Code reflect a regulatory view about what good looks like across the board. The closer an organisation’s own practices are to those defaults, the less exposed it will be if (or when!) the model is extended at some future point.

Conclusion

PAW 2026 was a challenge as much as a celebration. There is more work to be done to promote privacy and win trust.

The regulator is doing its part. I am genuinely impressed at how much the OAIC has been able to pull off, given all the things on its plate and the (limited) resources it has to work with.

As Commissioner Kind noted at the outset, the Australian community is already privacy-aware. The question now is whether regulated entities are paying attention – and what they intend to do about it.

If you have any about how these developments might affect your organisation, or would like assistance with privacy program uplift, PIAs or any of the practical implications above, please contact us.

Comment