Viewing entries tagged
Governance

From awareness to action – Reflections on PAW 2026

Comment

From awareness to action – Reflections on PAW 2026

By Chong Shao

On Monday, 4 May 2026, IIS joined other IAPP members for the Sydney launch of Privacy Awareness Week, where Privacy Commissioner Carly Kind gave the keynote address.

Commissioner Kind opened by alluding to the thing that many in the room were probably thinking: It’s Privacy Awareness Week again; is there anyone left who isn’t aware that privacy matters? She floated the heretical thought that PAW might have achieved its purpose. Awareness is not the gap anymore. The harder question – the one organisations should be sitting with – is whether that awareness is being converted into something real.

That framing set up the rest of her address, which was organised around three ideas: action, agency and alternatives. It also gave the speech a different feel from previous PAW addresses. As Olga Ganopolsky (General Counsel, Privacy and Data at Macquarie Group Limited) observed during a later fireside chat, where past years have been heavy on law reform, this year was striking for how much was being done under the regime that already exists.

That observation also captures the IIS view of where things stand: no Tranche 2, no problems. Commissioner Kind is proceeding full steam ahead – and, unlike certain other ‘full steam ahead’ projects in Australian public life (ahem, AUKUS), she has actual progress to show for it.

Twelve months on from last year’s PAW, the four themes we identified in the Commissioner’s stance have gathered apace. Commissioner Kind is still working with a holistic view of privacy grounded in power imbalance. She is still using the full regulatory toolkit, including pursuing matters in court. She is still interested in fresh, purposive readings of the existing Privacy Act. What is new this year is that there are now concrete cases and determinations to point to – and a clearer picture of where the OAIC is heading.

Action: mere compliance is not enough

The first theme was the move from awareness to action. The Commissioner’s organising question was a practical one: what does ‘good’ actually look like? Two recent matters illustrate her answer.

The Federal Court’s decision in Australian Information Commissioner v Australian Clinical Labs Limited deserves close reading not only for its technical security findings but for the governance failures sitting alongside. As the Commissioner put it in the fireside, the breach occurred on a subset of entities ACL had acquired, and post-acquisition the organisational measures were never properly embedded into the business (Medlab Pathology) which ACL had acquired. Key personnel had not been trained on the relevant policies and processes. There was over-reliance on a technical consultant whose advice turned out to be inadequate.

The Court found upwards of 200,000 contraventions on the OAIC’s preferred reading – each affected individual counting as a separate contravention. The Commissioner indicated that this will be the OAIC’s position going forward. With the maximum penalty per contravention now at $50 million rather than the pre-reform $2.2 million, the arithmetic speaks for itself.

The takeaway is not really about cyber security. It is that APP 11 has both a technical limb and an organisational one, and the latter does a great deal of work in practice. Acquisitions, integrations, restructures and outsourcing arrangements are exactly the moments when gaps start to show. A privacy policy is one thing. A privacy program – funded, properly governed, reflected in training, surviving an M&A event – is another.

The Vinomofo Pty Ltd investigation makes the same point from the other direction. The policies existed. The training, as the Commissioner described it, was nominal. Privacy was not embedded.

The third matter – the Bunnings review decision – extends the point from culture and training into process. APP 1.2’s requirement to take reasonable steps to implement procedures, processes and systems is not satisfied by scattered internal enquiries and informal sign-offs. For new, invasive or high-risk practices, the baseline is a formal, structured, documented assessment. Olga’s framing – ‘to avoid the Death Star, do a PIA’ – drew a chuckle from the room.

Bunnings is also worth reading for what the OAIC won and what it lost. The Tribunal departed from the Commissioner on proportionality and necessity, which the OAIC has acknowledged and will address in forthcoming updates to the APP 3 collection guidelines (now published). But on the points that matter most the OAIC won decisively. Collection-is-collection-no-matter-how-transient is the holding that will persist and make a difference. As the Commissioner noted, future collection events will look nothing like a paper form. They will be milliseconds long, mediated by AI, embedded in pixels, layered through brief and opaque encounters. Dispensing with the temporal threshold for ‘collection’ now matters enormously for how the Privacy Act applies later.

Agency: privacy as power, not paperwork

The second theme picked up the Commissioner’s continuing concern with power and information asymmetries. The question, she suggested, is not whether an individual could in principle have made a different choice. It is whether the individual was ever in a meaningful position to do so. Two areas stand out.

The first was AI. The Commissioner has clearly been mapping the AI landscape over the past year – engaging with developers, providers, agencies and civil society on the use of personal information to train AI models, and on the rollout of AI scribe technology in clinical settings. The iMed investigation closed without findings; others are ongoing and likely to produce decisions next year. The 2026 community attitudes survey, when it lands, will show that 93% of Australians do not think it is fair and reasonable for organisations to use personal information to train AI systems. That figure will inform how the OAIC interprets ‘purpose’, ‘use’ and ‘disclosure’ in this space.

The second practice was excessive collection. The 2Apply / InspectRealEstate determination is a striking application of APP 3. The factual setting matters: in a rental market with severe power imbalance and limited alternatives, a prospective tenant has little real say in what information they hand over or how the request is put to them. The OAIC found that the platform’s collection practices breached APP 3.3 (collection of sensitive information) and, more interestingly, breached APP 3.5 (lawful and fair collection), on the basis that the design of the application flow was unfair. Drawing on the UK ICO’s work on online choice architecture, the OAIC identified specific design patterns – ‘confirmshaming’ and biased framing – that contravened the fair-and-lawful-means requirement..

This is APP 3 doing more work than most organisations have assumed it does. The question is no longer just ‘can we identify a business reason for asking?’ It is whether each piece of information being collected is genuinely necessary – particularly sensitive or high-risk information that may carry more risk than value – and whether the way the request is put to the individual is fair on its own terms. Choice architecture has now arrived as a privacy concept, not just a consumer law one.

The thread connecting AI and rental applications (and, in forthcoming investigations, tracking pixels) is the one the Commissioner drew explicitly. These are all practices that are passive, opaque, or offer false choices. They are not legible to the people they affect. The OAIC’s regulatory interest is concentrating in exactly those places.

Alternatives: the Children’s Online Privacy Code as proof of concept

The third theme was the most forward-looking, and the most interesting departure from where one might have expected the speech to go.

There is a natural reading of action and agency together – fines are getting bigger, the OAIC is more active, the law is being read more purposively – that is essentially enforcement-focused. The Commissioner’s third move was to step out of that frame and ask a different question: what if the regulator did not just enforce against bad practice, but demonstrated what good practice could look like?

This is where the Children’s Online Privacy Code comes in. The exposure draft was published earlier this year. Three features stand out that make the Code structurally different from ordinary APP compliance.

First, the Code regulates at the service level, not the entity level. This follows the model used in online safety regulation. It also reflects a recognition that the entity is often not the right unit of analysis for digital services, where the same company might run multiple services with quite different risk profiles.

Second, data minimisation is the default starting position. Collection settings are switched off unless the child opts in. Consent must be genuine, not bundled or guilt-tripped, and where the child is under the age of digital consent they must still be brought into the conversation in age-appropriate language. There is a right to erasure, not just de-identification.

Third, the best interests of the child is the primary consideration. This is not a familiar concept in Australian privacy law. It draws from international children’s rights law and changes the orientation of the entire framework. Compliance is no longer principally about whether the organisation has acted reasonably from its own perspective. It is about whether the design of the service is in the interests of the children using it.

These are not incremental adjustments; they change the starting point. Commissioner Kind described feeling ‘something close to excitement’ about the Code’s potential. She also framed it as a proof of concept: ‘the aspiration is to build the alternative, then extend it to everyone else.’ If a digital ecosystem with stronger defaults, more honest design and meaningful user agency can be made workable for children, it becomes harder to argue that the same is impossible for others.

That is the part worth watching. There is an emerging Australian regulatory pattern here – the eSafety Commissioner’s Social Media Minimum Age framework, and now the OAIC’s Children’s Online Privacy Code – in which Australia is taking a more design-forward and structurally interventionist approach to digital regulation than comparable jurisdictions. The Children’s Online Privacy Code is the most ambitious yet because as the Commissioner indicated, the aspiration is to use it as a stepping stone: first prove the model with children, then extend the same defaults, design standards and user controls to digital services more broadly.

What this means for organisations

The clearest message of PAW 2026 is that waiting for Tranche 2 is not a compliance strategy. The Commissioner is using the Act she has, using it well, and signalling that she will continue to explore new understandings and applications of its existing terms.

The concepts that Commissioner Kind is seeking to clarify in the coming year include the definition of personal information, and purpose, use and disclosure under APP 6. These terms are especially pertinent when it comes to how the Act applies to AI training, profiling and connected devices.

For organisations, the practical implications follow from each of the three themes.

On action, paper compliance is no longer a safe place to sit. Privacy needs to be funded, embedded, reinforced in training and reflected in how the organisation actually makes decisions about new technologies. Acquisitions and integrations are where this can fall over in practice. High-risk and novel practices should be supported by formal, structured, documented assessments. The Commissioner has now made clear that anything less is unlikely to satisfy APP 1.2.

On agency, the orientation has shifted. The question is no longer whether the organisation’s privacy practices can survive a narrow legal review. Rather, the lens should be about trust: how do these practices hold up when looked at from the perspective of the person on the other side of the form, the screen or the AI model? Excessive collection, opaque processing and dark-pattern design are in the Commissioner’s crosshairs, and they will not be defended by pointing to a privacy policy.

On alternatives, the Children’s Online Privacy Code is worth paying attention to, including by organisations that are not directly captured by it. The design choices in the Code reflect a regulatory view about what good looks like across the board. The closer an organisation’s own practices are to those defaults, the less exposed it will be if (or when!) the model is extended at some future point.

Conclusion

PAW 2026 was a challenge as much as a celebration. There is more work to be done to promote privacy and win trust.

The regulator is doing its part. I am genuinely impressed at how much the OAIC has been able to pull off, given all the things on its plate and the (limited) resources it has to work with.

As Commissioner Kind noted at the outset, the Australian community is already privacy-aware. The question now is whether regulated entities are paying attention – and what they intend to do about it.

If you have any about how these developments might affect your organisation, or would like assistance with privacy program uplift, PIAs or any of the practical implications above, please contact us.

Comment

COVIDSafe - A turning point for privacy?

COVIDSafe - A turning point for privacy?

By Malcolm Crompton and Chong Shao

The Australian Government’s COVIDSafe app has been met by both widespread scrutiny and widespread adoption. Is the app safe? Is the public’s response revealing the true Australian character? Are the privacy fears overblown? The picture is fascinating when you step back and look at what this app says about privacy in Australia both now and going forward.

Making the grade

Let’s address the most important thing upfront: the app appears to be mostly sound from a privacy and security perspective. Contrary to the FUD (fear, uncertainty and doubt) swirling around – no, the app does not collect any location information; no, it does not “track and monitor” you at all times (contrary to existing apps in other countries). Here is a good explainer on how the app actually works. 

The Australian Government commissioned a Privacy Impact Assessment from a law firm which it has published. From our perspective, the key privacy protections are:

  • The layers of opt-in consent and control built into the app, from registration to uploading information to the National COVIDSafe Data Store

  • Access to the information in the Data Store will be strictly limited to health officials in the States and Territories, and the purpose will be strictly limited to COVID-19 contract tracing and notification – these restrictions will be backed by federal legislation

  • All data held in the Data Store will be deleted at the end of the pandemic – this is very important because retaining information is a necessary feature of the centralised model (as opposed to the decentralised model proposed by the Apple-Google partnership), which could lead to potential misuse or compromise of the information.

There are some remaining issues where more clarification would be welcome:

  • What will be the arrangements that govern how State and Territory officers use the gathered information? What will be the mechanisms for oversight, enforcement and responding to failure in those jurisdictions?

  • The government has stated that it will introduce regulations to prevent police and other government agencies from accessing the information collected by the app. This is a good move to increase trustworthiness, but will it extend to national security agencies (as it should)? Will it extend to State and Territory police forces?

  • Why the delay in the promised release of the source code and will the source code of the inevitable updates also be released? Has it been sufficiently security tested? 

  • Can we be sure about the assurances that Amazon Web Services will abide by Australian law rather than US laws should the US demand (secret) access to the data?

  • Why hasn’t there been wider consultation with interested parties beyond the chosen federal agencies? Will there be such consultations from now on?

The big missing piece

While the app’s privacy protections are commendable, as always, the proof of the pudding is in the eating. A recent post by the UK Information Commissioner, summarising the discussions of more than 250 participants from the privacy domain on the use of technology to combat the pandemic, highlighted the importance of governance and accountability processes.

This is where we think the government’s current implementation is lacking. For example: how will we know that only the right people are accessing the information and using it for the right reasons? How will we know that the information will be deleted once the pandemic is over? How secure is the system – in the exchange of Bluetooth signals, the information in transit to and from the Data Store, and information at rest in the Data Store?

The PIA recommends additional independent assurance and testing from security experts, and to make this publicly available. This should extend to all aspects of data handling by participants in the ecosystem including Commonwealth, State and Territory agencies as well as private sector participants such as Amazon.

To maximise privacy and trust, the government should not only make the right promises, but also (i) explain how it will keep them and (ii) demonstrate, via expert and independent validation, that they are indeed being kept.

The creation and the creator

We have observed an interesting dichotomy in the responses to the COVIDSafe app. There is widespread recognition, even from usually sceptical voices, that the app is not especially problematic from a privacy perspective. At the same time, there is a general sense of concern about a new method of data collection by the Australian Government. The problem is not with the creation, but with the creator.

It would be an understatement to say that the government has a chequered past with respect to privacy and data handling (see here for a recent history lesson). This has resulted in a trust deficit where anything it proposes is subject to negative publicity. So far, adoption rates indicate that many Australians are willing to try the app notwithstanding the government’s track record. 

Is this because of the objectively strong privacy measures implemented and promoted by the government? And/or is this because of the extraordinary circumstances we are in, with Australians doing their part to help combat the pandemic and hasten the reopening of our society? It may be too soon to tell, although it is fair to hypothesise that both are playing a role.

Our hope is that this augurs well for future government initiatives, that the Australian Government will take lessons from the positive response to the app – achieved through a combination of taking privacy seriously (including legislatively) and appealing to public solidarity. This represents a break from its past behaviour and could serve as the new and better precedent going forward.

Privacy Awareness Week 2020: A message from IIS

By Mike Trovato and Eugenia Caralt

IIS is a proud supporter of 2020 Privacy Awareness Week (PAW), 4-10 May, an annual event to raise awareness of privacy issues and the importance of protecting personal information. 

Australian privacy regulators are leading the effort to increase privacy awareness in the midst of a unique and uncertain time as we face the COVID-19 pandemic. Because of the challenges presented by the pandemic, compliance and risk to personal information in government, industry, education, and non-profits are front of mind. 

Regulatory themes

This year the Office of the Australian Information Commissioner’s (OAIC) theme is “Reboot Your Privacy”. As Information and Privacy Commissioner Angelene Falk indicates, this year’s theme is in line with the current challenges that Australian entities are facing to adapt to the new demands of remote working and online interactions. To access the Commonwealth and state-based PAW information, events and resources, click on the links below: 

Office of Australian Information Commissioner – Reboot Your Privacy

Office of the Victorian Information Commissioner – Privacy – Protect Yours and Respect Others’ 

Office of the Information Commissioner Queensland – Be Smart About Privacy

Information and Privacy Commission New South Wales – Prevent, Detect, Protect

IIS and partner events

In addition to being a PAW partner, IIS is supporting efforts to raise privacy awareness through the following activities:

Privacy Masterclass – Data and Privacy with Malcolm Crompton and Lyria Bennett Moses as part of the Australian Computer Society’s NSW Privacy Summit

When: Wednesday, April 29, 4:00 PM AEST

Theme: Why is there so much debate about the trustworthiness of government uses of data? 

This session will explore the ways in which existing law and its implementation are not meeting the needs of citizens or the needs of government seeking to retain citizen trust. 

To pre-register to the free webinar click here (Link will be posted 2 hours before the event commencing). 

OneTrust webinar – Privacy in a Pandemic with the Privacy Commissioners from Australia and New Zealand and IDCare’s Managing Director 

When: Wednesday, May 6, 2:00 PM AEST

Theme: As the world rapidly changes to address the COVID-19 pandemic, what’s at stake for privacy? 

Panel discussion of issues and practical advice for maintaining privacy during the pandemic.

To pre-register to the free webinar click here.

 

IIS’ PAW 2020 message

The OAIC’s theme is Reboot your Privacy using Ctrl+Alt+Del. What does Ctrl+Alt+Del practically look like?

1) Ctrl – OAIC message: Check and update your privacy and security controls; IIS view: Undertake privacy and security health checks – Know where you stand and take action!

At IIS,  we are often asked by potential and current clients seeking to improve privacy practice: “Where should we start?” or “What should we do?” We find that this question is best answered by more questions! For example:

  • When did you last review your entity’s privacy and security practices?

  • Does your management and board of directors have a clear view of where the entity standards in terms of personal information as an asset? Is the current culture and practice appropriate to the entity’s strategy, risk appetite and privacy stance?

  • Are your management and board of directors aware of the risks and do you have their support (including financially) to address them? 

As you are all aware, the Privacy Act requires entities to take reasonable steps to protect their personal information, considering, among other things, the nature of the entity, the amount and sensitivity of the information it holds. If your entity’s privacy management and governance are insufficient taking into account the above, both your entity and your customers are at risk.

A ‘privacy and security health check’ will assist entities to assess the extent to which their current practices, procedures and systems are compliant with the law, vulnerable to privacy and security risks, and/or meet privacy and security best practice. The assessment will provide a point-in-time assessment to assist entities in deciding where they want to be. 

Entities that do not understand their position and have not taken appropriate actions could be deemed as deficient by regulators and will likely be subject to enforceable undertakings after the inevitable breach. 

2) Alt – OAIC message: Consider the alternative when giving or asking for personal information; IIS view: Implement Privacy by Design!

What can you do with less? How can you cut unnecessary collection of personal information, or even creatively achieve the same goal without any personal information? These practices are best implemented by embedding Privacy by design (PbD) from the very start. 

Applying PbD strategically helps entities internalise user-centric practices that are key to building trust with customers and reducing risk to the entity over the long run. Furthermore, it heads off the often costly and time-consuming process of ‘bolting on’ privacy fixes at the end of a project, or finding a project has to be shelved altogether due to privacy concerns.

PbD should be actively adopted in contexts where the value of the data and the associated privacy risks are high, for example: big data, especially involving information; mobile location analytics; biometrics, including facial recognition; and customer loyalty programs.

IIS believes that now more than ever entities cannot hit the PAUSE button on thinking and doing privacy. Rather, they should adapt to this current moment, such as by using short-form Privacy Impact Assessments, as Australian privacy regulators have recently indicated.

3) Delete – OAIC message: Delete any data from old devices and securely destroy or deidentify personal information if it’s no longer needed for a legal purpose; IIS view: develop data retention policies, enforce it and prove it!

Data is a liability because of the risk of a privacy or security breach and the resulting toxic effects. Security and privacy are related but distinct. An entity can have the world’s best security practices for its personal information but still should not have collected it in the first place or should not have used it for an unexpected purpose. To highlight this point, consider the tech giants like Google and Facebook. Presumably they have industry-leading security practices, but this has not stopped them from getting into privacy mishaps over the years. 

To minimise both privacy and security failures, entities should have a retention policy in place for all types of data, including personal information. They should be familiar with their legal requirements and transparent about their data handling practices. When data is no longer needed, they should act to ensure that the appropriate steps are carried out (such as deletion or deidentification) – this includes thinking about their supply chain and external service providers. 

More and more we are seeing the policy and best practice landscape shift towards favouring stronger assurance. Entities that are able to prove what they say (including data deletion) will be in a much stronger position with respect to building trust and credibility with individuals, clients and regulators.

Summing up: The importance of governance and directors’ key role in driving privacy and security

Privacy awareness should lead to not only better compliance but also contribute to valued business and strategic goals. Reflecting on this year’s OAIC’s theme, IIS’s view is that given the growing importance of personal information as a mission critical asset, we encourage entities seeking to leverage awareness into better practice to start with a privacy and security health check.

As we look ahead to 2020 and beyond, the governance of personal information will be a growing area of interest for regulators (not just in privacy, but specific sectors as well). A board that is not asking relevant questions of management, or is unable to assure itself of how personal information is being handled and protected, is demonstrating a failure of governance that could compromise the entity’s mission and potentially open it up to external scrutiny and consequences.

It has been just over a year since the launch of “The New Governance of Data and Privacy: Moving beyond compliance to performance”, co-authored by Mike Trovato (Managing Director) and Malcolm Crompton AM (Lead Privacy Advisor) and published by the Australian Institute of Company Directors (AICD).
The book discusses why privacy governance is a top line strategic and compliance issue for boards and sets out a framework for boards to lead and direct privacy governance in their entity. The main themes of the book have also been adapted into the Data and privacy governance director tool jointly published by the AICD and the Australian Information Security Association (AISA), available here.