Viewing entries tagged
Children's privacy

From awareness to action – Reflections on PAW 2026

Comment

From awareness to action – Reflections on PAW 2026

By Chong Shao

On Monday, 4 May 2026, IIS joined other IAPP members for the Sydney launch of Privacy Awareness Week, where Privacy Commissioner Carly Kind gave the keynote address.

Commissioner Kind opened by alluding to the thing that many in the room were probably thinking: It’s Privacy Awareness Week again; is there anyone left who isn’t aware that privacy matters? She floated the heretical thought that PAW might have achieved its purpose. Awareness is not the gap anymore. The harder question – the one organisations should be sitting with – is whether that awareness is being converted into something real.

That framing set up the rest of her address, which was organised around three ideas: action, agency and alternatives. It also gave the speech a different feel from previous PAW addresses. As Olga Ganopolsky (General Counsel, Privacy and Data at Macquarie Group Limited) observed during a later fireside chat, where past years have been heavy on law reform, this year was striking for how much was being done under the regime that already exists.

That observation also captures the IIS view of where things stand: no Tranche 2, no problems. Commissioner Kind is proceeding full steam ahead – and, unlike certain other ‘full steam ahead’ projects in Australian public life (ahem, AUKUS), she has actual progress to show for it.

Twelve months on from last year’s PAW, the four themes we identified in the Commissioner’s stance have gathered apace. Commissioner Kind is still working with a holistic view of privacy grounded in power imbalance. She is still using the full regulatory toolkit, including pursuing matters in court. She is still interested in fresh, purposive readings of the existing Privacy Act. What is new this year is that there are now concrete cases and determinations to point to – and a clearer picture of where the OAIC is heading.

Action: mere compliance is not enough

The first theme was the move from awareness to action. The Commissioner’s organising question was a practical one: what does ‘good’ actually look like? Two recent matters illustrate her answer.

The Federal Court’s decision in Australian Information Commissioner v Australian Clinical Labs Limited deserves close reading not only for its technical security findings but for the governance failures sitting alongside. As the Commissioner put it in the fireside, the breach occurred on a subset of entities ACL had acquired, and post-acquisition the organisational measures were never properly embedded into the business (Medlab Pathology) which ACL had acquired. Key personnel had not been trained on the relevant policies and processes. There was over-reliance on a technical consultant whose advice turned out to be inadequate.

The Court found upwards of 200,000 contraventions on the OAIC’s preferred reading – each affected individual counting as a separate contravention. The Commissioner indicated that this will be the OAIC’s position going forward. With the maximum penalty per contravention now at $50 million rather than the pre-reform $2.2 million, the arithmetic speaks for itself.

The takeaway is not really about cyber security. It is that APP 11 has both a technical limb and an organisational one, and the latter does a great deal of work in practice. Acquisitions, integrations, restructures and outsourcing arrangements are exactly the moments when gaps start to show. A privacy policy is one thing. A privacy program – funded, properly governed, reflected in training, surviving an M&A event – is another.

The Vinomofo Pty Ltd investigation makes the same point from the other direction. The policies existed. The training, as the Commissioner described it, was nominal. Privacy was not embedded.

The third matter – the Bunnings review decision – extends the point from culture and training into process. APP 1.2’s requirement to take reasonable steps to implement procedures, processes and systems is not satisfied by scattered internal enquiries and informal sign-offs. For new, invasive or high-risk practices, the baseline is a formal, structured, documented assessment. Olga’s framing – ‘to avoid the Death Star, do a PIA’ – drew a chuckle from the room.

Bunnings is also worth reading for what the OAIC won and what it lost. The Tribunal departed from the Commissioner on proportionality and necessity, which the OAIC has acknowledged and will address in forthcoming updates to the APP 3 collection guidelines (now published). But on the points that matter most the OAIC won decisively. Collection-is-collection-no-matter-how-transient is the holding that will persist and make a difference. As the Commissioner noted, future collection events will look nothing like a paper form. They will be milliseconds long, mediated by AI, embedded in pixels, layered through brief and opaque encounters. Dispensing with the temporal threshold for ‘collection’ now matters enormously for how the Privacy Act applies later.

Agency: privacy as power, not paperwork

The second theme picked up the Commissioner’s continuing concern with power and information asymmetries. The question, she suggested, is not whether an individual could in principle have made a different choice. It is whether the individual was ever in a meaningful position to do so. Two areas stand out.

The first was AI. The Commissioner has clearly been mapping the AI landscape over the past year – engaging with developers, providers, agencies and civil society on the use of personal information to train AI models, and on the rollout of AI scribe technology in clinical settings. The iMed investigation closed without findings; others are ongoing and likely to produce decisions next year. The 2026 community attitudes survey, when it lands, will show that 93% of Australians do not think it is fair and reasonable for organisations to use personal information to train AI systems. That figure will inform how the OAIC interprets ‘purpose’, ‘use’ and ‘disclosure’ in this space.

The second practice was excessive collection. The 2Apply / InspectRealEstate determination is a striking application of APP 3. The factual setting matters: in a rental market with severe power imbalance and limited alternatives, a prospective tenant has little real say in what information they hand over or how the request is put to them. The OAIC found that the platform’s collection practices breached APP 3.3 (collection of sensitive information) and, more interestingly, breached APP 3.5 (lawful and fair collection), on the basis that the design of the application flow was unfair. Drawing on the UK ICO’s work on online choice architecture, the OAIC identified specific design patterns – ‘confirmshaming’ and biased framing – that contravened the fair-and-lawful-means requirement..

This is APP 3 doing more work than most organisations have assumed it does. The question is no longer just ‘can we identify a business reason for asking?’ It is whether each piece of information being collected is genuinely necessary – particularly sensitive or high-risk information that may carry more risk than value – and whether the way the request is put to the individual is fair on its own terms. Choice architecture has now arrived as a privacy concept, not just a consumer law one.

The thread connecting AI and rental applications (and, in forthcoming investigations, tracking pixels) is the one the Commissioner drew explicitly. These are all practices that are passive, opaque, or offer false choices. They are not legible to the people they affect. The OAIC’s regulatory interest is concentrating in exactly those places.

Alternatives: the Children’s Online Privacy Code as proof of concept

The third theme was the most forward-looking, and the most interesting departure from where one might have expected the speech to go.

There is a natural reading of action and agency together – fines are getting bigger, the OAIC is more active, the law is being read more purposively – that is essentially enforcement-focused. The Commissioner’s third move was to step out of that frame and ask a different question: what if the regulator did not just enforce against bad practice, but demonstrated what good practice could look like?

This is where the Children’s Online Privacy Code comes in. The exposure draft was published earlier this year. Three features stand out that make the Code structurally different from ordinary APP compliance.

First, the Code regulates at the service level, not the entity level. This follows the model used in online safety regulation. It also reflects a recognition that the entity is often not the right unit of analysis for digital services, where the same company might run multiple services with quite different risk profiles.

Second, data minimisation is the default starting position. Collection settings are switched off unless the child opts in. Consent must be genuine, not bundled or guilt-tripped, and where the child is under the age of digital consent they must still be brought into the conversation in age-appropriate language. There is a right to erasure, not just de-identification.

Third, the best interests of the child is the primary consideration. This is not a familiar concept in Australian privacy law. It draws from international children’s rights law and changes the orientation of the entire framework. Compliance is no longer principally about whether the organisation has acted reasonably from its own perspective. It is about whether the design of the service is in the interests of the children using it.

These are not incremental adjustments; they change the starting point. Commissioner Kind described feeling ‘something close to excitement’ about the Code’s potential. She also framed it as a proof of concept: ‘the aspiration is to build the alternative, then extend it to everyone else.’ If a digital ecosystem with stronger defaults, more honest design and meaningful user agency can be made workable for children, it becomes harder to argue that the same is impossible for others.

That is the part worth watching. There is an emerging Australian regulatory pattern here – the eSafety Commissioner’s Social Media Minimum Age framework, and now the OAIC’s Children’s Online Privacy Code – in which Australia is taking a more design-forward and structurally interventionist approach to digital regulation than comparable jurisdictions. The Children’s Online Privacy Code is the most ambitious yet because as the Commissioner indicated, the aspiration is to use it as a stepping stone: first prove the model with children, then extend the same defaults, design standards and user controls to digital services more broadly.

What this means for organisations

The clearest message of PAW 2026 is that waiting for Tranche 2 is not a compliance strategy. The Commissioner is using the Act she has, using it well, and signalling that she will continue to explore new understandings and applications of its existing terms.

The concepts that Commissioner Kind is seeking to clarify in the coming year include the definition of personal information, and purpose, use and disclosure under APP 6. These terms are especially pertinent when it comes to how the Act applies to AI training, profiling and connected devices.

For organisations, the practical implications follow from each of the three themes.

On action, paper compliance is no longer a safe place to sit. Privacy needs to be funded, embedded, reinforced in training and reflected in how the organisation actually makes decisions about new technologies. Acquisitions and integrations are where this can fall over in practice. High-risk and novel practices should be supported by formal, structured, documented assessments. The Commissioner has now made clear that anything less is unlikely to satisfy APP 1.2.

On agency, the orientation has shifted. The question is no longer whether the organisation’s privacy practices can survive a narrow legal review. Rather, the lens should be about trust: how do these practices hold up when looked at from the perspective of the person on the other side of the form, the screen or the AI model? Excessive collection, opaque processing and dark-pattern design are in the Commissioner’s crosshairs, and they will not be defended by pointing to a privacy policy.

On alternatives, the Children’s Online Privacy Code is worth paying attention to, including by organisations that are not directly captured by it. The design choices in the Code reflect a regulatory view about what good looks like across the board. The closer an organisation’s own practices are to those defaults, the less exposed it will be if (or when!) the model is extended at some future point.

Conclusion

PAW 2026 was a challenge as much as a celebration. There is more work to be done to promote privacy and win trust.

The regulator is doing its part. I am genuinely impressed at how much the OAIC has been able to pull off, given all the things on its plate and the (limited) resources it has to work with.

As Commissioner Kind noted at the outset, the Australian community is already privacy-aware. The question now is whether regulated entities are paying attention – and what they intend to do about it.

If you have any about how these developments might affect your organisation, or would like assistance with privacy program uplift, PIAs or any of the practical implications above, please contact us.

Comment

Children's Online Privacy Code: What You Need to Know and What's Next

Comment

Children's Online Privacy Code: What You Need to Know and What's Next

By Gabriella Assis

Introduction

Australia is entering a new era of child-centred privacy regulation, with the draft Children’s Online Privacy Code (the Code) marking a major shift in how children’s data must be handled.

The Office of the Australian Information Commissioner (OAIC) notes that by age 13, an estimated 72 million data points may have been collected about a child. The Code responds to the growing risks associated with large scale data collection, including discrimination, algorithmic bias, identity theft, targeted advertising and other forms of misuse.

This volume of data leaves children and young people exposed to a range of data practices including profiling, direct marketing and targeted advertising, as well as ingestion of personal information into AI. Data breaches, unlawful disclosure and broader security failures, identity theft, discrimination, and algorithm bias all can lead to serious financial, reputational and developmental harms. These risks highlight the need for stronger, enforceable safeguards.

The Children’s Online Privacy Code is a legislative instrument made under the Privacy Act 1988 and was introduced by the Privacy and Other Legislation Amendment Act 2024 (POLA Act). The Code places clear responsibility on organisations to embed safety, transparency and privacy protective design into their digital services.

This Insights post outlines what the Code is, why it matters, how it was developed, how stakeholders can influence its final form, how IIS can support organisations preparing submissions, and what happens next.

1. Understanding the Children’s Online Privacy Code

Why this matters

The Code is a major uplift to Australia’s privacy framework, designed to protect children in a digital ecosystem where data collection is pervasive and often invisible. The Code will become a legally enforceable instrument once it is registered on 10 December 2026.

Why the Code is needed: Evidence from the EdTech ecosystem

Recent independent research into school‑endorsed educational apps in Australia shows a clear gap between what privacy policies promise and what apps actually do – the very risks the Children’s Online Privacy Code is designed to address. Analysis of almost 200 apps approved for use in schools found that many shared children’s personal information with third parties as soon as the app was opened, often before any user interaction, contradicting their own privacy policies and exposing gaps in oversight by education systems, app developers, and regulators.

The research also found that most apps included advertising or tracking tools that were not necessary for their educational purpose, while only a small number of privacy policies accurately reflected these practices. Most policies were written in language too complex for parents and children to reasonably understand, and child-focused branding often created an illusion of safety not supported by how the apps operated.

Together, these findings highlight that current consent and disclosure mechanisms reinforce the need for enforceable, design focused obligations that place responsibility on organisations rather than children, parents, or schools to act in the best interests of the child.

Scope and application

The Code applies to businesses or organisations covered by the Privacy Act 1988 if:

  • They are a provider of a social media service, a relevant electronic service or designated internet service,

  • The service is likely to be accessed by children or primarily concern the activities of children, and

  • If the organisation is not providing a health service.

For the purposes of the Code, a social media service, a relevant electronic service, and a designated internet service are understood by the OAIC as follows:

  • Social media services: platforms where people can connect, share content and interact with others (e.g. social networks, public media-sharing sites, discussion forums and review platforms).

  • Relevant electronic services: online services that let people communicate with each other (e.g.  messaging apps, email services, video calling platforms and online games where players can chat).

  • Designated internet services: online services that allows users to access or receive material over the internet (e.g. cloud storage, websites that let users receive/access content, streaming platforms, consumer IoT devices).

Importantly, the Code applies at the service level, not the organisational level. This means only the child-facing or child-relevant components of a business fall within scope. This means that if an organisation operates one part of its website that is likely to be accessed by children, that specific service will be covered by the Code. Other services that are not accessed by children – or that do not involve children at all – remain outside the Code’s scope. In practice, the organisation would need to, for example, publish a dedicated privacy policy on its website that clearly identifies the in-scope services and explains its privacy practices in language that is easy for children to understand.

How the Code will work in practice

The Children’s Online Privacy Code introduces obligations that materially change how organisations must handle children’s personal information. This includes:

  • ‘Best interests of the child’ as the governing principle for collection, use, and disclosure of personal information.

  • Stronger consent mechanisms, including notifying a child when a parent consents.

  • Ensure personal information about a child is destroyed upon request, unless an applicable exception applies.

  • Limits on direct marketing, only permissible with consent and when in the child’s best interests.

  • Age-appropriate transparency, requiring clear, accessible, developmentally appropriate notices.

These obligations shift responsibility from children and parents to the organisations designing and operating digital services.

The Code’s primary requirement

The Code’s primary requirement is for organisations to only collect, use or disclose personal information in ways that are consistent with the ‘best interests of the child’.

To understand what actions are in the ‘best interests of the child’, the Code indicates that organisations should consider factors such as:

  • The nature and extent of child exploitation risks, noting that child exploitation includes any situation where a child is abused, harmed or used by another person for economic, sexual or personal gain.

  • The likely mental or physical impacts on the child.

  • The likely impact on the physical, psychological, emotional, social and cognitive development of the child.

  • The extent to which the child’s ability to develop and express their views and identities may be affected.

  • The extent to which the child’s freedom of association, play, leisure or participation in social, cultural or educational activities may be affected.

  • Whether particular groups of children may experience disproportionate or adverse impacts, including children with disabilities, Aboriginal and Torres Strait Islander children, children from culturally and linguistically diverse backgrounds.

  • The evolving capacities of children, including differences in age, maturity and developmental stage across childhood.

2. How the OAIC developed the Code

A research-driven, consultative approach

The OAIC’s development of the Code has involved research, evidence, and consultation. The OAIC has reported that it conducted more than 65 engagements with stakeholders across government, industry, academia, civil society, and international regulators.

Three phase consultation process

Phase 1 (Jan-Aug 2025) – The OAIC held the initial consultation with children, parents, and organisations focused on children’s welfare.

Phase 2 (Apr-Aug 2025) – The OAIC engaged with civil society, academia, and industry to test early concepts and gather insights and perspectives.

Phase 3 (current) – Mandatory 60-day public consultation (31 March – 5 June 2026): The OAIC is seeking industry, civil society, academia and any other interested parties to submit a written response to the Children’s Online Privacy Code.

International alignment

The OAIC has aligned the Code with global frameworks such as theAge Appropriate Design Codedeveloped by the UK Information Commissioner’s Office, while integrating novel protections to ensure Australian children benefit from leading privacy approaches.

3. A call to action for stakeholders: How to participate in the public consultation

Why your input matters

The OAIC has emphasised that it is approaching this consultation with an open mind and is actively seeking feedback to refine the Code and ensure it is implementable.

How to get involved

Stakeholders can:

Where feedback is most valuable

This is where organisations can meaningfully influence the final Code.

1. Scope clarity

As the Code applies at the service level, organisations with mixed service lines (e.g., banks, telcos, EdTech providers) should provide feedback if the application of the Code to some but not all of their services is unclear.

2. Operationalising the Code

Stakeholders can provide input on (or pose questions about):

  • Approaches to interpreting and operationalising the ‘best interests of the child’ principle, recognising that its application may involve balancing competing interests or rights.

  • How to balance commercial and child-centred interests.

  • What evidence organisations must demonstrate to comply with the Code.

  • How to implement any other requirements of the Code.

How IIS can support your submission

If your organisation wishes to have its say, now is the time to engage. IIS can support you in preparing a clear, well-structured submission that reflects your operational context and highlights any practical considerations the OAIC should take into account. Our team can help you interpret the Exposure Draft of the Children’s Online Privacy Code, assess the implications for your services, and articulate your feedback in a way that constructively contributes to the consultation process.

4. What happens after the consultation

Regulatory pathway

After the consultation closes, the OAIC will:

  • Review all submissions.

  • Engage in a Regulatory Impact Analysis (RIA) to conduct a cost-benefit analysis of the implementation of the Code. For the Children’s Online Privacy Code, the RIA focuses on balancing stronger privacy protections for children against the regulatory and economic impacts on online services.

  • Where appropriate and required, the OAIC will continue to consult with relevant stakeholders to ensure different voices are heard and represented throughout the process in developing the final Code.

  • Register the final Code by 10 December 2026 as required by the POLA Act. Once registered, the Code becomes legally enforceable.

Conclusion

The Children’s Online Privacy Code represents a significant development in the national privacy landscape. It elevates children’s rights, places responsibilities on organisations to design safer digital environments, and aligns Australia with global best practice.

The current consultation period is a critical opportunity for interested stakeholders to help shape the final Code, ensuring it is practical and capable of meaningfully protecting children in an increasingly complex digital ecosystem.

Comment

Privacy Act review: A closer look at children's privacy

Privacy Act review: A closer look at children's privacy

By Natasha Roberts

In this post, we take a closer look at proposals related to children’s privacy contained in the recent Privacy Act Review Report (the Review) – proposals to which the Government has agreed or agreed in principle.

What was the problem the Review was trying to address?

There is growing recognition that children and young people may be vulnerable in relation to privacy, particularly online. The Review noted that in the digital age kids are increasingly ‘datafied’ and that personal information about children can be used to build profiles and identify moments that children may be particularly vulnerable or receptive to online targeting and marketing (including in relation to harmful products and messaging). As the Report observed, this may affect children and young people’s autonomy and capacity to freely develop their identity.

How did the review propose to address this problem?

The Review took a multifaceted approach to addressing children’s privacy including the following…

Define ‘child’ and restrict marketing, targeting and trading in personal information

Currently the Privacy Act does not define ‘child’ and there are no specific provisions applying to children’s privacy (though organisations are expected to consider an individual’s capacity to consent which may include considerations of age or maturity). The Report proposed reforming the Privacy Act to define a child as an individual under 18 years of age.

In formally defining the meaning of child, the Privacy Act would then provide for certain specific provisions that apply only to children. These include proposals to prohibit the ‘trading’ of personal information of children and restrictions on ‘direct marketing’ and ‘targeting’ of children, other than marketing or targeting that is in the best interests of the child (for example, targeted marketing for essential child support, counselling and community services).

Codify ‘capacity’ in relation to consent

The Privacy Act contains several exceptions that allow certain information handling with the consent of the individual. However, deciding when children have ‘capacity’ to consent can be difficult, in recognition of varying levels of maturity at different ages. Up until now, the Privacy Act has not specified a particular age at which children may consent on their own behalf and guidelines issued by the Information Commissioner have stated that an organisation must decide on a case-by-case basis if an individual under the age of 18 has the capacity to consent. Where that is not practical, the Information Commissioner advises that an organisation may assume an individual over the age of 15 has capacity, unless there is something to suggest otherwise.

The Review recommended retaining this ‘middle path’ between individualisation and practicality, noting that over-reliance on parental consent was impractical and undesirable. The Review did however propose that the Privacy Act codify the principle that valid consent must be given with capacity. While this would result in a change to the Act, it should not result in a major change of approach for organisations given that it formalises what is already contained in the Information Commissioner’s guidelines and what should already be occurring in practice.

Build consideration of ‘best interests of the child’ into fair and reasonable test

Elsewhere we have discussed the proposal for the introduction of a fair and reasonable test to the Privacy Act. The Review further proposes that any such test require organisations to have regard to the best interests of the child as part of considering whether a collection, use or disclosure is fair and reasonable in the circumstances. In our view, this is the most far-reaching of the children’s privacy reforms as it puts the best interests of the child at the heart of decisions about information handling.

Introduce a Children’s Online Privacy Code

Other jurisdictions (notably the UK) have promulgated codes to regulate the privacy of young people online. The Review considered models adopted in those other jurisdictions and came to the view that Australia should introduce a Children’s Online Privacy Code that applies to online services that are ‘likely to be accessed by children’ and which aligns with the UK Age Appropriate Design Code, to the extent possible. According to the Review, a code could address:

  • Whether specific requirements are needed for assessing capacity

  • Whether certain collections, uses and disclosures of children’s personal information should be limited

  • Which default privacy settings should be in place

  • Whether entities should be required to ‘establish age with a level of certainty that is appropriate to the risks’ or apply the standards in the Children’s Code to all users instead

  • How privacy information (including collection notices and privacy policies) and tools that enable children to exercise privacy rights (including erasure requests) should be designed to improve accessibility for children, and

  • If parental controls are provided, how to balance the protection of the child with a child’s right to autonomy and privacy from their parents in certain circumstances.

The Review also proposed amending the Privacy Act to require that collection notices and privacy policies be clear and understandable, in particular for any information addressed specifically to a child. In the context of online services, these requirements are to be specified in the Children’s Online Privacy Code. Specifically, the Code could provide guidance on the format, timing and readability of collection notices and privacy policies.

What are the key takeaways for my organisation?

Privacy law reform is still ongoing, therefore this in an area on which to maintain a watching brief. That said, there is nothing to stop you from reviewing the bullets listed above and assessing your personal information handling activities against those standards. We suggest:

  • Identifying whether you handle children’s personal information and in what circumstances (for example, in person, online etc) to determine how you may be affected by reforms

  • Maintaining a watching brief on privacy law reform to see how proposals related to children’s privacy are implemented in practice

  • Engaging in consultation – the Government has committed to further consultation on children’s privacy and there are likely to be opportunities to comment on bill exposure drafts and the draft code, as its developed

  • Reviewing the UK’s Age Appropriate Design Code to gain insight on the possible scope and approach of the proposed Children’s Online Privacy Code, noting that the Review specifically called for the proposed code to align with the UK’s Age Appropriate Design Code to the extent possible, and

  • Considering whether your organisation’s handling of children’s personal information meets the ‘best interests of the child’ test, which is likely to form part of the proposed ‘fair and reasonable test.’ This may require consideration of whether, throughout the handling of a child’s personal information, a child’s physical, psychological and emotional wellbeing is protected.