PUBLICATIONS circle 25 Mar 2026

Digital Governance, Cyber and Privacy | Quarterly Roundup | March 2026

By Katherine Jones, Shannon Blain, Jessica Yazbek, Amelia Sakaris and Grace Ellis

In this edition, you will find our regular roundup of recent digital governance news* and developments in Australia and across the globe.


Welcome to the fourteenth edition of our quarterly Digital Governance, Cyber and Privacy newsletter.

This quarter highlights a marked shift toward stronger enforcement across cyber, privacy and AI. Australian regulators are driving accountability through significant penalties, expanded privacy powers and closer scrutiny of data practices, while new developments underscore rising legal risks linked to AI use.

Globally, authorities continue to test the boundaries of consumer protection, financial liability and data governance.

Here is your roundup of key developments from Australia and around the world:

Australia news

FIIG Securities ordered to pay $2.5 million

Australian Securities and Investments Commission (ASIC) successfully obtained orders in the Federal Court of Australia requiring FIIG Securities to pay a $2.5 million penalty for prolonged cyber security failures that breached its AFS licence obligations. The court found FIIG failed, for several years, to maintain adequate cyber controls, which exacerbated a 2023 cyberattack that exposed sensitive personal data of around 18,000 clients. The decision is significant as it is the first time civil penalties have been imposed for cyber security failures under general AFS licence obligations setting a clear licence-to-operate expectation for robust cyber resilience.

Lululemon fined $702,900 for spam (ACMA)

Lululemon Athletica Australia paid a $702,900 penalty after the Australian Communications and Media Authority (ACMA) found that it sent more than 370,000 emails containing marketing content without a compliant unsubscribe function. The ACMA determined that Lululemon mischaracterised promotional emails as service messages such as shipping updates, breaching Australia’s spam laws.

Should banks pay victims of phishing? (EU court adviser)

An Advocate General of the Court of Justice of the European Union (CJEU) has issued an opinion stating that banks must immediately refund customers for unauthorised transactions caused by phishing scams. Under EU payment services law, banks may only delay refunds where there are reasonable grounds to suspect fraud, although they can later seek recovery if they can establish that the customer acted with gross negligence. While not binding, the opinion signals a likely shift towards stronger consumer protections.

Spies in Australia

There have been two recent prosecutions in Australia alleging reckless foreign interference. The first revealed Australian businessman, Alexander Csergo, was guilty of reckless foreign interference after he prepared reports for individuals he should have suspected were Chinese intelligence operatives. The second related to information gathering on Buddhist organisation Guan Yin Citta, which is banned in China and considered a cult. Documents filed in the Canberra Magistrates court revealed that a range of tactics were deployed by the Chinese spies including the adoption of false identities, open‑source intelligence collection and handler‑directed infiltration over several years.

Artificial intelligence

Harnessing data and digital technology

The Productivity Commission, an Australian Government independent research and advisory body, has conducted an inquiry into harnessing data and digital technology. Its Interim Report has been issued which contains seven draft recommendations across the areas of artificial intelligence, data access, privacy regulation and digital financial reporting.

ACCC report on AI developments

The Australian Competition & Consumer Commission (ACCC) warns that companies remain responsible for their AI and chatbots, with a need to ensure that AI does not engage in anti-competitive behaviours such as preferring one LLM over another or restricting data access to rival LLMs. The ACCC highlights rising consumer risks, including data misuse, misleading AI‑generated content, manipulative design, fake reviews and AI‑enabled scams and stresses the need for ongoing monitoring as AI capabilities evolve.

Work Health and Safety amendment (Digital Work Systems) Bill 2026

NSW has passed the Work Health and Safety Amendment (Digital Work Systems) Bill 2026, amending the Work Health and Safety Act 2011 (NSW) to require employers to manage worker safety risks from AI and other digital work systems. Employers must now consider risks such as excessive workloads, unreasonable monitoring, discriminatory decision‑making, and performance‑tracking practices created by AI systems. The amendments also give union officials new powers to inspect AI systems, signalling increased regulatory scrutiny once the provisions commence.

AI plan for the Australian Public Service

The Australian Public Service (APS) AI Plan 2025 sets out how the APS will significantly expand the safe, responsible use of AI to improve service delivery, policy outcomes and productivity. It focuses on three pillars: trust, people and tools, covering stronger governance, uplifted AI capability for all staff and wider access to secure, fit‑for‑purpose AI technologies. The plan is designed to be adaptive, with ongoing engagement across government, unions, industry and academia as AI capabilities evolve.

EU Government blocks AI features

The European Parliament has disabled built‑in AI features on lawmakers’ devices after determining it could not guarantee the security of data sent to cloud services. Officials warned that the extent of data shared by these AI tools is still unclear, prompting a precautionary shutdown and advice for members to avoid exposing work information to AI systems on personal devices.
The move follows broader EU concerns about foreign tech vendors and data security, including previous restrictions on Microsoft tools and a ban on TikTok for staff.

Artificial intelligence litigation and judgments

AI litigation tracker

If you are interested in case filings related to AI and peripheral matters, the George Washington University has a running case list of the lawsuits brought (primarily) in the USA.

Witness uses AI glasses to give evidence

The England and Wales High Court became concerned about a witness's credibility after discovering that he was wearing smart glasses when his mobile phone audibly broadcast another person’s voice during his cross‑examination, suggesting possible outside assistance. Despite the witness's denial that the devices were being used to receive answers, his call log showed a series of unexplained calls to a contact labelled “abra kadabra” immediately before he entered the witness box. The court concluded that the witness had used smart glasses connected to his phone to receive coaching during cross‑examination and then lied about it, offering explanations the court described as lacking any credibility.

Should ChatGPT have called the police?

The family of a girl who was critically injured in a school shooting in Canada is suing OpenAI, alleging the company knew the shooter had used ChatGPT to plan a mass casualty event but failed to warn authorities. The family allege that OpenAI could have potentially prevented the tragedy by notifying police, but instead ChatGPT assisted the shooter in planning the mass casualty event.

Nippon Life sues OpenAI for providing legal advice without a licence

Nippon Life has sued OpenAI for practising law without a licence, encouraging breach of a settlement contract and facilitating abuse of the judicial process after ChatGPT allegedly provided legal assistance to an individual in dispute with Nippon Life. The case is seen as a potential turning point as a ruling against OpenAI could force AI companies to implement stricter safeguards, monitoring, and liability frameworks for generated content resembling legal advice.

AI generated documents are not privileged

A U.S. District Court has held that a criminal defendant's communications with the public AI tool Claude were not protected by attorney‑client privilege as Claude’s privacy policy, which allows for data collection and sharing, eliminates any reasonable expectation of confidentiality. The court also determined that the defendant's communications with Claude were not protected by the work product doctrine, noting that this doctrine does not apply to materials prepared by a client on their own volition without the involvement of counsel. The ruling warns that using public generative AI tools for legal strategy can waive privilege and serves as a reminder that counsel, not clients, should direct litigation preparation.

Privacy

High Court confirms NCAT can order damages for breach of privacy

The High Court of Australia has found that the New South Wales Civil and Administrative Tribunal (NCAT) can order damages for breach of privacy under section 55(2)(a) of the Privacy and Personal Information Protection Act 1998 (NSW). The court determined that such an order does not involve the exercise of judicial power and is instead an exercise of administrative power. Therefore, NCAT has the jurisdiction to make such orders, overturning the court of Appeal's decision that NCAT lacked jurisdiction.

India's legal challenge to WhatsApp's privacy policy

In India, privacy is a constitutional right. In a landmark legal battle over privacy and data control, the Supreme Court has warned WhatsApp about the sharing of information with Meta. WhatsApp has been in an antitrust dispute with the regulator since November 2024, when it fined the company $25.4 million and barred WhatsApp from sharing user data with other Meta entities for advertising purposes for five years.

Privacy enforcement sweep

The Office of the Australian Information Commissioner’s (OAIC) has announced its first-ever privacy “compliance sweep”, starting January 2026, focusing on whether selected businesses’ privacy policies meet the requirements of Australian Privacy Principle 1.4. The sweep will review around 60 entities across six sectors that commonly collect personal information in person (including rental/property, pharmacies, licensed venues, car rental, car dealerships, and pawnbrokers/second-hand dealers). Following 2024 Privacy Act changes that expanded enforcement options, the OAIC has flagged that non-compliant entities may face notices and penalties (up to $66,000) and is encouraging clearer, more transparent privacy policies about how information is collected, used, disclosed and destroyed.

Australia privacy breach dashboard

The OAIC has recently published an interactive Notifiable Data Breach (NDB) statistics dashboard, which presents key statistics on data breach notifications received by the OAIC under the NDB scheme since the scheme’s commencement in February 2018. The dashboard provides an interactive presentation of data in six month snapshots, including sources of breaches, top five sectors by source of breaches, time taken to identify breaches, number of individuals affected by breaches and the kinds of personal information involved in breaches.

GDPR Enforcement Tracker

CMS has launched a GDPR Enforcement Tracker providing a database and overview of fines and penalties which data protection authorities within the EU have imposed under the EU General Data Protection Regulation (GDPR, DSGVO).

Is storing photo ID for anti-money laundering a privacy risk?

On 13 March 2026, the Netherlands Supreme Court handed down a judgment finding that merely storing a passport photo or “selfie” (without specific technical processing such as facial-recognition template generation) is not “biometric data” processing for unique identification under GDPR Article nine. However, it considered that there is genuine uncertainty about whether EU anti‑money laundering record‑keeping rules require institutions to retain a copy of an identity document and, if so, whether that obligation—potentially including the photo—is compatible with the GDPR’s lawfulness and data‑minimisation requirements. The court indicated it will refer preliminary questions to the Court of Justice of the European Union (CJEU).

*Note: for some publications, you may require a current subscription to read the full article.

This is commentary published by Colin Biggers & Paisley for general information purposes only. This should not be relied on as specific advice. You should seek your own legal and other advice for any question, or for any specific situation or proposal, before making any final decision. The content also is subject to change. A person listed may not be admitted as a lawyer in all States and Territories. Colin Biggers & Paisley, Australia 2026

Stay connected

Connect with us to receive our latest insights.