Skip to Main Content
Our Commitment to Diversity

ACMA: Misinformation and Disinformation Bill

Date: 18 August 2023
Australia Corporate Alert

What Is the Background to the Misinformation and Disinformation Bill? 

Currently, an exposure draft for public comment, the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (Cth) (the Bill), may become law. If legislated, the Bill would enhance the powers of the Australian Communications and Media Authority (ACMA) to combat misinformation and disinformation in relation to data held by digital platform providers.

The Bill’s genesis lies in the work of the Australian Competition and Consumer Commission, which, in July 2019, released its final report into the Digital Platforms Inquiry (DPI)–please click here to read our previous alert. In response to the DPI, the Australian Government (the Commonwealth) asked digital platform providers in Australia to develop a voluntary code of practice (Voluntary Code) to address online misinformation. The ACMA was to have regulatory oversight of the Voluntary Code. The ACMA was also to report to the Commonwealth on the adequacy of the measures undertaken by the digital platforms and the broader effects of disinformation. 

In June 2021, the ACMA reported to the Commonwealth on the adequacy of digital platforms’ disinformation and news quality measures (Adequacy Report). Despite the ACMA concluding in the Adequacy Report that the Voluntary Code was “an improvement on the EU Code of Practice on Disinformation,” that the Voluntary Code “provide[d] signatories with the flexibility to implement measures to counter disinformation and misinformation in proportion to the risk of potential harm,” and that the Voluntary Code “stresses the need to balance interventions with the need to protect users’ freedom of expression, privacy and other rights,” the ACMA assessed that the Voluntary Code required improvement and strengthening. 

The ACMA specifically recommended that it be granted information-gathering powers (including the power to make record-keeping rules) to oversee digital platforms, as well as the power to enable the ACMA to seek from digital platforms Australia-specific data about the effectiveness of the measures of the digital platforms to address information and disinformation. The ACMA further recommended in the Adequacy Report that the Commonwealth grant the ACMA new powers to register industry codes, enforce industry code compliance, and make standards relating to the activities of digital platforms’ corporations.

Following this, the Commonwealth Department of Infrastructure, Transport, Regional Development, Communication and the Arts began the process of seeking feedback from the public on the exposure draft of the Bill.1 

What Would the Bill Do? 

If enacted, the Bill would introduce a new Schedule 9 into the Broadcasting Services Act 1992 (Cth) (BSA) titled “Digital platform services.” Schedule 9 would give the ACMA new powers in respect to misinformation and disinformation on digital platforms, and it would empower the ACMA to require digital platform providers to make and retain records, as well as provide information to the ACMA. Schedule 9 would also give the ACMA the power to register codes, request the development of codes, issue formal warnings to require compliance with codes, and determine standards by legislative instrument.

There are significant penalties for noncompliance with the proposed new laws. A digital platform provider that makes records that are false or misleading (including by omission) faces a penalty of 100 penalty units.2 There are serious consequences for a person who provides false or misleading information or evidence to the ACMA while knowing that the information, evidence, document, or copy is false or misleading, or by knowingly omitting any matter or thing without which the information, evidence, document, or copy is misleading.3 For a digital platform provider that is a corporation, the maximum penalty for contravening a standard or an ACMA direction in relation to a standard, is the greater of AU$6.875 million (25,000 penalty units) and 5% of the corporation’s annual turnover during the 12 months before the contravention.4   

The ACMA’s power to require the production of information or evidence, or to require a person appear before the ACMA, applies to persons other than digital platform providers if the information, document, or evidence is relevant to the exercise of the ACMA’s powers in relation to digital platform services.5 The Bill retrospectively applies the definitions of “misinformation” and “disinformation” to content for the purpose of the ACMA’s information-gathering powers.6 

The Bill’s intention is that the industry will develop one or more misinformation codes,7 with the ACMA being able to register the code. Significantly, however, the ACMA can determine a standard if there are exceptional and urgent circumstances, and if it is unlikely that a code dealing with that matter or matters could be developed within a reasonable period in the circumstances.8 

Two Areas for Service Providers and Customers to Be Aware Of

Despite the Bill including a mesmerising number of distinguishing technical definitions, there are a number of areas that the Bill may not adequately cover. We mention two areas: (a) traditional media, and (b) the application of freedom of political expression under codes and standards. 

The first topic relates to how companies will comply with the record-keeping and information provision obligations when there is media content on digital platforms. The second field for organisations to be aware of is how the ACMA would decide what kind of “political” content is subject to the terms of any codes and standards, including under a standard (a legislative instrument) potentially determined by the ACMA itself. 

Digital Platforms and Traditional Media

First, as noted, the ACMA could require digital platform providers to make and retain records and provide information. However, the Bill provides that content is not subject to the record-keeping rules to the extent the content is “excluded content for misinformation purposes.”9 The definition of “excluded content for misinformation purposes” includes material that is produced in good faith for the purposes of entertainment, parody, or satire, and professional news content, as well as other defined content.

In addition, under the Bill, a “digital service” does not include a “broadcasting service,”10 which, in effect, excludes broadcasting services from digital platform services.11  The Bill does not define broadcasting service, and, even drawing on the existing definition in the BSA, it is not entirely clear whether “broadcast content” is broadcast content when streamed on a digital platform.

These complex definitions may give rise to practical difficulties for organisations managing the obligations. If a digital platform provider hosts media content (say, footage from a news broadcast), then it would seem that the digital platform provider (rather than the provider of the media content) would determine whether the content on the digital platform is “excluded content for record keeping purposes.” It is not obvious why, using this example, the digital platform provider should be the one making this assessment.

To continue with this example, if the digital platform provider assessed that the media content was not excluded content, then the digital platform provider would have to determine for the purpose of record keeping whether the relevant content is “misinformation” or “disinformation.” This would not be a straightforward exercise. Each definition requires that the content on the digital service is reasonably likely to cause or contribute to serious harm. The distinction between “misinformation” and “disinformation” is that disinformation requires an intended deception.12 From a compliance point of view, it may be difficult for a digital platform provider (or any organisation) to discern how to distinguish misinformation from disinformation based on a record in a digital platform. 

Freedom of Political Expression

Second, organisations may struggle to understand how codes and standards apply or do not apply to “political” content. This is an important practical issue for service providers to manage because, except for prescribed authorised electoral and referendum content (which remain outside the scope of a code or standard),13 the Bill requires the ACMA to be satisfied that a code or standard does not unreasonably burden freedom of political expression.14 Further, in assessing whether there is an excessive burden on political expression, the ACMA can take into account “any circumstances the ACMA considers relevant,”15 which raises the obvious question of what kinds of matters the ACMA can take into account.

The terms of the schedules, rules, codes, and standards have no effect if their operation would infringe any constitutional doctrine of implied freedom of political communication.16 For companies who have obligations under the Bill, section 60 does not clarify how the ACMA would exercise its powers in respect to codes and standards. The ACMA may try to resolve some of this lack of clarity by consulting with the industry during the ACMA’s development of relevant codes and standards. 

Conclusion

The Bill raises complex matters of compliance for businesses and not only for digital platform providers. Consequently, businesses that may be impacted by the Bill should assess and consider if their compliance processes are sufficiently flexible and robust to enable future compliance.

1 https://www.infrastructure.gov.au/have-your-say/new-acma-powers-combat-misinformation-and-disinformation; https://minister.infrastructure.gov.au/rowland/media-release/consultation-opens-new-laws-tackle-online-misinformation-and-disinformation

Clause 17(1) (100 penalty units = AU$27,500).

The maximum penalty for this offence is imprisonment for 12 months: clause 22(1).

205F(5H) Broadcasting Services Act 1992 (Cth) inserted by Schedule 2 of the Bill. 

Clause 19. 

Section 32(2) of Schedule 2 of the Bill (Transitional provisions). 

Clause 32. 

Clause 50(1)(b),(c).

Clause 2: Definitions. 

10 Clause 3(e).

11  A digital platform service is a digital service: clause 4(1). 

12 Clause 7(2)(e). 

13 Clause 35.

14 See, e.g., clause 45.

15 Clause 45(b).

16 Clause 60.

This publication/newsletter is for informational purposes and does not contain or convey legal advice. The information herein should not be used or relied upon in regard to any particular facts or circumstances without first consulting a lawyer. Any views expressed herein are those of the author(s) and not necessarily those of the law firm's clients.

Find more items tagged as: Corporate, Technology Transactions and Sourcing
Return to top of page

Email Disclaimer

We welcome your email, but please understand that if you are not already a client of K&L Gates LLP, we cannot represent you until we confirm that doing so would not create a conflict of interest and is otherwise consistent with the policies of our firm. Accordingly, please do not include any confidential information until we verify that the firm is in a position to represent you and our engagement is confirmed in a letter. Prior to that time, there is no assurance that information you send us will be maintained as confidential. Thank you for your consideration.

Accept Cancel