Summary
Misinformation—false or misleading information—poses a significant threat to the health of our communities. It can lead to poor health choices, undermine trust in preventive measures like vaccines and fluoridation, harm social cohesion, and disrupt collective action needed to address common challenges, such as disaster responses.
In Aotearoa New Zealand (NZ), the scope of this threat is not well understood. Currently, we have limited insight into the types of health misinformation circulating and its potential impacts. Both internationally and in NZ, there is growing awareness of this as a significant issue, particularly highlighted during the Covid-19 pandemic. However recent progress appears to have stalled. Several NZ organisations that tracked and monitored misinformation environments have recently closed.
To effectively address this issue, both the government and civil society need to invest resources in monitoring public channels and platforms in NZ and formulating a proportionate response to misinformation. It is important to avoid overstating the risks, but the lack of active monitoring and research in NZ makes it difficult to accurately assess any risks let alone make informed decisions on how to address them.
Misleading information can lead communities and individuals to make harmful decisions. For instance, during the Covid-19 pandemic, some people ingested toxic substances mistakenly believed to protect against or cure the virus.1 More broadly, there are well documented examples of how misinformation harmed the NZ Covid-19 response.2 Similarly, misinformation about vaccines can lead people not to be vaccinated, affecting not only their own health but also the safety of others through reduced herd immunity.3, 4
This latter point is especially relevant as NZ grapples with falling childhood vaccination rates and the consequent risk such as measles epidemics and infant deaths from whooping cough.5, 6 The recent declines in vaccination have been attributed, in part, to misinformation fuelling vaccine hesitancy.
Misinformation can also have much wider effects. At the society level, misleading information can weaken trust in science and health organisations, government, and institutions, undermining social cohesion.7, 8
What is Misinformation? There are a range of definitions applied to terms like misinformation and disinformation, and even experts don’t agree on the exact meanings.9, 10 We use the term misinformation in a very broad sense in this Briefing, pragmatically defining it as ‘inaccurate or misleadingly presented information that can cause harm’.11 This definition focuses on two key aspects of concern: 1) that the information is false or misleading, including through omission of relevant facts or caveats, and 2) that there is the potential for harm. This definition also captures disinformation—false information deliberately spread for financial, ideological or geopolitical gain.
Both in NZ and abroad, authorities and the public have expressed concern about the impact of misinformation on society. The WHO has identified misinformation as a key threat, as have the UN and OECD (see Appendix 1 for further details). In NZ, the National Security Strategy flags disinformation as a core issue.
This concern is shared by the NZ public. Numerous surveys indicate that people in NZ are increasingly concerned about misinformation,12-17 with one survey reporting that 90% of New Zealanders believe misinformation is affecting public health.17
Misinformation monitoring and research stalled
Last year, the Department of Internal Affairs signalled stronger social media regulation through a new independent regulator, as part of the 'Safer Online Services and Media Platforms' consultation. While misinformation wasn’t a direct focus of the proposal, many submissions urged action in this space. The current government has since abandoned the project with no apparent plans to examine alternatives.
Other initiatives to track and analyse misinformation have also wound down. The Department of the Prime Minister and Cabinet (DPMC) funded Hate and Extremism Insights Aotearoa and UK-based Logically to monitor misinformation in 2023 and 2024, but both initiatives appear to have ended. Similarly, the Disinformation Project, which tracked Covid-19 misinformation and harmful rhetoric, and Tohatoha, which addressed misinformation and the societal impacts of technology, have also shut down.
Without these organisations, NZ faces a gap in understanding and tracking harmful health misinformation online.
By understanding what misleading information is circulating, the front-line health sector is better equipped to respond to concerns from the public, and ensure accurate, and accessible information is available and tailored to relevant audiences.
In 2023, the DPMC convened a Multi-Stakeholder Group to recommend ways to strengthen resilience to disinformation. Their report, released earlier this year, emphasised the need for research specific to the NZ context. The report states:
Additional empirical work and evidence-gathering is critical to building an evidentiary base to guide and explore the benefits and drawbacks of potential interventions. Support for these functions will almost certainly require ongoing resourcing, including from government.
While the DPMC Chief Executive agreed in principle with the report's recommendations, DPMC will not continue supporting research and monitoring efforts—a decision that could be shortsighted.
A continued lack of insight could leave NZ vulnerable to a sudden rise in misinformation and its associated risks to the health of our communities and social cohesion. A basic investment in mapping and monitoring the ‘information landscape’ of health-related narratives is warranted. Based on recent surveys highlighting public concern, most New Zealanders would agree with supporting such efforts.
A 2022 report from the Brainbox Institute, commissioned by DMPC, offers a model of how research and monitoring could be carried out. The report makes a strong case for the establishment and support of “a diverse, multidisciplinary civil society-led institution to conduct ongoing independent analysis of social media-based communications for the purpose of monitoring and analysing potential disinformation and misinformation.” Key design elements of such an institute are listed in Appendix 2.
The importance of having this research and surveillance capacity is underscored by the International Health Regulations, which requires signatories—including NZ —to build research capacity for “risk communication including addressing misinformation and disinformation”.18 We need to be ready for the current and next ‘infodemics’.
Beyond supporting research to understand the scale and impact of harmful inaccurate information in Aotearoa, there are other policy levers for building resilience at the Government’s disposal. These measures include initiatives to build information literacy across the population, supporting local journalism, and stronger policies around data transparency and accountability for online platforms.19-21
However, the first step required is to build a clear understanding of the NZ misinformation landscape and monitor this in a sustainable way.
What this Briefing adds
- Misleading information can harm the health of individuals and our communities both directly, and indirectly by undermining social cohesion that is needed to respond to common threats, including pandemics and disasters.
- International bodies and the NZ public are concerned about the impact of misinformation.
- The current prevalence and impact of misinformation in NZ is not well defined (although it may be contributing to declining immunisation rates).
- NZ’s current capacity to monitor misinformation is limited, especially following the closure of key organisations carrying out this work.
Implications for policy and practice
- There is a strong case for the NZ Government to invest in a basic level of independent monitoring to identify and highlight emerging misinformation and narratives that could be harmful to public health.
- A NZ capacity to address misinformation will soon be an expectation under international law.
- NZ agencies should also aim to build health information literacy at a population level, bolster local journalism and support international efforts to increase transparency and accountability for large online platforms through which harmful content may be spread.
A one-day workshop on misinformation and public health in the NZ context is being held in February 2025 as part of the University of Otago, Wellington Public Health Summer School.
Author details
Dr John Kerr, Science Lead, Public Health Communication Centre, and Department of Public Health, Ōtākou Whakaihu Waka, Pōneke - University of Otago, Wellington
Dr Michael Daubs, Co-Director, Kōtaha: the Internet, Social Media and Politics Research Lab at Te Herenga Waka - Victoria University of Wellington
Prof Nick Wilson, Co-Director, Public Health Communication Centre, and Department of Public Health, Ōtākou Whakaihu Waka, Pōneke - University of Otago, Wellington
Dr Claudia Schneider, Lecturer, School of Psychology, Speech and Hearing, Te Whare Wānanga o Waitaha - University of Canterbury
Prof Nikki Turner, Waipapa Taumata Rau - University of Auckland and Principal Medical Advisor to the Immunisation Advisory Centre (IMAC)
Prof Michael Baker, Director, Public Health Communication Centre, and Department of Public Health, Ōtākou Whakaihu Waka, Pōneke - University of Otago, Wellington
Appendix 1: Recommendations and excerpts from international reports
Report | Excerpt |
World Health Organization: Understanding disinformation in the context of public health emergencies: the case of COVID-19 | “WHO has developed various resources to support public health agencies in Member States in better understanding the impact of misinformation… “An effective disinformation containment strategy must build on this by developing monitoring and forecasting tools and strategic frameworks to help public health agencies to identify, understand, investigate and plan for emerging trends in disinformation as part of their public health preparedness.” (p 46) |
World Economic Forum: Global Risks Report 2024, which ranks false information as the number one risk for 2024-2026. | “However, even as the insidious spread of misinformation and disinformation threatens the cohesion of societies, there is a risk that some governments will act too slowly, facing a trade-off between preventing misinformation and protecting free speech…” (p 18) |
United Nations: Global Principles for Information Integrity 2024 | “To ensure that all States can contribute to and benefit from the information ecosystem, urgent and sustained initiatives are needed to … strengthen their capacities to adequately address risks in information spaces, while respecting human rights.” (p 34) |
OECD: Building Trust and Reinforcing Democracy 2022 | “Monitoring public and open channels and platforms to identify problematic content and emerging narratives is a key feature of government efforts to understand emerging mis- and disinformation narratives and to develop effective communication responses.” (p 18) |
Appendix 2: Key design principles for a civil society group tasked with social media analysis, from the Brainbox Institute (p. 10)
Full integration into civil society
The priorities and direction of the proposed organisation should be informed by those of many groups and communities across civil society; it should work in ways that are transparent and understandable to civil society, and it should produce outputs that facilitate a civil societal response where possible. It must be able to work effectively with, and within, other civil society organisations and networks.
Data access as a priority
It must leverage the full opportunities available for civil society researchers to acquire data from all the platforms and online spaces relevant to illicit online manipulation, not just the largest and most easily accessible.
Cross-platform focus
It should conduct monitoring and research across a range of platforms. It is understood that disinformation efforts frequently occur across a number of platforms, often functionally separated for the purposes of planning, coordination and execution. Detections made on one platform may present either data collection opportunities on another platform, or input into the detection methodology on another platform.
Continual self-assessment and development
It must improve its own capabilities as it learns, identifying new techniques and developing the necessary tools to conduct effective investigations. Access to technological development capabilities will be essential, as will a cyclical structure that allows for past research to inform future planning, new conceptual frameworks, and innovative ways to respond to the challenges outlined in this report.
Explicability at all stages
This group should build confidence not only in its final outputs, but in the processes and systems used to generate them. As such, in addition to making sure its public-facing output is as clear and easy to understand as possible, it should have a visualisation and analysis function wherever the machine- driven parts of its detection system produce human-readable output or require manual intervention.
Insight from all sources
Rather than focusing entirely on detecting and analysing online information, it should draw insights from any area that may be useful; particularly traditional psychology, sociology, and representatives from communities that may be particularly vulnerable to disinformation campaigns.
Te Ao Māori centrality
In order to make sure Te Tiriti obligations are met, mātauranga Māori is respected, and the group maintains credibility with indigenous communities which tend to have lower confidence in government, Māori scholars and community representatives must be a core pillar of its structure from the beginning.