The recently proposed “no jab, no pay” policy was focused on influencing the approximately 2% of Australian parents registered as refusing vaccinations for a variety of non-medical reasons. While this proportion has grown in the last decade, there are other percentages that are of much greater concern.

In Australia, half of parents have concerns about the safety of vaccines, and surveys from Australia, the United States and Canada indicate that between 20% and 33% of adults believe that vaccines can cause autism. Even though the link has been thoroughly and repeatedly debunked in the 17 years since it first appeared, this kind of persistent misinformation does influence vaccine hesitancy and refusal. With online social networks becoming the norm for information sharing within communities, misinformation can be amplified in hours, not months or years.

Step 1: Measure the size of the problem

In our research, we combine data-mining and social science to better understand misinformation. We use Twitter as a kind of social laboratory to examine how people share opinions about the human papillomavirus (HPV) vaccine. Since October 2013, we have collected over 200,000 tweets and mapped out the network of social connections among all the users who tweeted about the vaccines.

Yesterday, we published a new study in the Journal of Medical Internet Research, looking at 83,551 tweets about HPV vaccines posted within a six-month period. Just over 24% of those were negative, rejecting the safety or value of the vaccines. Examples include stories about young girls who suffered from serious medical conditions after receiving the vaccine, claims that a lead developer believes that the vaccine will not prevent cancer, and a whole range of causation-correlation confusion and data manipulation.

The main result of our study was to confirm the presence of a strong echo chamber effect for HPV vaccines. Users who were more often exposed to negative opinions were then much more likely to post negative opinions. These users also tended to inhabit sections of the Twitter ecosystem that were largely isolated from scientists, clinical evidence and public health organisations. The implications can be described more simply: for the vast majority of us, the social connections we choose largely dictate the information we see as well as the opinions we express.

Step 2: Work with communities at risk of misinformed opinions

The results of our study are consistent with the results of a recent study examining what Facebook users see in their timelines, showing that the way we structure our own social networks is the most important factor in limiting exposure to attitude-challenging information. Conflicts that could increase the polarisation in our social networks might further drive a wedge between clinical research and large sections of the broader community. This could increase the level of distrust in public health interventions and make it much harder to recover from any future safety scares that could gain traction in the media.

Instead, we should engage with the fence sitters — those at risk of becoming the refusers of all or some vaccines, or those who plan to vaccinate but simply want their concerns and questions addressed. And because vaccine rejection is a community phenomenon wound up in social norms, a useful way to do this is to support local advocates so that they can be effective voices for vaccination in a way that speaks the language of those communities.

Step 3: Help people evaluate what they read

By measuring misinformation online, we know that there are a range of different concerns about HPV vaccines. Stories of teenaged girls facing health issues after vaccination are posted on blogs mixed with advertising for gun rights. Homeopathy businesses are connected to conspiracy theorists concerned with the profiteering of pharmaceutical companies. These stories appear to affect people across the political and educational spectrum, so we should be careful not to assume there is a one-size-fits-all solution to addressing vaccine hesitancy.

We need to provide the tools that many of us in science take for granted in our professional lives — the tools needed to recognise false balance in the media, tools to evaluate the credibility of health information online, to reflect on our own personal biases, to separate temporal and causal association, and to support parents’ decision-making in a way that communicates science honestly.

Scientists, governments, science communicators and journalists should work together to improve the quality of information available online. These include translating research without distortion or exaggeration and providing access to the full texts of studies being reported. Given the centrality of news organisations in the networks we measure, it is clear that the media need to wield their influence with finesse and care.

Four-word policies that punish people for misinformed opinions create further conflict with a very small proportion of people who are unlikely to change their decisions. The risk associated with this kind of policy is that it could drive a larger wedge between public health practices based in clinical evidence and the kind of science denialism that is prevalent well beyond the 2% who refuse to vaccinate. To help guide parents away from these oubliettes of misinformation, we need to do much more to bridge the divide that separates evidence and public opinion.

*Adam G. Dunn is a  Senior Research Fellow, Centre for Health Informatics, Australian Institute of Health Innovation Macquarie University. Julie Leask is Associate Professor, School of Public Health, The University of Sydney