Friday, September 30, 2016

Vaccines and Alternative Medicine

The State of Vaccine Confidence 2016: Global Insights Through a 67-Country Survey  is the title of a study  just published by EBio Medine. A team of the London School of Hygiene & Tropical Medicine interviewed 65,819 respondents across 67 countries about their attitudes towards vaccines. The study is “the largest survey on confidence in immunization to date”.
Researchers submitted four statements to their sample, asking people to specify on a Likert scale their degree of agreement-disagreement. The four statements were,
1) "Vaccines are important for children to have."
2) "Overall I think vaccines are safe."
3) "Overall I think vaccines are effective."
4) "Vaccines are compatible with my religious beliefs."

The majority of interviewed people thought that vaccines are important for children but, rather contradictorily, they showed lower confidence in vaccine  effectiveness and, above all, in safety. Everywhere, education increased confidence in vaccine importance and effectiveness but not safety. Religion did not play a major role both in vaccine acceptance and hesitancy, with the exception of Mongolia where 50.5% of respondents said vaccines were not compatible with their religion (Buddhism), which is rather odd considering that other Buddhist countries were instead aligned with average results (around 8-10% people thinking that vaccines are hardly compatible with their religion).
Interestingly enough, European countries showed the lowest confidence in vaccine safety with France the least confident: 41% of respondents in France disagreed with the assertion that vaccines are safe (on average, 12% of respondents in other nations disagreed with this statement). Authors noted that "France recently has experienced 'anxiety' about suspected but unproven links between the hepatitis B vaccine and multiple sclerosis and, separately, the human papillomavirus vaccine and side effects like fatigue in girls". This element is certainly important, yet it hardly explains the findings.
In order to make more meaningful the survey, I would suggest to confront it with figures concerning prevalence of Complementary and Alternative Medicine (CAM).  Although in economic terms the largest CAM market is still the US, the market that is growing faster is the European market. Moreover, there is a significant difference between the US and the EU markets, while in the US the lion's share is largely taken by chiropractic, in the EU homeopathic and herbal remedies account for the largest part of the market. Homeopathy is particularly popular in France, where it is the leading alternative therapy. The costs for homeopathic products are partially covered by the French National Health System and the percentage of French population habitually or sporadically using homeopathy has grown from 16% (1982) to 62% (2004). This is mirrored by the attitude of health care professionals.  Homeopathy is taught in all major French medical schools and in schools of pharmacy, in dental schools, in veterinary medical schools, and schools of midwifery. According to a 2004 survey, 95% of GPs, dermatologists and pediatricians, consider homeopathy effective and are willing to prescribe it, or co-prescribe with conventional medicine. Another survey showed that 94.5% of French pharmacists advise pregnant women to prefer homeopathic products because "safer".


Concerns about safety of medical products are thus wider than vaccine hesitancy and vaccine hesitancy is probably only the peak of an iceberg. Further research is certainly required in order to understand better social, psychological, and economic dynamics that underlie this phenomenon. Yet an element is already self-evident: making appeal to scientific arguments to convince people to vaccinate themselves is a pure waste of time, if – at the same time -  the whole social fabric welcomes pseudo-scientific practices among recognized medical treatments. 

Monday, September 12, 2016

Selective Gullibility

PLOS Current Outbreaks has just published a study "Lessons from Ebola: Sources of Outbreak Information and the Associated Impact on UC Irvine and Ohio University College Students", authored by a team of researchers from the University of California Irvine, led by Miryha G. Runnerstrom.  Authors carried out an online survey of 797 undergraduates at the University of California, Irvine (UCI) and Ohio University (OU) during the peak of the 2014 Ebola (EVD) outbreak. Researchers aimed at identifying the main sources of information about the outbreak in four areas: knowledge, attitudes, beliefs, and stigma.

Results are rather interesting. Students main sources of information were news media (34%) and social media (19%). As one could expect, only few students searched information on official government and public health institution websites. However, this small minority (11%) was better informed and had more positive attitudes towards those infected. Authors conclude that " information sources are likely to influence students’ knowledge, attitudes, beliefs, and stigma relating to EVD", which is undoubtedly true, but it more a truism than an actual conclusion. Actually, the study tells something more.

There are at least three thought-provoking elements that emerge from this survey. The first one concerns risk perception. The large majority of participants (81%) perceived a low personal risk of contracting the infection. They were definitely right, given that the total number of Ebola cases in the US in 2014 was 11 cases, that is to say, the theoretical risk for an individual in the US to be infected by Ebola was about 11 in 319,000,000, even lower than the risk of being killed by a meteorite, which was calculated 
in 2014 by earth sciences professor Stephen A. Nelson to be 1 in 1,600,000. Yet, during 2014 Ebola outbreak, the alarm spread all over the US and public health communicators overly spoke today of a hysterical media coverage. Eric Boehlert, an influential blogger and writer for Media Matters for America, wrote "It's almost like they're crossing their fingers for an outbreak (…)   CNN actually invited onto the network a fiction writer who wrote an Ebola thriller in the 1980s to hype unsubstantiated fears about the transmission of the virus. CNN's Ashleigh Banfield speculated that "All ISIS would need to do is send a few of its suicide killers into an Ebola-affected zones and then get them on some mass transit, somewhere where they would need to be to affect the most damage." And colleague Don Lemon lamented that government officials seemed "too confident" they can contain the Ebola scare".  This is the first interesting element: notwithstanding media hypes, most students did not perceive a true health risk for them. 

The second interesting element concerns knowledge. Researchers complain that half sample (51%) showed poor or inexact knowledge of the disease, its means of transmission, its contagiousness, and symptoms. I would argue differently. It seems to me that – considering the inaccurate and emphatic media coverage - the fact that 49% participants had a sufficient correct knowledge of the infection and its dynamic is a positive surprise. It confirms that sensationalist information reaches the target but it does not necessarily penetrate it. This raises immediately a question, why does it happen? What are the main variables at stake in determining the outcome of a sensationalist information campaign?



The third element, which emerges from this study, provides a clue to answer this question.  The third element of interest concerns misinformation.  Asked whether they agreed with the statement “Ebola is a government conspiracy created to get rid of a particular race” 89% students answered that they disagreed. Yet, approx. 1/3 of all participants thought that “There is a cure for Ebola but the government is keeping it from the public”. This answer is particularly astonishing also considering that they were university students, that is to say, a population that should in principle possesses the intellectual means to debunk trivial conspiracy theories. What does it mean? It probably means that distrust towards politicians is at such a level that people look for any excuse to accuse them. In other words, this is a phenomenon that has little to do with public health information. As a general rule a piece of information becomes credible, ceteris paribus, to the extent that it meets people's expectation and imaginary. If people wish to see the evidence of politicians' dishonesty, any piece of information that is cooked with this ingredient becomes immediately palatable. But this does not imply that other pieces of information are passively accepted. 

This study confirms the universal tendency towards selective gullibility. We are always ready to believe in what confirms our beliefs and narratives, especially our biases, even when it is apparently incredible. Ironically enough, cynics are those more prone to this peculiar form of self-deception. 

Thursday, September 1, 2016

Science Communication and The Impostor Syndrome

"Why scientists are losing the fight to communicate science to the public" is the title chosen by Richard Grant for his editorial on science communication, published on The Guardian of August 23.  At the same time, Pediatrics, the official journal of the American Academy of Pediatrics, published the results of a US wide survey among pediatricians focusing on parents' refusal to vaccinate their children. The surveyed population was a population already surveyed in 2006, so allowing a temporal comparison.

Results are appalling, in a decade the proportion of pediatricians reporting parental vaccine refusals increased from 74.5% to 87.0%. Pediatricians reported that parents are increasingly refusing vaccinations because parents believe they are unnecessary, moreover a total of 75.0% of pediatricians reported that parents delay vaccines because of concern about discomfort, and 72.5% indicated that they delay because of concern for immune system burden.  Brief, not only ten years of public health communication aiming to increase vaccine acceptance have not achieved this goal - at least in the US -  but there is even the suspect that they have  been counterproductive.

Richard Grant mentions two main reasons why scientists are losing the fight to communicate science, first because scientists tell people how to live their lives and second, because they don't listen to people. Commenting Grant's article, Ian Mackay, in his Blog The Virology Down Under, adds a third, may be deeper, reason: scientists are usually "supercilious".  Of course, among scientists there are people of all kinds, disdainful and humble, arrogant and unpretentious. That's right, but what Mackay means is a bit subtler than a naïve psychological profile of the average scientist.  Mackay argues that "imposter syndrome for an academic can drive the need to sound just as supercilious as our peers". The "impostor syndrome" is a term that describes the peculiar condition of successful individuals who, notwithstanding their successes and despite their true competence, fear of being considered impostor. Although they are truly high-achieving, they have the inner sensation of being deceivers. In other words, to Mackay scientists are arrogant as far as they need to reassure themselves. Arrogance is their way to say "I'm part of the (scientific) tribe, I'm not a fraud".

Mackay's argument is sound and, as a psychiatrist, I would tend to agree with him. Yet, I would like to go a bit further. During my clinical practice, I happened to meet patients suffering from delusional beliefs.  In a dialogue with someone who is suffering from delusions, one could try to keep the conversation far from delusional contents, at least during the initial interview and the early treatment.  Yet, this strategy cannot be followed forever, soon or later the moment arrives in which one has to discuss patient's delusions.  When this occurs, it is not rare that the patient poses a simple question, "do you believe me?" We all know that questions are never simple questions. Yet, questions are also simple questions and psychiatrists should avoid playing the old, ridiculous, game to interpret questions or to answer with another question. Answering questions is a matter of professional honesty and respect for the patient.

When a delusional patient asked me whether I believed in his narrative, I used to reply him "No, I think you are wrong. Said so, I'm aware that there are more things in heaven and earth than are dreamt of in my science. So, at least inside this room, I'm suspending any definitive judgment and I will listen at you without too many preconceptions.  I will accept the, theoretical, possibility that you are right". Was it only a rhetoric expedient? No, it wasn't.  Of course, there was  a lot of rhetoric in my statement, yet it would not work if it were only rhetoric. At the end of the day, either the psychiatrist truly accepts the risk of becoming insane with his patient, or he will be forever precluded from a real understanding of the patient's inner world.

Similarly, scientists involved in science communication should sincerely accept the possibility that they are wrong, this is the only hope they have to change people's mind.  Is there any prerequisite to adopt this strategy? Yes, there is. You should feel yourself solid enough in your knowledge and your world-view in order to accept a true, actual, challenge. Finally, this is the implicit, reassuring, message that you give to someone else when you accept his extreme challenge. 

So, arrogance is not only the reason why scientists often dictate rules and don't listen to people, it is also the sign that they are not as confident in "their" science as they pretend being. Is this only due to the "impostor syndrome"?