Monday, October 31, 2016

SCIENTIFIC ILLITERACY

Sci Ed is a PLOS Blog devoted to "Diverse perspectives on science and medicine". Mike Klymkowsky, professor of Molecular, Cellular, and Developmental Biology at the University of Colorado Boulder, has recently posted on Sci Ed a very interesting article dedicated to science literacy, "Recognizing scientific literacy & illiteracy".  Inspired by a report of the National Academy “Science Literacy: concepts, contexts, and consequences”,  Professor Klymkowsky  poses a provocative question, "can we recognize a scientifically illiterate person from what they write or say?".  This is a bit mischievous question, because it implicitly suggests that the distinction between scientifically literate and illiterate persons is not that easy.

What makes scientific literacy is not what one knows, but how he knows it. Science – argues Mike Klymkowsky - is more a perspective, than a specific knowledge.   The article lists two main criteria for assessing scientific literacy. First, scientific literacy implies the capacity to understand scientific questions and recognize what an adequate answer should contain (which is not, pay attention, the "right answer" but the "right format of the answer"). 

Second, scientific literacy means the capacity "to recognize the limits of scientific knowledge; this includes an explicit recognition of the tentative nature of science, combined with the fact that some things are, theoretically, unknowable scientifically". Science is made of local perspectives, any perspective that aims to be universal, total, cannot be scientific (which does not imply that it is wrong or false, but it simply means that it belongs to a different register). 

Finally, Mike Klymkowsky addresses an important issue, say, "scientific illiteracy in the scientific community". Paradoxically enough, it is not rare that the very scientific community shows some forms of scientific illiteracy. How could it be possible? Mike Klymkowsky thinks that it is chiefly due to the "highly competitive, litigious, and high stakes environment"  in which most scientists operate. Often this situation leads them to make claims that are over-arching and self-serving. In other words, driven by a too competitive environment, scientists tend to draw unjustified conclusions from their empirical findings to best market their results

The article ends posing the question "how to best help our students avoid scientific illiteracy". The conclusion is that there is not a clear answer to this question but to try to establish "a culture of Socratic discourse (as opposed to posturing)". Such a culture could be summarized – per the author - into an ongoing attempt to understand "what a person is saying, what empirical data and assumptions it is based on, and what does it imply and or predict". 

Curiosity and ongoing inquiry could help to prevent scientific illiteracy, yet there are two other aspects of the Socratic approach, which are still more essential to the scientific discourse. They are self-irony and sense of transcendence.  These two elements are strictly interlaced, because they are both rooted in the deep conviction that truth is always a bit beyond the human reach. Socrates is not relativistic – as some commentators have erroneously argued – rather he is aware that humans can get closer to truth only asymptotically. This awareness prevents any form of scientific arrogance, the real origin of scientific illiteracy.

Thursday, October 13, 2016

Mandatory Vaccinations

In the Oct 6 issue of the New York Times, Christopher Mele devotes an interesting article to risk communication in emergencies. In a nutshell, his argument is that – if one aims to communicate risks – one needs also to evoke fear. Mele’s point of departure is the recent evacuation of Florida, Georgia, North Carolina and South Carolina residents due to Hurricane Matthew. He reports that “even after all of the best practices in emergency communications are exhausted, 5 percent of the population will most likely remain in harm’s way, experts and researchers said”.  Actually, this figure is likely to be still over optimistic, for instance during 2013 Hurricane Sandy 49% of coastal residents who were under mandatory evacuation did not evacuate.  In 2014, a State University of New Jersey team led Dr. Cara Cuite, carried out a study on “Best Practices in Coastal Storm Risk Communication” concluding that effective communication should “stress the possibility that people who do not evacuate could be killed.  This is better done by using implicit messages rather than direct, explicit, messages. For instance, if authorities ask people who do not evacuate to fill out a form on how to notify their next of kin, they communicate in a very effective way the actual possibility “that people who do not evacuate could be killed “without the need of warning them explicitly. Another important lesson concerns semantic, say, the specific words chosen to communicate.  In most cases, mandatory evacuation is excluded, since there is no way to enforce it. Yet, experts know very well that “a voluntary evacuation will have a lower rate of compliance than one labeled mandatory”. It is then critical to avoid using the expression “voluntary evacuation” and “make it clear that residents are being ordered to leave, even if no one is going to remove them forcibly from their homes”. 

It is possible to elicit from Mele’s arguments two general rules concerning risk communication, which could be adopted also in other situations, notably in outbreaks. 


First, in contrast to the standard risk communication account, one should focus on emotional responses rather than on mere rationality.  Risk communicators often aim to raise awareness and to provide the public with information, which is in principle laudable and it would be effective in an ideal world, ruled only by rational choices. Unfortunately, very rarely people make choices on a rational basis, even when they pretend doing it.  As a matter of fact, in real life “pure rationality” does not exist, is a fictional concept. Mental processes are an inextricable mix of logic arguments, emotional reactions, implicit and explicit memories, automatisms, conscious and unconscious processes. Very rarely – if ever – an action follows a rational decision; more often the so called “rational decision” is a post-hoc rationalization, used to justify decisions made in more or less irrational ways.  There is little that one could do to prevent this mechanism, notably in emergencies, when people are asked to take quick and momentous decisions. Among emotions, fear plays a pivotal role as one of the basic emotions that drive human behavior. There are two opposite mistakes that one could do in risk communication, over stimulating fear but also over reassuring people. Fear must be fine-tuned.

Second, if being able to deal with emotions is critical in risk communication, this implies that two variables become paramount, timing and words.  Timing is essential because human emotions are continuously fluctuating in each individual and they change over time. The same message could evoke completely different reactions according to the emotional context of the receiver and consequently a message could have very different effects according to the moment in which is delivered. There is not something like “the right message”; rather there is “the right message in the right moment”. Words are also very important. I’m saying “words” and not “contents”, because I’m referring to the very terms used rather than to concepts underlying words.  Words unavoidably evoke specific emotional reactions, which are – be careful – cultural bound and context dependent (say, one should avoid the mistake of thinking that the same words evoke the same reactions always, everywhere and in everybody). The word “mandatory” is a good example. At least in our society, if something is “mandatory”, for most people it is also important, while, if it is “voluntary”, it is not (or less). So to label something as “mandatory” does not imply necessarily that one is going to enforce it compulsorily. The term “mandatory” could be used also to transmit the importance of an action.  This is well illustrated by the wrong (in communication terms) policy to make most vaccinations “voluntary”.  To be sure, in democratic societies it is largely unthinkable to vaccinate compulsory people and notably children, yet the issue at stake is only in part the balance between voluntariness, persuasion, soft coercion, and compulsion. Words chosen by regulators communicate also the importance of a public measure. Health authorities and policy makers should pay more attention to the communicational implications of wording, even when choices seem to concern purely technical and normative aspects.   

Friday, September 30, 2016

Vaccines and Alternative Medicine

The State of Vaccine Confidence 2016: Global Insights Through a 67-Country Survey  is the title of a study  just published by EBio Medine. A team of the London School of Hygiene & Tropical Medicine interviewed 65,819 respondents across 67 countries about their attitudes towards vaccines. The study is “the largest survey on confidence in immunization to date”.
Researchers submitted four statements to their sample, asking people to specify on a Likert scale their degree of agreement-disagreement. The four statements were,
1) "Vaccines are important for children to have."
2) "Overall I think vaccines are safe."
3) "Overall I think vaccines are effective."
4) "Vaccines are compatible with my religious beliefs."

The majority of interviewed people thought that vaccines are important for children but, rather contradictorily, they showed lower confidence in vaccine  effectiveness and, above all, in safety. Everywhere, education increased confidence in vaccine importance and effectiveness but not safety. Religion did not play a major role both in vaccine acceptance and hesitancy, with the exception of Mongolia where 50.5% of respondents said vaccines were not compatible with their religion (Buddhism), which is rather odd considering that other Buddhist countries were instead aligned with average results (around 8-10% people thinking that vaccines are hardly compatible with their religion).
Interestingly enough, European countries showed the lowest confidence in vaccine safety with France the least confident: 41% of respondents in France disagreed with the assertion that vaccines are safe (on average, 12% of respondents in other nations disagreed with this statement). Authors noted that "France recently has experienced 'anxiety' about suspected but unproven links between the hepatitis B vaccine and multiple sclerosis and, separately, the human papillomavirus vaccine and side effects like fatigue in girls". This element is certainly important, yet it hardly explains the findings.
In order to make more meaningful the survey, I would suggest to confront it with figures concerning prevalence of Complementary and Alternative Medicine (CAM).  Although in economic terms the largest CAM market is still the US, the market that is growing faster is the European market. Moreover, there is a significant difference between the US and the EU markets, while in the US the lion's share is largely taken by chiropractic, in the EU homeopathic and herbal remedies account for the largest part of the market. Homeopathy is particularly popular in France, where it is the leading alternative therapy. The costs for homeopathic products are partially covered by the French National Health System and the percentage of French population habitually or sporadically using homeopathy has grown from 16% (1982) to 62% (2004). This is mirrored by the attitude of health care professionals.  Homeopathy is taught in all major French medical schools and in schools of pharmacy, in dental schools, in veterinary medical schools, and schools of midwifery. According to a 2004 survey, 95% of GPs, dermatologists and pediatricians, consider homeopathy effective and are willing to prescribe it, or co-prescribe with conventional medicine. Another survey showed that 94.5% of French pharmacists advise pregnant women to prefer homeopathic products because "safer".


Concerns about safety of medical products are thus wider than vaccine hesitancy and vaccine hesitancy is probably only the peak of an iceberg. Further research is certainly required in order to understand better social, psychological, and economic dynamics that underlie this phenomenon. Yet an element is already self-evident: making appeal to scientific arguments to convince people to vaccinate themselves is a pure waste of time, if – at the same time -  the whole social fabric welcomes pseudo-scientific practices among recognized medical treatments. 

Monday, September 12, 2016

Selective Gullibility

PLOS Current Outbreaks has just published a study "Lessons from Ebola: Sources of Outbreak Information and the Associated Impact on UC Irvine and Ohio University College Students", authored by a team of researchers from the University of California Irvine, led by Miryha G. Runnerstrom.  Authors carried out an online survey of 797 undergraduates at the University of California, Irvine (UCI) and Ohio University (OU) during the peak of the 2014 Ebola (EVD) outbreak. Researchers aimed at identifying the main sources of information about the outbreak in four areas: knowledge, attitudes, beliefs, and stigma.

Results are rather interesting. Students main sources of information were news media (34%) and social media (19%). As one could expect, only few students searched information on official government and public health institution websites. However, this small minority (11%) was better informed and had more positive attitudes towards those infected. Authors conclude that " information sources are likely to influence students’ knowledge, attitudes, beliefs, and stigma relating to EVD", which is undoubtedly true, but it more a truism than an actual conclusion. Actually, the study tells something more.

There are at least three thought-provoking elements that emerge from this survey. The first one concerns risk perception. The large majority of participants (81%) perceived a low personal risk of contracting the infection. They were definitely right, given that the total number of Ebola cases in the US in 2014 was 11 cases, that is to say, the theoretical risk for an individual in the US to be infected by Ebola was about 11 in 319,000,000, even lower than the risk of being killed by a meteorite, which was calculated 
in 2014 by earth sciences professor Stephen A. Nelson to be 1 in 1,600,000. Yet, during 2014 Ebola outbreak, the alarm spread all over the US and public health communicators overly spoke today of a hysterical media coverage. Eric Boehlert, an influential blogger and writer for Media Matters for America, wrote "It's almost like they're crossing their fingers for an outbreak (…)   CNN actually invited onto the network a fiction writer who wrote an Ebola thriller in the 1980s to hype unsubstantiated fears about the transmission of the virus. CNN's Ashleigh Banfield speculated that "All ISIS would need to do is send a few of its suicide killers into an Ebola-affected zones and then get them on some mass transit, somewhere where they would need to be to affect the most damage." And colleague Don Lemon lamented that government officials seemed "too confident" they can contain the Ebola scare".  This is the first interesting element: notwithstanding media hypes, most students did not perceive a true health risk for them. 

The second interesting element concerns knowledge. Researchers complain that half sample (51%) showed poor or inexact knowledge of the disease, its means of transmission, its contagiousness, and symptoms. I would argue differently. It seems to me that – considering the inaccurate and emphatic media coverage - the fact that 49% participants had a sufficient correct knowledge of the infection and its dynamic is a positive surprise. It confirms that sensationalist information reaches the target but it does not necessarily penetrate it. This raises immediately a question, why does it happen? What are the main variables at stake in determining the outcome of a sensationalist information campaign?



The third element, which emerges from this study, provides a clue to answer this question.  The third element of interest concerns misinformation.  Asked whether they agreed with the statement “Ebola is a government conspiracy created to get rid of a particular race” 89% students answered that they disagreed. Yet, approx. 1/3 of all participants thought that “There is a cure for Ebola but the government is keeping it from the public”. This answer is particularly astonishing also considering that they were university students, that is to say, a population that should in principle possesses the intellectual means to debunk trivial conspiracy theories. What does it mean? It probably means that distrust towards politicians is at such a level that people look for any excuse to accuse them. In other words, this is a phenomenon that has little to do with public health information. As a general rule a piece of information becomes credible, ceteris paribus, to the extent that it meets people's expectation and imaginary. If people wish to see the evidence of politicians' dishonesty, any piece of information that is cooked with this ingredient becomes immediately palatable. But this does not imply that other pieces of information are passively accepted. 

This study confirms the universal tendency towards selective gullibility. We are always ready to believe in what confirms our beliefs and narratives, especially our biases, even when it is apparently incredible. Ironically enough, cynics are those more prone to this peculiar form of self-deception. 

Thursday, September 1, 2016

Science Communication and The Impostor Syndrome

"Why scientists are losing the fight to communicate science to the public" is the title chosen by Richard Grant for his editorial on science communication, published on The Guardian of August 23.  At the same time, Pediatrics, the official journal of the American Academy of Pediatrics, published the results of a US wide survey among pediatricians focusing on parents' refusal to vaccinate their children. The surveyed population was a population already surveyed in 2006, so allowing a temporal comparison.

Results are appalling, in a decade the proportion of pediatricians reporting parental vaccine refusals increased from 74.5% to 87.0%. Pediatricians reported that parents are increasingly refusing vaccinations because parents believe they are unnecessary, moreover a total of 75.0% of pediatricians reported that parents delay vaccines because of concern about discomfort, and 72.5% indicated that they delay because of concern for immune system burden.  Brief, not only ten years of public health communication aiming to increase vaccine acceptance have not achieved this goal - at least in the US -  but there is even the suspect that they have  been counterproductive.

Richard Grant mentions two main reasons why scientists are losing the fight to communicate science, first because scientists tell people how to live their lives and second, because they don't listen to people. Commenting Grant's article, Ian Mackay, in his Blog The Virology Down Under, adds a third, may be deeper, reason: scientists are usually "supercilious".  Of course, among scientists there are people of all kinds, disdainful and humble, arrogant and unpretentious. That's right, but what Mackay means is a bit subtler than a naïve psychological profile of the average scientist.  Mackay argues that "imposter syndrome for an academic can drive the need to sound just as supercilious as our peers". The "impostor syndrome" is a term that describes the peculiar condition of successful individuals who, notwithstanding their successes and despite their true competence, fear of being considered impostor. Although they are truly high-achieving, they have the inner sensation of being deceivers. In other words, to Mackay scientists are arrogant as far as they need to reassure themselves. Arrogance is their way to say "I'm part of the (scientific) tribe, I'm not a fraud".

Mackay's argument is sound and, as a psychiatrist, I would tend to agree with him. Yet, I would like to go a bit further. During my clinical practice, I happened to meet patients suffering from delusional beliefs.  In a dialogue with someone who is suffering from delusions, one could try to keep the conversation far from delusional contents, at least during the initial interview and the early treatment.  Yet, this strategy cannot be followed forever, soon or later the moment arrives in which one has to discuss patient's delusions.  When this occurs, it is not rare that the patient poses a simple question, "do you believe me?" We all know that questions are never simple questions. Yet, questions are also simple questions and psychiatrists should avoid playing the old, ridiculous, game to interpret questions or to answer with another question. Answering questions is a matter of professional honesty and respect for the patient.

When a delusional patient asked me whether I believed in his narrative, I used to reply him "No, I think you are wrong. Said so, I'm aware that there are more things in heaven and earth than are dreamt of in my science. So, at least inside this room, I'm suspending any definitive judgment and I will listen at you without too many preconceptions.  I will accept the, theoretical, possibility that you are right". Was it only a rhetoric expedient? No, it wasn't.  Of course, there was  a lot of rhetoric in my statement, yet it would not work if it were only rhetoric. At the end of the day, either the psychiatrist truly accepts the risk of becoming insane with his patient, or he will be forever precluded from a real understanding of the patient's inner world.

Similarly, scientists involved in science communication should sincerely accept the possibility that they are wrong, this is the only hope they have to change people's mind.  Is there any prerequisite to adopt this strategy? Yes, there is. You should feel yourself solid enough in your knowledge and your world-view in order to accept a true, actual, challenge. Finally, this is the implicit, reassuring, message that you give to someone else when you accept his extreme challenge. 

So, arrogance is not only the reason why scientists often dictate rules and don't listen to people, it is also the sign that they are not as confident in "their" science as they pretend being. Is this only due to the "impostor syndrome"?