The below essay is in response to the following question:
(Fair warning: this is a long post)
It has been argued that contemporary science is socially (re)contextualised, with porous boundaries between institutions of science and those of wider society, and open to public dialogue. Describe how these claims apply to two of the five issues below and assess how the prevailing social conditions of science affect scientists' performance on those selected issues:
- call for improved integrity and accountability systems to reduce scientific misconduct
- claims that science is losing public trust and needs to work harder to regain and maintain it
Within the last century science has moved from the world of elite institutions and into everyday life. This is largely due to the success of scientific endeavor We regularly use items that, as Brian Cox has said, in 1912 would have been considered the cusp of magic. (Cox & Ince, 2012)
While it may be said that science has been re-contextualised to reflect our contemporary society, it is unclear if the scientific community has followed. In this essay I will argue that in order for science, and the communication of it, to evolve it must embrace the technology it invented.
I will look at the issues surrounding peer review, scientific misconduct and the current systems for accountability. I will look at the pressure on scientists to publish and how this is being challenged from within the scientific community. I will also look into the falling trust in scientific institutions, how science communication affects this, and what measures are being taken to counter it.
Improved integrity and accountability systems to reduce scientific misconduct:
It can be argued that with the formation of the Royal Society in the 1600’s, modern “peer” review began. Robert Boyle’s insistence on repeat experimentation and the recording of all results, both successful and unsuccessful, created the blueprint for scientific research used to this day.
Indeed, Science continues to rely upon peer review to bestow legitimacy. When Emmanuel Priori and Robert Conrad were asked, in 1946, to decide how the federal (US) government could best support university research without impacting academic freedom they advocated peer review. According to D. Allan Bromley, President Truman found this difficult to accept. He believed this could create a situation “where the pigs decided who gets into the trough”. (Bromley, 2002)
While peer review is the current best system for evaluating science, Truman had a point. Not all peers are equal.
For example, in a recent study on the effect of GM corn on rats published in Food and Chemical Toxicology found that female mortality was 2–3 times increased, mostly due to large mammary tumours and disabled pituitary. It also found that males had liver congestions, necrosis, severe kidney nephropathies and large palpable tumours. (Séralinia, 2012).
As the results were released to the press under embargo, journalists were unable to verify the data with other scientists before the news conference. However, within hours of the study being published, scientists and science enthusiasts from around the world had dissected the paper and discovered many troubling problems with it: Most notably that some of the GM test groups were healthier than the controls.
The Séralinia study passed peer review and remains un-retracted at Food and Chemical Toxicity, despite journals such as New Scientist detailing the problems with the paper and linking to other more comprehensive studies (MacKenzie, 2012). It has been referenced by numerous newspapers and anti-GM groups to back up their assertions that GM causes cancer.
This is not the first time that those outside the traditional system have found problems with peer-reviewed papers. In December 2010 The journal Science published a NASA research article online: “A Bacterium That Can Grow by Using Arsenic Instead of Phosphorus“. (Wolfe-Simon, 2011)
Within days the study had been ripped apart. Rosie Redfield, in her analysis of the paper on her website RRReasearch, concluded, “I don't know whether the authors are just bad scientists or whether they're unscrupulously pushing NASA's 'There's life in outer space!' agenda. I hesitate to blame the reviewers, as their objections are likely to have been overruled by Science's editors in their eagerness to score such a high-impact publication.” (Redfield, 2010)
Redfield highlights one of the main issues with peer review. In the rush to publish a headline-grabbing paper, scientific accuracy was sacrificed.
In both of the cited cases, science bloggers analysed the journal reviewed papers and detailed the problems with them. However, in the case of Wolfe-Simon et al’s paper, Wolfe-Simon refused to engage in dialogue outside peer-reviewed system. She has been quoted on the topic as saying, "The items you are presenting do not represent the proper way to engage in a scientific discourse and we will not respond in this manner." (Zimmer, 2010).
This insistence that journal peer-review is superior to all other forms of review is under pressure. Websites, such as Retraction Watch, now monitor papers post-publication for retractions. Ivan Oransky strongly believes that with technology today, any time you find the original piece of content, you should be able to find the correction or the retraction. (Hoppenhaus, 2012)
Currently, if a paper is retracted, there is negligible impact. In a preliminary analysis of 1,112 retracted papers from 1997-2009, John Budd (School of Education, University of Missouri) found that the papers were still cited, with only 4% of the citations mentioning the retractions (Noorden, 2011) .
Retractions and corrections lie at the heart of science. Scientists are human and, consequently, make mistakes. There should be no stigma attached to notifying journals that further investigation has yielded different results and the original paper should be amended to reflect this new information.
However, there is heavy competition between academics to secure tenure and funding, with the battle being fought on the field of publish or perish. The pressure can lead to secrecy and misconduct within the scientific profession. Daniele Fanelli (Universtiy of Edinburgh) studies research misconduct and believes forms of indecency and sabotage are likely to be common, from vindictive peer review and dishonest reference letters to withholding key aspects of protocols from colleagues or competitors (Maher, 2010).
If Fanelli is correct and these forms of misconduct are common, then science ultimately suffers as a result. An atmosphere where you cannot trust your peers to assist or support you is not conducive to acknowledging when mistakes have been made.
Academic success relies on high-impact publications, rather than on a continuous stream of high quality research (Harley & Acord, 2011). This may change with the increasing influence of the internet. If journals lose their grip on disseminating data, the need to secure high-profile publications may diminish.
There is a growing movement calling for open-access to data. There have been incidents of academic piracy already. The late Aaron Swartz was arrested in July 2011 for downloading over 4 million articles from JSTOR. While he denied any involvement, it precipitated the release in September 2011 of JSTOR’s public-domain content.
Dr Ben Goldacre, a proponent of open access, launched a blistering critique of the issues in medical science in his recent book Bad Pharma. He argues in the book, that publication bias is endemic and dangerous. The main problem according to Goldacre is that negative results are rarely published, so as a result, “the entire medical and academic community, around the world, when we pool the evidence to get the best possible view of what works, we are completely mislead.” (Goldacre D. B., 2012) He is currently campaigning for pharmaceutical trials to be registered, so that negative data can be captured a long with positive results.
There are some significant issues with open access. Goldacre himself, noted in a 2011 blog post on the JSTOR theft, “One major problem with the current publishing model is that it’s hard to give access for free to the motivated public, while still gathering income from institutions.” (Goldacre, 2011).
While there are problems in the implementation of open access to data, leaving scientific knowledge hidden behind pay-walls is no longer acceptable to many in the industry. The Wellcome Trust, for example, will no longer give grants to researchers who fail to make their results freely available to the public. In addition to that it will discount non-open access publications when assessing potential grant awards. (Wellcome Trust, 2012)
As more scientific research moves into the public domain, and out from the need to be in designated “high-impact” journals, the pressure to massage data or falsify findings should decrease.
Ivan Oransky believes that this move towards openness will also help raise the public’s trust in science. He is quoted in Nature as saying “What scientists should be doing is saying ‘In the course of what we do are errors, and among us are also people that commit misconduct or fraud. Look at how small that number is! And here is what we are doing to root it out’.” (Noorden, 2011)
Science is losing public trust and needs to work harder to regain and maintain it:
Science is a continuously evolving discipline. Each breakthrough builds upon the work of previous generations. Occasionally new knowledge will cause existing theories to be reassessed. When Einstein developed the Theory of Relativity it shifted the prevailing view of Newton’s theory of gravitation, not quite supplanting it but it allowed scientists to view the universe in a new way.
While this may be common knowledge within the fields of science, the public often are unaware of it. Jane Maienshcein, in her paper Innocent Reflections on Science and Technology Policy, considered the plight of politicians, “They are neither stupid nor ignorant, but they do not understand the statistical, evolutionary, or community nature of much of the scientific process. They typically believe that when we know something, it should stay known and not give way to apparently contradictory results.” (Maienschein, 2002)
The public trusts science. It is scientists they are unsure about. In a Eurobarometer report on Science and Technology, 58% of respondents felt that “scientists cannot be trusted to tell the truth about controversial scientific and technological issues because they depend more and more on money from industry.” (Directorate-General, 2010)
This mistrust has its roots in the fallout of scientific or medical scandals. Over the last century, with each scientific breakthrough, there have been less savoury side effects. From nuclear energy to GM foods, the shadow of Frankenstein’s monster looms large in the public’s imagination.
It is the responsibility of people working in science to communicate the risks and rewards of new scientific knowledge the public. Unfortunately in the same Eurobarometer report, a majority of European citizens felt that scientists did not put enough effort into informing the public about new developments in Science and technology. (Directorate-General, 2010)
This lack of ongoing and open communication is systemic. As Roland Jackson noted, in his rebuttal to Durodié’s paper on the Limitations of Public Dialogue in Science and the Rise of the New ‘Experts’, while there is increasing acknowledgement that two way communication is needed, there is little endorsement of it in reality. (JACKSON, et al., 2005)
Without dialogue fear of change, or the unknown, flourishes. Mark Lynas, a well-known environmental activist, spoke of his own early fears regarding GM in a recent lecture to the Oxford Farming conference, “Mixing genes between species seemed to be about as unnatural as you can get – here was humankind acquiring too much technological power; something was bound to go horribly wrong. These genes would spread like some kind of living pollution. It was the stuff of nightmares.” (Lynas, 2013)
The controversy surrounding GM foods stems from a fear that we, humanity, step above ourselves. The term “Playing God” is often used by anti-GM campaigners. These people care about the subject and are not stupid, but it can take years of research into the science behind GM food production to fully understand it.
The public is not served by science engaging in heated debates behind closed doors, while exhibiting a consensus to the public. Inevitably the truth will come out. Daniel Sarewitz made the following point, “a claim of scientific consensus creates a public expectation of infallibility that, if undermined, can erode public confidence; And when expert consensus changes, as it has on health issues from the safety of hormone replacement therapy to nutritional standards, public trust in expert advice is also undermined.” (Sarewitz., 2011)
This can be seen in the BSE crisis in the UK and, to a lesser extent, in the l’Aquila earthquake trial last year.
The scientists involved with investigating the BSE crisis in the UK made markedly different statements in private to those made to the British public. A Scientific advisor was quoted in private as saying “"It would not be justified to state categorically that there was no risk to humans", while at the same time, in public the MAFF minister was saying “... clear scientific evidence that British beef is perfectly safe". (Millstone, et al., 2006)
When the scandal eventually broke, it took down the department of MAFF, and in the process degraded the public’s faith in government and the scientists associated with it. People felt they had been patronised by the institutions that should have been open and frank about the risks. (Millstone & Zwanenberg, 2000)
This failure to be honest about risk is not limited to the UK. In October 2012 six Italian scientists were jailed for failing to adequately warn of an earthquake in l’Aquila, Italy. The judge recently made his reasoning public. The six were not jailed for failing to predict an earthquake, as had been popularly reported. Instead, the judge stated, they were jailed for their complete failure to properly analyse, and explain, the threat posed by the swarm of tremors that preceded the main earthquake. “The deficient risk analysis was not limited to the omission of a single factor, but to the underestimation of many risk indicators and the correlations between those indicators." (Billi, 2013)
It may be that Judge Marco Billi is incorrect in his finding. The scientists are appealing the verdict. However, what lies at the heart of the case, like with the BSE scandal, is the perceived disregard for proper dialogue with the affected public by the scientists. People do not necessarily need their fears soothed by platitudes. Science and its communicators should respect the public enough to be open and honest about the upsides and downsides to modern science.
“Unlike a pallid consensus, a vigorous disagreement between experts would provide decision-makers with well-reasoned alternatives that inform and enrich discussions as a controversy evolves, keeping ideas in play and options open.” (Sarewitz., 2011)
In the 21st century it is no longer practicable, or in fact possible, to hide knowledge behind expensive pay-walls. As the old media industries of film and music have found out to their detriment. Information wants to be free.
With pressure from funders, such as the Wellcome Trust, and high-profile journalists, like Ben Goldacre, open access to data will happen. It is up to people working in the field to make it work for science. The current peer-review system is proving ineffective in the information age. The flaws within the system (bias, delays and an inability to uncover misconduct) are being highlighted with increasing speed. (Benos, et al., 2006)
Misconduct will always exist in science, however, by opening up access, and reducing the power of the “high impact” journals to decide careers, the pressure to fake data in order to score points on the tenure ladder may decrease.
It is by opening up science to the public for review, that we will get a chance to regain the trust lost in previous decades. In reviewing the case studies quoted, I noticed that the real issue was not that there was risk inherent in science, but that scientists did not take the time to adequately explain those risks; instead they gave simple platitudes to dampen panic.
Trust is lost when it is not reciprocal. It is no longer feasible for scientists to communicate using solely the deficit model. Communication also involves listening. It is only by both sides being given a chance to explain their points of view that a genuine consensus can be reached.
Massimiano Bucchi has said “communication is not simply a technical tool functioning within a certain ideology of science and its role in economic development and social progress, but has to be recognised as one of the key dynamics at the core of those co-evolutionary processes, redeﬁning the meanings of science and the public, knowledge and citizenship, expertise and democracy.” (Bucchi, 2008)
The issues surrounding integrity in science and public trust are the result of the same underlying problem. Scientists should able to properly discuss issues surrounding academic research without fear of losing their position or funding. Likewise scientists speaking about matters of importance to the public should not be afraid to be honest about the risks and benefits to any new or existing technology.
It is only by encouraging open dialogue between scientists, journals, and the public that we can begin to solve these problems.
Benos, D. J., Bashari, E., Chaves, J. M., Gaggar, A., Kapoor, N., LaFrance, M., et al. (2006). The ups and downs of peer review. Advances in Physiology Education, 145-152.
Billi, J. M. (2013). L'Aquila Earthquake Trial.
Bromley, D. A. (2002). Science, Technology, and Politics. Technoloy in Society, 9-26.
Bucchi, M. (2008). OF DEFICITS, DEVIATIONS AND DIALOGUES. In Handbook of Public Communication of Science and Technology (pp. 57-76). Abingdon: Routledge.
Cox, B., & Ince, R. (2012, 12 18). Politicians must not elevate mere opinion over science. Retrieved 12 28, 2012, from The New Statesman: http://www.newstatesman.com/sci-tech/sci-tech/2012/12/brian-cox-and-robin-ince-politicians-must-not-elevate-mere-opinion-over-sc
Directorate-General, R. (2010). "Science and Technology" - Special EUROBAROMETER 340. European Commission.
Goldacre, D. B. (2011, 09 16). Academic papers are hidden from the public. Here’s some direct action. Retrieved 01 12, 2013, from http://www.badscience.net/: http://www.badscience.net/2011/09/academic-papers-are-hidden-from-the-public-heres-some-direct-action/
Goldacre, D. B. (2012). Bad Pharma. London: Fourth Estate.
Harley, D., & Acord, S. K. (2011). Peer Review in Academic Promotion and Publishing: Its Meaning, Locus, and Future. Berkeley: Research and Occasional Papers Series, Center for Studies in Higher Education, UC Berkeley.
Hoppenhaus, K. (2012, 3 6). My Interview with Ivan Oransky at #scio12 - The Transcript. Retrieved 01 02, 2013, from Digitalgrip.fieldnotes: http://field-notes.digitalgrip.de/2012/03/06/my-interview-with-ivan-oransky-at-scio12-the-transcript/
JACKSON, R., BARBAGALLO, F., & HASTE, H. (2005). Strengths of Public Dialogue on. Critical Review of International Social and Political Philosophy, 8(3), 349-358.
Lynas, M. (2013). Lecture to Oxford Farming Conference. Oxford.
MacKenzie, D. (2012, September 19). Study linking GM crops and cancer questioned. Retrieved December 29, 2012, from New Scientist: http://www.newscientist.com/article/dn22287-study-linking-gm-crops-and-cancer-questioned.html
Maher, B. (2010). Sabotage. Nature, 516-518.
Maienschein, J. (2002). Innocent Reflections on Science and Technology Policy. Technology in Society, 133-143.
Millstone, E., & Zwanenberg, P. v. (2000). A crisis of trust: for science, scientists or for institutions? Nature Medicine , 6, 1307-1308.
Millstone, E., Zwanenberg, P. v., Alvensleben, R. v., Dressel, K., Giglioli, P. P., Koivusalo, M., et al. (2006). Evolution and implications of public risk communication strategies on BSE. World Health Organisation.
Noorden, R. V. (2011). The Trouble with Retractions. Nature, 26-28.
Redfield, R. (2010, 12 04). Arsenic-associated bacteria (NASA's claims). Retrieved 12 30, 2012, from RRResearch: http://rrresearch.fieldofscience.com/2010/12/arsenic-associated-bacteria-nasas.html
Sarewitz., D. (2011). The voice of science: let’s agree to disagree. Nature, 478(7).
Séralinia, G.-E. (2012). Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize. Food and Chemical Toxicology, 4221-4231.
Wellcome Trust. (2012). Wellcome Trust. Retrieved 1 2013, from http://www.wellcome.ac.uk: http://www.wellcome.ac.uk/About-us/Policy/Spotlight-issues/Open-access/Policy/index.htm
Wolfe-Simon, F. (2011). A Bacterium That Can Grow by Using Arsenic Instead of Phosphorus. Science, 1163-1166.
Zimmer, C. (2010, 12 7). "This Paper Should Not Have Been Published". Retrieved 12 31, 2012, from Slate: http://www.slate.com/articles/health_and_science/science/2010/12/this_paper_should_not_have_been_published.2.html