Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Copy of Copy of Research Integrity

No description

David Koepsell

on 20 May 2016

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Copy of Copy of Research Integrity

and how to avoid it How Good Researchers
Go Bad David Koepsell,
V&T, Phil., TBM Faculty
Delft University of Technology
http://davidkoepsell.com Ethics as a field has developed numerous approaches the the question of what constitutes “the good”

In ethical theory, three major approaches have evolved, including “virtue ethics,” “deontology,” and “consequentialism.”

These three theories have informed modern “applied” ethics through codes of conduct and even laws. Half lecture (with Socratic interludes)
Brief discussion of ethical failures in research
Introduction to the “ethos” of science
Application of ethos through specific duties of researchers

Half workshop regarding case studies
Review case studies in groups,
Discussion of each in context of lecture According to Robert Merton, science is an inherently democratic institution, and embodies the following values: Universalism respects that the objects of science are the same everywhere. There is no special science of any one place vs. other nations, nor of one race over another. Science seeks to uncover universal truths. Science is also a necessarily communal activity. Even while research programs pursue their research sometimes behind closed doors, to enter into the scientific realm, results and methods must eventually be shared, and thus become subject to testing. Scientists must remain disinterested in the results of their study, lest they be emotionally vested in such a way that their reporting becomes skewed, either intentionally or unintentionally Scientists must constantly challenge and doubt the finality of their searches for the truths underlying nature, and be prepared to overturn cherished and accepted beliefs about past studies. - this is organized skepticism Universalism Communalism Disinterestedness Organized Skepticism But the main principles underlying good, ethical behavior in all of the sciences are rooted in what Robert Merton calls the ethos of science.
These principles may also be described in various ethical theories… Science is a socially-constructed institution that has no governing body, no central oversight, and no real codified or legal rules that apply to all scientists (although recently, some have attempted to promote such a universal code, it has only been sporadically adopted) While other fields have developed their own professional organizations, accrediting bodies, and other means of providing some direction to their “members”, science as a whole remains more or less self-governing, and scientists, who conduct their research generally due to the largess of states and their citizens, may yet be guided by the ethos of science itself. A few broad areas in which scientific integrity may be derived from the ethos of science: Research
(mis)conduct Authorship Conflicts of Interest IP and Data Duties to Society Researchers should be forthright about the bases for their “own” work, where it depends (as most research does, as even Newton acknowledged long ago) upon the work of others. Don't: misrepresent data, fail to properly acknowledge sources or methods, etc. Indeed, any actions that tend to undermine the confidence that an objective peer would have in the research or findings of another, were those actions known, ought to be avoided merely as a practical matter. Why?
What implications
for science? If, as seems likely, Ptolemy merely borrowed some likely reliable observations from another, attribution would have been the appropriate scientific and ethical course. not only is it ethically wrong to claim as one’s own work the work of another, but because scientists must leave a clear research trail for other researchers to check their work.Without such a trail, many countless hours of work may be duplicated or wasted. The pressures of academic tenure and promotion, the need to publish, to achieve quickly, and the achieve security may tend to impel researchers even now to cut corners, and in the worst cases, to commit fraud, but prominent cases of getting caught, as well as the impact such fraud has upon the ethos and perception of the sciences, ought to steer modern scientists away from such risks Why? Being open about one’s sources and attributing correctly, leaving a clear trail, being honest, cooperative, and careful. Good Research Conduct being good is good for science A frequent issue that arises in academia with ethical implications and impact regards the issue of authorship. Who counts as an author? All too frequently, authors are named on papers where no overt acts of authorship occurred. This may be because the named person was responsible for securing funding, or because of a sense of duty within a hierarchy to name the person who provides some oversight in a department, or for other understandable but insufficient reasons to grant authorship. A good measure however, is asking whether the named person actually contributed to writing the paper, either through developing the ideas, doing critical research and analysis, or the nuts and bolts of stringing the words together in a meaningful way. Mere copyediting does not amount to authorship, nor does a discussion around a water cooler regarding the progress or scope of the research. Rather, ask whether without the person to be named would the paper as conceived and written be possible. Some hand in both the conception and realization of a paper warrants a claim of authorship. All authors ought to be involved in the drafting process to some degree, combing through it, providing corrections, actually writing parts, but at the very least reviewing it. who gets benefit?
who gets blamed? Responsibility Taking benefit means accepting potential blame Where a researcher may be beholden not just to the virtues and aims of the ethos of science, in other words when he or she is not sufficiently “disinterested” in the outcome of the research, a conflict of interest may result. The nature of modern science inevitably raises more frequent potential conflicts as scientists and academics, forced to seek funding and support beyond their institutions, are often in the position now of serving multiple masters. Conflicts of interest may not necessarily impede research, nor cause it to go astray or wrong, but they run the risk of undermining the confidence of the public or other members of the profession consider the Gelisinger tragedy... Dr. James Wilson Institute for Human Gene Therapy University
Pennsylvania Genovo, Inc. "private, public partnership" headed founded,
owned stock owned stock Penn waived some of its own conflict-of-interest guidelines in 1995 when it gave exclusive rights to license patents from Wilson's lab at Penn to Genovo and Genovo's corporate sponsors, led by Biogen.

Penn also allowed Wilson to control up to 30 percent of Genovo's stock, far more than the 5 percent generally allowed for professors under the university's conflict-of-interest guidelines faculty member hosted In the case of Jesse Gelsinger, the gene therapy trial was an ongoing two-year trial. He began the trial on his eighteenth birthday, the day he became eligible to participate. His liver was infused with trillions of genetically disabled cold viruses containing the corrected OTC genes. The viruses, which were to be the delivery vehicle for the needed gene, caused "an immune system revolt," leading Gelsinger's immune system to go into overdrive and begin attacking his lungs and vital organs. He died four days after the trial began. An FDA investigation found these violations: Failure to stop the study after some patients developed elevated liver enzymes, fever and anemia. Failure to follow the experiment plan in conducting monitoring medical tests on the patients.
Submitting “misleading and inaccurate statements” in an annual report on the experiments, including claims that there were no “significant” toxicities experienced by the patients. Failure to warn patients enrolled for the tests that two monkeys used in the experiment had to be put to death after the animals experienced serious side effects. Enrolling patients, including Gelsinger, who were not eligible for the study. for whom was Wilson
responsible? too many masters? Transparency? Remaining mindful of the potential for conflicts, however, and keeping the sources of money and trail of overlapping or conflicting duties transparent, can help maintain confidence in results and professional integrity. Ultimately, researchers wishing to avoid the perils of conflicting interests need firstly to be aware of their potentially occurring. Be mindful of how owing duties to various parties may affect the production, translation, or dissemination of research. Ordinarily, intellectual property rights do not attach to data sets since copyright and patent are meant to protect original expressions of ideas. Data accumulated through scientific study is observational. In its best instances, it reflects the world that exists, even while researchers might then go on to represent that data in new and original manners (as expressions) Researchers must be mindful of existing protections on original expressions, either in publications or in the form of patented methods or machines, and ensure that they do not violate any legal rights that might exist. Openness Secrecy The use and misuse of data itself poses another sort of challenge. Aside from legal issues such as non-disclosure agreements, privacy concerns, and contractual restrictions that may arise in complex research scenarios, data itself must at some point be revealed in accord with the proper methods and ethos of science. VS. Attempts to massage, obfuscate, or otherwise manipulate data are treated harshly if uncovered by the scientific community at large. Attempts to massage, obfuscate, or otherwise manipulate data are treated harshly if uncovered by the scientific community at large. Claims of rights and restrictions over the dissemination of data or other attempts to prevent full access and investigation of data sets ultimately reflect poorly on those who choose less open paths in their research. But being a good researcher also means respecting the need for some secrecy and not disclosing during the course of a research project, motivated by the perfectly reasonable desire to be the first to publish on a topic. But being a good researcher also means respecting the need for some secrecy and not disclosing during the course of a research project, motivated by the perfectly reasonable desire to be the first to publish on a topic. Secrecy Openness then Researchers who steal or destroy data in order to get ahead, to promote their careers or destroy those of others, not only violate the most basic tenets of the ethos of science, but also behave unethically. Science is a meritocracy, and sometimes luck helps. But good research and discovery is rewarded by status in the profession. Theft, misappropriation, or excessive secrecy, mistrust or guile, may result in temporary personal advances, but ultimately hinder our progress both materially and ethically Data can be reasonably concealed or otherwise protected during the course of a research project but must be reported honestly at its conclusion. Without this sort of transparency, the scientific method itself is undermined. Increasingly, we are becoming aware that no area of science is an island unto itself. Even the most theoretical fields of research have some broader impact, if only because most basic research remains publicly-funded, and the pot of public money available for such research is increasingly threatened. Our duties to others in general, as expressed by centuries of ethical theory, are typically recognized as including avoiding conscious harms, or at least minimizing them where harm is necessary for some other, more important reason. Recent stories regarding the discussion and possible manipulation of climate data, by scientists with clearly-expressed motivations to influence the public debate, have undermined public confidence in climate scientists, In so doing, these researchers, by failing to remain disinterested, have hurt their cause. If they are right about the nature of climate change, they have harmed much more than a cause. scientists observing nature must seek to hone their models of the universe, its laws and phenomena without regard for political or religious ends. In their roles as researchers, at least, the values expressed by Merton in delineating the ethos of science may serve as a useful guide for ethical conduct. Ultimately, science is a social phenomenon, and scientists are a part of a broader society. The proper conduct of scientific research in every one of its branches reflects upon its role in society, and effects its perception by the broader public. Ethical conduct can be complicated as duties and obligations multiply, and as motivations become cloudy or conflict. Remembering the ultimate role of science in society, and the manner by which it best proceeds, can help to sort out some if not all of the ethical issues that may arise, and will help to ensure the smooth progress not only of individual subfields in science, but of science in general as a means to achieving human progress. Research ethics as an applied field has grown in part due to a number of contentious and public lapses in ethical judgment over the past hundred years. Part 1:
Introduction to
Applied Ethics: a brief history Science didn't always proceed "ethically" ...
values we now take for granted were, until very recently, almost absent in scientific pursuits. Example: Edward Jenner Noting the common observation that milkmaids were generally immune to smallpox, Jenner postulated that the pus in the blisters that milkmaids received from cowpox (a disease similar to smallpox, but much less virulent) protected them from smallpox. On 14 May 1796, Jenner tested his hypothesis by inoculating James Phipps, a boy eight years old (the son of Jenner's gardener), with pus scraped from the cowpox blisters on the hands of Sarah Nelmes, a milkmaid who had caught cowpox from a cow called Blossom, whose hide now hangs on the wall of the St George's medical school library (now in Tooting). Phipps was the 17th case described in Jenner's first paper on vaccination.

Jenner inoculated Phipps in both arms that day, subsequently producing in Phipps a fever and some uneasiness but no full-blown infection. Later, he injected Phipps with variolous material, the routine method of immunization at that time. No disease followed. The boy was later challenged with variolous material and again showed no sign of infection. vacca = cow in latin in 1979, the World Health Organization declared smallpox officially eradicated from nature.

Jenner's study would have been considered deeply unethical today. Example: Karl Brandt, and the "Doctors' Trial" at Nuremberg Twenty of the 23 defendants were medical doctors (Viktor Brack, Rudolf Brandt, and Wolfram Sievers being Nazi officials instead) and were accused of having been involved in Nazi human experimentation and mass murder under the guise of euthanasia. Josef Mengele, one of the leading Nazi doctors, had evaded capture. On August 19, 1947, the judges delivered their verdict in the "Doctors' Trial" against Karl Brandt and several others. They also delivered their opinion on medical experimentation on human beings. Several of the accused had argued that their experiments differed little from pre-war ones and that there was no law that differentiated between legal and illegal experiments. Nuremberg Code:

1. Voluntary consent
2. Experiment must be well-designed, not random, and likely to produce beneficial results
3. Based upon animal experiments
4. Avoid all unnecessary physical and mental injury
5. Don't experiment if reason to believe serious injury or death will result
6. Risk should not exceed humanitarian importance
7. adequate measures of safety and risk reduction
8. qualified personnel and highest degree of care
9. subjects must always be able to withdraw
10. experimenter must be willing to terminate if perceives likely injury, disability, or death to the experimental subject its progeny... The Declaration of Helsinki is a set of ethical principles regarding human experimentation developed for the medical community by the World Medical Association (WMA). It is widely regarded as the cornerstone document of human research ethics. (WMA 2000, Bošnjak 2001, Tyebkhan 2003) The Common Rule is a rule of ethics in the US regarding biomedical and behavioral research involving human subjects in the United States. These regulations governing Institutional Review Boards for oversight of human research came into effect in 1981 following the 1975 revision of the Declaration of Helsinki, and are encapsulated in the 1991 revision to the U.S. Department of Health and Human Services Title 45 CFR 46 (Public Welfare) Subparts A, B, C and D. Subpart A ("The Common Rule") is the baseline standard of ethics by which any government-funded research in the US is held, and nearly all academic institutions hold their researchers to these statements of rights regardless of funding.
1)Respect for persons: protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent. Researchers must be truthful and conduct no deception;

2)Beneficence: The philosophy of "Do no harm" while maximizing benefits for the research project and minimizing risks to the research subjects; and

3) Justice: ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally. Notable subsequent lapses include:

Radiation Experiments
Chemical Warfare/ Prison dioxin study
Stanley Milgram's experiments
Tuskegee Study - sparks "common rule" medical ethics
& other applied ethics Part 2: The Ethos of Science Part 3 Some Important TU Delft Ethics Resources:

Scientific Integrity Committee
Human Research Ethics Committee
Ethics Code
Full transcript