Category Archives: Ethics

Genetically Modified Food Panel – Science, Policy, and Passion….

GPSS FOSEP panel 2While Washington state aready has results on I-522, the Initiative to label GM foods, we still wanted to review the GM food panel a few weeks ago co-hosted by GPSS and FOSEP.  (See the video link if you missed it).

We had a full house with a diverse repGPSS FOSEP panel 6resentation from various University Departments, as well as people from the general public, reflecting the general passion about this issue.  Paraphrasing one of our panelists, Flavia, GM food and issues surrounding it are convoluted with some of our deepest values.  The panel had a goal of presenting a scientific discussion of genetic modification.

Continue reading

Advertisements

The DoD and the Genome

As a new member of FOSEP, I’ve been thinking about writing about this topic for awhile, but felt nervous about getting my feet wet with the blog.  I’m sorry if this rambles a little.

I’m a graduate student in Public Health Genetics, and am interested in the ways information about our genome can be used to improve health, and how the information is understood (or not) by our society.  I am particularly interested in working with those who have decided to Serve in the military (in some ways, we could talk about them as the other 1% of those who consistently serve- see http://www.pewsocialtrends.org/2011/11/23/the-military-civilian-gap-fewer-family-connections/) , so I became especially intrigued when colleagues pointed out a 2010 report produced by the JASONs, a group of scientists that advise the Department of Defense, or DoD, (see http://www.fas.org/irp/agency/dod/jason/ for unclassified studies) about how / if Genetic Databases could be used by the DoD called the $100 Genome: Implications for the DoD: (www.fas.org/irp/agency/dod/jason/hundred.pdf) and a Nature Reviews Genetics Report in Response to it (http://www.nature.com/nrg/journal/v12/n9/full/nrg3063.html).   The report focuses on how genome sequencing, once cost prohibitive, is reaching the point where it is more expensive to store and analyze the information than to obtain it.

Continue reading

Noninvasive prenatal genetic testing raises hopes for the future but also ethical concerns

A new study published in Science Translational Medicine describes a noninvasive technique for sequencing the entire genome of a human fetus. Through analysis of cell-free DNA isolated from maternal blood plasma and containing fetal DNA, the authors were able to predict inherited variation in the fetal genome with 98% accuracy. While new noninvasive prenatal techniques have big implications for the future of prenatal testing and morbidity and mortality rates, they also raise important ethical issues and concerns which will need to be addressed.

 

Mendelian diseases, genetic disorders caused by mutations within a single gene, are numerous (about 3500 currently) and uncommon in the general population, but collectively make-up  ~20% of infant mortality and ~10% of pediatric hospitalizations each year. Current prenatal testing involves either amniocentesis or chorionic villus sampling, invasive procedures that involve a small but potentially deadly risk to the pregnancy.  Thus, research has been underway to develop noninvasive techniques for prenatal genetic testing.

 

13% of cell-free DNA in maternal blood plasma contains fetal DNA. This DNA is increased during pregnancy and then rapidly cleared following birth. Building on techniques recently developed by other research groups, and using a blood sample from the mother and a saliva sample from the father, Kitzman et al. (2012) reported the full genome sequencing of a human fetus at 18.5 weeks of gestation. The authors used the haplotype-resolved genome sequence of a mother, the shotgun genome sequence of a father, and the deep sequencing of maternal cell-free DNA. They went on to replicate the results by sequencing the full genome of a human fetus at 8.2 weeks of gestation, with 95% accuracy. In both cases, cells collected from cord blood following delivery of the child was used to assess accuracy.

 

Although a major breakthrough in the field of prenatal genetic testing, the authors suggest several areas for improvement. First, there were a large number or sites for which they did not attempt prediction, and improvements in genome-wide haplotyping protocols are required. The authors state: “there remains a critical need for genome-wide haplotyping protocols that are at once robust, scalable, and comprehensive. Significant reductions in cost, along with standardization and automation, will be necessary for compatibility with large-scale clinical application.” Secondly, the authors suggest that improvements in deep sequencing techniques are required to better predict de novo single-nucleotide mutations (new mutations seen only in the fetus and thus representing potential new sources of disease). Finally, while the authors were able to predict single-nucleotide variants (Mendelian diseases), “more robust methods for detecting other forms of variation, for example, insertion-deletions, copy number changes, repeat expansions, and structural rearrangements” will be required.

 

The noninvasive techniques presented by Kitzman et al. (2012) have huge implications for the future of prenatal testing, and could reach the clinic in the near future. While these new techniques could reduce the incidence of  life-threatening genetic disorders and improve prenatal and neonatal treatment for many conditions, the authors warn that more research is required before introduction into clinical practice, in part because “although the noninvasive prediction of a fetal genome may be technically feasible, its interpretation—even for known Mendelian disorders—will remain a major challenge.” To get around some of the ethical issues raised by these new techniques, some researchers recommend only sequencing parts of the genome known to be involved in genetic disorders, but regardless, advances in prenatal testing require “extensive discussions among all stakeholders (physicians, genetic counselors, patient advocacy groups, and the general public), much debate, and great care in implementation.”

AAAS Annual Conference: Saturday

Exploding Myths on Reactor Security, Harm Reduction, and Genetically Modified Organisms

Day 2 started off with a very interesting although potentially one-sided panel aiming to dispel common misconceptions concerning three major global science policy issues. The first talk was given by Dr. Roland Schenkel, a nuclear energy consultant from Germany. The main theme of his talk was that keeping nuclear energy is imperative to a successful global future. His reasoning behind this was that nuclear energy is sustainable, it addresses climate change issues, and will secure long-term energy supply and thus reduce global tensions pertaining to gas availability. Additionally, he claimed that a new generation of safer and more efficient reactors are ready for deployment and solutions for waste disposal management are already available. Dr. Schenkel explained that although there are 30 countries with nuclear reactors, the regulations and policies involved varies substantially. He stressed that there needs to be pressure to establish a global regulatory framework for safety with internal benchmarks and that all work pertaining to nuclear energy needs to be open and transparent in order to increase public acceptance and dispel common misconceptions. The second talk was given by David Oriely, the Group Scientific Director for British America Tobacco. The main point of his presentation was that smoking harm reduction can be obtained by producing and marketing safer sources of nicotine. He presented data showing that people who use solely nicotine products such as snus or smokeless tobacco have an extremely decreased incidence of lung and other cancers. While I admit the data demonstrating that nicotine alone products are safer was convincing, the issue still exists that people will be ingesting nicotine, which will still hijack your brain reward pathways and produce adverse effects such as cross sensitization to other drugs of abuse.  Dr. Guy van dee Eede from the Joint Research Center gave the final talk of the session which dealt with genetically modified organisms (GMOs). His main thesis was that GMOs are a revolution in evolution in that they are subject to evolution themselves and therefore should not be viewed as so foreign. He maintained that GMOs are an integral part of the future of global food supply and that work needs to be done to increase public knowledge and expel myths regarding GMOs. Overall I do agree with the myths presented and dispelled but I also would have liked to have heard from a similar panel presenting the opposing view on these controversial issues.

 

Carl Wieman, A Scientific Approach to Science Education

 Dr. Carl Wieman is the Associate Director for Science in the Office of Science and Technology Policy within the Executive Office of the U.S. President and gave one of Saturday’s Topical Lectures. He described an alternative science teaching method to the typical teacher stand in front of a class and lecture from a textbook. He described that many teachers, including himself as a young professor, believe that when a student does not understand a seemingly basic concept it is because they are not trying hard enough and thus the answer is to teach them the same material again with that belief that eventually the student will get it. The new approach draws from the basis of how scientists think and then applies that to the classroom setting; scientists are experts that have a mental organizational framework that they can access and apply, they can see complex relationships and patterns, and have the ability to monitor their own thinking and learning process.  The approach is to, instead of lecturing to a classroom about scientific concepts already explained in the assigned textbook, give the students challenging but doable tasks and questions with the explicit focus on expert-like thinking, feedback, and reflection. An example of this new method is to first pre-assign the textbook reading of the material and give a subsequent online quiz. Next, because the reading has already taken place, the class becomes about problem solving and critical thinking. Each student has a hand-held answer remote and the class begins with a multiple-choice question pertaining to the previously read material (data shows that when a student is forced to give an answer that they are accountable for they are more invested in the answer and think more deeply). Before the answer to the question is shown, small groups of students debate the various multiple-choice answers, then the class as a whole discusses the different answers, and finally the answer is shown and the problem is explained by the teacher. Dr. Weiman gave numerous examples of improved class performance using this new teaching strategy. An example taken from an introductory UW biology class showed that not only did some of the students improve with this teaching method but that the whole bell curve shifted to the right (including low performing and underrepresented or minority populations). Based on this very convincing real world data one would think that teachers around the world are accepting and using this new strategy, but in reality change has been hard to bring about. Dr. Weiman concluded his talk with speculation about why it has been so hard for teachers to give up the traditional methods but also with hope that with new efforts by organizations such as AAU, APLU, professional societies, and NAS, increasing numbers will begin to adopt these new strategies.

 

Beyond Evolution: Religious Questions in Science Classrooms

I was very excited to attend the session on religion and science, as I have on many occasions had to address these issues with friends and family. The talks focused on teaching evolution and climate change to religious students. Dr. Ken Miller from Brown University showed data demonstrating that as the level of education increases, even throughout the four years of college, the acceptance of evolution increases, regardless of religious view or political beliefs. Additionally data shows that as education level increases students stop believing in the idea that science and religion have to be in conflict and that either one or the other is correct. Surprisingly, while this increased belief in evolution is progressing, students in the sciences are also becoming more, not less religious! Unfortunately many students and the general public are shown anti-evolution propaganda and taught that science is evil and will destroy morality. Examples of Rick Santorum’s stance on science and evolution and documentaries like Expelled narrated by Ben Stein were given. Dr. Miller closed with a three-part strategy: one to teach more science, two to teach science as a process and not as a doctrine, and three to teach the interconnectedness of science and religion. His bottom line was that instead of adhering to the conflict model and pitting science against religion we should be teaching more and better science. Dr. Mark Mecafery next discussed the issues of teaching climate change to religious students. His talk focused on the history of climate change research, started in the early 1800s by contemporaries of Darwin. Studies done in the 1950s postulated that by year 2000 temperatures would be increasing and melting of the polar ice cap would be seen. Dr. Mecafery suggested that so many people deny that climate change is occurring because certain groups have made the environment and climate change into a new religion, again pitting science against religion and teaching an either/or belief. As far as solutions go, he also echoed Dr. Miller that we need to teach more, better science in an attempt to expel the now common notion that science is anti-religion and thus threatening to the sanctity of life.

 

Plenary Panel Science is not Enough

I ended the day with attending the Plenary Panel Science is not Enough. The panel was moderated by Dr. Frank Sesno and included Dr. James Hansen, Dr. Olivia Judson, and Dr. Hans Rosling. The panel addressed why even in the face of overwhelming data and support for climate change by the scientific community so much of the American public is in disbelief, and focused on the increasing need for outreach and communication by scientists. The panel was videotaped and will be available online, so instead of describing it, I will provide a link to the video when made available to the public.

H5N1 – when do governments step in to limit scientific research?

Back in December of last year news broke that two groups of scientists, one at the Erasmus Medical Center in Rotterdam, the Netherlands and one at the University of Wisconsin, Madison, created a mutation of the H5N1 avian flu virus that could potentially be contagious among humans (click here for one of the initial reports). Based on the reported cases of people infected by the avian flu virus, the mortality rate of the H5N1 virus is estimated to be 60%. This far exceeds that of the ‘Spanish flu’ of 1918 that killed between 50 and 100 million people worldwide. The accuracy of the 60% mortality rate of the avian flu has been debated by scientists, but even if the true mortality rate ends up 10% of the current estimate, H5N1 would be 3 times more fatal than the Spanish virus.

 

News covering the incident not only centered on the creation of this potentially dangerous strain of the avian flu virus but also on the intervention by the National Science Advisory Board for Biosecurity (NSABB) to ask the journals to put a temporary hold on the publication of the article and a moratorium on related research until scientists and government officials can assess the risk of publishing the findings from the group from Erasmus. This was a rare intervention by the NSABB. When a group from the Mount Sinai School of Medicine in New York published studies on the reconstruction of the Spanish flu virus back in 2005 there was nothing from the NSABB.

 

So the question still stands: how much of the study will they make public, if any at all?

 

A Nature news blog post, as well many others, covered a recent debate on Feb.  2 led by the New York Academy of Science on the issue of how much of the paper should be publicly made available.  There were proponents on either side of the debate, some calling for complete retraction of the paper, others calling for complete publication of the details. The debate, in my opinion, boiled down to an argument of whether details about the study will inherently harm or benefit society. On one side, one argued that the details from the paper can be used by the evils of the world to create a deadly virus that can potentially be released into the population, killing many, many people. On the other side, one argued that mutations, like those that the scientists induced, happen in nature and a contagious form of the avian flu, like the H5N1 can appear without human involvement. If we want to be prepared for such a time, you need as many researchers studying H5N1 to better understand the virus… There’s a lot of uncertainty and speculation surrounding the issue, and as this article states, there appears to be very little in terms of international guidelines when such a situation crops up.

 

As a scientist, it’s beat into my head that we need to be transparent with all our methods and conclusions, for the sake of the integrity of science. If we’re asked for the data we used in a study, we make it available so that others can test whether they come to the same conclusions.

 

But what happens when the knowledge can be potentially harmful to society? What does one do? When there is uncertainty to what the study results mean in terms of our protection or our harm, who is involved in the decision making process?

 

I can imagine scientists will be necessary to interpret the results in increasingly complex situations, but who else?  There will be uncertainties about how the results will be used, so the decision makers (i.e. governmental advisers, journal publishers) will eventually have to decide on a course of action.  Are there others who need to be involved?

 

The NSABB’s statement on its decisions can be found here. And further discussions and opinions can be found on Nature’s news special online.

UK scientists and government collaborating to eliminate maternal genetic mutations through new reproductive techniques

Recently in Britain, renewed talk of banned legislation regarding advances in an in vitro fertilization (IVF) technique has commenced. The controversial procedures would save children from inheriting certain genetic diseases but would also result in a child with three genetic parents and the destruction of a fertilized egg. The new IVF techniques obviously raise ethical and legal concerns, but should Britain pass this legislation they would be the first in the world to test these procedures in humans.

Mitochondrial DNA (mtDNA) is inherited from the mother and is the source of numerous devastating neuromuscular and neurodegenerative disease. Mutations in mtDNA passed on from mother to child are responsible for diseases such as muscular dystrophy, diabetes mellitus, deafness, and myoclonic epilepsy and affect around 1 in 5000 people. Now Britain has initiated steps towards clinical trials investigating a break through IVF technique that combines the nuclear DNA of the mother with mutation-free mtDNA from a donor egg.

UK’s Human Fertilization and Embryology Authority (HFEA) announced on January 19th a public dialogue regarding this emerging IVF technique in order to gauge public opinion of the possible use in a clinical setting. The Secretary of State of Health together with the Secretary of State of Business, Innovation, and Skills jointly asked HEFA to form this task force, a necessary first step to bringing this potentially life-saving technique to clinical trials. The public dialogue will begin later this year and be guided and overseen by a panel of experts. Additionally, the biomedical charity Welcome Trust has promised funds for preclinical safety experiments and the Nutffield Council on Bioethics has started an independent review. Here seems to be a perfect case where scientists, government, and policy experts are working together instead of just talking past or at each other.

There are two procedures currently under development: pronuclear transfer and maternal spindle transfer. In the first an egg with mutated mtDNA is fertilized in vitro, then the resulting pronucelus is removed and transferred to a donor egg that has had its pronucleus removed. The second technique involves chromosomes (DNA) taken from an unfertilized egg with mutated mtDNA being added to an unfertilized donor egg lacking a nucleus, then fertilization occurs in vitro. Pronuclear transfer has been successfully preformed on defective human eggs and maternal spindle transfer has been used to produce two healthy rhesus monkeys. Additionally, HEFA released a review in early 2011 finding the techniques not unsafe, although they did determine numerous additional studies would be required prior to beginning clinical trials.

These proposed IVF techniques, techniques that would produce children with three genetic parents, raise many important legal and ethical questions and issues. In the US federal funds cannot be used for research involving human embryos. Additionally, these procedures were banned by the British government in 2008 for safety, ethical, and research related reasons. But importantly, legislation was also put in place for a streamlined mechanism to legalize the techniques should scientific advances be made.  Now, based on recent technical advances the British government has decided to take another look at the legislative ban. Chair of HFEA, Lisa Jardine, said in a recent press release, “This is an issue of great importance to families affected by mitochondria disease and it is also one of enormous public interest. The decision about whether this research technique should be made available to treat patients is one for the Secretary of State and, ultimately, Parliament. We will work hard to stimulate a rich and varied public debate, to help him make an informed decision.”