Scientific Potemkin Villages

Naomi Oreskes and Erik Conway come up with cool name for a familiar strategy:

This strategy of creating a ‘scientific Potemkin village’ was applied to global warming too. During the period that we scrutinize in our book, the Marshall Institute didn’t create its own journal, but it did produce reports with the trappings of scientific argument — such as graphs, charts and references — that were not published in the independent peer-reviewed literature. At least one of these reports — read and taken seriously by the administration of former US president George H. W. Bush — misrepresented the science by presenting only part of the story. NASA climate modeller James Hansen and his team had demonstrated in the peer-reviewed literature that historic temperature records could be best explained by a combination of solar irradiance, volcanic dust, and anthropogenic greenhouse gases. The Marshall Institute report included only a single piece of Hansen’s graph, using the fragment to make it seem as if there was a poor link between carbon dioxide and climate warming, and to argue — against Hansen’s analysis — that the real culprit was the Sun.

And we can’t forget the pioneering efforts of creation scientists, who are masters of the scientific Potemkin Village.

Ending the World for 60 Years: 1952 yet again

Post-Holocaust Noble Savages

I’ve read three 1952 post-apocalyptic novels for this seriesThe Long Loud Silence, and two books that are so similar that they can be dealt with in a single post: Star Man’s Son, by Andre Norton, and Vault of the Ages, by Poul Anderson. Both of these books are basically fantasy/neo-barbarian novels set hundreds of years after the North American continent has been ravaged by nuclear war. Both feature late teenage boys defying their elders and seeking out the lost knowledge of the god-like-but-fallen pre-apocalyptic ancestors, ancestors who held so much knowledge, but squandered it in a catastrophic nuclear war. Both feature climactic battles among various tribes, and finish with grand peace settlements (catalyzed by the boy heroes and accompanied by lengthy speeches) as humanity tries to recover the lost secrets of technology.
Continue reading “Ending the World for 60 Years: 1952 yet again”

The $60,000 Man

This is what your next doctor’s visit will sound like after you get your genome sequenced:

“Analysis of 2·6 million single nucleotide polymorphisms and 752 copy number variations showed increased genetic risk for myocardial infarction, type 2 diabetes, and some cancers. We discovered rare variants in three genes that are clinically associated with sudden cardiac death—TMEM43, DSP, and MYBPC3. A variant in LPA was consistent with a family history of coronary artery disease. The patient had a heterozygous null mutation in CYP2C19 suggesting probable clopidogrel resistance, several variants associated with a positive response to lipid-lowering therapy, and variants in CYP4F2 and VKORC1 that suggest he might have a low initial dosing requirement for warfarin. Many variants of uncertain importance were reported….”

It’s been obvious for some time that cost will soon be no obstacle to getting your genome sequenced as part of a routine clinical workup. What’s been less clear is just how useful that is going to be, and how physicians should go about incorporating a patient’s genome sequence into routine clinical decisions. (Check out a discussion of where costs are now here.)

We can argue about how to go about bringing sequence data into the clinic, but perhaps the best way to get started is to just give it a try – which is exactly what a group of researchers at Harvard and Stanford have done. They’re reporting in The Lancet their trial run of a first whole genome clinical workup:

We assessed a patient with a family history of vascular disease and early sudden death. Clinical assessment included analysis of this patient’s full genome sequence, risk prediction for coronary artery disease, screening for causes of sudden cardiac death, and genetic counselling. Genetic analysis included the development of novel methods for the integration of whole genome and clinical risk. Disease and risk analysis focused on prediction of genetic risk of variants associated with mendelian disease, recognised drug responses, and pathogenicity for novel variants. We queried disease-specific mutation databases and pharmacogenomics databases to identify genes and mutations with known associations with disease and drug response. We estimated post-test probabilities of disease by applying likelihood ratios derived from integration of multiple common variants to age-appropriate and sex-appropriate pre- test probabilities. We also accounted for gene-environment interactions and conditionally dependent risks.

Continue reading “The $60,000 Man”

What a real scientific discussion looks like

John Timmer on some climate change back-and-forth.

After reviewing debates over two papers published in American Geophysical Union journals,

These situations tell us a couple of valuable things about the current state of climate science. First of all, they make it obvious that papers that go against the consensus can still get published, even when they come from people who very notably fall outside the scientific community’s mainstream. And, in fact, the scientific community takes these things seriously—seriously enough to check the math and examine the data sources.
Continue reading “What a real scientific discussion looks like”

Texas finally does something right regarding education

Texas may be screwing over their public school children’s education (and, via the textbook market, our kids’ too), but at least Texas is doing something right: they won’t permit a creationist institute to hand out graduate degrees in science education.

Continue reading “Texas finally does something right regarding education”