According to NASA, the New Horizons spacecraft made its closest approach (about 7800 miles) to Pluto right now (7:49AM ET, 14 July 2015) after traveling three billion miles. If you want a travel post, that certainly fits the bill.
I’ve always thought the Reproducibility Project represented an incredibly naive approach to the scientific method. This excellent news piece in Science sums up many of the reasons why. As Richard Young says in the piece, “I am a huge fan of reproducibility. But this mechanism is not the way to test it.” Here’s why:
1) Reproducibility in science is not achieved by having a generic contract research organization replicate a canned protocol, for good reason: cutting edge experiments are often very difficult and require specialized skills to get running. Replication is instead achieved by other labs in the field who want to build on the results. Sometimes this is done using the same protocol as the original experiment, and sometimes by obtaining similar results in a different system using a different method.
2) For this reason, I don’t have much confidence that the results obtained by the Reproducibility Project will accurately reflect the state of reproducibility in science. A negative result could mean many things — and most likely it will reflect a failure of the contract lab and not an inherent problem with the result. Contrary to the claims of the projects leaders, the data produced by the Project will probably not be useful to people who are serious about estimating the scope of irreproducibility in science. At its worst, it could be extremely misleading by painting an overly negative picture of the state of science. It’s already been damaging by promoting a too-naive view of how the process of successful science actually works.
3) As the Science piece points out, there is a much better, cheaper, and scientifically sensible way to achieve better reproducibility. If many papers out there are suspect because they lack proper controls, don’t use validated reagents, fail to describe methods adequately, or rely on flawed statistics, then we don’t need to spend millions of dollars and thousands of hours of effort trying to repeat experiments. We need to make sure editors and reviewers require proper controls, reagents, statistics, and full methods descriptions.
It’s worth reading the full article, but below the fold are some salient quotes: Continue reading
This week Science for the People is talking about do-it-yourself biology, and the community labs that are changing the biotech landscape from the grassroots up. We’ll discuss open-source genetics and biohacking spaces with Will Canine of Brooklyn lab Genspace, and Tito Jankowski, co-founder of Silicon Valley’s BioCurious. We’ll also talk to transdisciplinary artist and educator Heather Dewey-Hagborg about her art projects exploring our relationship with genetics and privacy.
*Josh provides research & social media help to Science for the People and is, therefore, completely biased.
Posted in Curiosities of Nature, Follies of the Human Condition
Tagged BioCurious, biohacking, Desiree Schell, Genetics, Genspace, Heather Dewey-Hagborg, Podcast, privacy, sciart, science for the people, Tito Jankowski, Will Canine
“Translating the genetic code is the nexus connecting pre-biotic chemistry to biology.” — Dr. Charles Carter
Last week we discussed the general question of how the genetic code evolved, and noted that the idea of the code as merely a frozen accident — an almost completely arbitrary key/value pairing of codons and amino acids — is not consistent with the evidence that has been amassed over the past three decades. Instead, there are deeper patterns in the code that go beyond the obvious redundancy of synonymous codons. These patterns give us important clues about the evolutionary steps that led to the genetic code that was present in the last universal common ancestor of all present-day life.
Charles Carter and his colleague Richard Wolfenden at the University of North Carolina Chapel Hill recently authored two papers that suggest the genetic code evolved in two key stages, and that those two stages are reflected in two codes present in the acceptor stem and anti-codon of tRNAs.
In the first part of my interview with Dr. Carter, he reviewed some of previous work in this field. In the present installment, he comments on the important results that came out of his two recent studies with Dr. Wolfenden. But before we continue with the interview, let’s review the main findings of the papers.
The key result is that there is a strong relationship between the nucleotide sequence of tRNAs, specifically in the acceptor stem and the anti-codon, and the physical properties of the amino acids with which those tRNAs are charged. In other words, tRNAs do more than merely code for the identity of amino acids. There is also a relationship between tRNA sequence and the physical role performed by the associated amino acids in folded protein structures. This suggests that, as Dr. Carter summarized it, “Our work shows that the close linkage between the physical properties of amino acids, the genetic code, and protein folding was likely essential from the beginning, long before large, sophisticated molecules arrived on the scene.” Perhaps it also suggests – this is my possibly unfounded speculation – that today’s genetic code was preceded by a more coarse-grained code that specified sets of amino acids according to their physical functions, rather than their specific identity. Continue reading