Author Archives: Mike White

Sunday Science Poem: The Geometry of Love

Andrew Marvell’s ‘The Definition of Love’ (1681)

Kepler_Mars_retrogradeWhy are 17th century poets like John Donne, George Herbert and Andrew Marvell called ‘metaphysical’ poets? You can trace the name back to John Dryden, who in an unabashedly sexist comment accused John Donne of “affect[ing] the metaphysics, not only in his satires, but in his amorous verses… perplex[ing] the minds of the fair sex with nice speculations of philosophy when he should engage their hearts, and entertain them with the softnesses of love.”

Well, the Metaphysical poets proved that you can in fact engage the heart with science. Continue reading

Apocalypse 1901: Adam and Eve in the Empty World Asylum

M. P. Shiel’s The Purple Cloud (1901)

NorthPoleNOAAThe Book of Revelation isn’t the only part of the Bible that inspires post-apocalyptic fiction — Genesis plays a big part too. The Bible’s story about the beginning of the world has become a popular way to think about the world’s end. Adam and Eve, a paradisiacal Eden, and humanity’s fall get transformed into a last couple, a post-apocalyptic haven, and the forbidden fruit of some unexplored territory or lethal knowledge. What could be called the very first post-apocalyptic novel was explicitly written as a bookend to Genesis. Nathaniel Hawthorne later wrote a replay of Genesis that takes place within the empty remnants of civilization. M.P Shiel’s The Purple Cloud, an overwritten but under-read classic, is also a post-apocalyptic Adam and Eve story: the fall of civilization is brought about by a reach for the unexplored North Pole, and a last couple must consider the moral dilemma of repopulating an empty world.

The Purple Cloud is the first post-apocalyptic novel of the 20th century, but it starts with a throwback, by putting the whole thing within a mystic frame story of the sort employed much earlier by de Grainville and Mary Shelley. Most of the novel consists of the first-person record of the last man as he wrote it down in his notebooks; to get those notebooks in the hands of 20th century readers, Shiel has them dictated by a medium to her physician, who then passes the manuscript on to M.P. Shiel. Finding a plausible explanation for how a future story comes into the hands of present-day readers was a particular concern of 19th century SF writers, but would soon be largely abandoned. Continue reading

Would you outsource your gel to a gel-informatician?

Sean Eddy explains why sequencing is replacing many older assays, and why biologists need to learn to analyze their own data.

“High throughput sequencing for neuroscience”:

If we were talking about a well-defined resource like a genome sequence, where the problem is an engineering problem, I’m fine with outsourcing or building skilled teams of bioinformaticians. But if you’re a biologist pursuing a hypothesis-driven biological problem, and you’re using using a sequencing-based assay to ask part of your question, generically expecting a bioinformatician in your sequencing core to analyze your data is like handing all your gels over to some guy in the basement who uses a ruler and a lightbox really well.

Data analysis is not generic. To analyze data from a biological assay, you have to understand the question you’re asking, you have to understand the assay itself, and you have to have enough intuition to anticipate problems, recognize interesting anomalies, and design appropriate controls. If we were talking about gels, this would be obvious. You don’t analyze Northerns the same way you analyze Westerns, and you wouldn’t hand both your Westerns and your Northerns over to the generic gel-analyzing person with her ruler in the basement. But somehow this is what many people seem to want to do with bioinformaticians and sequence data.

It is true that sequencing generates a lot of data, and it is currently true that the skills needed to do sequencing data analysis are specialized and in short supply. What I want to tell you, though, is that those data analysis skills are easily acquired by biologists, that they must be acquired by biologists, and that that they will be. We need to rethink how we’re doing bioinformatics.

I would add this: it takes some time to learn, but in the end it’s not that hard, people. Students in chemistry and physics routinely learn the requisite skills. We need to educate biologists who expect to do programming, math, and statistics.

How bad is the NIH budget really?

In the blowback to Francis Collins’ comments about budget cuts delaying an Ebola vaccine, there is a lot of confusion going around about just how much the NIH budget declined.

The worst offender is the usually very good Sarah Kliff at Vox.com, who writes:

The NIH’s budget rose rapidly during the early 2000s, growing from $17 billion in 2000 to a peak of $31 billion in 2010. This meant more money for everything…

Funding then began to decline in 2010 and has continued to fall slightly over the past four years (this was during a period when Obama was in the White House, Democrats controlled the Senate, and Republicans controlled the House). By 2013, funding was down to $29.3 billion. These figures do not account for inflation.

Inflation – there’s the rub. Because when you do account for inflation, you see that the NIH budget was in decline long before 2010 – in fact things started to go south after 2004, as the AAAS budget analysis shows:

And depending on how you make the inflation adjustment, things can look even worse – you hear claims of a 20% decline tossed around. To understand how this works, lets look at the numbers themselves: Continue reading

Will the future run out of technology?

If you haven’t seen it, this opinionated, provocative, and forceful essay by Bruce Gibney at Founder’s Fund is a great read. Starting with the question of why venture capital return has generally sucked over the past two decades, he delves into issue of real vs. fake technology, why we’ve been too quick to be satisfied with incremental progress, and whether there is that much revolutionary technology left to invent.

“What happened to the future?”:

Have we reached the end of the line, a sort of technological end of history? Once every last retailer migrates onto the Internet, will that be it? Is the developed world really developed, full stop? Again, it may be helpful to revisit previous conceptions of the future to see if there are any areas where VC might yet profitably invest.

In 1958, Ford introduced the Nucleon, an atom-powered, El Camino-shaped concept car. From the perspective of the present, the Nucleon seems audacious to the point of idiocy, but consider at the time Nautilus, the first atomic submarine, had just been launched in 1954 (and that less than ten years after the first atomic bomb). The Nucleon was ambitious – and a marketing gimmick, to be sure – but it was not entirely out of the realm of reason. Ten years later, in 1968, Arthur C. Clarke predicted imminent commercial space travel and genuine (if erratic) artificial intelligences. “2001: A Space Odyssey” was fiction, of course, but again, its future didn’t seem implausible at the time; the Apollo program was ready to put Armstrong on the moon less than a decade after Gagarin, and computers were becoming common place just a few years after Kilby and Noyce dreamed up the integrated circuit. The future envisioned from the perspective of the 1960s was hard to get to, but not impossible, and people were willing to entertain the idea. We now laugh at the Nucleon and Pan Am to the moon while applauding underpowered hybrid cars and Easyjet, and that’s sad. The future that people in the 1960s hoped to see is still the future we’re waiting for today, half a century later. Instead of Captain Kirk and the USS Enterprise, we got the Priceline Negotiator and a cheap flight to Cabo.

There are major exceptions: as we’ve seen, computers and communication technologies advanced enormously (even if Windows 2000 is a far cry from Hal 9000) and the Internet has evolved into something far more powerful and pervasive than its architects had ever hoped for. But a lot of what seemed futuristic then remains futuristic now, in part because these technologies never received the sustained funding lavished on the electronics industries. Commercializing the technologies that have languished seems as good a place as any to start looking for ideas