Gee – I sure wish we had one of those doomsday machines – General Buck Turgidson
Previously, on 24, I offered up the back story of our new paper detailing a cryoconite metagenome. Writing it brought out the inarticulate cod philosopher in me with a tenuous analogy about “next-gen” sequencing and whether it is a disruptive technology or not. Rather than cut a short story long, and sour the rare note of optimism in seizing opportunities I thought a separate post was in order.
The emergence of “Next Gen” DNA sequencing, has, like the development of a new weapon, say The Bomb, proven truly explosive.
It has also proliferated widely, perhaps into hands which shouldn’t be trusted with it (like mine), and has sparked an arms race, if not a revolution.
The notion it has forced regime change in genomics is frequently encountered. A graph of cost per nucleotide vs time is an ubiquitous powerpoint offering to the extent of cliché.
This regime change has often been referred to as a democratization of genomics. It is a powerful and attractive notion: taking the power to sequence away from an industrial sequencing centre and into the hands of the “ordinary” researcher. Big science for the little guy; something like a physicist finding a Large Hadron Collider in her basement ten years from now.
Has that really happened though?
At the level of many individual researchers, buying in to a “next-gen” experiment is not cheap. Monetarily, it was certainly beyond my means as a brand-new independent researcher at the time of sequencing the cryoconite metagenome discussed earlier. I am grateful for colleagues willing to offer that opportunity rather than re-sequence standards. Otherwise, early steps with “Next Gen” can be tentative and fraught with failed runs. Within the scope of seedcorn and pump-priming funds one might wish to use for such high-risk exploration of a new methodology, it is very easy to burn your budget on a single failed experiment. If, however one is analysing samples from deep polar fieldwork, the “Next Gen” bit of an experimental framework can be the cheapest, especially if you have collected those samples using helicopters to access your field sites (£30 a minute and up). It becomes a matter of scale and perspective.
At the level of an individual institute, it can be a risky process too. As the fates and fortunes of various platforms play out, buying into a particular platform is a brave step. After an initial hurrah, and much longer to optimize it to gain useful data, our first “next-gen” sequencer now lies neglected, all but obsolete within two years of its purchase (EDIT: Actually, it is now obsolete). I believe the number of its paper outputs to date can be counted using the fingers of one hand. Despite my better judgement, I remain partial to the Ion Torrent PGM, even when MiSeq, Proton, and HiSeq are also in house. I suspect he cost per base of sequencing matters less than the cost per experiment for a microbial community study: Everything is extrapolative (sequencing depth – rant for another time?). A dinky #314v2 chip will do me (and my budget) fine, thanks. Yet I have to admit the Downfall of Ion Torrent video had me laughing and wincing in even measure.
Of course, mileage may vary. A few minutes on Web of Science tracking the rate of publication and citation of some prominent (predominantly Stateside) microbial ecologists reveals a sharp inflexion upwards round about the time their pyrosequencers lost the new car smell. Those able to surf the wave of “Next Gen” rather then drown under it have done well.
I would argue that the regime change offered by “next-gen” sequencing is not as democratizing as it is destabilizing. I suspect Professor Mark Pallen briefly offers a related argument within this excellent talk on the application of metagenomics to a field (clinical microbiology) where 19th century technology is still the norm. Is there a risk of a gulf between those who can and the rest? How accessible is the technology to the real next-gen: students, early career researchers and the like? How can standards be agreed and verified as progress evolves rapidly? Or the hysteresis between the months taken to publish and days to conduct expensive experiments which may be consequently overtaken by competitors?
Even if the gold rush has past in the eyes of some, the yield and scale of data churned out has transformed the way we could aspire to do microbial ecology in a few years. Can this focus on cracking out loads of data be at the expense of insight and impact? A triumph of technology over science? A microbial genome has become devalued from meriting fanfare in a Nature paper to a brief non peer-reviewed obituary in Genome Announcements within a decade or so.
To build on an aside from Pallen’s presentation again: If Hall’s predictions of rising cost-per-base are borne out, is there a Fahrenheit-454 measured in units of bp/$? At this point it becomes more economic to transform the reams of sequence data into knowledge (and hence biology) by the means of further experimentation and analysis rather than simply short read archive it and move on to the next sequencing project?
It might even be arguable that it is at that Fahrenheit-454 that “Next-Gen” would cease to become a disruptive technology, and perhaps enable us to look for truths in biology again.
Maybe that the regime change has yet to reach Fahrenheit-454 is reflected in the common usage of the terminology “next gen” sequencing. Nearly a decade has passed since the first description of 454 sequencing. I have purposefully used quotation marks in this post to highlight this uneasy Luddite-tinged limbo. “Now gen” sequencing still doesn’t seem right, even if 454 itself is about to become ex-gen but it could reflect the reality better – for some. Perhaps “high throughput” dodges the bullet, and this is what I prefer to use, without quotation marks.
Argh. Call it what you like. The bottom line is that, like proliferated Nukes, the genie is not going to go back into the bottle. “Next-gen” in it many forms here to stay. Yeehah.