Late Night, Unedited Musings on Synthesizing Secret Genomes

By now you have probably heard that a meeting took place this past week at Harvard to discuss large scale genome synthesis. The headline large genome to synthesize is, of course, that of humans. All 6 billion (duplex) bases, wrapped up in 23 pairs of chromosomes that display incredible architectural and functional complexity that we really don't understand very well just yet. So no one is going to be running off to the lab to crank out synthetic humans. That 6 billion bases, by the way, just for one genome, exceeds the total present global demand for synthetic DNA. This isn't happening tomorrow. In fact, synthesizing a human genome isn't going to happen for a long time.

But, if you believe the press coverage, nefarious scientists are planning pull a Frankenstein and "fabricate" a human genome in secret. Oh, shit! Burn some late night oil! Burn some books! Wait, better — burn some scientists! Not so much, actually. There are a several important points here. I'll take them in no particular order.

First, it's true, the meeting was held behind closed doors. It wasn't intended to be so, originally. The rationale given by the organizers for the change is that a manuscript on the topic is presently under review, and the editor of the journal considering the manuscript made it clear that it considers the entire topic under embargo until the paper is published. This put the organizers in a bit of a pickle. They decided the easiest way to comply with the editor's wishes (which were communicated to the authors well after the attendees had made travel plans) was to hold the meeting under rules even more strict than Chatham House until the paper is published. At that point, they plan to make a full record of the meeting available. It just isn't a big deal. If it sounds boring and stupid so far, it is. The word "secret" was only introduced into the conversation by a notable critic who, as best I can tell, perhaps misconstrued the language around the editor's requirement to respect the embargo. A requirement that is also boring and stupid. But, still, we are now stuck with "secret", and all the press and bloggers who weren't there are seeing Watergate headlines and fame. Still boring and stupid.

Next, It has been reported that there were no press at the meeting. However, I understand that there were several reporters present. It has also been suggested that the press present were muzzled. This is a ridiculous claim if you know anything about reporters. They've simply been asked to respect the embargo, which so far they are doing, just like they do with every other embargo. (Note to self, and to readers: do not piss off reporters. Do not accuse them of being simpletons or shills. Avoid this at all costs. All reporters are brilliant and write like Hemingway and/or Shakespeare and/or Oliver Morton / Helen Branswell / Philip Ball / Carl Zimmer / Erica Check-Hayden. Especially that one over there. You know who I mean. Just sayin'.)

How do I know all this? You can take a guess, but my response is also covered by the embargo.

Moving on: I was invited to the meeting in question, but could not attend. I've checked the various associated correspondence, and there's nothing about keeping it "secret". In fact, the whole frickin' point of coupling the meeting to a serious, peer-reviewed paper on the topic was to open up the conversation with the public as broadly as possible. (How do you miss that unsubtle point, except by trying?) The paper was supposed to come out before, or, at the latest, at the same time as the meeting. Or, um, maybe just a little bit after? But, whoops. Surprise! Academic publishing can be slow and/or manipulated/politicized. Not that this happened here. Anyway, get over it. (Also: Editors! And, reviewers! And, how many times will I say "this is the last time!")

(Psst: an aside. Science should be open. Biology, in particular, should be done in the public view and should be discussed in the open. I've said and written this in public on many occasions. I won't bore you with the references. [Hint: right here.] But that doesn't mean that every conversation you have should be subject to review by the peanut gallery right now. Think of it like a marriage/domestic partnership. You are part of society; you have a role and a responsibility, especially if you have children. But that doesn't mean you publicize your pillow talk. That would be deeply foolish and would inevitably prevent you from having honest conversations with your spouse. You need privacy to work on your thinking and relationships. Science: same thing. Critics: fuck off back to that sewery rag in — wait, what was I saying about not pissing off reporters?)

Is this really a controversy? Or is it merely a controversy because somebody said it is? Plenty of people are weighing in who weren't there or, undoubtedly worse from their perspective, weren't invited and didn't know it was happening. So I wonder if this is more about drawing attention to those doing the shouting. That is probably unfair, this being an academic discussion, full of academics.

Secondly (am I just on secondly?), the supposed ethical issues. Despite what you may read, there is no rush. No human genome, nor any human chromosome, will be synthesized for some time to come. Make no mistake about how hard a technical challenge this is. While we have some success in hand at synthesizing yeast chromosomes, and while that project certainly serves as some sort of model for other genomes, the chromatin in multicellular organisms has proven more challenging to understand or build. Consequently, any near-term progress made in synthesizing human chromosomes is going to teach us a great deal about biology, about disease, and about what makes humans different from other animals. It is still going to take a long time. There isn't any real pressing ethical issue to be had here, yet. Building the ubermench comes later. You can be sure, however, that any federally funded project to build the ubermench will come with a ~2% set aside to pay for plenty of bioethics studies. And that's a good thing. It will happen.

There is, however, an ethical concern here that needs discussing. I care very deeply about getting this right, and about not screwing up the future of biology. As someone who has done multiple tours on bioethics projects in the U.S. and Europe, served as a scientific advisor to various other bioethics projects, and testified before the Presidential Commission on Bioethical Concerns (whew!), I find that many of these conversations are more about the ethicists than the bio. Sure, we need to have public conversations about how we use biology as a technology. It is a very powerful technology. I wrote a book about that. If only we had such involved and thorough ethical conversations about other powerful technologies. Then we would have more conversations about stuff. We would converse and say things, all democratic-like, and it would feel good. And there would be stuff, always more stuff to discuss. We would say the same things about that new stuff. That would be awesome, that stuff, those words. <dreamy sigh> You can quote me on that. <another dreamy sigh>

But on to the technical issues. As I wrote last month, I estimate that the global demand for synthetic DNA (sDNA) to be 4.8 billion bases worth of short oligos and ~1 billion worth of longer double-stranded (dsDNA), for not quite 6 Gigabases total. That, obviously, is the equivalent of a single human duplex genome. Most of that demand is from commercial projects that must return value within a few quarters, which biotech is now doing at eye-popping rates. Any synthetic human genome project is going to take many years, if not decades, and any commercial return is way, way off in the future. Even if the annual growth in commercial use of sDNA were 20% — which is isn't — this tells you, dear reader, that the commercial biotech use of synthetic DNA is never, ever, going to provide sufficient demand to scale up production to build many synthetic human genomes. Or possibly even a single human genome. The government might step in to provide a market to drive technology, just as it did for the human genome sequencing project, but my judgement is that the scale mismatch is so large as to be insurmountable. Even while sDNA is already a commodity, it has far more value in reprogramming crops and microbes with relatively small tweaks than it has in building synthetic human genomes. So if this story were only about existing use of biology as technology, you could go back to sleep.

But there is a use of DNA that might change this story, which is why we should be paying attention, even at this late hour on a Friday night.

DNA is, by far, the most sophisticated and densest information storage medium humans have ever come across. DNA can be used to store orders of magnitude more bits per gram than anything else humans have come up with. Moreover, the internet is expanding so rapidly that our need to archive data will soon outstrip existing technologies. If we continue down our current path, in coming decades we would need not only exponentially more magnetic tape, disk drives, or flash memory, but exponentially more factories to produce these storage media, and exponentially more warehouses to store them. Even if this is technically feasible it is economically implausible. But biology can provide a solution. DNA exceeds by many times even the theoretical capacity of magnetic tape or solid state storage.

A massive warehouse full of magnetic tapes might be replaced by an amount of DNA the size of a sugar cube. Moreover, while tape might last decades, and paper might last millennia, we have found intact DNA in animal carcasses that have spent three-quarters of a million years frozen in the Canadian tundra. Consequently, there is a push to combine our ability to read and write DNA with our accelerating need for more long-term information storage. Encoding and retrieval of text, photos, and video in DNA has already been demonstrated. (Yes, I am working on one of these projects, but I can't talk about it just yet. We're not even to the embargo stage.) 

Governments and corporations alike have recognized the opportunity. Both are funding research to support the scaling up of infrastructure to synthesize and sequence DNA at sufficient rates.

For a “DNA drive” to compete with an archival tape drive today, it needs to be able to write ~2Gbits/sec, which is about 2 Gbases/sec. That is the equivalent of ~20 synthetic human genomes/min, or ~10K sHumans/day, if I must coin a unit of DNA synthesis to capture the magnitude of the change. Obviously this is likely to be in the form of either short ssDNA, or possibly medium-length ss- or dsDNA if enzymatic synthesis becomes a factor. If this sDNA were to be used to assemble genomes, it would first have to be assembled into genes, and then into synthetic chromosomes, a non trivial task. While this would be hard, and would to take a great deal of effort and PhD theses, it certainly isn't science fiction.

But here, finally, is the interesting bit: the volume of sDNA necessary to make DNA information storage work, and the necessary price point, would make possible any number of synthetic genome projects. That, dear reader, is definitely something that needs careful consideration by publics. And here I do not mean "the public", the 'them' opposed to scientists and engineers in the know and in the do (and in the doo-doo, just now), but rather the Latiny, rootier sense of "the people". There is no them, here, just us, all together. This is important.

The scale of the demand for DNA storage, and the price at which it must operate, will completely alter the economics of reading and writing genetic information, in the process marginalizing the use by existing multibillion-dollar biotech markets while at the same time massively expanding capabilities to reprogram life. This sort of pull on biotechnology from non-traditional applications will only increase with time. That means whatever conversation we think we are having about the calm and ethical development biological technologies is about to be completely inundated and overwhelmed by the relentless pull of global capitalism, beyond borders, probably beyond any control. Note that all the hullabaloo so far about synthetic human genomes, and even about CRISPR editing of embryos, etc., has been written by Western commentators, in Western press. But not everybody lives in the West, and vast resources are pushing development of biotechnology outside of the of West. And that is worth an extended public conversation.

So, to sum up, have fun with all the talk of secret genome synthesis. That's boring. I am going off the grid for the rest of the weekend to pester litoral invertebrates with my daughter. You are on your own for a couple of days. Reporters, you are all awesome, make of the above what you will. Also: you are all awesome. When I get back to the lab on Monday I will get right on with fabricating the ubermench for fun and profit. But — shhh — that's a secret.

70 Years After Hiroshima: "No government is well aware of the economic importance of biotechnology"

I was recently interviewed by Le Monde for a series on the impact of Hiroshima on science and science policy, with a particular focus on biotechnology, synthetic biology, and biosecurity. Here is the story in French. Since the translation via Google is a bit cumbersome to read, below is the English original.

Question 1

On the 16 of July 1945, after the first nuclear test at large scale in New Mexico (called trinity) the American physicist Kenneth Bainbridge, head of the shooting, told Robert Oppenheimer, head of the Manhattan Project, "Now we are all sons of bitches ".

In your discipline, do you feel that the time the searchers might have the same revelation has been reached ? Will it be soon?

I think this analogy does not apply to biotechnology. It is crucially important to distinguish between weapons developed in a time of war and the pursuit of science and technology in a time of peace. Over the last thirty years, biotechnology has emerged as a globally important technology because it is useful and beneficial. 

The development and maintenance of biological weapons is internationally outlawed, and has been for decades. The Trinity test, and more broadly the Manhattan Project, was a response to what the military and political leaders of the time considered an existential threat. These were actions taken in a time of world war. The scientists and engineers who developed the U.S. bombs were almost to a person ambivalent about their roles – most saw the downsides, yet were also convinced of their responsibility to fight against the Axis Powers. Developing nuclear weapons was seen as imperative for survival.

The scale of the Manhattan Project (both in personnel and as a fraction of GDP) was unprecedented, and remains so. In contrast to the exclusive governmental domain of nuclear weapons, biotechnology has been commercially developed largely with private funds. The resulting products – whether new drugs, new crop traits, or new materials – have clear beneficial value to our society.

Question 2

Do you have this feeling in other disciplines? Which ones ? Why?

No. There is nothing in our experience like the Manhattan Project and nuclear weapons. It is easy to point to the participants’ regrets, and to the long aftereffects of dropping the bomb, as a way to generate debate about, and fear of, new technologies. The latest bugaboos are artificial intelligence and genetic engineering. But neither of these technologies – even if they can be said to qualify as mature technologies – is even remotely as impactful as nuclear weapons.

Question 3

What could be the impact of a "Hiroshima" in your discipline?

In biosecurity circles, you often hear discussion of what would happen if there were “an event”. It is often not clear what that event might be, but it is presumed to be bad. The putative event could be natural or it could be artificial. Perhaps the event might kill many people as Hiroshima. (Though that would be hard, as even the most deadly organisms around today cannot wipe out populated cities in an instant.) Perhaps the event would be the intentional use of a biological weapon, and perhaps that weapon would be genetically modified in some way to enhance its capabilities. This would obviously be horrible. The impact would depend on where the weapon came from, and who used it. Was it the result of an ongoing state program? Was it a sample deployed, or stolen, from discontinued program? Or was it built and used by a terrorist group? A state can be held accountable by many means, but we are finding it challenging to hold non-state groups to account. If the organism is genetically modified, it is possible that there will be pushback against the technology. But biotechnology is producing huge benefits today, and restrictions motivated by the response to an event would reduce those benefits. It is also very possible that biotechnology will be the primary means to provide remedies to bioweapons (probably vaccines or drugs), in which case an event might wind up pushing the technology even faster.

Question 4

After 1945, physicists, including Einstein, have committed an ethical reflection on their own work. has your discipline done the same ? is it doing the same today ?

Ethical reflection has been built into biotechnology from its origins. The early participants met at Asilomar to discuss the implications of their work. Today, students involved in the International Genetically Engineered Machines (iGEM) competition are required to complete a “policy and practices” (also referred to as “ethical, legal, and social implications” (ELSI)) examination of their project. This isn’t window dressing, by any means. Everyone takes it seriously. 

Question 5

Do you think it would be necessary to rase the public awarereness about the issues related to your work?

Well, I’ve been writing and speaking about this issue for 15 years, trying to raise awareness of biotechnology and where it is headed. My book, “Biology is Technology”, was specifically aimed at encouraging public discussion. But we definitely need to work harder to understand the scope and impact of biotechnology on our lives. No government measures very well the size of the biotechnology industry – either in terms of revenues or in terms of benefits – so very few people understand how economically pervasive it is already. 

Question 6

What is, according to you, the degree of liberty of scientists face to political and industrial powers that will exploit the results of the scientific works?

Scientists face the same expectation of personal responsibility as every other member of the societies to which they belong. That’s pretty simple. And most scientists are motivated by ideals of truth, the pursuit of knowledge, and improving the human condition. That is one reason why most scientists publish their results for others to learn from. But it is less clear how to control scientific results after they are published. I would turn your question in another direction, and say politicians and industrialists should be responsible for how they use science, rather than putting this all on scientists. If you want to take this back to the bomb, the Manhattan Project was a massive military operation in a time of war, implemented by both government and the private sector. It relied on science, to be sure, but it was very much a political and industrial activity – you cannot divorce these two sides of the Project.

Question 7

Do you think about accurate measures [?] to prevent further Hiroshima?

I constantly think about how to prevent bad things from happening. We have to pay attention to how new technologies are developed and used. That is true of all technologies. For my part, I work domestically and internationally to make sure policy makers understand where biotechnology is headed and what it can do, and also to make sure it is not misused. 

But I think the question is rather off target. Bombing Hiroshima was a conscious decision made by an elected leader in a time of war. It was a very specific sort of event in a very specific context. We are not facing any sort of similar situation. If the intent of the question is to make an analogy to intentional use of biological weapons, these are already illegal, and nobody should be developing or storing them under any circumstances. The current international arms control regime is the way to deal with it. If the intent is to allude to the prevention of “bad stuff”, then this is something that every responsible citizen should be doing anyway. All we can do is pay attention and keep working to ensure that technologies are not used maliciously.

The most important paragraph of The Gene Factory

The most important paragraph of Michael Specter's story about BGI:

"In the United States and in the West, you have a certain way," [BGI President Jian Wang] continued, smiling and waving his arms merrily. "You feel you are advanced and you are the best. Blah, blah, blah. You follow all these rules and have all these protocols and laws and regulations. You need somebody to change it. To blow it up. For the last five hundred years, you have been leading the way with innovation. We are no longer interested in following."

The Arrival of Nanopore Sequencing

(Update 1 March: Thanks to the anonymous commenter who pointed out the throughput estimates for existing instruments were too low.)

You may have heard a little bit of noise about nanopore sequencing in recent weeks.  After many years of development, Oxford Nanopore promises that by the end of the year we will be able to read DNA sequences by threading them through the eye of a very small needle.

How It Works: Directly Reading DNA

The basic idea is not new: as a long string of DNA pass through a small hole, its components -- the bases A, T, G, and C -- plug that hole to varying degrees.  As they pass through the hole, in this case an engineered pore protein derived from one found in nature, each base has slightly different interactions with the walls of the pore.  As a result, while passing through the pore each base lets different numbers of salt ions through, which allows one to distinguish between the bases by measuring changes in electrical current.  Because this method is a direct physical interrogation of the chemical structure of each base, it is in principal much, much faster than any of the indirect sequencing technologies that have come before.

There have been a variety of hurdles to clear to get nanopore sequencing working.  First you have to use a pore that is small enough to produce measurable changes in current.  Next the speed of the DNA must be carefully controlled so that the signal to noise ratio is high enough.  The pore must also sit in an insulating membrane of some sort, surrounded by the necessary electrical circuitry, and to become a useful product the whole thing must be easily assembled in an industrial manner and be mechanically stable through shipping and use.

Oxford Nanopore claims to have solved all those problems.  They recently showed off a disposable version of their technology -- called the MinIon -- containing 512 pores built into a disposable USB stick.  This puts to shame the Lava Amp, my own experiment with building a USB peripheral for molecular biology.  Here is one part I find extremely impressive -- so impressive it is almost hard to believe: Oxford claims they have reduced the sample handling to single (?) pipetting step.  Clive Brown, Oxford CTO, says "Your fluidics is a Gilson."  (A "Gilson" would be a brand of pipetter.)  That would be quite something.

I've spent a good deal of my career trying to develop simple ways of putting biological samples into microfluidic doo-dads of one kind or another.  It's never trivial, it's usually a pain in the ass, and sometimes it's a showstopper.  Blood, in particular, is very hard to work with.  If Oxford has made this part of the operation simple, then they have a winning technology just based on everyday ease of use -- what sometimes goes by the labels of "user experience" or "human factors".  Compared to the complexity of many other laboratory protocols, it would be like suddenly switching from MS DOS to OS X in one step.

How Well Does it Work?

The challenge for fast sequencing is to combine throughput (bases per hour) with read length (the number of contiguous bases read in one go).  Existing instruments have throughputs in the range of 10-55,000 megabases/day and read lengths from tens of bases to about 800 bases.  (See chart below.)  Nick Loman reports that using the MinIon Oxford has already run DNA of 5000 to 100,000 bases (5 kB to 100 kB) at speeds of 120-1000 bases per minute per pore, though accuracy suffers above 500 bases per minute.  So a single USB stick can run easily run at 150 megabases (MB) per hour, which basically means you can sequence full-length eukaryotic chromosomes in about an hour.  Over the next year or so, Oxford will release the GridIon instrument that will have 4 and then 16 times as many pores.  Presumably that means it will be 16 times as fast.  The long read lengths mean that processing the resulting sequence data, which usually takes longer than the actual sequencing itself, will be much, much faster.

This is so far beyond existing commercial instruments that it sounds like magic.  Writing in Forbes, Matthew Herper quotes Jonathan Rothberg, of sequencing competitor Ion Torrent, as saying "With no data release how do you know this is not cold fusion? ... I don't believe it."  Oxford CTO Clive Brown responded to Rothberg in the comments to Herper's post in a very reasonable fashion -- have a look.

Of course I want to see data as much as the next fellow, and I will have to hold one of those USB sequencers in my own hands before I truly believe it.  Rothberg would probably complain that I have already put Oxford on the "performance tradeoffs" chart before they've shipped any instruments.  But given what I know about building instruments, I think immediately putting Oxford in the same bin as cold fusion is unnecessary.

Below is a performance comparison of sequencing instruments originally published by Bio-era in Genome Synthesis and Design Futures in 2007.  (Click on it for a bigger version.)  I've hacked it up to include the approximate performance range of 2nd generation sequencers from Life, Illumina, etc, as well for a single MinIon.  That's one USB stick, with what we're told is a few minutes worth of sample prep.  How many can you run at once?  Notice the scale on the x-axis, and the units on the y-axis.  If it works as promised, the MinIon is so vastly better than existing machines that the comparison is hard to make.  If I replotted that data with log axis along the bottom then all the other technologies would be cramped up together way off to the left. (The data comes from my 2003 paper, The Pace and Proliferation of Biological Technologies (PDF), and from Service, 2006, The Race for the $1000 Genome).
 
Carlson_sequencer_performanc_2012.png The Broader Impact

Later this week I will try to add the new technologies to the productivity curve published in the 2003 paper.  Here's what it will show: biological technologies are improving at exceptional paces, leaving Moore's Law behind.  This is no surprise, because while biology is getting cheaper and faster, the density of transistors on chips is set by very long term trends in finance and by SEMATECH; designing and fabricating new semiconductors is crazy expensive and requires coordination across an entire industry. (See The Origin of Moore's Law and What it May (Not) Teach Us About Biological Technologies.)  In fact, we should expect biology to move much faster than semiconductors. 

Here are a few graphs from the 2003 paper:

...The long term distribution and development of biological technology is likely to be largely unconstrained by economic considerations. While Moore's Law is a forecast based on understandable large capital costs and projected improvements in existing technologies, which to a great extent determined its remarkably constant behavior, current progress in biology is exemplified by successive shifts to new technologies. These technologies share the common scientific inheritance of molecular biology, but in general their implementations as tools emerge independently and have independent scientific and economic impacts. For example, the advent of gene expression chips spawned a new industrial segment with significant market value. Recombinant DNA, gel and capillary sequencing, and monoclonal antibodies have produced similar results. And while the cost of chip fabs has reached upwards of one billion dollars per facility and is expected to increase [2012 update: it's now north of $6 billion], there is good reason to expect that the cost of biological manufacturing and sequencing will only decrease. [Update 2012: See "New Cost Curves" for DNA synthesis and sequencing.]

These trends--successive shifts to new technologies and increased capability at decreased cost--are likely to continue. In the fifteen years that commercial sequencers have been available, the technology has progressed ... from labor intensive gel slab based instruments, through highly automated capillary electrophoresis based machines, to the partially enzymatic Pyrosequencing process. These techniques are based on chemical analysis of many copies of a given sequence. New technologies under development are aimed at directly reading one copy at a time by directly measuring physical properties of molecules, with a goal of rapidly reading genomes of individual cells.  While physically-based sequencing techniques have historically faced technical difficulties inherent in working with individual molecules, an expanding variety of measurement techniques applied to biological systems will likely yield methods capable of rapid direct sequencing.

Cue nanopore sequencing. 

A few months ago I tweeted that I had seen single strand DNA sequence data generated using a nanopore -- it wasn't from Oxford. (Drat, can't find the tweet now.)  I am certain there are other labs out there making similar progress.  On the commercial front, Illumina is an investor in Oxford, and Life has invested in Genia.  As best I can tell, once you get past the original pore sequencing IP, which it appears is being licensed broadly, there appear to be many measurement approaches, many pores, and many membranes that could be integrated into a device.  In other words, money and time will be the primary barriers to entry.

(For the instrumentation geeks out there, because the pore is larger than a single base, the instrument actually measures the current as three bases pass through the pore.  Thus you need to be able to distinguish 4^3=64 levels of current, which Oxford claims they can do.  The pore set-up I saw in person worked the same way, so I certainly believe this is feasible.  Better pores and better electronics might reduce the physical sampling to 1 or 2 bases eventually, which should result in faster instruments.)

It may be that Oxford will have a first mover advantage for nanopore instruments, and it may be that they have amassed sufficient additional IP to make it rough for competitors.  But, given the power of the technology, the size of the market, and the number of academic competitors, I can't see that over the long term this remains a one-company game.

Not every sequencing task has the same technical requirements, so instruments like the Ion Torrent won't be put to the curbside.  And other technologies will undoubtedly come along that perform better in some crucial way than Oxford's nanopores.  We really are just at the beginning of the revolution in biological technologies.  Recombinant DNA isn't even 40 years old, and the electronics necessary for nanopore measurements only became inexpensive and commonplace in the last few years.  However impressive nanopore sequencing seems today, the greatest change is yet to come.

A Few Thoughts on Water

Years ago, I frequently commuted between Los Angeles and Seattle by air.  The contrast between the two cities was always a bit jarring, particularly in July and August -- high summer on the west coast of North America -- when the lawns in Seattle are brown while all the residential yards in Los Angeles are a beautiful emerald green.  Summer rainfall in Seattle is usually about 1.8 inches spread over those two months, while Los Angeles is essentially dry.

A couple of weeks ago I flew into LAX from the east coast and got another perspective on water use there.  My first glimpse of the basin was the smog lapping up against the rim of the San Gabriel Mountains. I managed to snap a quick photo after we had flown over the ridge (the smog is on the lower left, though the contrast was more impressive when we were looking from the east side).

IMG_0477.JPGEven in May it looks a little dry 'round those parts.

A few minutes later, I noticed large green patches covering the sides (usually the west side) of hills.  This continued all the way to downtown LA, and we were high enough for most of that time that I couldn't figure out why the locals were spending so much of their precious water keeping the sunset sides of hills green.  Then, finally, we passed over one low enough that the purpose jumped out at me.

Cemeteries.

Even in death, Los Angelinos maintain their homage to William Mulholland by keeping him eternally damp.  And in death, Los Angelinos continue to contribute to the smog shown above -- the grass covering the land of the dead is trimmed quite short.  Many, many square miles of it.  A cushy life, have those dead people.  And to be fair to Los Angeles (which, admittedly, is hard for me), Seattle, too, uses a great deal of water and hydrocarbons to keep our decaying ancestors covered with a trim layer of green.  It happens everywhere here.  Welcome to America.

Even the way the US irrigates land to feed the living represents a profligate use of water.  According to the USDA, 80% of the water consumed in this country goes to agriculture. (Note that "use" and "consumption" are often confused.  Agriculture and thermoelectric power generation both "use" about 40% of the nation's freshwater, but while almost 100% of the water used for power generation is returned to where it was taken from -- albeit somewhat warmer than when it was taken -- much of the of water put on crops is does not reach the roots or is evaporated and lost to the atmosphere.)  Notice that I did not use the word "waste", because some of the leakage winds up back in groundwater, or otherwise finds its way into the environment in a way that might be classified as "beneficial".

And pondering water use here in the US, and the impact on our economy, my thoughts turn to water use in Asia.  Much ado was made in the last couple of years about the IPCC report of anomalous melting of Asian glaciers, followed by the discovery that there was no actual data behind the assertion.

A recent paper in Science adds some much needed analysis to the story.  Walter Immerzeel and colleagues set out to understand the relative importance of meltwater and rainwater to river flows in Asia.  It is interesting to me that this sort of analysis wasn't done before now: "Earlier studies have addressed the importance of glacial and snow melt and the potential effects of climate change on downstream hydrology, but these are mostly qualitative or local in nature."

For five large river basins the authors used a combination of precipitation data, snow melt models, and evaporation rates, to calculate the Normalized Melt Index (NMI).  The NMI is the ratio of snow and glacier discharge to downstream discharge.  If all the water in a river downstream is from melting, then this ratio is obviously one; if the ratio is less than one, rainfall contributes more than meltwater, and if it larger than one, more water is lost through evaporation or other processes (like agriculture) and meltwater is more important for total flow.

Here are the results.  For each of the rivers, the authors calculated the percentage of the total discharge generated by snow and glacial melt:
 

Indus

151%

Brahmaputra

27%

Ganges

10%

Yangtze

8%

Yellow

8%

In other words, water supplies in the Indus river valley are largely dependent on meltwater, whereas the large river systems in China appear to be less dependent on meltwater.  That is a very interesting result, because the story told by lots of people (including myself) about the future of water in China is that they are in big trouble due to glacial melting in the Himalayas.  Assuming this result holds up, China may be better off in a warmer world that I had anticipated.

The authors also used various projections of snow and rainfall to estimate what water supplies would look like in these rivers in 2050.  As you might expect, a warmer world leads to less snowfall, more melting, and lower river flows.  But as the warmer world brings increased rainfall, the impact is smaller than has been widely assumed.  I am not going to bother putting any of the numbers in here, because, as the authors note, "Results should be treated with caution, because most climate models have difficulty simulating mean monsoon and the interannual precipitation variation, despite recent progress in improving the resolution of anticipated spatial and temporal changes in precipitation."

But they went one step further and tried to estimate the effects of potential decreased water supply on local food supplies.  Couched in terms of crop yields, etc., Immerzeel et al estimate that the Brahmaputra will support about 35 million fewer people, the Indus will support about 26 million fewer people -- that's food for 60 million fewer people in India and Pakistan, if you are counting -- and the Yellow about 3 million more people.  Finishing up, they write:

We conclude that Asia's water towers are threatened by climate change, but that the effects of climate change on water availability and food security in Asia differ substantially among basins and cannot be generalized. The effects in the Indus and Brahmaputra basins are likely to be severe owing to the large population and the high dependence on irrigated agriculture and meltwater. In the Yellow River, climate change may even yield a positive effect as the dependence on meltwater is low and a projected increased upstream precipitation, when retained in reservoirs, would enhance water availability for irrigated agriculture and food security.

I am perplexed by the take on these results over at Nature News by Richard Lovett.  His piece carries the title, "Global warming's impact on Asia's rivers overblown".  I'll give Lovett the out that he may not have written the actual headline (Editors!), but nonetheless he sets up the Immerzeel paper as a big blow to some unnamed group of doomsayers.  Perhaps he imagines that Immerzeel completely undermines the IPCC report?  This is hardly the case.  As I wrote last January, sorting out the mistake over Himalayan melting rates is an example of science working through a blunder.  Instead overturning some sort of vague conspiracy, as best I can tell Immerzeel is simply the first real effort to make quantitative assessments of something to which much more attention should have been paid, much earlier than it was.

And even Lovett appears to acknowledge that reducing the human carrying capacity of the Brahmaputra and Indus river valleys by 60 million people is something to be concerned about.  From Lovett: 

The findings are important for policy-makers, says Jeffrey Kargel, a glaciologist at the University of Arizona in Tucson. "This paper adds to mounting evidence that the Indus Basin [between India and Pakistan] is particularly vulnerable to climate change," says Kargel. "This is a matter that obviously concerns India and Pakistan very much."

Indeed.  As they should concern us all.

"National Strategy for Countering Biological Threats"

I recently had cause to re-read the National Strategy for Countering Biological Threats (Full PDF), released last fall by the National Security Council and signed by the President. I think there is a lot to like, and it demonstrates a welcome change in the mindset I encounter in Washington DC.

When the document came out, there was just a little bit of coverage in the press. Notably, Wired's Threat Level, which usually does a commendable job on security issues, gave the document a haphazard swipe, asserting that "Obama's Biodefense Strategy is a Lot Like Bush's".  As described in that post, various commentators were unhappy with the language that Under Secretary of State Ellen Tauscher used when announcing the Strategy at a BWC meeting in Geneva. According to Threat Level, "Sources tell this reporter that the National Security Council had some Bush administration holdovers in charge of editing the National Strategy and preparing Ms. Tauscher's script, and these individuals basically bulldozed the final draft through Defense and State officials with very little interagency input and with a very short suspense." Threat Level also asserts that "Most are disappointed in the language, which doesn't appear to be significantly different than the previous administration." It is unclear who "Most" are.

In contrast to all of this, in my view the Strategy is a clear departure from the muddled thinking that dominated earlier discussions. By muddled, I mean security discussions and policy that, paraphrasing just a little, went like this: "Biology Bad! Hacking Bad! Must Contain!" 

The new National Strategy document takes a very different line. Sources tell this reporter, if you will, that the document resulted from a careful review that involved multiple agencies, over many months, with an aim to develop the future biosecurity strategy of the United States in a realistic context of rapidly spreading infectious diseases and international technological proliferation driven by economic and technical needs. To wit, here are the first two paragraphs from the first page (emphasis added, of course):

We are experiencing an unparalleled period of advancement and innovation in the life sciences globally that continues to transform our way of life. Whether augmenting our ability to provide health care and protect the environment, or expanding our capacity for energy and agricultural production towards global sustainability, continued research and development in the life sciences is essential to a brighter future for all people.

The beneficial nature of life science research is reflected in the widespread manner in which it occurs. From cutting-edge academic institutes, to industrial research centers, to private laboratories in base­ments and garages, progress is increasingly driven by innovation and open access to the insights and materials needed to advance individual initiatives.

Recall that this document carries the signature of the President of the United States.  I'll pause to let that sink in for a moment.

And now to drive home the point: the new Strategy for Countering Biological Threats explicitly points to garage biotech innovation and open access as crucial components of our physical and economic security. I will note that this is a definite change in perspective, and one that has not fully permeated all levels of the Federal bureaucracy and contractor-aucracy. Recently, during a conversation about locked doors, buddy systems, security cameras, and armed guards, I found myself reminding a room full of biosecurity professionals of the phrase emphasized above. I also found myself reminding them -- with sincere apologies to all who might take offense -- that not all the brains, not all the money, and not all the ideas in the United States are found within Beltway. Fortunately, the assembled great minds took this as intended and some laughter ensued, because they realized this was the point of including garage labs in the National Strategy, even if not everyone is comfortable with it. And there are definitely very influential people who are not comfortable with it. But, hey, the President signed it (forgive me, did I mention that part already?), so everyone is on board, right?

Anyway, I think the new National Strategy is a big step forward in that it also acknowledges that improving public health infrastructure and countering infectious diseases are explicitly part of countering artificial threats. Additionally, the Strategy is clear on the need to establish networks that both promulgate behavioral norms and that help disseminate information. And the new document clearly recognizes that these are international challenges (p.3):

Our Strategy is targeted to reduce biological threats by: (1) improving global access to the life sciences to combat infectious disease regardless of its cause; (2) establishing and reinforcing norms against the misuse of the life sciences; and (3) instituting a suite of coordinated activities that collectively will help influence, identify, inhibit, and/or interdict those who seek to misuse the life sciences.

...This Strategy reflects the fact that the challenges presented by biological threats cannot be addressed by the Federal Government alone, and that planning and participation must include the full range of domestic and international partners.

Norms, open biology, better technology, better public health infrastructure, and better intelligence: all are themes I have been pushing for a decade now. So, 'nuff said on those points, I suppose.

Implementation is, of course, another matter entirely. The Strategy leaves much up to federal, state, and local agencies, not all of whom have the funding, expertise, or inclination to follow along. I don't have much to say about that part of the Strategy, for now. But I am definitely not disappointed with the rest of it. It is, you might say, the least bad thing I have read out of DC in a long time.

Good Climate Data, Bad Climate "Data" -- Science Always Wins.

This week brings news of 1) a dramatic improvement in the estimates of how soil carbon content is related to atmospheric carbon concentration and 2) the exposure of some really crappy work on the rate of melting of Himalayan glaciers by the International Panel on Climate Change (IPCC).  The soil carbon work is Good Data, but Bad News if you care about the effects of high atmospheric carbon concentrations, while the Himalayan glacier story is all about terrible peer review and Bad Data (non-existent data, actually), which doesn't help anybody figure out the real story on water supplies in Asia.

First up, a paper from this week's PNAS by Breeker et al at UT Austin, "Atmospheric CO2 concentrations during ancient greenhouse climates were similar to those predicted for A.D. 2100".  Already from the title you can see where this is going.

The problem Breeker and colleagues address is the following: how do you correlate the carbon content of fossil soils with prevailing atmospheric carbon dioxide concentrations?  Well established methods exist for measuring the carbon content of compounds in fossil soil, but less certain were conditions under which chemical reactions produce those particular compounds.  It turns out that model used to infer atmospheric CO2 contained an error.  Breeker determined that the primary compound assayed when determining soil carbon content forms at much lower atmospheric CO2 concentrations than had been assumed.

Prior attempts to correlate soil carbon (and by proxy atmospheric CO2) with greenhouse periods in Earth's climate had concluded that warm periods experienced CO2 concentrations of much greater than ~1000 parts per million (ppm).  Therefore, one might conclude that only when average atmospheric CO2 spiked above this level would we be in danger of experiencing greenhouse gas warming that threatened glaciers.  The correction supplied by Breeker substantially lowers estimates of the average CO2 concentration that is correlated with continental glacial melting.  Eyeballing the main figure in the paper, it looks to me like we could be in real trouble above 450 ppm -- today we are at just shy of 390 ppm and there is no sign we will be slowing down anytime soon, particularly if India and China keep up their pace of development and emissions.

Looking forward to 2100, things get a touch squiffy because Breeker relies on an estimate of CO2 concentrations that come out of model of global economic activity.  So the title of the paper might be a tad alarmist, simply because 2100 is a long way out for any model to be taken too seriously.  But the correction of the paleodata is a big story because at minimum it reduces the uncertainty of atmospheric CO2 levels, and it appears to clarify the connection between CO2 levels and continental glaciation.  More work is needed on the later point, obviously, or this paper would have been on the cover of Science or Nature.

Now on to a serious screw-up at the IPCC.  Elisabeth Rosenthal at the NYT is reporting that "A much-publicized estimate from a United Nations panel about the rapid melting of Himalayan glaciers from climate change is coming under fire as a gross exaggeration."  Here is Andrew Revkin's take on DotEarth, and anyone interested in this story should read through his post.  The comments are worth perusing because some of the contributors actually seem to have additional useful knowledge, though, of course, nut jobs aplenty show up from both sides of the debate over climate change.

In a nutshell, the issue is that the most recent IPCC chapter on glaciers contained a conclusion, advertised as real analysis, that was in fact a speculation by one scientist promulgated through the popular press.  The authors of that section of the IPCC report may have been warned about the unsubstantiated claim.  Contradictory data and analysis seems to have been ignored.

So, to be frank, this is a giant, inexcusable fuck-up.  The IPCC is composed of so many factions and interest groups that this may be a case of simple blundering or of blatant politicization of science.  But here is the beautiful thing about science -- it is self-correcting.  It may take a while, but science always wins.  (See also my post of a couple of years ago, Dispelling a Climate Change Skeptic's "Deception".)  Every newspaper story I have seen about this particular IPCC screw-up notes that it was brought to light by...wait for it...a climate scientist.  It is an excellent public airing of dirty laundry by the community of science.  So while this episode demonstrates that the last official IPCC report on glacial melting in the Himalayas should not be used for any sort of scientific policy recommendation or economic forecast, you can bet that the next report will do a damn fine job on this topic. 

Finally, whether or not the IPCC gets its act together, there are plenty of good data out there on the state of the planet.  Eventually, Science -- with a capital S -- will get the right answer.  The same methodical process that has resulted in computers, airplanes, and non-stick fry pans will inevitably explain what is really going on with our climate.  And if you use computers, fly on airplanes, or eat scrambled eggs then you are implicitly acknowledging, whatever your political or religious persuasion, that you believe in science.  And you better, 'cause science always wins.

Video from The Economist's World in 2010 Festival

The Economist has posted video from the World in 2010 Festival, held in Washington DC in early December.  The Innovation panel is below, with me (Biodesic), Dean Kamen (DEKA Research), Dwayne Spradlin (Innocentive), and Kai Huang (Guitar Hero), moderated by Mathew Bishop (The Economist).  (Here is a link to video selections from the rest of the event.)  I was chatting with a reporter a few days ago who observed that everyone else on the panel is quite wealthy -- hopefully that bodes well for me in 2010.  But maybe I am destined always to be the odd man out.  C-Span is re-running the video periodically on cable if you want to watch it on a bigger screen, but I can't seem to find an actual schedule.  (Here is their web version: Innovation in 2010.)


I have a couple of general thoughts about the event, colored by another meeting full of economists, bankers, and traders that I attended in the last week of December.  I met a number of fantastically accomplished and interesting people in just a few hours, many of whom I hope will remain lifelong friends. 

First, I have to extend my thanks to The Economist -- they have been very good to me over the last 10 years, beginning in 2000 by co-sponsoring (with Shell) the inaugural World in 2050 writing competition.  (Here is my essay from the competition (PDF).  It seems to be holding up pretty well, these 10 years later, save the part about building a heart.  But at least I wasn't the only one who got that wrong.)

Here is a paraphrased conversation over drinks between myself and Daniel Franklin, the Executive Editor of the newspaper.

Me:  I wanted to thank you for including me.  The Economist has been very kind to me over the past decade.
Franklin: Well, keep doing interesting things.
Me:  Umm, right.  (And then to myself: Shit, I have a lot of work to do.)

On to the World in 2010 Festival.  The professional economists and journalists present all seem to agree that we have seen the worst of the downturn, that the stimulus package clipped the bottom off of whatever we were falling into, and that employment gains going forward could be a long time in coming.  Unsurprisingly, the Democratic politicians and operatives who turned up crowed about the effects of the stimulus, while the Republicans who spoke poo-pooed any potential bright spots in, well, just about everything.

At the other meeting I attended, last week in Charleston, SC, one panel of 10 people, composed Federal reserve and private bankers, traders, and journalists couldn't agree on anything.  The recovery would be V shaped.  No, no, W shaped.  No, no, no, reverse square root shaped (which was the consensus at The World in 2010 Festival).  No, no, no, no, L shaped.  But even those who agreed on the shape did not agree on anything else, such as the availability of credit, employment, etc.

Basically, as far as I can tell, nobody has the slightest idea what the future of the US economy looks like.  And I certainly don't have anything to add to that.  Except, of course, that the future is biology.

Here is John Oliver's opening monologue from the Festival.  He was absolutely hilarious.  Unfortunately you can't hear the audience cracking up continuously.  I nearly pissed myself.  Several times.  (Maybe the cocktails earlier in the evening contributed to both reactions.)

Back to Innovation in 2010.  Dean Kamen had this nice bit in response to a question about whether the imperative to invent and innovate has increased in recent years (see 36:20 in the C-Span video): "7 billion people can't be recipients, they have to be part of the solution.  And that is going to require advanced technologies to be properly developed and properly deployed more rapidly than ever before."

To this I can only add that we are now seeing more power to innovate put into the hands of individuals than has ever occurred in the history of humanity.  Let's hope we don't screw up.