Bumps for Biofuels and Growing Pains for the BioEconomy

I found this post, written in early 2008, for some reason sitting unpublished in my archives.  It is just as relevant today, now that we are through the worst of the economic meltdown, so I'll push the "publish" button in just a moment.  I updated the revenue numbers for the US, but otherwise it is unchanged.  I note that high farm prices are again putting pressure on the amount of land in the conservation reserve program.

---------------

Just as we reach the point where biological technologies can begin to economically replace the industrial chemistry we have relied on for the last two centuries, the price of raw materials is going through the roof.  As explored in my recent article, "Laying the Foundations for a Bio-Economy", the contribution of "genetically modified stuff" to the U.S. economy already amounts to the equivalent of more than 2% of GDP, or north of $300 billion.  [See the Biodesic 2011 Bioeconomy Update for the updated revenue numbers.]  About 80% of this total is from agriculture and industrial products, where revenues from the latter are growing 15-20% a year.  But as more products of industrial biotechnology hit the market, they will compete for more expensive feedstock resources.

The New York Times carried two stories on 9 April that illustrate some of the attendant issues.  In, "Harnessing Biology, and Avoiding Oil, for Chemical Goods", Yudhijit Bhattacharjee" gives a short summary of the shift from chemistry to biology for producing everything from plastic to fuel.  I've written here before about DuPont's success with Sorona, a plastic made using corn processed by engineered bacteria.  By itself, Sorona is already a billion-dollar product.  It seems DuPont has discovered additional uses for materials that are produced using biology:

The payoffs from developing biobased chemicals could be huge and unexpected, said Dr. John Pierce, DuPont's vice president for applied biosciences-technology. He pointed to DuPont's synthesis of propanediol, which was pushed along by the company's goal to use the chemical to make Sorona, a stain-resistant textile that does not lose color easily.

Soon DuPont scientists realized that bioderived propanediol could also be used as an ingredient in cosmetics and products for de-icing aircraft. The high-end grades that are now used in cosmetics are less irritating than traditional molecules, Dr. Pierce said, and the industrial grade used in de-icing products is biodegradable, which makes it better than other options.

DuPont is, of course, not the only one in this game.  Cathay Industrial Biotech, for example, ships many different polymers composed of long chain dicarboxylic acids, which are derived from corn and used in anticorrosion products for cars.  Both firms are buying more corn just as prices for commodities are headed through the roof.  Higher prices are now leading U.S farmers to pull land out of conservation programs for use in producing more crops, as described by David Streitfeld in, "As Prices Rise, Farmers Spurn Conservation Program".  Corn, wheat, and soy prices are all up, but so are the prices of oil and fertilizer.

Ostensibly, the Conservation Reserve Program pays farmers to keep environmentally sensitive land out of production.   In the context of a grain surplus, this has the effect of reducing the total amount of land in production, thereby keeping prices a bit higher.  But the surplus of recent decades is over, due in large part to increases in demand in developing countries (see, for example, my post "China and Future Resource Demands"). 

The utility of keeping lands in conservation programs is debated intensely by a range of interested parties, including farmers, policy makers, conservationists, hunters, and even bakers.  From Streitfled's article:

"We're in a crisis here. Do we want to eat, or do we want to worry about the birds?" asked JR Paterakis, a Baltimore baker who said he was so distressed at a meeting last month with Edward T. Schafer, the agriculture secretary, that he stood up and started speaking "vehemently."

The Paterakis bakery, H&S, produces a million loaves of rye bread a week. The baker said he could not find the rye flour he needed at any price.   

..."The pipeline for wheat is empty," said Michael Kalupa, a bakery owner in Tampa, Fla., who is president of the Retail Bakers of America. Mr. Kalupa said the price he paid for flour had doubled since October. He cannot afford to absorb the cost and he cannot afford to pass it on. Sales have been falling 16 percent to 20 percent a month since October. He has laid off three employees.

Among farmers, the notion of early releases from conservation contracts is prompting sharp disagreement and even anger. The American Soybean Association is in favor. "We need more food," said John Hoffman, the association's president.

The National Association of Wheat Growers is against, saying it believes "in the sanctity of contracts." It does not want more crops to be grown, because commodity prices might go down.

That is something many of its members say they cannot afford, even with wheat at a robust $9 a bushel. Their own costs have increased, with diesel fuel and fertilizer up sharply. "It would decrease my profit margin, which is slim," said Jeff Krehbiel of Hydro, Okla. "Let's hurt the farmer in order to shut the bakers up, is that what we're saying?"

Mr. Krehbiel said his break-even last year was $4 a bushel. This summer it will be $6.20; the next crop, $7.75.

That a baker  in the U.S. can't even find the flour he needs is remarkable, though it may not actually be a harbinger of food shortages.  One reason that baker is having trouble is no doubt an increase in demand, and another, equally without doubt, is due to shifting grain production priorities that accommodate increased use of biofuels.

Much in the news the last couple of months has been the assertion that production and use of biofuels is largely responsible for recent increases in food prices.  But how much of the price increase is due to shifting crops to fuel use?

Further Thoughts on iGEM 2011

Following up on my post of several weeks ago (iGEM 2011: First Thoughts), here is a bit more on last year's Jamboree.  I remain very, very impressed by what the teams did this year.  And I think that watching iGEM from here on out will provide a sneak peak of the future of biological technologies.

I think the biggest change from last year is the choice of applications, which I will describe below.  And related to the choice of applications is change of approach to follow a more complete design philosophy.  I'll get to the shift in design sensibility further on in the post.

The University of Washington: Make it or Break it

I described previously the nuts and bolts of the University of Washington's Grand Prize winning projects.  But, to understand the change in approach (or perhaps change in scope?) this project represents, you also have to understand a few details about problems in the real world.  And that is really the crux of the matter -- teams this year took on real world problems as never before, and may have produced real world solutions.

Recall that one of the UW projects was the design of an enzyme that digests gluten, with the goal of using that enzyme to treat gluten intolerance.  Candidate enzymes were identified through examining the literature, with the aim of finding something that works at low pH.  The team chose a particular starter molecule, and then used the "video game" Foldit to re-design the active site in silico so that it would chew up gluten (here is a very nice Youtube video on the Foldit story from Nature).  They then experimentally tested many of the potential improvements.  The team wound up with an enzyme that in a test tube is ~800 times better than one already in clinical trials.  While the new enzyme would of course itself face lengthy clinical trials, the team's achievement could have an enormous impact on people who suffer from celiac disease, among many other ailments.

From a story in last week's NYT Magazine ("Should We All Go Gluten-Free?"), here are some eye-opening stats on celiac disease, which can cause symptoms ranging from diarrhea to dramatic weight loss:

  • Prior to 2003, prevalence in the US was thought to be just 1 in 10,000: widespread testing revealed the actual rate was 1 in 133.
  • Current estimates are that 18 million Americans have some sort of gluten intolerance, which is about 5.8% of the population.
  • Young people were 5x more likely to have the disease by the 1990s than in the 1950s based on looking at old blood samples.
  • Prevalence is increasing not just in US, but also worldwide.

In other words, celiac disease is a serious metabolic issue that for some reason is affecting ever larger parts of the global population.  And as a summer project a team of undergraduates may have produced a (partial) treatment for the disease.  That eventual treatment would probably require tens of millions of dollars of further investment and testing before it reaches the market.  However, the market for gluten-free foods, as estimated in the Times, is north of $6 billion and growing rapidly.  So there is plenty of market potential to drive investment based on the iGEM project.

The other UW project is a demonstration of using E. coli to directly produce diesel fuel from sugar.  The undergraduates first reproduced work published last year from LS9 in which E. coli was modified to produce alkanes (components of diesel fuel -- here is the Science paper by Schirmer et al).  Briefly, the UW team produced biobricks -- the standard format used in iGEM -- of two genes that turn fatty acids into alkanes.  Those genes were assembled into a functional "Petrobrick".  The team then identified and added a novel gene to E. coli that builds fatty acids from 3 carbon seeds (rather than the native coli system that builds on 2 carbon seeds).  The resulting fatty acids then served as substrates for the Petrobrick, resulting in what appears to be the first report anywhere of even-chain alkane synthesis.  All three genes were packaged up into the "FabBrick", which contains all the components needed to let E. coli process sugar into a facsimile of diesel fuel.

The undergraduates managed to substantially increase the alkane yield by massaging the culture conditions, but the final yield is a long way from being useful to produce fuel at volume.  But again, not bad for a summer project.  This is a nice step toward turning first sugar, then eventually cellulose, directly into liquid fuels with little or no purification or post-processing required.  It is, potentially, also a step toward "Microbrewing the Bioeconomy".  For the skeptics in the peanut gallery, I will be the first to acknowledge that we are probably a long way from seeing people economically brew up diesel in their garage from sugar.  But, really, we are just getting started.  Just a couple of years ago people thought I was all wet forecasting that iGEM teams would contribute to technology useful for distributed biological manufacturing of fuels.  Now they are doing it.  For their summer projects.  Just wait a few more years.

Finally -- yes, there's more -- the UW team worked out ways to improve the cloning efficiency of so-called Gibson cloning.  They also packaged up as biobricks all the components necessary to produce magnetosomes in E. coli.  The last two projects didn't make it quite as far as the first two, but still made it further than many others I have seen in the last 5 years.

Before moving on, here is a thought about the mechanics of participating in iGEM.  I think the UW wiki is the about best I have seen.   I like very much the straightforward presentation of hypothesis, experiments, and results.  It was very easy to understand what they wanted to do, and how far they got.  Here is the "Advice to Future iGEM Teams" I posted a few years ago.  Aspiring iGEM teams should take note of the 2011 UW wiki -- clarity of communication is part of your job.

Lyon-INSA-ENS: Cobalt Buster

The team from Lyon took on a very small problem: cleaning up cooling water from nuclear reactors using genetically modified bacteria.  This was a nicely conceived project that involved identifying a problem, talking to stakeholders, and trying to provide a solution.  As I understand it, there are ongoing discussions with various sponsors about funding a start-up to build prototypes.  It isn't obvious that the approach is truly workable as a real world solution -- many questions remain -- but the progress already demonstrated indicates that dismissing this project would be premature.

Before continuing, I pause to reflect on the scope of Cobalt Buster.  One does wonder about the eventual pitch to regulators and the public: "Dear Europe, we are going to combine genetically modified organisms and radiation to solve a nuclear waste disposal problem!"  As the team writes on its Human Practices page: "In one project, we succeed to gather Nuclear Energy and GMOs. (emphasis in original)"  They then acknowledge the need to "focus on communication".  Indeed.

Here is the problem they were trying to solve: radioactive Cobalt (Co) is a contaminant emitted during maintenance of nuclear reactors.  The Co is typically cleaned up with ion exchange resins, which are both expensive and when used up must be appropriately disposed of as nuclear waste.  By inserting a Co importer pump into E. coli, the Lyon team hopes to use bacteria to concentrate the Co and thereby clean up reactor cooling water.  That sounds cool, but the bonus here is that modelling of the system suggests that using E. coli as a biofilter in this way would result in substantially less waste.  The team reports that they expect 8000kg of ion exchange resins could be replaced with 4kg of modified bacteria.  That factor of 2000 in volume reduction would have a serious impact on disposal costs.  And the modified bug appears to work in the lab (with nonradioactive Cobalt), so this story is not just marketing.

The Lyons team also inserted a Co sensor into their E. coli strain.  The sensor then drove expression of a protein that forms amyloid fibers, causing the coli in turn to form a biofilm.  This biofilm would stabilize the biofilter in the presence of Co.  The filter would only be used for a few hours before being replaced, which would not give the strain enough time to lose this circuit via selection.

Imperial College London: Auxin

Last, but certainly not least, is the very well thought through Imperial College project to combat soil erosion by encouraging plant root growth.  I saved this one for last because, for me, the project beautifully reflects the team's intent to carefully consider the real-world implications of their work.  There are certainly skeptics out there who will frown on the extension of iGEM into plants, and who feel the project would never make it into the field due to the many regulatory barriers in Europe.  I think the skeptics are completely missing the point.

To begin, a summary of the project: the Imperial team's idea was to use bacteria as a soil treatment, applied in any number of ways, that would be a cost-effective means of boosting soil stability through root growth.  The team designed a system in which genetically modified bacteria would be attracted to plant roots, would then take up residence in those roots, and would subsequently produce a hormone that encourages root growth.

The Auxin system was conceived to combine existing components in very interesting ways.  Naturally-occurring bacteria have already been shown to infiltrate plant roots, and other soil-dwelling bacteria produce the same growth hormone that encourages root proliferation.

Finally, the team designed and built a novel (and very clever) system for preventing leakage of transgenes through horizontal gene transfer.  On the plasmid containing the root growth genes, the team also included genes that produce proteins toxic to bacteria.  But in the chromosome, they included an anti-toxin gene.  Thus if the plasmid were to leak out and be taken up by a bacterium without the anti-toxin gene, any gene expression from the plasmid would kill the recipient cell.

The team got many of these pieces working independently, but didn't quite get the whole system working together in time for the international finals.  I encourage those interested to have a look at the wiki, which is really very good.

The Shift to Thinking About Design

As impressive as Imperial's technical results were, I was also struck by the integration of "human practices" into the design process.  The team spoke to farmers, economists, Greenpeace -- the list goes on -- as part of both defining the problem and attempting to finesse a solution given the difficulty of fielding GMOs throughout the UK and Europe.  And these conversations very clearly impacted the rest of the team's activities.

One of the frustrations felt by iGEM teams and judges alike is that "human practices" has often felt like something tacked on to the science for the sake of placating potential critics.  There is something to that, as the Ethical, Legal, and Social Implications (ELSI) components of large federal projects such as The Human Genome Project and SynBERC appear to have been tacked on for just that reason.  Turning "human practices" into an appendix on the body of science is certainly not the wisest way to go forward, for reasons I'll get to in a moment, nor is it politically savvy in the long term.  But if the community is honest about it, tacking on ELSI to get funding has been a successful short-term political hack.

The Auxin project, along with a few other events during the finals, helped crystallize for me the disconnect between thinking about "human practices" as a mere appendix while spouting off about how synthetic biology will be the core of a new industrial revolution, as some of us tend to do.  Previous technological revolutions have taught us the importance of design, of thinking the whole project through at the outset in order to get as much right as possible, and to minimize the stuff we get wrong.  We should be bringing that focus on design to synthetic biology now.

I got started down this line of thought during a very thought-provoking conversation with Dr. Megan Palmer, the Deputy Director for Practices at SynBERC.  (Apologies to you, Megan, if I step your toes in what follows -- I just wanted to get these thoughts on the page before heading out the door for the holidays.)  The gist of my chat with Megan was that the focus on safety and security as something else, as an activity separate from the engineering work of SB, is leading us astray.  The next morning, I happened to pass Pete Carr and Mac Cowell having a chat just as one of them was saying, "The name human practices sucks. We should really change the name."  And then my brain finally -- amidst the jet lag and 2.5 days of frenetic activity serving as a judge for iGEM -- put the pieces together.  The name does suck.  And the reason it sucks is that it doesn't really mean anything.

What the names "human practices" and "ELSI" are trying to get at is the notion that we shouldn't stumble into developing and using a powerful technology without considering the consequences.  In other fields, whether you are thinking about building a chair, a shoe, a building, an airplane, or a car, in addition to the shape you usually spend a great deal of time thinking about where the materials come from, how much the object costs to make, how it will be used, who will use it, and increasingly how it will be recycled at end of use.  That process is called design, and we should be practicing it as an integral part of manipulating biological systems.

When I first started as a judge for iGEM, I was confused by the kind of projects that wound up receiving the most recognition.  The prizes were going to nice projects, sure, but those projects were missing something from my perspective.  I seem to recall protesting at some point in that first year that "there is an E in iGEM, and it stands for Engineering."  I think part of that frustration was the pool of judges was dominated for many years by professors funded by the NIH, NRC, or the Welcome Trust, for example -- scientists who were looking for scientific results they liked to grace the pages of Science or Nature -- rather than engineers, hackers, or designers who were looking for examples of, you know, engineering.

My point is not that the process of science is deficient, nor that all lessons from engineering are good -- especially as for years my own work has fallen somewhere in between science and engineering.  Rather, I want to suggest that, given the potential impact of all the science and engineering effort going into manipulating biological systems, everyone involved should be engaging in design.  It isn't just about the data, nor just about shiny objects.  We are engaged in sorting out how to improve the human condition, which includes everything from uncovering nature's secrets to producing better fuels and drugs.  And it is imperative that as we improve the human condition we do not diminish the condition of the rest of the life on this planet, as we require that life to thrive in order that we may thrive.

Which brings me back to design.  It is clear that not every experiment in every lab that might move a gene from one organism to another must consider the fate of the planet as part of the experimental design.  Many such experiments have no chance of impacting anything outside the test tube in which they are performed.  But the practice of manipulating biological systems should be done in the context of thinking carefully about what we are doing -- much more carefully than we have been, generally speaking.  Many fields of human endeavor can contribute to this practice.  There is a good reason that ELSI has "ethical", "legal", and "social" in it.

There have been a few other steps toward the inclusion of design in iGEM over the years.  Perhaps the best example is the work designers James King and Daisy Ginsburg did with the 2009 Grand Prize Winning team from Cambridge (see iGEM 2009: Got Poo?).  That was lovely work, and was cleverly presented in the "Scatalog".  You might argue that the winners over the years have had increasingly polished presentations, and you might worry that style is edging out substance.  But I don't think that is happening.  The steps taken this year by Imperial, Lyon, and Washington toward solving real-world problems were quite substantive, even if those steps are just the beginning of a long path to get solutions into people's hands.  That is the way innovation work s in the real world.

iGEM 2011: First Thoughts

Congratulations to the 2011 University of Washington iGEM team for being the first US team ever to win the Grand Prize.  The team also shared top honors for Best Poster (with Imperial College London) and for Best Food/Energy Project (with Yale).  The team also had (in my opinion) the clearest, and perhaps best overall, wiki describing the project that I have seen in 5 years as an iGEM judge.  I only have a few minutes in the airport to post this, but I will get back to it later in the week.

The UW team had an embarrassment of riches this year.  One of the team's projects demonstrated production of both odd and even chain alkanes in E. coli directly from sugar.  The odd-chain work reproduces the efforts of a Science paper published by LS9 last year, but the team also added an enzyme from B. subtilis to the pathway that builds alkanes starting from a 3-carbon seed rather than the normal 2-carbon seed in coli.  This latter step allowed them to make even-chain alkanes via a synthetic biological pathway, which has not been reported elsewhere.  So they wound up directly making diesel fuel from sugar.  The yields aren't all there yet to roll out this sort of thing more widely, but its not so bad for a summer project.

And that's not all.

The other main project was an effort to produce an enzyme to digest gluten.  There is one such enzyme in clinical trials at the moment, intended for use as a therapeutic for gluten intolerance, which afflicts about 1% of the population.  However, that enzyme is not thermostable and has an optimum pH of 7.

The UW team found an enzyme in the literature that was not known to digest gluten, but which works at pH 4 (close to the human stomach) and is from a thermophilic organism.  They used Foldit to redesign the enzyme to process gluten, and then built a library of about 100 variants of that design.  One of those variants wound up working ~800 times better than the enzyme that is currently in clinical trials.  And the team thinks they can do even better by combining some of the mutants from the library.

Nice work.

I could go on and on about the competition this year.  The teams are all clearly working at a new level.  I recall that a couple of years ago at iGEM Drew Endy asked me, somewhat out of frustration, "Is this it?  Is this all there is?"  The answer: No.  There is a hell of a lot more.  And the students are just getting started.

Plenty of other teams deserve attention in this space, in particular Imperial College London, the runner up.  They built a system (called Auxin) in E. coli to encourage plant root growth, with the aim of stopping desertification.  And their project was an extremely good example of design, from the technical side through to conversations with customers (industry) and other stakeholders (Greenpeace) about what deployment would really be like.

More here later in the week.  Gotta run for the plane.

It is the End of the World as We Know it, and I feel Strangely Ambivalent: Synthetic Biology 5.0

Synthetic Biology 5.0 has come and gone.  I expected, as in previous years, to be busy liveblogging amid the excitement.  I tweeted some during the proceedings (here is Eric Ma's summary of #synbio5 tweets), but this is my first post about the meeting, and probably the last one.  I mostly just listened, took a few notes, and was delighted to see the progress being made.  I was not nearly as amped up about the proceedings as in previous years, and I am still trying to figure out why. 

Here are a couple of reasons I have sorted out so far.  It was the end of the beginning of synthetic biology.  The meeting was full of science and engineering.  And that's about all.  There were a few VC's and other investors sniffing around, but not nearly so many as in previous years; those who did show up kept a lower profile.  There were also fewer obvious government officials, no obvious spooks, no obvious law enforcement officers, nor any self-identified Weapons of Mass Destruction Coordinators.  And I only encountered a couple of reporters, though there must have been more.  I skipped 3.0 in Zurich, but at 1.0 at MIT, 2.0 at Berkeley (parts 1, 2, 3, 4, 5), and 4.0 in Hong Kong (part 1), there was much more buzz.  Synthetic Biology 5.0 was much shorter on hype than prior gatherings. 

There was substantially more data this year than previously.  And there was substantially less modeling.  All in all, Synthetic Biology is substantially more ... substantial.  It was like a normal scientific meeting.  About science.  No stunts from "civil society" groups looking for their next fear bullet point for fundraising.  No government officials proclaiming SB as the economic future of their city/state/country.  Just science.

What a relief.

And that science was nothing to sneeze at.  There were great talks for 3 days.  Here are a couple of things that caught my eye.

Jef Boeke from Johns Hopkins presented his plans to build synthetic yeast chromosomes.  I first heard this idea more than ten years ago from Ron Davis at Stanford, so it isn't brand new.  I did notice, however, that Boeke having all his synthetic chromosomes made in China.  Over the longer term this means China is getting a boost in building out future biomanufacturing platforms.  If the project works, that is.

As tweeted, Jack Newman from Amyris gave an update on commercialization of artemisinin; it should be on the market by the end of the year, which should be in time to help avert an expected shortfall in production from wormwood.  Fantastic.

Pam Silver and her various students and post-docs showed off a variety of interesting results.  First, Faisal Aldaye showed in vivo DNA scaffolds used to channel metabolic reactions, resulting in substantial increases in yield.  Second, Pam Silver showed the use of those scaffolds to generate twice as much sucrose from hacked cyanobacteria per unit of biomass as from sugar cane.  If that result holds up, and if the various issues related to the cost of bioreactors used to culture photosynthetic organisms are worked out, then Pam's lab has just made an enormous step forward in bringing about distributed biological manufacturing.

This is the sort of advance that makes me feel more sanguine about the future of MIcrobrewing the Bioeconomy.  It will take some years before the volume of Amyris' Biofene, or Gevo's bio-PET, or Blue Marble's bio-butyric acid begins to impact the oil industry.  But it is clear to me now as never before that the petroleum industry is vulnerable from the top of the barrel -- the high value, low volume compounds that are used to build the world around us in the form of petrochemicals.  Biology can now be used to make all those compounds, too, directly from sugar, cellulose, and sunlight, without the tens of billions of dollars in capital required to run an oil company (see The New Biofactories). 

So SB 5.0 was the end of the world as we know it.  Synthetic biology is now just another field of human endeavor, thankfully producing results and also thankfully suffering reduced hype.  I can see how the pieces are starting to fit together to provide for sustainable manufacturing and energy production, though it will be some years before biological technologies are used this way at scale.  Perhaps this is less in-your-face exciting for the attendees, the press, and the public, and that may be part of the reason for my ambivalence.  I fell asleep several times during the proceedings, which has never happened to me at SB X.0, even when overseas and jetlagged.  I have never before thought of achieving boredom as constituting progress.

An Engineered Bug that Produces Isobutanol from Cellulose

This morning, Tom Murray at The Hastings Center pointed me to a new paper from James Liao's lab at UCLA demonstrating the first engineered bug that produces isobutanol from cellulose.  Wendy Higashide, et al, ported the artificial butanol synthesis pathway from the group's earlier work in E. coli (see this previous post) into Clostridium cellulolyticum.  Here is the article.

Recall that butanol is a much better biofuel than is ethanol.  Butanol is also not hygoscopic (doesn't suck up water), which means it can be blended at any point in the distribution chain, whereas ethanol must be trucked/barged/piped in dedicated infrastructure until just upstream of a gas station in order to avoid pulling contaminating water into the fuel stream.  Butanol has a long history of use as a transportation fuel, and has been demonstrated to be a drop in replacement for gasoline in existing engines.  See, for example, the work of the 2007 iGEM team from Alberta, and my earlier post "A Step Toward Distributed Biofuel Production?"  One advantage of making butanol instead of ethanol is that butanol spontaneously phase separates from water (i.e., it floats to the top of the tank) at concentrations above about 7.5% by volume, which substantially reduces the energy required to separate the molecule for use as a fuel.

The press release accompanying the Higashide paper describes the work as a "proof of concept".  The team attempted to insert the butanol synthesis pathway into a Clostridum strain isolated from decaying grass -- a strain that naturally consumes cellulose.  Unfortunately, this Clostridium strain is not as well characterized as your average lab strain of coli, nor does it have anywhere near the same number of bells, knobs, and whistles for controlling the inserted metabolic genes.  The short summary of the paper is that the team managed to produce 660 mg of butanol per liter of culture.  This is only about 0.07% by volume, or ~100 times below the concentration at which butanol phase separates from water.  The team lays out a number of potential routes to improving this yield, including better characterization of the host organism, or simply moving to a better characterized organism.

So, a nice proof of principle.  This is exactly the sort of technological transformation I discuss in my book.  But this proof of concept is not anywhere near being economically useful or viable.  Nonetheless, this progress demonstrates the opportunities ahead in relying on biology for more of our industrial production.

Yummy, Corrosive Biodiesel

Yummy for microbes, that is.  Who turn the methyl esters in biodiesel -- with some intermediate steps -- into hydrogen sulfide that corrodes carbon steel.

This according to a paper last month in Energy & Fuels, Aktas et al explore "Anaerobic Metabolism of Biodiesel and Its Impact on Metal Corrosion".  The authors observe that "Despite the global acceptance of biodiesel, the impact of integrating this alternate fuel with the existing infrastructure has not been fully explored."

Here is a paragraph from the paper, full of interesting tidbits:

The chemical stability characteristics of biodiesel are well-documented,(3, 4) but the susceptibility of this fuel to biodegradation is not well-known. Biodiesel methyl esters are sparingly soluble in seawater, with a saturation concentration of 7 ppm at 17 °C.(5) Several studies showed that aerobic microorganisms readily degrade biodiesel.(6-8) The half-life for the biodegradation of the vegetable methyl esters in agitated San Francisco Bay water was less than 4 days at 17 °C.(9) However, anaerobic conditions prevail whenever heterotrophic microbial respiration consumes oxygen at a rate that exceeds diffusion. This is typically the case in subsurface environments, including oil reservoirs,(10-12) oil-contaminated habitats,(13) refineries, storage vessels, pipelines, oil−water separators, and ballast tanks.

In particular, it is interesting that biodiesel spills might be metabolized by bugs in the environment at a much greater rate than petrodiesel.  Next, it is interesting that our steel infrastructure might be susceptible to more rapid degradation with the inclusion of bio-products.  Plastics, anyone?

The paper concludes:

Our studies suggest that biodiesel can be quite easily hydrolyzed and converted to a variety of fatty acid intermediates by anaerobic microorganisms, regardless of their previous hydrocarbon- or biodiesel-exposure history. The acidic nature of these intermediates accelerates the pitting corrosion process of the most common metal alloy used throughout the fuel infrastructure.(39) The corrosion of pipelines, tanks, storage units, and associated equipment increases the risk of the release of hazardous materials to the environment, with concomitant pollution issues. With the widespread use of biodiesel as an additive to fuel supplies, it is at least prudent to consider how best to avoid the negative consequences associated with the microbial metabolism of these labile fuel components.

Something to watch, obviously.

Micro-Brewing the Bioeconomy: Beer as an Example of Distributed Biological Manufacturing (Updated, and again)

(Updated yet again, 19 June, 2011: Here is a technical report from Biodesic based on the post below (PDF).  "Microbrewing the Bioeconomy: Innovation and Changing Scale in Industrial Production")

(I used this data as part of my report on the bioeconomy and biosecurity for the Biodefense Net Assessment: Causes and Consequences of Bioeconomic Proliferation.)

Ah, beer.  The necessary lubricant of science.  Always the unacknowledged collaborator in the Nobel Prize.  Whether critical to the formulation of quantum mechanics in the pubs of Copenhagen, smoothing the way to the discovery of the double-helix in Cambridge, or helping celebrate an iGEM victory in that other Cambridge (congratulations again, almost-Dr. Brown and team), beer is always there.

And now it is helping me think about the future of biological manufacturing.  Not just by drinking it, though I can't say it hurts.  Yet.

Anyway, the rise of craft brewing in the US is an interesting test case, and a proof of principle, of distributed biological manufacturing successfully emerging in a market dominated by large scale industrial production.  To wit, Figure 1:

US_Brewery_Count_Biodesic.png
Figure 1.  The number of US large and small breweries over the last century.  The (official) count was forced to zero during Prohibition.  (Click on image for full-size.)

A Short, Oversimplified History of Craft Brewing

Before Prohibition, the vast majority of beer produced in the US was brewed by relatively small operations and distributed locally.  There was no refrigeration, nor were there highways and trucks, so beer had to drunk rather than produced and stored in large quantities (modulo some small amount of storage in basements, caves, etc.).  Moreover, the official count of breweries went to zero during the years 1920-1933.  After Prohibition, brewing was regulated and small scale producers were basically shut out of the market.

With the aid of refrigeration and transportation, large scale breweries took off.  Consolidation took its toll -- beer is pretty close to a commodity, after all -- and the number of breweries in the US shrank until about 1980.  In 1979, Jimmy Carter signed legislation reopening the market to small brewers.  This is an interesting and crucial point, because as far as I can tell nothing else substantive changed about the market.  (OK, so it was more complicated than this -- see updates below.)  Deregulation reopened the market to craft brewers and the industry blossomed through organic growth and the preferences of consumers (more on this in the Update below).  (Conclusion: Emerging small scale, distributed production can compete against an installed large scale infrastructure base.)

(Update 18 Aug 2010) There seems to be some upset out in blogland about the idea that Carter deregulated craft brewing.  See the first comment to this post.  I don't think it changes my story about biological manufacturing at all, but for the sake of clarity, here is this: On February 1, 1979, President Carter signed the Cranston Act, which allowed a single adult household to brew up to 100 gallons of beer per year.  A household with two adults could brew up to 200 gallons per year.  For more, see here, or this nice 2009 article from Reason Magazine by Greg Beato, "Draft Dodgers: For DIY brewers, Prohibition lasted until 1978. But onceunleashed, they revolutionized the industry."  From Beato's article: "After Prohibition ended, the Federal Alcohol Administration Act of 1935 laid out a new set of liquor laws. Home winemaking for family use was granted a tax exemption; home brewing was not. If you were making any amount of beer, you had to obtain a permit and comply with a long list of regulations."  Prior to the Cranston Act, brewing beer at home, or in small volumes anywhere, was hard to do because of federal regulations.  After the Cranston Act, people could concoct all kinds of interesting liquids at home.  So it sounds to me like Carter deregulated craft brewing.

(re-Update 19 August, 2010: Tom Hilton, at If I Ran the Zoo, makes some nice points here.  Namely, he observes that there were additional changes at the state level that legalized brewpubs.  Note that not all craft brewers are brewpubs, and this distinction appears to be glossed over in much of the criticism of this post.  Anyway, it is pretty clear that reality was more complicated than the summary I gave above.  No surprise there, though, as the heading of the section contains the word "oversimplified"...)

Better yet as a reference is a peer-reviewed article by Victor Tremblay and colleagues entitled "The Dynamics of Industry Concentration for U.S. Micro and Macro Brewers." (Link. Review of Industrial Organization (2005) 26:307-324)  Here is their description of what happened in 1979 (the original text contains an obvious typo that I have corrected in brackets):

Changes in government policy also benefited micro brewers. First, the legalization of home brewing in February of [1979] stimulated entry, since most early micro brewers began as home brewers. Second, states began lifting prohibitions against brewpubs in the early 1980s. Brewpubs were legal in only six states in 1984; Mississippi was the last state to legalize brewpubs in 1999. Third, the government granted a tax break to smaller brewers in February 1977. According to the new law, brewers with annual sales of less than 2 million barrels paid a federal excise tax rate of $7.00 per barrel on the first 60,000 barrels sold and $9.00 per barrel on additional sales. Brewers with more than 2 million barrels in sales paid an excise tax rate of $9.00 on every barrel sold. In 1991, the tax rate rose to $18 per barrel, but brewers with annual sales of less than 2 million barrels continued to pay only $7.00 per barrel on the first 60,000 barrels sold annually. This benefited the specialty sector, as all micro breweries and brewpubs have annual sales of less than 60,000 barrels and all of the larger specialty brewers have annual sales of less than 2 million barrels.

So a combination of changes to federal regulations and federal excise taxes enabled small players to enter a market they had previously been prohibited from.  That home brewing had been almost non-existent prior to 1979 points to another interesting feature of the market, namely that the skill base for brewing was quite limited.  Thus another effect of legalizing home brewing was that people could practice and build up their skills; they could try out new recipes and explore new business models.  And then, wham, in just a few years many thousands of people were participating in a market that had previously been dominated by large corporate players.

(end Update)

The definition of a "craft" brewer varies a bit across the various interested organizations.  According to the Brewers Association, "An American Craft Brewer is small, independent, and traditional."  Small means less than 2 million barrels a year (at 26 Imperial gal or 30.6 31 standard gal per barrel); independent means less than 25% owned by a non-craft brewer; traditional means either an all malt flagship beer or 50% of total volume in malt beer.  There is a profusion of other requirements to qualify as a craft brewer, some of which depend on jurisdiction, and which are important for such practical concerns as calculating excise tax.  Wikipedia puts the barrier for a craft brewer at less than 15,000 barrels a year.  According to the Brewers Association, as of the middle of 2009 there are about 1500 craft brewers in the US, and about 20 large brewers, and about 20 "others", with brewpubs accounting for about 2/3 of the craft brewers. 

Show Me the Hops.  Or Wheat.  Or Honey (if you must).

Brewpubs and microbreweries are so common that the majority of Americans live within 10 miles of a craft brewer, and it is a good bet that there is one quite close to where you live.  The Beer Mapping Project can help you verify this fact.  Please conduct your field research on foot.

Beer generates retail revenues of about $100 billion in the US (brewery revenues are probably less than half that), contributing combined direct and indirect jobs of about 1.9 million.  But craft brewers account for only a small fraction of the total volume of beer brewed in the US.  According to the Beer Institute's "Craft Brewers Conference Statistical Update - April 2007" (PPT), three brewers now supply 50% of the world's market and 80% of the US market.  See Figure 2, below. The Brewer's Association clarifies that only 5% of the volume of beer brewed in the US is from craft brewers, who manage to pull down a disproportionate 9% of revenues. (Conclusion: Small scale producers can command a premium in a commodity marketplace.)

US_market_share_Biodesic.png
Figure 2.  US beer market share.  (Click on image for full-size.)

Here is an interesting question to which I do not have an answer: how much beer brewed by large producers is actually bottled and distributed locally?  "Lot's of beer", where I don't have any real idea of what "lot's" means, is produced via contract brewing.  It may be that "large scale production" is therefore not as centralized as it looks, but is rather the result of branding.  This makes some sense if you think about the cost of transportation.  As beer (regardless of its source) is mostly water, you are paying to ship something around that is usually plentiful at the destination.  It makes a lot of sense to manufacture locally.  But, as I say, I have yet to sort out the numbers.

Brewing as an Example of Distributed Biological Manufacturing

All of the above makes brewing an interesting test case for thinking about distributed biological production.  Craft brewers buy feedstocks like everybody else, pay for bottles and probably for bottling services, and ship their product just like everybody else.  They may be much smaller on average than Anheuser Busch, but they survive and by definition make enough money to keep their owners and employees happy.  And they keep their customers happy.  And their thirsts quenched.

Above, I identified two important conclusions about the craft brewing market relevant to this story: 1) Craft brewing emerged in the US amidst an already established large scale, industrial infrastructure for producing and distributing beer.  2) Small scale, distributed production can command a premium at the cash register.

As we look forward to future growth in the bioeconomy, more industrial production will be replaced by biofactories, or perhaps "industrial biorefineries", whatever those are supposed to be.  Recall that the genetically modified domestic product (GMDP) now contributes about 2% of total US GDP, with the largest share for industrial products.

This story becomes particularly relevant for companies like Blue Marble, which is already producing high value, drop-in replacements for petrochemicals using biological systems.  (Full disclosure: Blue Mable and Biodesic are collaborating on several projects.)  As feedstocks, Blue Marble uses local waste agricultural products, macro- and micro-algae, sewage, and -- wait for it -- spent grains from the microbrewery next door.  (How's that for closing the loop?)  Products include various solvents, flavorings, and scents.

The craft brewing story tells us that consumers are quite willing to pay a premium for locally produced, high quality products, even before they learn -- in the case of Blue Marble -- that the product is organic and petroleum-free.  It also tells us that small scale production can emerge even amidst an existing large industry. 

Can Blue Mable and other companies compete against enormous, established chemical and petroleum companies?  In my experience, the guys (and they are nearly universally guys) at the top of the oil industry don't even get this question.  "It is all about steel in the ground", they say.  In other words, they are competing based on the massive scale of their capital investments and the massive scale of their operations and they don't think anybody can touch them.

But here is the thing -- Blue Marble and similar companies are going to be producing at whatever scale makes sense.  Buildings, neighborhoods, cities, whatever.  Any technology that is based on cow digestion doesn't have to be any bigger than a cow.  Need more production?  Add more cows.  This costs rather less than adding another supertanker or another refinery.  Blue Marble just doesn't require massive infrastructure, in large part because they don't require petroleum as a feedstock and are not dependent on high temperatures for processing.  Most of the time, Blue Marble can do their processing in plastic jugs sitting on the floor, and stainless steel only comes into the picture for food-grade production lines.  This means capital costs are much, much lower.  This is a point of departure for biomanufacturing when compared to brewing.

(Update: Perusing old posts, I discovered I did a decent job last year of putting this scale argument in the context of both computers and the oil industry, here.)

Beer is close to a commodity product, and it is the small scale producers who get a better price, even though their costs will be roughly the same as large scale producers.  Blue Marble generally has substantially higher margins than petrochemical producers -- and by focusing on the high margin portion of the petroleum barrel they are going to be stealing the cream away from much larger companies -- but Blue Marble's costs are much lower.  What is the financial situation of a large petrochemical company going to look like when they lose the market for esters, which can have margins of many hundreds of dollars per liter, and are left with margins on products closer to gas and diesel at dollars per liter?  This is a different sort of play than you would see in brewing.

Now, I am not guaranteeing that distributed biological production will win in all cases.  Large beer brewers clearly still dominate their market.   It may be that biological manufacturing will look like the current beer industry; a few large players producing large volumes, and a large number of small players producing much less but at higher margins.  But craft brewing is nonetheless an existence proof that small scale, distributed production can emerge and thrive even amidst established large scale competition.  And biological manufacturing is sufficiently different from anything else we have experience with that the present market size of craft brewing may not be that relevant to other products. 

Shell and Recent Biofuels Moves

According to the Financial Times, Shell recently entered a $12 billion deal with Cosan, the Brazilian sugar and ethanol producer.  Included in the deal are Shell's stakes in Iogen and Codexis, which together have a bunch of potent biological technologies useful for turning sugar and celluose into biofuels.  This represents a shift in strategy towards the biological production of fuels and away from industrial chemistry.  Last fall Shell sold off its stake in Choren, which had an advanced biomass-to-liquids program based on gasification of just about anything.  I met a group of executives from Choren at a meeting in Alberta about 18 months ago, and they seemed on top of the world with the partnership from Shell supporting their feedstock agnostic process.

It is interesting that Shell decided to change directions like this.  In the last couple of years I've heard many chemical engineers (including some from Shell) suggest that many of the problems plaguing process development in gasification and catalytic fuel synthesis were getting solved.  The story we told at Bio-era, and that I developed further in the book, is that industrial chemistry  would be one of many routes to biofuels, but that they might compete poorly in the long run because they require such careful tuning.  So Shell's exit might have been predicted at some point, but it came much sooner than I thought.  It appears biological technologies may appear a better bet even at this early stage.

25% of US Grain Crop Used for Biofuel

The Guardian UK reported today that 2009 USDA figures show 25% of grains grown in the US were used to produce liquid biofuels.  The typical food vs fuel story follows.  And it is mostly on point, if tinted by The Guardian's usual populist tone.  Yes, all the grain could in principle be used to feed people.  No, it isn't clear that grain-based ethanol is in fact better than burning petroleum when it comes to total greenhouse gas emissions or energy content.

The story ends with a nod toward "continued innovation in ethanol product" that supposedly is increasing yields and reducing costs.  Huh.  No mention, though, of the fact that any starch crop used to make fuel starts at a major disadvantage with respect to sugar crops, nor that there is an ethanol glut in the US due to construction of too many ethanol production plants.  Neither does the story get into why ethanol isn't a very good fuel to begin with (wrong solvent properties, low energy content, water soluble).

I go into detail about this in my forthcoming book, but the upshot of the argument is that the US is investing quite a lot of money in ethanol production technology and infrastructure that will never be competitive with sugar derived fuels.  And then relatively soon we will get butanol, longer chain alcohols, and true drop-in petroleum replacements made using modified organisms.  In the meantime, I suppose we will just have to suffer through the impact of decisions made more for political reasons than for competitive or national security reasons.  But grain to ethanol isn't really good for anybody except US Senators from farm states.