US Market Value of GM Crops is Approximately $70 Billion

I have a short letter in the November 2009 issue of Nature Biotechnology (subscription req.) correcting the record on US revenues from genetically modified crops.  Based on USDA data for corn, soy, and cotton, revenues from the GM versions of those crops were about US$ 65 billion in 2008, rather than the widely misreported ~$4 billion.  The latter figure is in fact just from GM seed revenue.  I would put the total from all GM crops and seeds at $75-85 billion, though it isn't yet clear where GM sugar beets are going to come in.  Assuming US revenues are representative of global averages, thentotal worldwide revenues are probably north of $150 billion for crops and seeds together.

Below is a figure showing US yearly revenues from the three big crops, as well as the US annual total.  Note that although the GM fraction of each crop continues to grow (see the ISAAA report from 2008), prices fluctuate sufficiently from year to year that total revenues declined from 2007 to 2008.  Food and crop prices have come off their 2007 highs -- which cannot last given increasing demand around the world.  I would expect revenues to resume their climb in 2010.

Carlson_US_rev_GM_crops_Nov_09.png

WWF Endorses Industrial Biotech for Climate Solutions

A fortnight ago the World Wildlife Fund released a report pushing industrial biotech as a way to increase efficiency and reduce carbon emissions.  Interesting.  Of course, industrial biotech doesn't necessarily require direct genetic modification, but the WWF must know that is an inevitable consequence of heading down this road.  More on this after I get a chance to read the report.

The Economist Debate on the Fuel of the Future for Cars

Last week The Economist ran an online debate considering the motion "Biofuels, not electricity, will power the car of the future".  I was privileged to be invited as a guest contributor along with Tim Searchinger of Princeton University.  The two primary "speakers" were Alan Shaw of Codexis and Sidney Goodman of Automotive Alliances.  Here is my contribution to the debate, in which I basically rejected the false dichotomy of the motion (the first two 'graphs follow):

The future of transportation power sources will not be restricted to "either/or". Rather, over the coming decades, the nature of transportation fuel will be characterised by a growing diversity. The power sources for the cars of the future will be determined by the needs those cars address.

Those needs will be set for the market by a wide range of factors. Political and economic pressures are likely to require reducing greenhouse gas emissions and overall energy use per trip. Individuals behind the wheel will seek to minimise costs. But there is no single fuel that simultaneously satisfies the requirements of carbon neutrality, rapid refuelling, high-energy density for medium- to long-range driving and low cost.

I find it interesting that the voting came down so heavily in favor of electricity as the "fuel" of the future.  I suppose the feasibility of widespread electric cars depends on what you mean by "future".  Two substantial technology shifts will have to occur before electric cars displace those running on liquid fuels, both of which will require decades and trillions.

First, for the next several decades, no country, including the US, is likely to have sufficient electricity generating resources and power distribution infrustructure to convert large numbers of automobiles to electric power.  We need to install all kinds of new transmission lines around the country to pull this off.  And if we want the electricity to be carbon neutral, we need to install vast amounts of wind and solar generating capacity.  I know Stewart Brand is now arguing for nuclear power as "clean energy", but that still doesn't make sense to me for basic economic reasons. (Aside: at a party a few months ago, I got Lowell Wood to admit that nuclear power can't be economically viable unless the original funders go bankrupt and you can buy the physical plant on the cheap after all the initial investment has been wiped out.  Sweet business model.)

Second, the energy density of batteries is far below that of liquid hydrocarbons.  (See the Ragone chart included in my contribution to The Economist debate.)  Batteries are likely to close the gap over the coming years, but long distance driving will be the domain of liquid fuels for many years to come.  Yes, battery changing stations are an interesting option (as demonstrated by Better Place), but it will take vast investment to build a network of such stations sufficient to replace (or even compete with) liquid fuels.  Plugging in to the existing grid will require many hours to charge the batteries, if only because running sufficient current through most existing wires (and the cars themselves) to recharge car batteries rapidly would melt those wires.  Yes, yes -- nanothis and nanothat promise to enable rapid recharging of batteries.  Someday.  'Til then, don't bother me with science fiction.  And even if those batteries do show up in the proverbial "3 to 5 year" time frame, charging them rapidly would still melt most household power systems.

In the long run, I expect that electric cars will eventually replace those powered by liquid fuels.  But in the mean time, liquid fuels will continue to dominate our economy.

The Origin of Moore's Law and What it May (Not) Teach Us About Biological Technologies

While writing a proposal for a new project, I've had occasion to dig back into Moore's Law and its origins.  I wonder, now, whether I peeled back enough of the layers of the phenomenon in my book.  We so often hear about how more powerful computers are changing everything.  Usually the progress demonstrated by the semiconductor industry (and now, more generally, IT) is described as the result of some sort of technological determinism instead of as the result of a bunch of choices -- by people -- that produce the world we live in.  This is on my mind as I continue to ponder the recent failure of Codon Devices as a commercial enterprise.  In any event, here are a few notes and resources that I found compelling as I went back to reexamine Moore's Law.

What is Moore's Law?

First up is a 2003 article from Ars Technica that does a very nice job of explaining the why's and wherefore's: "Understanding Moore's Law".  The crispest statement within the original 1965 paper is "The number of transistors per chip that yields the minimum cost per transistor has increased at a rate of roughly a factor of two per year."  At it's very origins, Moore's Law emerged from a statement about cost, and economics, rather than strictly about technology.

I like this summary from the Ars Technica piece quite a lot:

Ultimately, the number of transistors per chip that makes up the low point of any year's curve is a combination of a few major factors (in order of decreasing impact):

  1. The maximum number of transistors per square inch, (or, alternately put, the size of the smallest transistor that our equipment can etch),
  2. The size of the wafer
  3. The average number of defects per square inch,
  4. The costs associated with producing multiple components (i.e. packaging costs, the costs of integrating multiple components onto a PCB, etc.)

In other words, it's complicated.  Notably, the article does not touch on any market-associated factors, such as demand and the financing of new fabs.

The Wiki on Moore's Law has some good information, but isn't very nuanced.

Next, here an excerpt from an interview Moore did with Charlie Rose in 2005:

Charlie Rose:     ...It is said, and tell me if it's right, that this was part of the assumptions built into the way Intel made it's projections. And therefore, because Intel did that, everybody else in the Silicon Valley, everybody else in the business did the same thing. So it achieved a power that was pervasive.

Gordon Moore:   That's true. It happened fairly gradually. It was generally recognized that these things were growing exponentially like that. Even the Semiconductor Industry Association put out a roadmap for the technology for the industry that took into account these exponential growths to see what research had to be done to make sure we could stay on that curve. So it's kind of become a self-fulfilling prophecy.

Semiconductor technology has the peculiar characteristic that the next generation always makes things higher performance and cheaper - both. So if you're a generation behind the leading edge technology, you have both a cost disadvantage and a performance disadvantage. So it's a very non-competitive situation. So the companies all recognize they have to stay on this curve or get a little ahead of it.

Keeping up with 'the Law' is as much about the business model of the semiconductor industry as about anything else.  Growth for the sake of growth is an axiom of western capitalism, but it is actually a fundamental requirement for chipmakers.  Because the cost per transistor is expected to fall exponentially over time, you have to produce exponentially more transistors to maintain your margins and satisfy your investors.  Therefore, Intel set growth as a primary goal early on.  Everyone else had to follow, or be left by the wayside.  The following is from the recent Briefing in The Economist on the semiconductor industry:

...Even the biggest chipmakers must keep expanding. Intel todayaccounts for 82% of global microprocessor revenue and has annual revenues of $37.6 billion because it understood this long ago. In the early 1980s, when Intel was a $700m company--pretty big for the time--Andy Grove, once Intel's boss, notorious for his paranoia, was not satisfied. "He would run around and tell everybody that we have to get to $1 billion," recalls Andy Bryant, the firm's chief administrative officer. "He knew that you had to have a certain size to stay in business."

Grow, grow, grow

Intel still appears to stick to this mantra, and is using the crisis to outgrow its competitors. In February Paul Otellini, its chief executive, said it would speed up plans to move many of its fabs to a new, 32-nanometre process at a cost of $7 billion over the next two years. This, he said, would preserve about 7,000 high-wage jobs in America. The investment (as well as Nehalem, Intel's new superfast chip for servers, which was released on March 30th) will also make life even harder for AMD, Intel's biggest remaining rival in the market for PC-type processors.

AMD got out of the atoms business earlier this year by selling its fab operations to a sovereign wealth fund run by Abu Dhabi.  We shall see how they fare as a bits-only design firm, having sacrificed their ability to themselves push (and rely on) scale.

Where is Moore's Law Taking Us?

Here are a few other tidbits I found interesting:

Re the oft-forecast end of Moore's Law, here is Michael Kanellos at CNET grinning through his prose: "In a bit of magazine performance art, Red Herring ran a cover story on the death of Moore's Law in February--and subsequently went out of business."

And here is somebody's term paper (no disrespect there -- it is actually quite good, and is archived at Microsoft Research) quoting an interview with Carver Mead:

Carver Mead (now Gordon and Betty Moore Professor of Engineering and Applied Science at Caltech) states that Moore's Law "is really about people's belief system, it's not a law of physics, it's about human belief, and when people believe in something, they'll put energy behind it to make it come to pass." Mead offers a retrospective, yet philosophical explanation of how Moore's Law has been reinforced within the semiconductor community through "living it":

After it's [Moore's Law] happened long enough, people begin to talk about it in retrospect, and in retrospect it's really a curve that goes through some points and so it looks like a physical law and people talk about it that way. But actually if you're living it, which I am, then it doesn't feel like a physical law. It's really a thing about human activity, it's about vision, it's about what you're allowed to believe. Because people are really limited by their beliefs, they limit themselves by what they allow themselves to believe what is possible. So here's an example where Gordon [Moore], when he made this observation early on, he really gave us permission to believe that it would keep going. And so some of us went off and did some calculations about it and said, 'Yes, it can keep going'. And that then gave other people permission to believe it could keep going. And [after believing it] for the last two or three generations, 'maybe I can believe it for a couple more, even though I can't see how to get there'. . . The wonderful thing about [Moore's Law] is that it is not a static law, it forces everyone to live in a dynamic, evolving world.

So the actual pace of Moore's Law is about expectations, human behavior, and, not least, economics, but has relatively little to do with the cutting edge of technology or with technological limits.  Moore's Law as encapsulated by The Economist is about the scale necessary to stay alive in the semiconductor manufacturing business.  To bring this back to biological technologies, what does Moore's Law teach us about playing with DNA and proteins?  Peeling back the veneer of technological determinism enables us (forces us?) to examine how we got where we are today. 

A Few Meandering Thoughts About Biology

Intel makes chips because customers buy chips.  According to The Economist, a new chip fab now costs north of $6 billion.  Similarly, companies make stuff out of, and using, biology because people buy that stuff.  But nothing in biology, and certainly not a manufacturing plant, costs $6 billion.

Even a blockbuster drug, which could bring revenues in the range of $50-100 billion during its commercial lifetime, costs less than $1 billion to develop.  Scale wins in drug manufacturing because drugs require lots of testing, and require verifiable quality control during manufacturing, which costs serious money.

Scale wins in farming because you need...a farm.  Okay, that one is pretty obvious.  Commodities have low margins, and unless you can hitch your wagon to "eat local" or "organic" labels, you need scale (volume) to compete and survive.

But otherwise, it isn't obvious that there are substantial barriers to participating in the bio-economy.  Recalling that this is a hypothesis rather than an assertion, I'll venture back into biofuels to make more progress here.

Scale wins in the oil business because petroleum costs serious money to extract from the ground, because the costs of transporting that oil are reduced by playing a surface-to-volume game, and because thermodynamics dictates that big refineries are more efficient refineries.  It's all about "steel in the ground", as the oil executives say -- and in the deserts of the Middle East, and in the Straights of Malacca, etc.  But here is something interesting to ponder: oil production may have maxed out at about 90 million barrels a day (see this 2007 article in the FT, "Total chief warns on oil output").  There may be lots of oil in the ground around the world, but our ability to move it to market may be limited.  Last year's report from Bio-era, "The Big Squeeze", observed that since about 2006, the petroleum market has in fact relied on biofuels to supply volumes above the ~90 million per day mark.  This leads to an important consequence for distributed biofuel production that only recently penetrated my thick skull.

Below the 90 million barrel threshold, oil prices fall because supply will generally exceed demand (modulo games played by OPEC, Hugo Chavez, and speculators).  In that environment, biofuels have to compete against the scale of the petroleum markets, and margins on biofuels get squeezed as the price of oil falls.  However, above the 90 million per day threshold, prices start to rise rapidly (perhaps contributing to the recent spike, in addition to the actions of speculators).  In that environment, biofuels are competing not with petroleum, but with other biofuels.  What I mean is that large-scale biofuels operations may have an advantage when oil prices are low because large-scale producers -- particularly those making first-generation biofuels, like corn-based ethanol, that require lots of energy input -- can eke out a bit more margin through surface to volume issues and thermodynamics.  But as prices rise, both the energy to make those fuels and the energy to move those fuels to market get more expensive.  When the price of oil is high, smaller scale producers -- particularly those with lower capital requirements, as might come with direct production of fuels in microbes -- gain an advantage because they can be more flexible and have lower transportation costs (being closer to the consumer).  In this price-volume regime, petroleum production is maxed out and small scale biofuels producers are competing against other biofuels producers since they are the only source of additional supply (for materials, as well as fuels).

This is getting a bit far from Moore's Law -- the section heading does contain the phrase "meandering thoughts" -- I'll try to bring it back.  Whatever the origin of the trends, biological technologies appear to be the same sort of exponential driver for the economy as are semiconductors.  Chips, software, DNA sequencing and synthesis: all are infrastructure that contribute to increases in productivity and capability further along the value chain in the economy.  The cost of production for chips (especially the capital required for a fab) is rising.  The cost of production for biology is falling (even if that progress is uneven, as I observed in the post about Codon Devices).&nb sp; It is generally becoming harder to participate in the chip business, and it is generally becoming easier to participate in the biology business.  Paraphrasing Carver Mead, Moore's Law became an organizing principal of an industry, and a driver of our economy, through human behavior rather than through technological predestination.  Biology, too, will only become a truly powerful and influential technology through human choices to develop and deploy that technology.  But access to both design tools and working systems will be much more distributed in biology than in hardware.  It is another matter whether we can learn to use synthetic biological systems to improve the human condition to the extent we have through relying on Moore's Law. 

"The New Biofactories"

(Update: McKinsey seems to have pulled the whole issue from the web, which is too bad because there was a lot of good stuff in it.  The text of my contribution can be found below.)

I have a short essay in a special edition of the McKinsey Quarterly, What Matters.  My piece is waaaay back at the end of the printed volume, and all the preceding articles are well worth a look.  Other essayists include Steven Chu, Hal Varian, Nicholas Stern, Kim Stanley Robinson, Yochai Benkler, Vinod Khosla, Arianna Huffington, Joseph Nye, and many more.  Good company.

Here is the essay: "The New Biofactories" (PDF), Robert Carlson, What Matters, McKinsey & Company, 2009.

Carl Zimmer on Synthetic Biology for Biofuels

Carl Zimmer has a nice piece in Yale Enivronment360 on continued efforts to build bugs that produce fuel, "The High-Tech Search For A Cleaner Biofuel Alternative".  The article extensively quotes Steve Aldrich, President of Bio-era, on the trade-offs of using sugar cane as a source material.

Craig Venter makes an appearance arguing that the best long-term bet is to build photosynthetic bugs that use atomspheric CO2 to directly produce fuel.  Maybe.  This would require containment facilities for culturing engineered bugs, where those facilities also must capture sunlight and CO2 to feed the bugs.  The costs for this infrastructure are not insignificant, and this is exactly what is presently standing in the way of large scale algal biodiesel production.

Here is the question I keep asking in these circles: why not just grow naturally occurring algae, which can be grown at extremely high yield in a wide variety of conditions, as food for bugs hacked to eat cellulose?  If there is no algae to be had, just throw in another source of cellulose or other biomass.  There would be minimal concern over growing modified organisms that might escape into the wild.  The processing of biomass into fuel under would also be under conditions that are easier to optimize and control.

I'm not suggesting this is the only answer, but rather that it appears to balance 1) the costs of infrastructure, 2) concerns over enviromental release of genetically modified organisms, and 3) provide an efficient processing infrastructure that could use a wide variety of feedstocks.

Update on Plans for GM Crop Research in Britain

Last year I pointed out the complexities of arguments about GM food through the continuing debate in Europe and the U.K. about animal feed.  The diminishing availability of GM-free feed grain could lead to significant shortages, which in turn could drastically reduce the amount of meat in European markets.  (See "Re-Inventing The Food Chain (or "On Food Prices, In Vitro Meat, and GM Livestock Feed")."

Now the Independent reports that the U.K. is considering protecting GM crop research from domestic protest and attack.  The government may go so far as to bring that research onto defense installations in order to protect it better, as suggested by Andrew Grice in a story provocatively titled "Government to defy critics with secret GM crop trials".

Here is one 'graph from the article:

Professor Tim Benton, research dean at [the Leeds University] Faculty of Biological Science, said yesterday: "We need to find a way to do crop trials in a safe way and to minimise the environmental risk. We cannot carry on for the next 20 or 30 years saying it's too scary, the public is too frightened, it is politically too dangerous. There is absolutely no way we can move towards a world with food security without using GM technology. The amount of food we need could double because the population is growing, climate change will reduce yields and we will take land out of food production for biofuels."

Amyris Opens Biodiesel Pilot Plant

Amyris appears to be making good progress towards meeting their goal of getting biofuels to market by 2010.  They just opened their first pilot plant in California, with the aim of importing fuel into the US from Brazil within two years.  The output of the pilot plant will be used to gain EPA certification.  The announcement pretty well tracks with my previous posts about biofuels.

Here are a few graphs from the press release:

Amyris' diesel is characterized as a No Compromise™ fuel because it is designed to be a scalable, low‐cost renewable fuel with performance attributes that equal or exceed those of petroleum‐sourced fuels and currently available biofuels. Other attributes include:
  • Superior environmental performance: Preliminary analyses show that Amyris diesel fuel has virtually no sulfur and significantly reduced NOx, particulate, carbon monoxide and hydrocarbon exhaust emissions relative to petroleum‐sourced diesel fuel.
  • High blending rates: Because Amyris renewable diesel contains many of the properties of petroleum diesel, Amyris can blend the fuel at high levels ‐‐ up to 50  pecent ‐‐ compared with 10‐20 percent for conventional biodiesel and ethanol.
  • Compatibility with  existing infrastructure: Unlike many commercially available biofuels, Amyris expects to distribute its renewable diesel through the existing fuel distribution and storage infrastructure, thus speeding time to market while minimizing costs.
  • Adaptive: Amyris can produce its fuels from a broad range of feedstock including sugarcane and cellulosic biomass. It is starting with Brazilian sugar cane because it provides the most environmentally sound, economical, and scalable source of energy available today.

"This new diesel fuel has all the characteristics to make an important contribution toward solving our global transportation energy and climate crisis," said John Melo, chief executive officer of Amyris. "The opening of our pilot plant is a significant business marker for us, taking us one step closer to bringing our diesel fuel to market."

Craig Rubens at earth2tech provides interesting coverage, and his story notes:

Melo described the company's business model as "a capital-light model to scale up fast." The company plans to partner with existing ethanol plants and convert a portion of those partners' production capacity to make diesel and other chemicals using Amyris IP. The startup will then buy the products back from the refiner and take them to market, Melo said. The startup has already formed a joint venture with Santelisa Vale, Brazil's second largest sugar grower, called Crystalsev, which aims to produce 200 million gallons of fuel a year by 2011 at several of its existing ethanol plants at a price of less than $2 a gallon.

The Brazilian partnership, Melo explained, gives Amyris access to ports and ships to export the fuel. Amyris plans to import it to the U.S. and sell its to large customers, like Wal-Mart and the U.S. government. Foreign ethanol is hit with a 54-cent-per-gallon tariff as it comes into the U.S., but Amyris would be importing hydrocarbons, not ethanol, and therefore avoid the tariff. Amyris is already marketing other companies' biofuels in the Southeast to make sure its distribution channels will work.

To date, Amyris' strategy hasn't seamed particularly "capital light." The company has raised more than $120 million in capital (see previous coverage here and here) from heavy-hitting cleantech and biotech investors, including Kleiner Perkins, Khosla Ventures, TPG Biotech and DAG Ventures.

I understand the present need for scale, both physical and financial, and earth2tech's skepticism seems a bit naive.  Amyris is facing enormous competition, both from established petroleum companies and from other start-ups.  As I would not expect any of these companies to have a firm lock on IP surrounding biological production of fuels, Amyris must establish itself and its brand quickly and rely on first-mover advantage. (I wonder how thoroughly they are scrubbing the waste stream?  Dumpster diving for competitive intelligence takes on a new meaning here.)  Shell is dropping seven billion on upgrading a single refinery in Texas.  Amyris seems pretty light in comparison.

Writing at Cleantech, Emma Ritch provides an excellent tidbit: "The company has shelved its plans for a bio-gasoline".  "We're focused on the products with the highest value," Melo said. "We're not investing our resources in developing a bio-gasoline because we see the U.S. as the last gasoline-based economy."  That is particularly fascinating, as Melo is the former President of BP Fuels.  It is also a change since I heard Zach Serber speak at SB 4.0 last month in Hong Kong.  The fluctuating price of oil may be important here.

Unfortunately, Ritch mischaracterises the competitive landscape a bit: "Amyris plans to use the cheapest nonfood feedstock available, which for now means sugarcane... The company could also use algae for its biodiesel--much like Solazyme, LiveFuels, GreenFuel Technologies and many others."  In contrast to Amyris, the latter three companies are directly producting fuel in algae, with Solazyme feeding sugar to bugs in the dark and completely skipping photosynthesis. (Hmm...I wonder what sort of selection pressure that is putting on their algae strains?)  If Amyris does use algae -- sorry, when Amyris starts using algae -- the company will almost certainly be using it as a feedstock fed to microbes that then produce fuels.  This would require building a front-end process onto their yeast production system, but I don't see that as taking very long to happen.  See my earlier post on Blue Marble Energy.

Things are moving forward.  I would note that I see a lot of stainless steel in the photos of Aymris' pilot plant.  I am no fermentation jock, but it seems that they could probably use solvent resistent plastic as their culture vessels.  Here is one home-brew kit that basically just consists of plastic buckets.  Maybe that is a step for later.

Congratulations to everyone at Amyris.  Keep up the good work.

Synthetic Biology 4.0 – Not so live blog, part 1

What a difference a few years makes.  SB 1.0 was mostly a bunch of professors and grad students in a relatively small, stuffy lecture hall at MIT.  SB 2.0 in Berkeley expanded a bit to include a few lawyers, sociologists, and venture capitalists.  (I skipped 3.0 in Zurich.)

At just over 600 attendees, SB 4.0 is more than twice as big as even 3.0, with just under half the roster from Asia.  The venue, at the Hong Kong University of Science and Technology, is absurdly nice, with a view over the ocean that beats even UCSB and UCSD.  Kudos also to the organizers here.  They worked very hard to make sure the meeting came off well, and it is clear they are interested in synthetic biology, and biotech in general, as a long term proposition.  The Finance Minister of Hong Kong, John Tsang, spoke one evening, and he was very clear that HK is planning to put quite a lot of money and effort into biology.

Which brings me to a general observation that Hong Kong really cares about the future, and is investing to bring it along that much sooner.  I arrived a day early in order to acclimate a bit and wander around the city, as my previous visit was somewhat hectic.  Even amid the financial crisis, the city feels more optimistic and energetic than most American cities I visit.

I will have to write up the rest of the meeting when I get back to the States later this week.  But here are a few thoughts:

As of the last few days, I have now seen all the pieces necessary to build a desktop gene printer.  I don’t have prediction when such a thing will arrive on the market, but there is no doubt in my mind that it is technically feasible.  With appropriate resources, I think it would take about 8 weeks to build a prototype.  It is that close.

Ralph Baric continues to do work on SARS that completely scares the shit out of me.  And I am really glad it is getting done, and also that he is the one doing it.  His work clearly demonstrates how real the threat from natural pathogens is, and how poorly prepared we are to deal with it.

Jian Xu, who is better known for his efforts to understand the human gut microbiome, spoke on the soup-to-nuts plant engineering and biofuels effort at the Qingdao Institute of Bioenergy and Bioprocess Technology, run by the Chinese Academy of Sciences (QIBEBT).   The Chinese are serious about putting GM plants into the field and deriving massive amounts of energy from biomass.

Daphne Prauss from Chromatin gave a great talk about artificial chromosomes in plants and how they speed up genetic modification.  I’ll have to understand this a bit better before I write about it.

Zach Serber from Amyris spoke about on their biofuels efforts, and Amyris is on schedule to get aviation fuel, diesel, and biogasoline into the market within the next couple of years.  All three fuels have equivalent or better characteristic as petro-fuels when it comes to vapor pressure, cloud point, cetane number, octane, energy density, etc.

More soon.