The History and Future of High Fertilizer Prices

Alarming record-high fertilizer prices in December 2021 don’t seem quite so bad when compared apples-to-apples to 2012 highs, or the really grim times of the 1850’s Guano Age.

“Peruvian guano has become so desirable an article to the agricultural interest of the United States that it is the duty of the Government to employ all the means properly in its power for the purpose of causing that article to be imported into the country at a reasonable price. Nothing will be omitted on my part to accomplishing this desirable end.”
— President Millard Fillmore in his Dec. 2, 1850, State of the Union Address, promising to bring down the market price of bird poop.

 

In fact, President Fillmore devoted 100 words of his 1850 State of the Union address to the topic of bird droppings, over 1% of the entire speech, alongside talk of international canals and railroads that would advance global commerce. The topic of manure — any type of manure — was a very big deal in those days, and these days, too. Farmers today can sympathize with that high-consequences attention to the topic of fertilizer and devote at least that much scrutiny to record-high prices today.

Back in 1850, there was no such thing as synthetic nitrogen fertilizer, and if anyone ever wanted to get better than 20 bushels per acre of corn, farmers had to use whatever natural fertilizer they could find — farm animal manure, sure, but also kitchen waste, sawdust, dead animals, sludge from wetlands; you name it. In the “Guano Age,” a powerful global industry had sprung up to mine giant mountains of desirable bird poop off the coasts of Peruvian islands (and elsewhere), yet there still wasn’t enough supply to meet demand. Seabird guano reportedly hit a high of $76 per pound^ in 1850, or a quarter of the price of gold. (https://www.nationalgeographic.com/…)

That’s so expensive, it’s beyond comparing to today’s common agricultural fertilizers. Assume you can get 70 pounds of nitrogen per ton of seabird guano (3.5% nitrogen), similar to some poultry litter available today, and it shows anyone actually buying guano at $76 per pound was paying over $2,000 per pound of nitrogen. There’s no way this could have been a widespread market price paid by the average American farmer, but nevertheless, today’s anhydrous ammonia at $1,314 per ton (or $0.80 per pound of nitrogen) doesn’t seem so bad!

To be sure, the prices for nitrogen-based fertilizers are as high as they’ve ever been. The DTN Fertilizer Index, a weekly survey of over 300 retailer prices, showed dry urea averaging $873 per ton in the first week of December 2021. With 920 pounds of nitrogen provided to a crop from each ton of urea, that’s $0.95 per pound of nitrogen. With liquid UAN-32 fertilizer priced at $661 per ton and 640 pounds of nitrogen available from each ton, that’s $1.03 per pound of nitrogen — now the highest price for nitrogen since the Guano Age.

In year-over-year terms, dry urea prices are 143% higher than in early December 2020, but looking at a longer timescale, today’s urea prices are only 13% above their peak from the 2012 planting season. You can watch fertilizer prices rise week by week, like anhydrous, which has been rising by an average of 6% per week since October, in the DTN Retail Fertilizer Trends series by DTN Staff Reporter Russ Quinn: https://www.dtnpf.com/…

Nevertheless, it’s no wonder that many grain producers are looking for alternatives to high-priced synthetic fertilizers. Although I don’t have access to a nice, long-term data series of manure prices from cattle feed yards, for instance, let’s assume the value of cattle manure today is at about $70 per ton. If it makes 5 pounds of nitrogen available per ton of manure, that’s equivalent to paying $0.18 per pound of nitrogen (and getting all the other minerals and benefits of organic matter for free). This would be by far the cheapest source of nitrogen, if it’s available to you and logistically feasible to apply. But just like the guano of the 1850s, there’s only so much to go around.

Interestingly, phosphate-based fertilizers remain cheaper now than they were during the commodity boom of 2008 (an N-P-K mix of 10-34-0 in November of 2008 was priced at $1,250 per ton but is now only $756). Altogether then, fertilizer prices shouldn’t seem quite as panic-inducing, as a one-year anhydrous chart (up 208% year over year) might strike us at first glance. It’s not some new, overwhelming, unmeetable demand for the stuff that’s causing the rally — rather, it’s the global shortage of nitrogen itself that is the driving force behind rising fertilizer prices.

Ever since German chemist Fritz Haber and metallurgist Carl Bosch figured out the famed Haber-Bosch process (1909) to artificially fix atmospheric nitrogen into the usable nitrogen of ammonia under high temperatures and pressures, the world’s supply of nitrogen has more or less depended on the world’s supply of natural gas. In commercial fertilizer production, natural gas is used not only as the source for raw hydrogen in the chemical reaction, but also to provide the necessary heat. Today, U.S. natural gas prices have come down from their September peak, but in Europe there is still massive uncertainty about receiving natural gas supplies from Russia amid current geopolitical tensions and some fertilizer plants remain shut down until natural gas prices moderate. For at least the next several months, the global outlook for fertilizer availability will remain as tense as the Russia-Ukraine border.

Just as a new technology (synthetic fertilizer production through the Haber-Bosch process) ultimately relieved the shortage of guano in the 20th century and made a golden age of agriculture and human prosperity possible, is there perhaps a new technology on the horizon that will alter our dependence on natural gas to feed the world? Researchers at Australia’s Monash University have recently announced a breakthrough new method to produce “green” ammonia* with a different catalyst than the Haber-Bosch process, at room temperature, at efficiency rates that may someday be commercially viable. (https://lens.monash.edu/…)

In the meantime, until there’s abundant commercially produced green ammonia, if you’re unable to source the usual synthetic fertilizers necessary for spring planting at your local retailers, perhaps you’d like to consider something more old school, like this Mexican bat guano you can buy on the internet for $7.50 for a 1.25-lb bucket from a natural gardening outlet. (https://www.planetnatural.com/…)

That’s probably equivalent to $6.67 per pound of nitrogen, so maybe it’s better to stick with anhydrous. Anyway, even the bat guano is out of stock these days!

 

(c) Copyright 2021 DTN, LLC. All rights reserved. Originally published at: High Fertilizer Prices: The History and Future (dtnpf.com)

^Burnett, Christina Duffy. “The Edges of Empire and the Limits of Sovereignty: American Guano Islands.” American Quarterly 57, no. 3 (2005): 779–803. http://www.jstor.org/stable/40068316.

*B. Suryanto, K. Matuszek, J. Choi, R. Hodgetts, et al, Nitrogen reduction to ammonia at high efficiency and rates based on a phosphonium proton shuttle. Science, 372, 6547, (1187-1191), (11 June 2021). https://doi.org/10.1126/science.abg2371

Plague Year Price Patterns

Human behavior appears to be as predictably unpredictable in the 21st century as it was in the 17th century. There are astonishing parallels between our own concerns, obsessions and actions during the time of COVID-19 and the concerns, obsessions and actions of Londoners during the last outbreak of bubonic plague in 1665. Presumably, if there were any first-hand written accounts of the Plague of Justinian in 541 AD or even Pharoah’s plagues in 1800 BC, they would demonstrate the same human concerns — how many are dying, how to treat the sick and dying, how to keep the living fed and how to keep the sickness from surging back more destructively than ever.

For evidence of what behavior patterns the commodity markets might expect during a pandemic, I’ve turned to “A Journal of the Plague Year,” written by Daniel Defoe (better known for writing Robinson Crusoe and Moll Flanders) about London’s Black Death in 1665 when 20% of the population died. My jaw dropped as I turned each page and found descriptions — in Defoe’s wry chatty voice — of market behavior nearly exactly matching our lives and decisions today.

First of all, let me note that Defoe was the son of a butcher before becoming a trader and an economic journalist, so it’s no wonder I appreciated his “journal” with its tables of numbers and attention to the logistics of getting commodities into the hands of consumers. Here’s a small list of the ways his 1665 narrative seems to match our own:

— When the plague started spreading in London, the first thing the rich did was to pack up and leave for their second homes in more remote locations.

— Those who didn’t leave amassed great hoards of food and supplies, if they were economically able, and shut themselves up in their homes.

— “At the first breaking out of the infection there was, as it is easy to suppose, a very great fright among the people, and consequently a general stop of trade, except in provisions and necessaries of life, and even in those things, as there was a vast number of people fled and a very great number always sick, besides the number which died, so there could not be above two-thirds, if above one-half, of the consumption of provisions in the city as it used to be.”

— When people did venture out to buy goods, the shopkeepers made sure not to touch any coins unless they were first dropped into a pot of vinegar.

— All public entertainment was banned.

— Sick people were quarantined in their homes, along with every other resident of the household, for 40 days at a time.

— “The common people, who, ignorant and stupid in their reflections as they were brutishly wicked and thoughtless before, were now led by their fright to extremes of folly: running after quacks and every practicing old woman for medicines and remedies, storing themselves with such multitudes of pills, potions, and preservatives, as they were called, that they not only spent their money, but even poisoned themselves beforehand for fear of the poison of the infection and prepared their bodies for the plague instead of preserving them against it.”

— There was knowledge of asymptomatic carriers: “One man, who may have really received the infection, and knows it not, but goes abroad and about as a sound person, may give the plague to a thousand people, and neither the person giving the infection nor the persons receiving it know anything of it.”

— The poor and the working-class population were acknowledged by Defoe to be the most likely to suffer, because they had fewer resources to begin with, and most of all because they still had to go to work in dangerous occupations (like hauling out dead bodies). As Defoe said, they “went about their employment with a sort of brutal courage.”

— Trading for agricultural goods (grain, butter, and cheese) “carried on all the while of the infection, and that with little or no interruption.”

The temptation, of course, is to skip ahead to the end of the book and ask, “How did it all end?” If market behavior matches up between the beginnings and middles of the two pandemics, shouldn’t we prepare for our own pandemic story to match the end of Defoe’s? Here again I admire the old trader’s attention to prices. He reports that there were no shortages of food: “Provisions were always to be had in full plenty, and the price not much raised neither, hardly worth speaking.”

Specifically, the value of a “wheaten loaf” of bread went from the equivalent of $0.82 (converted into today’s U.S. dollars) before the plague, to $0.91 “in the height of the contagion … and never dearer, no, not all that season,” then back down again within eight months as the plague waned. That may sound cheap to us, when a 16-ounce loaf of bread today costs more than $3.00, but bear in mind that the annual wages for a skilled tradesman in 1660s were only $3,270 (converted into today’s U.S. dollars).

The great value of Defoe’s “Journal of the Plague Year” is its reminder to us that there is nothing new under the sun. It also reminds us that sometimes, patterns do occur in markets. Robert Shiller, winner of the 2013 Nobel Prize in economics for his work describing the “speculative bubble” pattern that occurs in asset prices fueled by herd instinct, has now turned his attention to the opposite pattern perhaps occurring in markets today. Rather than irrationally exuberant buying that creates overheated prices, perhaps we’re seeing irrationally, fearful selling driving unfairly depressed prices. He writes that “a contagion of financial anxiety works differently than a contagion of disease. It is fueled in part by people noticing others’ lack of confidence, reflected in price declines and others’ emotional reaction to the declines. A negative bubble in the stock market occurs when people see prices falling, and, trying to discover why, start amplifying stories that explain the decline. Then, prices fall on subsequent days, and again and again.”

Time will tell whether the current bearishness in the fuel markets, for instance (which include ethanol and corn), or the bearishness in feed markets (including soybean meal), is an irrational overreaction that will correct itself back upward as fear subsides or whether it’s truly an efficient response to observations of suddenly lower demand.

Unfortunately, my own understanding of germs (which, admittedly, comes from nothing more than high school science classes and years of vaccinating calves) is enough to know that our pandemic likely won’t peter out as easily as the Black Death of 1665. Their plague was caused by a flea-hosted bacterium that presumably died over the cold winter when the fleas on the rats died. Our pandemic today is caused by a virus — a trickier, undead non-lifeform with no known cure.

Yet again Defoe has been there before us, noting a resurgence of death counts in the two weeks after a premature rumor circulated that the disease wasn’t really so bad. He lamented “the people’s running so rashly into danger, giving up all their former cautions and care, and all the shyness, which they used to practice, depending that the sickness would not reach them, or that if it did, they should not die.”

Elaine Kub is the author of “Mastering the Grain Markets: How Profits Are Really Made” and can be reached at masteringthegrainmarkets@gmail.com or on Twitter @elainekub.

© (c) Copyright 2020 DTN, LLC. All rights reserved.

The multiple problems with multi-year cycles

Perhaps, sitting in a bar one evening, a friend told you that corn yields tend to be great during years that end in “6.” Or perhaps you’ve heard of the 18-year cycle in the stock markets? Or the 60-year cycle in wheat prices? Or the 14 3/4-year cycle in soybean prices, which only holds true if the previous year’s price ended with an even number?

Okay, I made that last one up. But that’s alright — other people baselessly fabricated all those other examples, too, and they all have the same statistical significance (zero).

I hadn’t heard of the 60-year cycle in wheat prices until a gentleman told me about it after a recent market presentation. He has many more years of experience in the wheat market than I do, and I’m always willing to learn new things, so I promised I would look into it. More on that later.

All these multi-year cycles are interesting bits of folklore, and they’re kind of neat to think about. If thinking about them and analyzing the underlying economic reasoning behind them helps market participants better understand the world around us, then that’s great. But if blindly believing them motivates farmers to make or postpone marketing decisions based on unsound science, then that’s bad. That’s why I’m going to try to bust the myth of the multi-year cycle as clearly as I can.

In this universe, many phenomena tend to occur frequently near their averages and less frequently at unusual values, measurements or strengths. This is often shown with the bell-curve chart of the normal distribution. But even if a phenomenon isn’t “normally” distributed, if that thing happens a large enough number of times, it will still always tend toward some average value. That’s the Central Limit Theorem, roughly speaking. It is powerful because it allows us to calculate whether a particular event is truly unusual, like someone who’s 6 feet 7 inches tall. Is that just part of the randomness of the universe? All heights will vary somewhat from person to person.

Therefore, among an entire universe of values, taking just one sample — or just a few samples — is extremely unhelpful when it comes to predicting future values. The Chicago Board of Trade was established in 1848 to exchange cash grain, but a standardized record of corn, wheat and soybean futures prices only exists since 1959. That means there are only six samples of an annual corn price from “a year ending in six,” and six samples is way too few to be confident that whatever trend our human brain might think it sees is anything more than just random statistical noise.

I was slightly more willing to believe in a statistically provable multi-year pattern in wheat, however, because I remember seeing an amazing chart of wheat prices from 1750 through 1960, collected by Hugh Ulrich. I updated that data through 2018, and that made 268 years of information — which is a lot! It’s only four samples of any 60-year period, however, once again, it’s difficult to prove there is anything significant beyond randomness in any purported 60-year cycle in wheat prices.

Furthermore, even the 268 years of data was problematic. Some of it was from England in the 18th century, quoted in English cents per bushel. Some of it was CBOT futures quoted in U.S. cents per bushel. More importantly, the structural economic reality of wheat itself has drastically changed between 1750 and today. The number of man-hours that go into a bushel of wheat, the proportion of a farm family’s income that comes from a single bushel, the proportion of an urban consumer’s budget that goes into a single bushel — none of this is apples-to-apples from one economic timeframe to the next. This is called time-period bias in statistical sampling. Even comparing U.S. stock prices from the inflation-plagued 1970s compared to the easy-money 2010s is problematic.
Let’s actually try to test a multi-year cycle. Say we look at the so-called “decennial pattern” in the stock market, which colloquially claims that years which end in “0” tend to have poor performance, and years which end in “5” have “by far the best” performance. We can gather 90 years of stock market returns since 1928. Maybe 90 years sounds like a lot, but it’s only nine sets of 10, or nine samples from years that end in “5.”

Let’s say the average of all 90 annual returns is only 11.4%, but we calculate the average of annual returns from years ending in “5” at 14.6%. Woohoo! Sounds like those years ending in “5” really are winners — notably including 1995’s 37.6% return. However, if we conduct a two-tailed test for statistical significance using the student’s t-distribution, which mathematically considers the standard deviation of all those returns and the small number of samples, and then compares them against what can occur by mere happenstance, the difference between the all-year average returns, and the years-ending-in-5 average returns is proven to be nothing but statistical noise.

However, if we were magically able to use 300 years of stock market data, and therefore had 30 samples to draw from (30 is widely considered to be the minimum statistically useful number of samples), we could calculate a somewhat larger critical value for this statistical test. And then say there’s maybe 80% confidence that the difference between the two sets of returns might actually be significant (and a 20% chance they’re not). There still wouldn’t be any fundamental explanation for why the final digit of a calendar year should affect equity performance.

Anyway, look and see that stock returns in 2015 were only 1.38% — the worst annual performance since 2008. Anyone who actually invested money based on this hokey idea of a decade-long market pattern would have been sorely disappointed.

To all the believers in multi-year patterns or cycles: please continue to tell me about them! I love hearing about these fables, and I collect them like other people collect pretty seashells. But please don’t sell your grain (or not sell your grain) based on someone else’s flimsy idea that has only ever been sampled four times in history.

Elaine Kub is the author of “Mastering the Grain Markets: How Profits Are Really Made” and can be reached at elaine@masteringthegrainmarkets.com or on Twitter @elainekub.

© Copyright 2018 DTN/The Progressive Farmer. All rights reserved.

Grain and the Birth of Civilization

Originally published at https://www.dtnpf.com/agriculture/web/ag/news/article/2017/07/19/grain-birth-civilization

Grain markets played a huge role in human history. The very formation of early nation states and all the “great” early civilizations — the Sumerians, the Egyptians, the Han Dynasty — were only possible once humans started growing grain, and the history lessons all take it for granted that this is a good thing. Farming is a noble and desirable profession, and everyone should want to do it.

But history lessons would say that, wouldn’t they? By definition, if we are still reading about a civilization today, it’s only because that civilization happened to have some class of elites sitting around with enough time and education to write things down. There are no surviving epics from the “barbarians” who passed their time hunting and gathering and moving from place to place in small bands, as their desires moved them, although this might have been a much more pleasant, healthy way to live compared to life in the shadow of the Ziggurat of Ur.

The writing elites were supported by the excess grain taken away (taxed) from the broader population of working, drudging peasant farmers, so of course the elites thought that grain was pretty great. Of course that’s how written history remembers this development in human civilization. One Sumerian text reads, “Whoever has silver, whoever has jewels, whoever has cattle, whoever has sheep shall take a seat at the gate of whomever has grain, and pass his time there.”

But a forthcoming book called Against the Grain: A Deep History of the First Civilizations from James C. Scott, political scientist, anthropologist, and Director of the Agrarian Studies Program at Yale University, tosses all these assumptions upside down.

It’s true that there was no such thing as an organized nation state without farming, and this was a newly powerful system of human organization, one that really only developed within the past 400 years across most of the globe. However, Scott points out the ways that humans themselves were “domesticated” (basically enslaved) to make this at all possible for the relatively few elites among a population. He leaves a reader wistful for a time when roving bands of humans had a better leisure-to-drudgery ratio (about 50/50) compared to the constant hard work that our farming ancestors have convinced us is so virtuous.

Think about it: even today with the significant aid of technology, how many hours have farmers invested in this season of growing crops, and how much of that effort is ultimately going to be sent to the government as taxes? Don’t get me wrong — I personally enjoy living in a society with well-maintained roads and law and order, but there’s always a certain temptation to just live off the fish you can catch in the river and the wild plums you can harvest from the roadsides. How much stronger must that temptation have been when the choice was even starker: sitting around a campfire with your hunter friends, or collapsing at the end of a day spent hoeing weeds in Pharaoh’s wheat fields?

As Scott puts it, “Why anyone not impelled by hunger, danger, or coercion would willingly give up hunting and foraging or pastoralism for full-time [fixed-field] agriculture is hard to fathom.”

Grain states were fragile things, usually disintegrating within two or three reigns, due to sudden new epidemic diseases (human diseases or grain diseases, all only possible now that large populations were concentrated in one urban area), or drought or flood or pestilence, or climate variation, or the exhaustion of the soil and other nearby resources, or overly rapacious taxation and the subsequent fleeing of the peasant population. All the “Great Walls” of history, including the first one built in 2000 BC by the Sumerian King Sulgi between the Tigris and Euphrates rivers, were built as much to keep a tax-paying population IN, as to keep barbarians OUT.

But during the time when any nation state thrived, it was only possible because of grain. Sure, there could be a sedentary grain farming population without any nation state, but there was never any such thing as a nation state without grain farming.

Population centers might develop, and wealth might be accumulated within a town, but in order to politically enclose a real empire, supported by other people’s labor, a nation state needed to be based on an easily taxable product. Wheat (or barley or teff or any other cereal grain) can be dried and stored; it can be transported; it is visible, divisible, rationable, and assessable. In other words, its annual production can be easily seen and accurately counted by the tax collectors, unlike almost any other agricultural crop. Even wool was difficult to tax because the shepherds kept moving around and shearing their sheep in different locations. Once a grain crop is planted, it stays in one place and the peasantry must stay there, too.

That is how the farming tradition of growing ever-more grain got started. More yield per field, more fields per farmer, more of anything there can be more of. As Scott writes, “In the absence of either compulsion or the chance of capitalist accumulation, there was no incentive to produce beyond the locally prevailing standards of subsistence and comfort … Beyond sufficiency, there was no reason to increase the drudgery of agricultural production.”

Only 240 human generations have passed since the first adoption of agriculture (can you believe that?), and Scott calculates that perhaps no more than 160 generations have elapsed since grain farming became a widespread practice. It only took that many people to pass down and reinforce every deeply held attitude and belief we currently hold about farming.

It is fascinating to read the details of that progression, to think about how these beliefs originated, and to consider how closely we should still treasure them today.

*Against the Grain: A Deep History of the First Civilizations by James C. Scott. Yale University Press, 336 pp, $26.00, August 22, 2017, ISBN 978 0300182910

© Copyright 2017 DTN/The Progressive Farmer. All rights reserved.

Jethro Tull Spinning in His Grave

Originally published at https://www.dtnpf.com/agriculture/web/ag/news/article/2017/03/29/jethro-tull-spinning-grave

In 1701, Jethro Tull invented the horse-drawn seed drill, used for evenly planting straight furrows of small grains in England, and later in Europe, America and the rest of the world. And here I was, thinking all along that “Jethro Tull” was just a progressive rock flautist!

There have long been some basic, human-operated devices for planting seed in furrows, used in India for many hundreds of years and in Europe as early as 1647. But it wasn’t until Jethro Tull put some horsepower and some wheels into play that the Agricultural Revolution was really sparked. His controversial invention was more eagerly adopted in the New World than it was among his own countrymen, who were stubbornly accustomed to sowing their wheat seed by hand. But, eventually, the mechanized drill allowed the world’s farmers to support a population explosion throughout the industrialized world.

Anyway, I think if Jethro Tull (not the band, but the real Jethro Tull, the 18th-century educated gentleman farmer from Berkshire) could see how unloved his world-altering invention is in 2017 on the Plains of America, he would spin in his grave. It’s expected that Friday’s Prospective Plantings report from USDA will show 2017’s total wheat acreage down 3% from last year. Several industry participants weighed in on this shift in an article from DTN’s Basis Analyst Mary Kennedy, “Spring Wheat, Durum Losing Ground”: http://bit.ly/…

A 3% dip in acreage from one year to the next, on its own, doesn’t sound like much. In total acreage terms, wheat is expected to lose 1.6 million acres compared to last year’s crop. Variations in weather and yield trends could easily counteract that acreage loss by the time total production is counted. Even if we look strictly at the winter wheat crop, where planted acreage for the 2017 crop has already dropped 10% (to 32.4 ma) and harvested acreage could fall significantly more if dry fields in the Southern Plains are switched to row crops this spring, a 10% drop still isn’t huge in the big scheme of things.

I looked at all the year-to-year acreage changes of the 10 biggest U.S. crops since the beginning of time (or at least since 1909, when nationwide acreage records started to be kept). There have been some whoppers. The most significant increase was soybean acreage gaining 63% from 1933 to 1934, when 5 ma were planted to soybeans. The biggest percentage drop came from sugarbeets in 1943, falling 41% to 619,000 acres, but perhaps oats dropping 39% to 12.4 ma in 1984 was a more significant change.

There are other examples from recent times that show how farmers have frequently reacted to market conditions more drastically than what we’re seeing in 2017 wheat acreage. Grain sorghum (milo) acreage surged 1.8 ma to 8 ma in 2013, a 29% year-over-year acreage gain in response to corn’s dismal performance in the previous drought year. Rice acres were up 20% in 2016, barley acres were up 43% in 2012, and cotton acres were up 34% in 2011 after gaining 20% the year before.

To see the real context of wheat’s acreage losses, however, we need to consider a longer timespan than just looking one year to the next. This will be the fourth straight year of losses in winter wheat acreage, which is down 31% since 2008 (when it was 46.7 million) or down 42% since 1990 (56.7 million).

Futures traders will see USDA’s surveyed responses from farmers about their spring wheat planting intentions in Friday’s annual Prospective Plantings report — and that crop, too, is expected to see an acreage drop that may not look too huge from one year to the next, but when seen in its larger context, is actually astounding. The average pre-report guess pegs spring wheat acreage at 11.3 ma, down 3% from last year. But there was a 22% acreage loss in the decade between 2006 and 2016 following about a 30% acreage loss in the decade before that. Since the crop’s peak popularity in 1996, when 20 ma were planted, it’s looking like 2017’s spring wheat acreage will be 43% lower than that.

To understand why an increasing number of wheat drills are sitting unused in the Northern Plains of the United States and Canada, compare Jethro Tull’s invention (pulled by tractors today, sure, but still mechanically similar) against the successfully widespread adoption of the corn planter. Together with herbicide-resistant seeds, over the past 20 years, the corn planter has allowed farmers to use no-till techniques and grow corn and soybeans in regions previously thought too dry for row-crop production, and today you’ll find a lot of prairie farmers who prefer to plant corn and soybeans instead of wheat, just for the logistics of the situation (to say nothing of the price).

The corn planter was a late arrival on the scene, compared to the wheat drill, and for a long time in many areas of the U.S., corn seed was drilled. As early as 1834, there was a patent for a wheelbarrow-like corn-planting device, then other devices in the 1850s, ultimately evolving into the “corn jabbers” that were carried by humans and used to plant individual hills. Those were also known as “bill picks,” a name that’s easy to understand because the gadget works somewhat like a bird’s bill. But I’ve always wondered why a seed drill was called a “drill.” It turns out, it is just one of those quirks of British language. Jethro Tull himself wrote that his invention “was named a drill because when farmers used to sow their beans and peas into channels or furrows by hand, they called that action drilling.”

It took him decades of losing money and tinkering with his design before the small grain seed drill really caught on. And then apparently, it took 300 years for it to fall out of favor. If Friday’s all-wheat acreage number is anywhere below 48.7 ma (which it is expected to be), then that will be 45% below the wheat acreage peak of 88.2 ma in 1981 … and this summer, America will have the fewest wheat fields it’s ever had since USDA’s acreage records began.

© Copyright 2017 DTN/The Progressive Farmer. All rights reserved.