Regenerative Agriculture and the Denial of Comparative Advantage: Part 2
Writing in 1770, the French economist and statesman Anne-Robert-Jacques Turgot observed that there were two ways to deal with a price spike that followed a bad harvest. The first was to transport “grain from provinces where the harvest is good to those where it is bad.” The other was to “store it up in abundant years for use in famine years.” Turgot wrote that the “two methods entail costs, and free trade will always choose that which, all told, entails the least cost.” He added, that “barring special circumstances,” transportation was usually preferable since “the return of the funds is speedier” and the “waste product is less considerable, the grain being consumed the sooner.” Very often though, governments placed “obstacles in the way of transportation” and tipped the balance in favor of storage.
In the two centuries and a half since Turgot wrote these lines, debates about food security have essentially been along similar lines. On the one hand are supporters of the so-called trade-based approach who promise a more abundant, affordable and reliable food supply through reliance on multiple foreign suppliers with comparative advantages in agricultural production. On the other are supporters of greater autarky and regional self-sufficiency. While the former had the upper hand at the end of the twentieth century, the latter have found new audiences in the wake of the 2007-2008 food crisis and the COVID-19 pandemic.
One key problem for supporters of increased local self-reliance, however, is that because of natural calamities ranging from droughts and floods to pests and diseases, no traditional agricultural systems could consistently produce enough food to eliminate malnutrition and recurring famine and starvation.
Even in good years, the key challenge of traditional agricultural systems was to make it through the “lean season” between crops, meaning the period of greatest scarcity before the first availability of new crops. For instance, in England the late spring, and especially the month of May, was historically referred to as the “starving time” or the “hungry gap.” This is why granaries were invented over ten thousand years ago. Going back at least to Pharaonic Egypt and Han China, some of these were built and controlled by the state, either for provisioning bureaucrats and soldiers or else for the stated rationale of being filled in good harvest seasons and emptied in lean ones, thus softening hunger cycles and price spikes.
Some prominent present-day proponents of greater local food production have picked up on the absolute necessity of greater storage capacity to fulfill their vision. Journalist Michael Pollan thus argued in an influential essay that the “the food security of billions of people around the world” would benefit from a government-run strategic grain reserve which would “prevent huge swings in commodity prices” and “provide some cushion for world food stocks.” By buying and storing grain “when it is cheap and sell it when it is dear,” he points out, public-minded bureaucrats would “moderat[e] price swings in both directions and discourage[e] speculation.” Needless to say, in recent decades public granaries and “strategic” reserves were also often a key component of foreign aid and protectionist agricultural strategies. Their supporters have included the prestigious International Food Policy Research Institute (IFPRI); NGOs such as the Institute for Agriculture and Trade Policy (ITAP), OXFAM and Share the World’s Resources; several American consumer, environmental, religious and development groups and producers’ cartels; and the people in charge of the large-scale food reserves of India and China. Although recent proposals often take the form of special emergency reserves, international reserves, and “virtual reserves” controlled via commodity futures and options trading, their basic rationale remains the same.
Government-run food reserves, however, have a long history of failure. As several analysts have documented in more recent times, they typically proved “expensive, ineffective, and generally short-lived.” They also proved unable to outperform futures markets that, through the buying and selling of commodities and their future delivery contracts, already smooth out long-term price volatility. Recent failures include Sahelian community-managed cereal banks that are small subsidized warehouses located in subsistence farming communities. As could be expected, their managers are tasked to buy grain when it is cheap and to sell it later at a discounted (but nonetheless profitable) price when it is dear. In practice though, cereal “community-run banks often run out of money. Borrowers default; bank managers price-gouge or simply steal money, leaving villages as hungry as before.” As one former NGO employee observed, people “stole, managers disappeared, or the bank was located too far for some villagers to get their food.” Supporters who acknowledged these problems could only defend this strategy by arguing that a “flawed solution to fight hunger is better than no solution.”
Similar outcomes were observed in the strategic grain reserves set up throughout Africa under the aegis of the Food and Agricultural Organization of the United Nations after the first oil shock of the 1970s. As described by geographer Evan Fraser and journalist Andrew Rimas, two analysts not particularly enamored of market forces, the “seemingly limitless hoard” in silos proved “too tempting for local officials to ignore, and the program was plagued by politicking, mismanagement, and corruption.” A decade ago, an internal note suggested that about a third of the grain stock reserve under the supervision of the Food Corporation of India (FCI) was rotting in the open because of a lack of adequate storage space. Despite the “full knowledge of the precarious condition of food grains, governments, both at the centre and in states, were unable to protect the country’s precious food reserves.” The FCI was also accused of being unable to move stocks after acquiring them and having difficulty carrying out fumigation, “thus making preservation difficult.” According to the news report, the “apathy of the people and officials responsible for feeding millions may result in more losses in years to come. The big question which needs to be answered is whether anyone would be held responsible for this seemingly criminal negligence.”
Far from being aberrations, however, such outcomes are typical of the history of government-run food reserves. Apart from the perennial temptation of public officials to dip into them for their personal benefit, they also proved extremely costly and technically challenging, especially before the development of modern technologies. Among other challenges, their operators had to aerate and turn the grain, control moisture levels, sell and replace the grain frequently if it was to be used as seeds, and repair and maintain large structures. Not surprisingly, Turgot observed that the large granaries built by the French state always increased “the share of the rats and weevils to no purpose.” At about the same time, the Englishman Walter Harte considered “public granaries quite detrimental, rather than useful in a free state” for “[n]ational and even provincial magazines of corn” quite naturally produced monopoly, an “undue fear of famine” and “much anxiety about hoarding up grain” that would then inevitably create pressures to stop exports. These factors, he added, were “one of the surest methods I know of bringing on a dearth.”
The Belgian historian Louis Torfs further added in 1839 that public granary managers who could rely on the public purse were never as careful in their purchases as private individuals who spent their own money. Other problems were that massive state-sponsored purchases drove up prices for everyone and safeguarding large warehouses during turbulent times always proved nearly impossible. Besides, while the building and maintenance of massive structures entailed enormous sums of money, it paled in comparison to the amounts required to provision a decent sized city for even a short period of time. As such, Torfs stated, the very notion of effective public granaries had always been impractical (“sans aucune valeur pratique”). Efficient provisioning, he concluded, should be left in the hands of farmers and merchants, with government intervention limited to guaranteeing freedom to trade and private property rights. In the end, as William Harte argued, the best public granaries were “vast tracts of country covered with corn,” wherever they may be.
With the benefit of hindsight, we now know that trade liberalization and technical advances have delivered an ever more abundant, cheaper and more secure food supply. Promoting “solutions” that have always been plagued with unavoidable problems can only deliver the more expensive, scarcer and less food secure world of yesterday.
Pierre Desrochers, is Associate Professor of Geography, University of Toronto Mississauga.