Regular EconLog reader KevinDC wrote me last week with some interesting content from Slate and his own insightful comments. He has given me permission to quote.

Kevin writes:

I came across a story in Slate I thought you might find interesting. The author describes how food delivery apps similar to UberEats or GrubHub work in Indonesia. From a certain naive point of view, these sorts of services could be seen as successful instance of an algorithmic, central planning model of food delivery. But the author points out that these systems can work due only to the ability of the drivers to bring their deep knowledge of specific local conditions, the quintessential Hayekian point about “particular circumstances of time and place.”

From Rida Qadri, “Delivery Platform Algorithms Don’t Work Without Drivers’ Deep Local Knowledge,” Slate, December 28, 2020:

To do their jobs, they must think every day about which routes have the most potholes and which traffic signals stay red the longest. Their mental maps of the city note what places have unfriendly security, where they might encounter violent traditional motorbike drivers, specific agreements they have to comply by [sic], friendly roadside restaurants that would let them rest. They must compensate for inaccurate geolocations caused by GPS signals blocked by nearby infrastructure.

Much has been written about the frictionless technology of ride-hail platforms celebrated by customers and technologists alike…Yet their elegance is powered by and relies on the human mediations of the drivers on the street. It is the local markets they claim to replace that have often furnished drivers with the knowledge of local physical and social constraints.

Kevin then points out that this caused him to recall “a similar observation by James C. Scott about how apparently successful planning depends on the ability of people to ignore the plans and regulations, and follow their own evolved rules instead.”

Scott wrote:

Workers have seized on the inadequacy of the rules to explain how things are actually run and have exploited it to their advantage. Thus, the taxi drivers of Paris have, when they were frustrated with the municipal authorities over fees or new regulations, resorted to what is known as a grave de zele. They would all, by agreement and on cue, suddenly begin to follow all the regulations in the code routier, and, as intended, this would bring traffic in Paris to a grinding halt. Knowing that traffic circulated in Paris only by a practiced and judicious disregard of many regulations, they could, merely by following the rules meticulously, bring it to a standstill. The English language version of this procedure is often known as a ‘work-to-rule’ strike. In an extended work-to-rule strike against the Caterpillar Corporation, workers reverted to following the inefficient procedures specified by engineers, knowing that it would cost the company valuable time and quality, rather than continuing the more expeditious practices they had long ago devised on the job. The actual work process in any office, on any construction site, or on any factory floor cannot be adequately explained by the rules, however elaborate, governing it; the work gets done only because of the effective informal understandings and improvisations outside those rules.

I (this is David R. Henderson speaking) remember when I first heard, in my teens, about a work-to-rule strike and thought “What’s the problem? Isn’t everybody supposed to be working by the rules?” It might not surprise you to learn that I grew up in a family run by a man (my father) who was a high-school principal. Someone, probably my mother, who, OMG, was so much looser with rules, explained to me why such a strike would be effective.

Back to Kevin. He writes:

These (and other examples) are bringing into focus something that I’ve noticed for a long time but never articulated. “Planning” is most able to appear successful in places where people are most free to ignore or work outside the plan. Delivery drivers aren’t successfully allocated by algorithms crunching all the “relevant data” – the drivers use their own local knowledge, unaccounted for by planners, to determine what the most efficient allocation of driving resources will be. A de jure “well regulated” taxi industry can appear to work efficiently, but only to the extent that the taxi drivers are de facto free to ignore regulations and act instead by their own evolved order. In countries that were dedicated to the idea of “planned economies,” life was most tolerable in the places where the local authorities tacitly approved (or at least tolerated) the existence of black markets operating in parallel to allocate resources outside the dictates of the planners. The less effectual planning is, the more successful planning appears.

This is even true (in my experience) of what appears one of the most “command and control” organizations you can think of – the US Military. (I was in the Marine Corps for nine years.) On the one hand, there are the official rules, regulations, general orders, and standard operating procedures written up by people sitting behind desks and printed up in the official manuals. And on the ground, there is how stuff “really gets done,” which varies from unit to unit. (This phenomenon was made fun of in the Terminal Lance webcomic, where Marines fresh out of training are quickly advised to forget everything they were taught: https://terminallance.com/terminal-lance-character/itb/) There was also an informal understanding that “regulation thumpers” who insisted that everything be done “by the numbers” according to the official rules should never be allowed to be in charge of anything – because they were prone to substitute official rules for evolved unit level practices, and nothing would ever get done properly.

Back to David R. Henderson:

One of the funnest and most illuminating projects I gave my students for the last 15 years I taught was to discuss I situation they had confronted in the military and say acting according to local knowledge worked (or didn’t work) or why centralized decisions didn’t work (or worked.) We all agreed that a lieutenant should not start a nuclear war, but there were some really good examples far below that level of importance where local knowledge worked well (and some where it worked badly.)

I’ve sometimes thought to collect these in a book.