It was the spring of 1960, and a group of military officers had just seized control of the government and the national media, imposing an information blackout to suppress the coordination of any threats to their coup. But inconveniently for the conspirators, a highly anticipated soccer game between Turkey and Scotland was scheduled to take place in the capital two weeks after their takeover. Matches like this were broadcast live on national radio, with an announcer calling the game, play by play. People all across Turkey would huddle around their sets, cheering on the national team.
Canceling the match was too risky for the junta; doing so might incite a protest. But what if the announcer said something political on live radio? A single remark could tip the country into chaos. So the officers came up with the obvious solution: They kept several guns trained on the announcer for the entire 2 hours and 45 minutes of the live broadcast.
It was still a risk, but a managed one. After all, there was only one announcer to threaten: a single bottleneck to control of the airwaves. (italics added)
This is from Zeynep Tufekci, “IT’S THE (DEMOCRACY-POISONING) GOLDEN AGE OF FREE SPEECH,” Wired, January 16, 2018.
This quote from early in the piece is about how relatively easy it was, before the Internet, for an oppressive government to engage in censorship. But now, she explains, it’s much harder.
That’s good, right? Maybe to you, and certainly to me, but not to Professor Tufekci.
Why? She explains why in the whole rest of the article and so if you’re really interested, read it.
But what follows are some excerpts from her case for censorship.
Here’s the first part:
But today that playbook is all but obsolete. Whose throat do you squeeze when anyone can set up a Twitter account in seconds, and when almost any event is recorded by smartphone-Âwielding memÂÂbers of the public? When protests broke out in Ferguson, Missouri, in August 2014, a single livestreamer named Mustafa Hussein reportedly garnered an audience comparable in size to CNN’s for a short while. If a Bosnian Croat war criminal drinks poison in a courtroom, all of Twitter knows about it in minutes.
Again, that seems good to me, although there are potential downsides, but remember that this is part of her case for censorship. Myself, I’m not big on squeezing throats.
She goes on to argue, correctly, that there’s much fake news out there and that big outlets like Facebook and Google are gatekeepers.
Moreover, she notes, again correctly, that it’s hard to get people to hear your viewpoint if you have one that is contrary to the one they’re seeing/reading and you have no very good way of reaching them:
And the famous American saying that “the best cure for bad speech is more speech”–a paraphrase of Supreme Court justice Louis Brandeis–loses all its meaning when speech is at once mass but also nonpublic. How do you respond to what you cannot see? How can you cure the effects of “bad” speech with more speech when you have no means to target the same audience that received the original message?
But is that really much different from the fact that before the Internet, you didn’t have ways to enter people’s living rooms where conversations that contradicted what you knew were taking place? The Internet has definitely expanded the living room that you’re not invited to. But it has also made it much easier to know that there is such a living room. So maybe what you need to do is figure out how to get invited into that living room. Could your success at that have anything to do with how you approach people in the living room? My gut feel is that the answer is yes.
Also, I think this problem shouldn’t be overstated. I have acquired friends on Facebook whom I know of, or who know of me, for reasons unrelated to my political views. Here’s what I notice when I post on FB one of my EconLog posts or something else that has political or economic content: crickets. No loves, no likes, no sads, no angries, nada. But it’s not that they’re blocking me. When I post nice pieces about cats, then boom, lots of loves and likes. In short, they’re choosing not to respond and, probably, not even to read. Would I like that not to be so? Of course. But my point is that these people who don’t share my views have clearcut ways of seeing my views.
Here’s her ending paragraph:
But we don’t have to be resigned to the status quo. Facebook is only 13 years old, Twitter 11, and even Google is but 19. At this moment in the evolution of the auto industry, there were still no seat belts, airbags, emission controls, or mandatory crumple zones. The rules and incentive structures underlying how attention and surveillance work on the internet need to change. But in fairness to Facebook and Google and Twitter, while there’s a lot they could do better, the public outcry demanding that they fix all these problems is fundamentally mistaken. There are few solutions to the problems of digital discourse that don’t involve huge trade-offs–and those are not choices for Mark Zuckerberg alone to make. These are deeply political decisions. In the 20th century, the US passed laws that outlawed lead in paint and gasoline, that defined how much privacy a landlord needs to give his tenants, and that determined how much a phone company can surveil its customers. We can decide how we want to handle digital surveillance, attention-Âchanneling, harassment, data collection, and algorithmic decisionÂmaking. We just need to start the discussion. Now.
So it’s not Facebook and Google, responding to public outcry that should fix this situation. Whom does that leave? It’s “deeply political.” In other words, Tufekci wants some government intervention to fix it. Which means, almost certainly, censorship.
Notice a little irony with her phone company example: political decisions “determined how much a phone company can surveil its customers.” Political decisions also determined determined how much the federal government can surveil its residents. How is that working out?
Roman emperor Caligula once said, “Would that the Roman people had but one neck!” The idea was presumably that then he could easily control it, break it, threaten it, or strangle it. Tufekci leads off with a discussion of how easy it was to censor when there was one bottleneck. I wish I were 100% confident that she doesn’t share Caligula’s wish.
HT2 to Brian Doherty whose analysis I slightly overlap. He makes a lot of other good points.
READER COMMENTS
Mark
Feb 13 2018 at 5:37pm
I assume the article was written in 2018 rather than 2108, though will not rule out the possibility of econlog owning a time machine.
Whenever I see articles demanding greater censorship or speech regulation, I’m always disappointed to find that the evidence for how bad things have gotten is entirely anecdotal or speculative. What is the quantitative evidence that people are actually more misled today than in previous generations? It just seems like a vague golden age fallacy to me. Is there a great history of fair, constructive state regulation of speech that I’m not aware of?
David R Henderson
Feb 13 2018 at 5:52pm
@Mark,
2018, yes. I seem to make the mistake a lot. Fixed.
Whenever I see articles demanding greater censorship or speech regulation, I’m always disappointed to find that the evidence for how bad things have gotten is entirely anecdotal or speculative.
Well said. Me too.
What is the quantitative evidence that people are actually more misled today than in previous generations? It just seems like a vague golden age fallacy to me.
My impression too.
Is there a great history of fair, constructive state regulation of speech that I’m not aware of?
I tend to doubt it. I sometimes wonder, when I see left “liberals” decry how misled people are, if what they really don’t like is that they’re misled by things they don’t agree with rather than being misled by things they do agree with.
Jon Murphy
Feb 13 2018 at 9:11pm
The problem with trying to come up with some kind of “rule” about content, regardless of who enforces it, is said rule ultimately means that the truth content will be determined not by its nature but rather by some arbitrary judge. Even if one wants a rule like “any statement that is not supported by fact should be banned” or “any statement that is contrary to fact should be banned” will eventually become arbitrary. What’s factual at one time is not at another. There was once a time where the “facts” told us the Sun revolved around the Earth.
In other words, you’d need non-experts to make expert judgments. That’s a scary thought.
I am reminded of the Devil speech from A Man For All Seasons. I fear these folks would be willing to cut down all the laws for the sake of getting to the Devil…
Weir
Feb 13 2018 at 9:29pm
“Sadly you can say what you like around the kitchen table at home.” True quote from Gillian Triggs, formerly Dean of Sydney Law School, recipient of last year’s Voltaire Award, and the Australian government’s choice for Human Rights Commissioner.
foobarista
Feb 13 2018 at 11:07pm
I can’t help but think that she subscribes to the ancient definition of democracy: that when my guy or gal wins, it’s democracy.
Anything else is a horrid dictatorship of Those People.
And if only a committee of the right-thinking were in charge of, my buds would have a “permanent governing majority” and we’d have a nice safe democracy…
David R Henderson
Feb 13 2018 at 11:35pm
@Weir,
Thanks for that. Chilling.
BC
Feb 14 2018 at 1:56am
“Sadly you can say what you like around the kitchen table at home.”
Actually, combine the internet censorship that Tufekci calls for with Alexa and Google Home and maybe this won’t be true anymore.
Hazel Meade
Feb 14 2018 at 1:47pm
A couple of months ago, I was having a discussion about Wikipedia and how that website has somehow evolved community standards that rely on factual evidence and citations, while the rest of the internet has spun off into tribal camps with their own alternate narratives and sets of facts.
The explanation was offered that Wikipedia only allows one article for each topic, so that participants are forced to confront the other side’s facts and citations and consequently the article tends to converge towards something truthful and comprehensive.
Maybe in the future, the news media will converge on something similar, eliminating the demand for government to do something about fake news. The collaborative aspect of Wikipedia also helps to build trust – everyone can participate as long as they are committed to honesty, so fewer people will feel the need to build an alternative media to present their own point of view.
NS
Feb 14 2018 at 3:04pm
It’s been obvious for some time that any private company that collects “Big Data” should be considered a de-facto arm of the state. Is anyone else suspicious that although knowledge and power have been synonyms for hundreds of years, the government is now curiously unable and unwilling to regulate private organisations that collect data on its citizens, just at the moment when its own intelligence agencies are having their hands increasingly tied or at least disparaged in the public eye?
Doesn’t Occam’s Razor suggest that private companies are giving central governments a deniable way to “collect it all”? In fact, didn’t Edward Snowden reveal exactly this behind-the-scenes cooperation? So much cooperation, in fact, that according to his stolen documents it wasn’t clear where Google, Yahoo, Facebook and Microsoft ended and where Washington began.
In a Western democratic system, “the government” is always much more than just the official departments and ministries. It has to be, because it pivots on the manufacturing of consent (Walter Lippmann, not Noam Chomsky). This function has been delivered through official organs in the past, but now advertising, marketing, education and the internet magnifies this role to extraordinary levels.
It’d be easy here to slip into the “corporate feudalism” argument and claim the companies are taking power from the central governments, but the Western democratic system has always been a whole-of-society affair, with every piece playing its part in holding the organism together. Why should it be different now? That the knowledge collectors of today are the private companies doesn’t change the fact that they know more about you than you know about you, and will use this to nudge/guide you in the required direction. The question, as always, is not where the lines are drawn, but who is drawing the lines?
As an aside, I suspect this explains why there is a growing divide between the establishment and the insurgents regarding blockchain technology. If – and it’s a big if – blockchain can do what its proponents claim, then the free collection of data is made much more difficult. The establishment supports AI and Big Data, while the insurgents (for want of a better word) is betting heavily on the blockchain. Look at the article trends on the blockchain, nearly all of them are negative on the usefulness of blockchain and guilt-by-association about criminal underworlds. I don’t care either way, but as a rule of thumb, this is probably a good way to look at what’s going on.
Thoughts?
Comments are closed.