Bryan Caplan made strong, and to me incredible, claims that econ consensus predicts all ems would be fully slaves with no human personality. As he won’t explain his reasoning, but just says to read the slavery literature, I’ve done a quick lit review, which I now summarize, and then apply quickly to the future in general, and to ems in particular.
I think I explained my reasoning repeatedly, but I’m happy to clarify.
For starters, Robin overstates my position. I can’t remember the last time I wrote a sentence like “All ems would be fully slaves with no human personality.” But I would say, “A large majority of ems are likely to be effectively slaves with little human personality.”
Now why do I think this?
1. Most human beings wouldn’t see ems as “human,” so neither would their legal systems. Robin wants to soften people’s attitudes, but he’s unlikely to succeed. One of the main reasons his project seem so “weird” outside the sci-fi and futurist communities is that normal humans feel little sympathy for non-humans. There’s an obvious evolutionary explanation: Our empathy primarily exists to encourage cooperation between humans. We feel mild empathy for other animals – especially larger mammals – but not much.
2. At the dawn of the Age of Em, humans will initially control (a) which brains they copy, and (b) the circumstances into which these copies emerge. In the absence of moral or legal barriers, pure self-interest will guide creators’ choices – and slavery will be an available option.
3. Moral and legal barriers aside, imperfect information about workers’ ability is the only self-interested reason not to treat them as slaves, especially when you can pre-select workers for docile personalities.
4. Since brain scans allow for cheap copying, employers would have excellent information about ems’ true abilities. Create a few dozen copies, give them life-or-death incentives to excel, and see what they accomplish. Then use that information against all the copies: Perform at your potential or we’ll inflict horrible pain on you.
5. There’s little reason to think initial human control will devolve into anything else as the Age of Em proceeds. As the em economy expands, the cost of treating ems more nicely gets higher and higher. And since, like farm animals, ems are bred for docility, there’s little reason to fear organized rebellion.
I understand how Robin can disagree with this argument. But I don’t understand how he could be unable to understand it. I can also understand why Robin would like more details. But I’m not confident enough about the details to want to describe them.
To repeat: If, like me, you think ems wouldn’t really be conscious anyway, my scenario is a great outcome. Humans would be fabulously rich and safe. And ems would genuinely experience no more pain than paper in a paper-shredder.
READER COMMENTS
Pajser
Jul 19 2016 at 2:32am
It surprises me when people talk about future only in terms “what will happen” and not “what should happen.” If artificial beings have emotions equivalent to human emotions they should be treated equally as humans. And if I understand what should be done, others people will understand that too, and eventually it will be done. Caplan’s vision of mankind is more pessimistic than I expected.
Dan Hill
Jul 19 2016 at 4:20am
“If, like me, you think ems wouldn’t really be conscious anyway”
What is the basis for concluding that ems would be perfect substitutes for human minds, but conveniently without that pesky consciousness?
Isn’t consciousness a defining feature of a human mind? Is it actually an em (of a human mind) if it isn’t conscious?
And how do you even create it? Which component of a human mind do you NOT emulate to avoid consciousness?
Hazel Meade
Jul 19 2016 at 10:12am
It would be nice if you would include a brief explanation of what an ‘Em’ is. Even I am not nerdy enough to have heard of this.
(Electronic memory?)
Mark Bahner
Jul 19 2016 at 12:17pm
Yes, that’s precisely why ems probably won’t occur (except as a scientific curiosity). If ems occurred, morally and legally they would almost certainly be viewed as human. If instead computers have no emotions, there is no need to legally or morally treat them like humans. They can be abused like electronic equipment is currently abused.
An “em” is a human brain emulation. It’s software that (theoretically) performs exactly like a particular human brain. (And “ems” is the plural of “em”.) “Whole brain emulation” or “mind uploading” are common phrases meaning the same thing.
Mark Bahner
Jul 19 2016 at 12:55pm
Bryan: “If, like me, you think ems wouldn’t really be conscious anyway”
Dan Hill: “What is the basis for concluding that ems would be perfect substitutes for human minds, but conveniently without that pesky consciousness?”
This is exactly the issue I had with Robin’s book when I first learned about it. I said that it wouldn’t make sense to have a computer that replicates negative aspects of the human mind (prejudice, greed, laziness). Robin insisted that an em had to have those things, or it wouldn’t be an em. So I told him ems were a bad deal. I still don’t understand why Robin, especially being an economist, can’t see that.
Richard O. Hammer
Jul 19 2016 at 3:29pm
@Hazel Meade
Evidently em means brain emulations. See the page of Robin Hanson’s book, The Age of Em.
Jan Mikkelsen
Jul 19 2016 at 5:45pm
If ems aren’t conscious they don’t have the value you ascribe. In this model they have that value (eg. The “discover ability through torture” step) and so I think it’s reasonable to think about them as conscious.
As society we may go through an extended God King phase where people do what they want with ems, much like early civilisations had God kings who owned the whole society as slaves with torture and human sacrifice. As civilisations scaled up we needed external controls to make rulers follow external rules – the technology of religion to make rulers feel bad when the abused the people.
The model you are describing is one where some have their own personal God King society with torture and slavery. The question is – what force prevents that? Do we even know what it is?
Consider this thought experiment: a religion acquires an em copy of you, and has decided you’re a “sinner”. They put a full simulation of you into a world with continuous torture with no end to ensure that you correctly “go to hell”. Is that moral?
(I think I need to go and read the book …)
Don Geddis
Jul 20 2016 at 12:34pm
Jan Mikkelsen: “Consider this thought experiment: a religion acquires an em copy of you … They put a full simulation of you into a world with continuous torture with no end to ensure that you correctly “go to hell”.”
If you wish to explore this scenario further, you may enjoy the novel Surface Detail, one of the Culture novels by Iain M. Banks.
Jan Mikkelsen
Jul 20 2016 at 5:39pm
Don Geddis: Yes, I have read it — I should have credited it in my post, but ran out of time and forgot after thinking about history wording. Thanks for reminding me.
Surface Detail is excellent and explores the whole em issue — Iain M Banks was a fascinating thinker and a great storyteller. An excellent combination.
Jan Mikkelsen
Jul 20 2016 at 5:42pm
One more thing: some of my God King thinking was influenced by Peter Turchin’s “Ultrasociety” and thinking about how those concepts would apply in an em-capable world.
Matt Cederholm
Jul 20 2016 at 10:54pm
To further Bryan’s argument, as well as the comment by @Mark Bahner (admittedly, I have not read Robin’s book, so this may be a naive question):
Why would we not assume that ems would be created by copying brains of people who actually enjoy servitude? Surely, such people exist.
That satisfies both Bryan’s argument that ems would be used primarily for service and solves the moral dilemma of creating a “slave race.” While I agree that ems would not be seen as human, there would likely be empathy towards them as human-like beings.
Mark Bahner
Jul 21 2016 at 12:43pm
A couple responses:
1) Slavery is defined (at least in the Constitution) as “involuntary servitude.” So by definition in the Constitution, no one can enjoy slavery…because if they did, it wouldn’t be “involuntary.”
2) My contention is that there are an overwhelming number of reasons–including economic reasons–*not* to completely emulate a human brain, so ems are not likely to ever occur.
If a human brain is emulated, right off, many people would say it should have legal rights…such as a right to a minimum wage. So it seems to me obvious that the more likely future is, for example, computer-driven cars, not ems that have some special talent for driving. And robotic house builders, not ems that have some special talent for building houses. And so on for the millions of types of occupation that exist in the world.
Peter McCluskey
Aug 11 2016 at 10:20pm
I’m puzzled by the claim “employers would have excellent information about ems’ true abilities”.
That’s only plausible if tasks are fairly uniform. And uniform over long periods of time as ems experience time.
Suppose a human owner monitors worker performance once per human day.
That suggests ordinary em workers get monitored something like once every 3 or 4 of their years. That’s roughly equivalent to sailors on the longest voyages of a few centuries ago.
That hardly looks like a typical situation in which slavery worked well in the past.
For top em managers, it’s even stranger. They’ll do more like millenia of work between each instance of the human owner monitoring them. I expect such a company with substantial human control would be too static to compete with any company whose managers had more discretion (or with an em-owned startup – it’s unclear what stops ems from starting their own company, and becoming wealthy before humans can react).
Em managers would behave emapthetically toward their em employees for the same reason that human managers do that with human employees. (If you somehow had an em that didn’t think ems had experiences in the morally relevant sense, what would cause that em to care about his own future?)
It seems less clear whether em managers would act as if they have empathy for humans.
Comments are closed.