on Apr 20, 2016
The most interesting thing about these philosophical debates is this: Many people, when presented with these questions, seem to already have some kind of mental model for how they think it all works. They have their own definition for what a person is, what consciousness is, and what it means to “die” in a world where people can be copied. And to them it’s all sensible, reasonable, and consistent. Perhaps even obvious. Everything is fine until they talk to someone else, who has a radically different mental model, which the other person feels is equally inescapable and obvious.
For example? Everyone keeps linking the Transporter Problem video by the awesome CGP Grey. In that video, the mental model is that since your cells are all destroyed, you die, and then a new thing – a copy of you – is created in a new location. This doesn’t match my mental model at all and so just comes off like a bunch of wanking to me. When talking about someone “dying” I’m much more concerned with the continuity and fidelity of their thought processes than with which particular pile of cells those processes are running on.
This is one of the reasons I like this game. It seems to be pretty good at finding those narrow gaps between people’s mental models and wedging them open.
For the record: I think the bit with Brandon is actually pretty tricky, ethics-wise. I shrugged it off during the game, but if we were causing him physical(?) pain then I might have reacted differently. But to me we were slightly upsetting someone for twenty seconds for our own survival, and that seemed like a pretty clear-cut case. The fact that he won’t even remember being upset makes this even easier. Also – and maybe I’m being unfair to Brandon – but I felt like he should have handled this better. He’s exhibiting Simon-levels of panic and confusion, when he ostensibly grew up around this technology and has been given ample time to wrap his head around the idea.