SOMA EP8: Handwavium

By Shamus Posted Thursday Apr 14, 2016

Filed under: Spoiler Warning 58 comments


Link (YouTube)

This entire show is supposedly some form of game criticism where we talk about what worked and what didn’t. But let’s put that idea aside for the next couple of episodes, because that’s not really what our conversation is all about. When I say something didn’t work for me, I’m using that to segue to another philosophical question. I’m not actually saying the game is bad, or that it should have been done differently. I’m a big believer in the idea that when it comes to philosophical wanking like this, there are no wrong answersObviously the stakes go up when we start talking about how this stuff could be applied to real-world problems, but that’s why I love sci-fi. It gives us a safe space to play around with these ideas, where nobody dies if we’re “wrong”..

To put it more specifically: It’s pretty clear that Simon (and perhaps the developers?) disagree with me on a pretty fundamental level. And that’s okay. I bring this up because I disagree with the game often, and I don’t want people to think I’m counting these disagreements as faults, from a game-design sense. It’s all good.

 

Footnotes:

[1] Obviously the stakes go up when we start talking about how this stuff could be applied to real-world problems, but that’s why I love sci-fi. It gives us a safe space to play around with these ideas, where nobody dies if we’re “wrong”.



From The Archives:
 

58 thoughts on “SOMA EP8: Handwavium

  1. Mintskittle says:

    You linked episode 7, Shamus.

    1. ehlijen says:

      Indeed. Here is the youtube link to the correct episode.

      https://www.youtube.com/watch?v=b49oaLAKLKk

      1. ehlijen says:

        Since it’s fixed, feel free to delete the above comment to reduce webpage size if that helps.

  2. SlothfulCobra says:

    These are all a bunch of classic sci-fi philosophical questions that various people have been taking a whack at for years. A lot of writers just assume that self-continuity of a single “self” is vital, so naturally any duplication should be immediately followed by the destruction of the original, or at least that a similar process would be entailed with a lot of wacky sci-fi junk.

    The satellite thing is like what a lot of powerful people try with their tombs, although there’s also a religious component there sometimes. As I understand, these people were already working on that project, and decided now that everything they knew was dead and gone and they’re stuck in these videogame corridors for the rest of their lives, they might as well finish it as a kind of legacy for the human race, or alternatively some kind of spore? There was a vignette in the Mass Effect codex like this too. God knows why they were going to launch a rocket from the sea floor into the outer reaches of space instead of literally anything else though.

    1. ehlijen says:

      Am I getting this right? The launch site for the rocket is DEEPER under water than the rest of the base? Why? Wouldn’t you prefer having less water in the way?

      1. MechaCrash says:

        It’s launched into orbit via what is basically a gigantic railgun. For that, you want a nice long barrel so you have plenty of time and length to get the thing up to speed.

        1. Echo Tango says:

          Having it in water could also potentially make it easier to build, since you could use balloon-/submarine-things to help lift a lot of the weight. Air is a lot harder to float in than water. :)

          1. ehlijen says:

            Ah, so the barrel of the thing would presumably be air (if it’s open in the top) or possibly vaccuum (if it has some sort of soft one way seal)?

            Otherwise I have trouble believing that the extra length gained would be an advantage if you just end up shoving your spaceship through more water first. You’d also need to build the spaceship a lot sturdier if it’s supposed to survive that deep in water until countdown, again increasing the energy requirements.

            1. Echo Tango says:

              Yeah, it’d be air / partial vacuum. I guess that would make it its own boat / balloon, to help with floating. The whole contraption would need a lot of complicated junk, but I think it might be possible.

              1. ehlijen says:

                It’d need to be very sturdy to withstand the depth pressure (but should be feasible?) and the ship would need a solid nose to not break apart when it hits the air after accelerating in near vacuum first.

            2. Alex says:

              You’re also building it to survive being shoved down the barrel and then through the atmosphere at ridiculous speeds, so something that can survive under high atmospheric pressure is already pretty much required.

              1. ehlijen says:

                That’s pressure in one direction. To guard against crushing from all sides, you’d need the cube of that resistance evenly distributed, no?

        2. Peter H. Coffin says:

          And rail guns operate by electricity. A LOT of electricity. Which means they need to dump a lot of waste heat. Water does a really good job of pulling off that heat compared to air.

    2. McNutcase says:

      A couple of good places to explore replication-without-destruction of personality are the Takeshi Kovacs trilogy by Richard K. Morgan (in which such replication is possible, but highly illegal; naturally, it happens anyway) and the Eclipse Phase roleplaying game (which is… astoundingly obtuse in character creation in part because forking one’s self is easy and not illegal in most jurisdictions), which notes Takeshi Kovacs as a major inspiration.

      1. sheer_falacy says:

        Takeshi Kovacs is an interesting take on it but it does pretty much completely ignore the continuity of self question. Which is fine, the story didn’t need to get bogged down in that.

        Another story with an interesting take, which this game reminds me of in several ways, is After Life (http://sifter.org/~simon/AfterLife/). It even opens with a brain scan!

      2. I’d also recommend the Old Man’s War series by John Scalzi. His stories (as well as another, more space-opera series I’m reading, “The Reality Dysfunction”) both have people uploading or transferring their minds. In both cases, the ideal form is where consciousness is shared between vessels and one is terminated while both are a single person.

        This way, the story can dance around the whole “soul” aspect and concentrate on a mind not experiencing any discontinuity.

    3. King Marth says:

      Can’t blame them too much for assuming there can only be one self, a lot of people are under the absurd impression that the branching universes described by quantum equations must somehow be annihilated once interaction with external sensors decohere them. (Often referred to as ‘collapse’.) Once a few atoms are out of alignment they’re basically unreachable from each other, so this is a pretty harmless assumption, but it must be lonely assuming that exactly one universe is all that exists.

      1. Echo Tango says:

        I was willing to just chalk up the suicides to people being desperate and a bit self-deluding, in a desperate hope for an escape from their hell.

        1. Sarachim says:

          I understood the argument for suicide like this: when you’re scanned, meat-you and scan-you are identical. The more time passes, the more they diverge, until eventually meat-you doesn’t feel like your scan is the same person as you any more. Since meat-you wants your authentic self preserved, logically you’d need to stop having new experiences immediately after getting scanned. I was sure that’s what Robin meant to say, but now that I go back and look at her dialogue closely, it seems ambiguous.

          But that’s not mutually exclusive with the idea that all the stated reasons are pretexts, and really people just want to escape.

          1. Mintskittle says:

            I haven’t played the game, so everything I know about this game comes from TVTropes, but it seems to me that the people who committed suicide after being scanned were led to believe that the transferal of consciousness wouldn’t be complete until after the death of their physical bodies. That continuing to live would somehow prevent the digital copy from truly being you. At least that’s what I got out of it.

            1. Alex says:

              Right. It’s Insane Troll Logic, to borrow the TVTropes term. It hides the evidence that you’re being copied and not transferred, but it doesn’t make it untrue.

      2. Daemian Lucifer says:

        Its not that absurd at all.Also,having a quantum state collapse once its being detected does not automatically mean there is just one universe.It could also mean that there is one universe for every particle,where every other particle is a “virtual” one used only to “fill out the void” for that one “real” particle.

    4. Echo Tango says:

      Yeah, I don’t get why they didn’t at least try something else besides the humans-in-a-VR spaceship launch. Like, Catherine briefly mentions that she’ll have to align solar panels on the ship once it’s up in space, so she must have some kind of superadmin access to everything. Given that she’d be able to interact with the real world, could she not have strapped some robot arms to the ship, so it could potentially start building a new robo-city for humans, that eventually gets populated with flesh-and-blood people, after they’ve built a cloning lab? Hell, even just building more robots, so they can keep the VR going and do maintenance would be good. :)

      1. Galad says:

        A fine idea I hadn’t thought about, but not the focus of the core game. That being said, if you feel like you’d like to mod it in, I’d love to play it!

    5. Groboclown says:

      A lot of writers just assume that self-continuity of a single “self” is vital, so naturally any duplication should be immediately followed by the destruction of the original, or at least that a similar process would be entailed with a lot of wacky sci-fi junk.

      This was a big part of what Noam Chomsky researched with his linguistic studies. The idea of how humans comprehend the continuity of a thing as it changes form. Take for example a tree – you can easily create a genetic clone of the tree with a cutting. Is the new tree, grown from that cutting, the same tree?

      This question raised here is using that continuity idea and instead putting it on yourself.

      1. Peter H. Coffin says:

        The fun thing about that one is that the usual reason for taking a cutting is to graft it onto another, less useful branch or even whole tree. So which tree is the new one?

  3. Redingold says:

    On the subject of what exactly is simulated, I figure that since it’s just a brain scan, it wouldn’t include things that aren’t generated in the brain, like itching, or hunger, but it would include everything else, such as anxiety. However, Simon isn’t just a simulation – he’s plugged into a real, functional body, and the body is clearly capable of receiving signals from the cortex chip (since Simon can move its muscles) and is presumably capable of sending signals to the cortex chip, as Simon can feel pain. Presumably, then, Simon can feel sensations from that body, and that could include things like itching. Other people, like the ones stuck in robots, or Catherine, wouldn’t have access to these bodily sensations. Pure simulations, like those on the ARK, would presumably simulate the entire body, though they might do away with random itching or pins and needles for the sake of convenience.

    As for hormones, I figure that, since it was developed to simulate medical treatments, the simulated brain scan reacts realistically to simulated hormones (if a treatment could release colossal amounts of, say, cortisol, which can damage neurons, you need to be able to simulate that). The cortex chip might even be able to activate the adrenal glands via the body’s sympathetic nervous system. However, I can think of no reason why the cortex chip would include hormone detectors, since they aren’t supposed to be attached to biology. It’s possible that the WAU built hormone detectors when combining Simon’s chip with his body, but I don’t think there’s anything to support that and it kinda feels like an asspull.

    1. Echo Tango says:

      The pure-simulations could even just simulate the memories of being itchy to save on processor power, and still keep its people feeling normal(-ish). :)

  4. Echo Tango says:

    OMG, I did not know you could just rip off the door/access-panel at 9:10! I spent like, a solid minute, slowly opening that thing far enough, that I could quickly jam my hitbox between it and the buttons, so I could operate them. :S

  5. ehlijen says:

    I’d think that if humanity is truly over with no chance of recovery and my only future is being stuck in a small underwater base with the same people for the rest of my life, I’d think I could get behind some sort of memorial/at least they sort of live lifeboat.

    It’s the kind of idea that probably gains a lot of support in a ‘what else is there to do?’ scenario.

  6. Nate A.M. says:

    It really bugs me when people who don’t study philosophy academically say that philosophical wanking has no wrong answers. It shouldn’t though, because there are a few important ways in which the most charitable interpretation of this idea is true. In any case, if no well-justified position on a question is globally preferable, there are almost always still important merits and defects which can be enumerated and elaborated. Which, of course, is exactly what you’re already doing.

    Ahh…but I do feel really bad for being pedantic about this sort of thing, the same way I feel bad for identifying with Catherine’s impatience for naà¯ve existential crises. Maybe my psychology isn’t so different from her’s. As a scientist who deals directly with mind theory, she’s probably spent a great deal of time studying and considering the culmination of millennia of literature on the subject, and there’s no telling what the children of the late 2070s internalized. This brain scan stuff has apparently been happening since 2015, after all. If this characterization of Catherine is intentional, it’d be some damn fine world building too.

    There’s a lot I want to say about the Ark and the nature of brain scanning in this game (spoiler: I both agree and disagree with Shamus on different points and to varying degrees~!), but I’ll write it for the last episode and space out the amount of time you’re obliged to spend reading my comments =P

    1. nerdpride says:

      I’m just as disappointed as anyone that Simon plays it dumb all the time. But so far I like Catherine as a character in this setting. She does come off as knowing very much more than she’s saying, and although it doesn’t add to scariness directly, there’s a neat uncomfortable feeling that’s not quite the same as the generic, “she’s so much smarter than you”. Maybe contrast her to TIM in ME3?

      Also interesting that while playing a videogame, your character never makes your nose itchy. I hope to not see anything 4th wall breaking, but those small bodily functions need to be ignored for immersion to work anyway, so the game should not mention them. Maybe it could do something to chemicals though.

  7. Daemian Lucifer says:

    @4 minutes
    Was I the only one expecting Rutskarn to say “Is this the real life or is it just fantasy”?

  8. Daemian Lucifer says:

    About killing yourself and the continuation of being:

    Its not really that stupid.In fact,the hypothesis of quantum suicide supports this.Basically,it states that spacetime multiverse is a static thing,a series of moments which our consciousness is traveling through,from one dot to the next.And if there is a spacetime dot where our body does not exist,our consciousness cannot enter that place,so it will enter the first available dot,always picking the route where “self” is a thing.Therefore if your body is dead,but there exists a copy of your body,your consciousness will pick that one to inhabit.And the less time passes between your copy being created and the original dying,the greater the chances that your consciousness will “pick” the living copy to inhabit.

    Of course,the problem with this hypothesis is that you can only test it for yourself,by trying to kill yourself a bunch of times.And even if you survive them all,it could mean that you were just incredibly lucky.Not to mention that the hypothesis only states that you will survive it all,not that you will remain unharmed.So not a great hypothesis.But an interesting one to ponder.But,if you have nothing to lose(say the apocalypse has happened),you might give it a try.

    1. sheer_falacy says:

      What, exactly, is your “consciousness” in this scenario? How would someone be different if they didn’t have it (a “p-zombie”)?

      And why would this “consciousness” move to the ARK after you killed yourself, when the ARK has basically nothing in common with your body? What happens to whoever was there before your “consciousness” took over (or is the simulation a p-zombie?)

      On the other hand, I suppose that the people arguing for it in the comments suggests that yes, people would actually do it, despite the absurdity.

      1. Daemian Lucifer says:

        What, exactly, is your “consciousness” in this scenario?

        Its you.

        How would someone be different if they didn't have it (a “p-zombie”)?

        They wouldnt be conscious.

        And why would this “consciousness” move to the ARK after you killed yourself, when the ARK has basically nothing in common with your body?

        It has one thing in common:The brain,which is the storage for the consciousness.And it wouldnt move there after you killed yourself,but after you got copied.You killing yourself merely increases the chances of it “picking” the copy over the original.

        What happens to whoever was there before your “consciousness” took over (or is the simulation a p-zombie?)

        No one was there.Ok,heres the deal:The whole hypothesis hinges on the many universe proposition,in which every particle exists in its own universe,where every other particle is not really there,but is rather a reflection of that particle from their own universe.So for example,if two people talk to each other,the conscious A lives in universe A in body A and talks to a virtual body B,which is a reflection of conscious B that lives in universe B in body B(and talks to virtual body A).

        The idea is not any more absurd than the idea that there is just a single universe,existing in nothingness,beginning for no reason,going to an unknown ending,and having one state preferable to another just cuz.

        1. sheer_falacy says:

          “It’s me” doesn’t mean anything.

          Nor does “they wouldn’t be conscious”. What affect does that have? Is there a way you can identify if they’re conscious or not? If there isn’t a way, why assume that they are?

          The ARK does not contain a brain. The ARK is circuitry (and, apparently, structure goo). Regardless of any emulation going on, the ARK does not contain neurons, hormones, DNA, or anything biological at all. From the perspective of your consciousness, which exists… somewhere, why does the ARK have any resemblance to your brain at all?

          What happens if it picks the copy while the original is still alive? Does the copy suddenly gain the memories the original acquired in between when the scan was taken and when the copy was initialized? Does the original lose them? Is the original now a p-zombie, and once again does that have any detectable effect?

          If no one was there, then… where do consciousnesses come from in general? Are they just floating around til a baby is born and then they latch on and say “this brain is now my target”?

          And that’s, uh, a very odd many universe theory. How is a virtual particle different from a real particle? Is your conscious the real particle? What about all those particles that aren’t consciouses, like, say, all the ones that make up a brain? Are those virtual everywhere? Why does the virtual particle have to be a reflection of a real particle elsewhere? And how does this many universe theory correspond with the consciousness theory?

          And there are plenty of theories that have multiple universes that don’t need include consciousness as a separate immaterial thing. And the one real particle theory doesn’t explain where the universe came from, where it’s going, why it exists in nothingness, or what makes a state preferable to another.

          Also how do you quote people? That would make this easier.

          http://lesswrong.com/lw/p7/zombies_zombies/ has rather a lot more discussion of this as well. Though http://lesswrong.com/lw/pn/zombies_the_movie/ is shorter and more amusing.

          1. Daemian Lucifer says:

            Also how do you quote people? That would make this easier.

            Use the blockquote tag.So [blockquote] [/blockquote],but with angle brackets.

            “It's me” doesn't mean anything.

            Nor does “they wouldn't be conscious”. What affect does that have? Is there a way you can identify if they're conscious or not? If there isn't a way, why assume that they are?

            Thats a whole separate topic that philosophy has been pondering for millennia.From souls,to consciousness,to life forces(its scientific!).Whether it exists,whether its separate from your body,whether it can jump into another storage,whether it can exist without a storage,whether it has any physical impact,….If there is an answer,we havent figured it out yet.

            The ARK does not contain a brain.

            Yes it does.Brain =/= organic brain.As we have discussed in previous soma threads,simulating all the hormonal responses is a possibility for such an artificial storage.

            What happens if it picks the copy while the original is still alive?

            Thats the desired outcome for these people.Them killing themselves isnt so that they would suddenly jump into the copy,but so that they would increase the chances(which would initially be 50/50)of them jumping into the copy at the time of its creation.

            Does the copy suddenly gain the memories the original acquired in between when the scan was taken and when the copy was initialized? Does the original lose them?

            No.Why would it?The transfer happens once the copy is made(just like in soma),not when one of the two is terminated.

            Is the original now a p-zombie, and once again does that have any detectable effect?

            Only for you.

            And that's, uh, a very odd many universe theory. How is a virtual particle different from a real particle?

            For all physical intents and purposes,in no way.The only difference is that the universe the real particle inhabits exists only while the real particle exists,and is independent of every other particle.

            Is your conscious the real particle?

            If it is,then what these guys are doing is providing results.If its not,then whatever they do is futile.

            What about all those particles that aren't consciouses, like, say, all the ones that make up a brain? Are those virtual everywhere?

            Probably.

            Why does the virtual particle have to be a reflection of a real particle elsewhere?

            Because otherwise the universe wouldnt be solely dependent on just one particle.

            And how does this many universe theory correspond with the consciousness theory?

            It proposes that consciousness exists as a tangible thing and thus exists in a universe of its own.

            And there are plenty of theories that have multiple universes that don't need include consciousness as a separate immaterial thing.

            Well yes,but in those cases you wouldnt be able to influence where you wake up once the copy is being made.So these guys are betting their lives on the chance that they can do such a thing.Which isnt a good bet in normal circumstances,but if the whole world is doomed,its not that bad bet to make.

            1. sheer_falacy says:

              It just seems so silly to posit the existence of something that can’t be observed and has no effect on anything. That’s so much more complicated than just… not having it.

              And there are reasons to create the ARK besides wanting to personally live on (since you won’t). It prevents the extinction of the human species, sort of. It allows someone who is very similar to you (and who you would, therefore, probably like) to live for thousands of years in a paradise. I’d want that for a copy of me.

              1. Daemian Lucifer says:

                It just seems so silly to posit the existence of something that can't be observed and has no effect on anything. That's so much more complicated than just… not having it.

                I agree.But in this case its not something that cant be observed and has no effect on anything.We know that consciousness (or self) can be observed and that it has an effect.What we dont know is if that consciousness is separate from the body or not.Its a useless question now when all we have is just one body thats tied to consciousness for as long as the person is alive.But if we ever develop a brain scan thing like in soma,or an effective transporter,the question of “which is the original and which is the copy” will have tangible ramifications in the physical world.

                And there are reasons to create the ARK besides wanting to personally live on

                Of course.But in the case of this game we are talking about people who have selfish reasons to want themselves to be the one who live on for thousands of years.

                1. sheer_falacy says:

                  I’ve asked how you can observe epiphenomenal consciousness and what effect it has and you haven’t really had an answer besides “it’s you”. There’s no test you can devise that can tell whether someone has epiphenomenal consciousness or not. What tangible ramifications would it have if we did develop the brain scan?

                  Also I wouldn’t call wanting to live for thousands of years selfish. That’s completely reasonable. But commiting suicide to live for thousands of years is not the path to success.

                  1. Daemian Lucifer says:

                    I've asked how you can observe epiphenomenal consciousness and what effect it has and you haven't really had an answer besides “it's you”.

                    No,youve asked what is consciousness.Which isnt really important,and why my answer to that was so brief.The important thing(for this game,and this conversation)is whether it is separate from the body.If it is,then you can observe it and test it in ways similar to what the game presents(killing yourself being the extreme solution for an extreme situation).Of course,those test dont exist now,but if we ever find a way to duplicate a person,they will.

                    What tangible ramifications would it have if we did develop the brain scan?

                    Whether they are an actual person.How should they be treated.Is the consciousness separate from the body or not.Etc.

                    Also I wouldn't call wanting to live for thousands of years selfish. That's completely reasonable.

                    Not if that means shirking your duties and leaving all your colleague with cleaning up behind you.Especially if said duties involve something as trying to save humanity from complete extinction.

                    1. sheer_falacy says:

                      Killing yourself isn’t a good test. What other tests exist?

                      If you have a test for “consciousness” and you don’t treat people as actual people if they fail it, then holy crap is that ever monstrous. Because they’re still a copy of the person and behave the same, which means they still feel. If you don’t think they do… too bad. Not thinking of other people as people is how some of the worst atrocities in history have been committed.

                      And wanting to live for thousands of years isn’t selfish. Killing yourself for immortality isn’t selfish either, it’s just incredibly stupid. Well I guess it’s a little selfish, if you had consideration for others you’d kill yourself in a cleaner way. Actually if you had consideration for others you wouldn’t kill yourself at all because they were depending on you and may have cared about you.

                    2. Daemian Lucifer says:

                      Killing yourself isn't a good test. What other tests exist?

                      If you have a test for “consciousness” and you don't treat people as actual people if they fail it, then holy crap is that ever monstrous.

                      Two things.First:
                      Thing is,you keep talking about external tests,while I keep talking about internal tests.In the situation Ive initially described,and the situation of soma,there is only one consciousness from your point of you:Your own.Its where this consciousness resides thats the question,not where anyone elses consciousness resides.To you,if anyone else duplicates themselves a hundred times,they are all the same person,duplicated a hundred times.But if you duplicate yourself even once,you will notice the difference,because you will experience just one of the two points of view.And there will be two universe,one where you will experience yourself from the original,one where you will experience yourself from the copy.All of the entities in question are undeniably people,and should be treated as such,but only one entity is you,and can be treated as you(because you cannot treat yourself in any different way).

                      The argument for suicide comes from the idea that once you kill yourself,the universe where you would experience yourself from the original would cease to be,so you maximize the chances of you experiencing yourself from the point of view of the copy.

                      Of course,this is all simplified,because at any given moment there arent just two universes being created,but billions,quadrillions,of them.And we are going into the realm of “if a tree falls in the woods,but no one is there to hear it”.Because if there is no consciousness to experience the universe,does it exist?

                      Second:
                      The treatment of brain copies isnt all about how consciousness works.The game does hint at this a bit.You “summon” a simulation into existence,and just as they start experience the world arount it,you terminate it.Is that a monstrous thing to do?Once you summon it,should you leave it on?Making it aware that its a simulation(a pretty limited one),that would cause it great torment,far greater than returning it to oblivion.So which of the two actions is the bad one?

                    3. sheer_falacy says:

                      If it’s a purely internal test, then why are you making assertions about whether anyone else has a conscious? It’s purely internal, the only one who can tell if they have one is them. And the simulation will believe it has a conscious because it’s perfect copy (somehow).

                      If you duplicate yourself, each of them will experience their own point of view.

                      And yes, things exist even when you don’t see them. That’s object permanence. Everyone tends to learn it at a very young age.

                      And I think murdering someone is considerable more cruel than telling them the truth that they’re a simulation. I know I’d rather learn I was a simulation than die.

                    4. Daemian Lucifer says:

                      If it's a purely internal test, then why are you making assertions about whether anyone else has a conscious?

                      Im not.Everyone does have a consciousness.Thats not the question I raised originally.The question is whether those consciousness directly interact.If consciousness is separate from your body,then it will always live in its own universe,and everyone you interact with will be just reflexions of everyone elses “true self” from their own universes.If this is true,committing suicide and leaving all your friends to clean the mess is not a big deal.If not true,then it is a very selfish thing to do.

                      If you duplicate yourself, each of them will experience their own point of view.

                      Yes.But the question is if that can be influenced.If your consciousness can be independent from your body,then you can influence which of the duplicated bodies you will perceive things from.If its not independent,you have no influence.

                      And yes, things exist even when you don't see them. That's object permanence. Everyone tends to learn it at a very young age.

                      Saying that is the only reasonable explanation is false.Sure,if you dont see a tree,and it falls down,later when you see it it will be down.But here,you are still observing the phenomenon.The question deals with phenomena that are NEVER observed.Not when they happen,not an hour later,not a trillion years later.And not just the phenomenon itself,but all the other phenomena that it interacted with.

                      And I think murdering someone is considerable more cruel than telling them the truth that they're a simulation. I know I'd rather learn I was a simulation than die.

                      Even if said simulation meant that youll be confined to a single blank room for a few years,with nothing happening because there isnt enough memory to simulate anything other than a single person?To me,thats a fate worse than death.

                    5. sheer_falacy says:

                      “If consciousness is separate from your own body” is the thing I have an issue with. There’s no evidence of that. We can observe the human brain and see thoughts in it. If it’s damaged, the thoughts change. If it’s affected by drugs or hormones, the thoughts change. The entire system is self consistent if there’s nothing but a brain there. So why do you posit a separate consiousness? Why imagine this thing that can’t be seen, that doesn’t interact with the world, that can’t be defined beyond “it’s you”? It doesn’t work with physics, chemistry, or biology. It’s purely a construct of philosophy or religion, and an unnecessary one.

                      If everyone has a consciousness, why don’t the clones? If you’re defined “everyone” as “everyone who has a consciousness” then, well, No True Scotsman.

                      Really it actually sounds like you already believe you’re in a simulation. And that you’re the only one there, and everyone else is just a reflection of their real selves in a different simulation. That’s terrifying. It’s actually genuinely terrifying – you have dehumanized every person you interact with, preemptively. They aren’t real. They’re just virtual projections. And if they aren’t real, then it doesn’t matter what you do to them.

                      Maybe it’s all just a philosophical argument and you don’t apply it to your real life. I hope that if someone took a brain scan of you that you wouldn’t kill yourself. I hope that you never try to kill yourself to prove quantum immortality. I really, really hope that this is just an abstract argument and not something that affects your actions in everyday life.

                      As for leaving someone in a blank room for eternity – you have created a person. You have a responsibility to them. If you don’t have a better answer for them than death or solitary confinement, then you shouldn’t have created them in the first place. But at least solitary confinement leaves the possibility of release. It’s why many places in the world use life sentences over death sentences.

                    6. Daemian Lucifer says:

                      There's no evidence of that.

                      You mean how there was no evidence for special relativity decades before the experiments were made?How there is no evidence for black holes?How atoms were a speculation for hundreds of years before we could even conceive of an experiment to test for them?

                      Just because we dont have evidence for a thing NOW does not mean that we wont have evidence for it LATER.The question of whether consciousness is separate from the body is meaningless now,but like Ive said,if we ever find a way to duplicate someone,whether it be a brain scan or teleportation,it will be relevant then.

                      Besides such,and similar questions,are fun to ponder about.Thats why philosophy was developed in the first place.

                      If everyone has a consciousness, why don't the clones?

                      Who said they dont?

                      Really it actually sounds like you already believe you're in a simulation.

                      Stop,stop,stop,please stop.We had this thing here recently,and it ended poorly,so dont do that.Dont go assuming stuff about people just because you are misinterpreting their words.Read again what I was saying.From the very start I was talking about THIS GAME and what characters in THIS GAME think.Never once have I said that I agree or disagree with them,just that I understand their reasoning.Whether false or true,its still the reasoning of characters in THIS GAME,not mine.Please,dont project that on me,not a good idea.

                      As for leaving someone in a blank room for eternity ““ you have created a person. You have a responsibility to them. If you don't have a better answer for them than death or solitary confinement, then you shouldn't have created them in the first place. But at least solitary confinement leaves the possibility of release.

                      We arent talking a random case of creating a person for the lulz.We are talking a specific case that happens here in this game,where there are NO means of having it better for the simulation,and where creating that simulation was the only way to save others.Its a crappy situation,yes,but thats how it is.And in this situation,erasing that person IS preferable to leaving it on for the next few years until the energy runs out(depending on how long the wau remains on,that is).So yes,life is not preferable to death ALL the time.

            2. Kaspar says:

              [QUOTE]Thats the desired outcome for these people.Them killing themselves isnt so that they would suddenly jump into the copy,but so that they would increase the chances(which would initially be 50/50)of them jumping into the copy at the time of its creation.
              [/QUOTE]
              That there is pure nonsense. They should just use destructive mind uploading instead. (Or even just kill the meatbag-human right afterwards before the they wake up from the procedure.)

              1. Daemian Lucifer says:

                True.But for that to work,people doing the uploading should be the ones that share that idea.What we have here is that the people who did the upload did not want to kill those who are being uploaded.

  9. Shamus just condemned the future to having computer towers.

    The last time he mocked a game for showing something that was oversized was in Assassin’s Creed 2. One of the technicians had headphones that weren’t earbuds, Shamus said that was silly, and soon afterwards Beats By Dre became a thing.

    1. Galad says:

      Ear buds can be awfully uncomfortable though, and are not likely to isolate from outside sounds. I wish over-marketed crap like beats were not successful. Towers would probably be only the choice for people that want to feel like they have something fancy, not a practical choice

  10. Daemian Lucifer says:

    Getting information from that guy,and the ground hog day can definitely be evil.What if every day Bill Murray creates a new universe that continues existing even after he is sent back to the beginning of the day?He has created a huge mess of problems for a bunch of people in plenty of those universes.Its much worse than creating and then killing this guy over and over.At least we know that he has stopped existing once we stop the interrogation.Yes it is kind of heartless that we create and kill this guy over and over again,but he does stop suffering every time the simulation is stopped.Still kind of evil,but not as much as the ground hog day can be.

    1. Fun fact: The original script for Groundhog Day had the time loop being caused by some ancient goddess that Murray had somehow ticked off. Wisely, that was cut and the script leaves the origin of the looping up to the viewer’s imagination.

      1. Shoeboxjeddy says:

        They could have made that work (less well than the version we got, but still) if Murray treated the God idea with as much reverence as he did in Ghostbusters. Which is to say, none whatsoever.

  11. Jimmy McAwesome says:

    On the subject of not missing your random sensations like itching and stuff. It would probably be like Unbreakable, where you just wouldn’t notice there’s anything wrong. From a psychological standpoint, people typically only notice the addition of things, but not the lack of things. There would be nothing to trigger you to think about how your noes hasn’t itched in a while, so you probably would never think about it. Probably the only reason you think about itchy noeses now is that you did it recently, or watched someone else do it.

    Interestingly enough games, movies, and tv shows use this when they make changes. Its easy to take something away in sequals, and sure there’s typically a bit of outrage at first when they announce it, but it shortly goes away because you typically don’t think about how Shoot Mans 7 is missing Kill-Cams because you don’t see any.

  12. Malky says:

    Ah, yet another part of the game I enjoyed just sitting back and thinking about! The question of the continuity of self, and how the go-to solution for some people here was suicide immediately after the scan.

    I saw people talking about this earlier, with quantum and universes and stuff, but when I was watching Soma, I kinda just thought about the sorta simpler stuff, like, what constitutes a continuous identity and what doesn’t. Like, is it just memory? Is it something to do with our body? Or something else? Considering the particles and atoms that makes us up are constantly being replaced, it can’t be that having the same bodies our whole life makes up our consciousness. And people don’t really like saying memory alone forms our identity, because it’s possible for people to lose their memories, like not even straight out amnesia, just, we constantly forget things, all the time. Do we really want to say that we’re a different person every time we forget stuff we did as kids? And speaking of being kids, it can’t even be our personality is our continuity, because the way we act and think as kids is not at all the same as the way we act and think as adults.

    Still, we all seem to kinda know what personal identity is. Like, if Person A and Person B were told that they were gonna swap bodies, and then Body A would be tortured, then who would be worried? I think generally people say that Person B would, because even if the body initially belonged to A, it’s still B who will go through the experience. And similarly, if A was told that they would have their memory wiped and then they would be tortured, wouldn’t A be worried? They might not remember anything about their self, but in the end, it’s still them experiencing the torture.

    So, onto the topic of replication, which is what the scientists in Soma are thinking about. If you’re cloned, then the other guy is you entirely, and at that very moment of cloning, you two are the same. You might be considered “one person,” even if you have two bodies. But then time happens, and shit happens, and you might go your separate ways and have a bunch of different experiences, and then you can’t really said to be the same person anymore, since we recognize that our unique experience of ourselves is part of what makes up our identity, and now that clone is a separate creature. So if you’re replicating yourself in order to actually straight out preserve your “self,” then how do you make sure your replication will still keep being you? For some people, the answer was “suicide,” so that the divergence never occurs. I guess you could say it’s so that what happens at the end of the game doesn’t happen. (Or, possibly, depending on your choice, what happens in the middle of the game doesn’t happen.)

    I’m sorry if I’m rehashing what people have said earlier, but I really do want to get across that there was actual thought put behind this sort of thinking, and it wasn’t just desperation or a “cult” (though perhaps it was “cult-ish”). Philosophy is just kinda weird.

Thanks for joining the discussion. Be nice, don't post angry, and enjoy yourself. This is supposed to be fun. Your email address will not be published. Required fields are marked*

You can enclose spoilers in <strike> tags like so:
<strike>Darth Vader is Luke's father!</strike>

You can make things italics like this:
Can you imagine having Darth Vader as your <i>father</i>?

You can make things bold like this:
I'm <b>very</b> glad Darth Vader isn't my father.

You can make links like this:
I'm reading about <a href="http://en.wikipedia.org/wiki/Darth_Vader">Darth Vader</a> on Wikipedia!

You can quote someone like this:
Darth Vader said <blockquote>Luke, I am your father.</blockquote>

Leave a Reply to Jimmy McAwesome Cancel reply

Your email address will not be published.