I think SOMA made it pretty clear we're never uploading jack shit, at best we're making a copy for whom it'll feel as if they've been uploaded, but the original remains behind as well.
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
A lot of people don't realize that a 'cut & paste' is actually a 'copy & delete'.
And guess what 'deleting' is in a consciousness upload?
I mean, if I die instantaneously and painlessly, and conciousness is seemingly continuous for the surviving copy, why would I care?
My conciousness might not continue but I lose consciousness every day. Someone exists who is me and lives their (my) life. I totally understand peoples aversion to death but I also don't see any difference to falling asleep and waking up. You lose consciousness, then a person who's lived your life and is you regains consciousness. Idk
You make a good point. We all might be being copied and deleted in our sleep every night, for all we know.
There'd be no way to know anything even happened to you as long as your memory was copied over to the new address with the rest of you. It would be just a gap in time to us, like a dreamless sleep.
Most people don't like the idea of a suicide machine.
Yeah, and I completely understand that. Just from a logical perspective though, lets say the process happens after you fall asleep normally at night. If you can't tell it happened, does it matter? I've been really desensitized to the idea of dying through suicidal ideation throughout most of my life (much better now), so I'm able to look at it without the normal emotional aversion to it. If teleportation existed, via this same method, I don't think I'd have qualms about at least trying it. Certainly wouldn't expect other people to but to me I don't think it's that big a deal. I wouldn't do a mind upload scenario, but moreso due to a complete lack of trust in system maintenance and security, and a doubt that true conciousness can be achieved digitally. If it's flesh and blood to flesh and blood though? I'd definitely try
its the transporters all over again.
I wonder how you ever could "upload" a consciousness without Ship-of-Theseusing a Brain.
Cyberpunk2077 also has this "upload vs copy" issue, but doesn't actually make you think about it too hard.
That's what I've always thought more or less, to have a chance you would need a method where mental processing starts to be shared in both, then transfers more and more to the inorganic platform till it's 100% and the organic isn't working anymore.
The animated series Pantheon has a scene depicting exactly this, and it's one of the most disturbing things I've ever seen.
Edit: Here is the scene in question. It's explained he has to be awake during the procedure because the remaining parts of his brain need to continue functioning in tandem with the parts that have already been scanned.
Interesting but I would argue that's actually still a destructive copy process. "Old Man's War" did a good job of what I'm talking about, it was body to clone body but the principal was similar and at the halfway point the person was experiencing existence in both bodies at once, seeing both bodies from the perspective of each other until the transfer completed and they were in the new body and the old slumped over.
Any sufficiently identical copy of me is me. A copy just means there are more me in the universe.
reproduction 101
That ending screwed with my mind. Existential horror at it's finest!
I was just annoyed at the protagonist for expecting anything else. The exact same thing already happened 2 times to the protagonist (initial copy at beginning of the game, then move to the other suit). Plus it's reinforced in the found notes for good measure. So by the ending, the player knows exactly what's going to happen and so should the protagonist, but somehow he's surprised.
Ahh, but here's the question. Who are you? The you who did the upload, or the you that got uploaded, retaining the memories of everything you did before the upload? Go on, flip that coin.
If you are the version doing the upload, you're staying behind. The other "you" pops into existence feeling as if THEY are the original, so from their perspective, it's as if they won the coin flip.
But the original CANNOT win that coinflip...
But like.. do I care? "I" will survive, even if I'm not the one who does the surviving.
I can't speak for anyone else, but I would. The knowledge that "A" me is out there, somewhere, safe and sound, is uplifting, but it's still quite chilling to realize you are staying wherever the hell you are. At least we die after enough time has passed because our bodies decay.
onthulling
The SOMA protagonist wasn't that lucky...
Is it chilling? I was already going to stay where I am, whether I made a copy or not. Sharding off a replica to go on for me would be strictly better than not doing that
which instance of theseu's ship am I?
Implementation will be
{
// TODO
return true;
}
It's still a surviving working copy. "I" go away and reboot every time I fall asleep.
Why would you want a simulation version? You will get saved at "well rested." It will be an infinite loop of put to work for several hours and then deleted. You won't even experience that much, your consciousness is gone.
Joke's on them, I've never been "well rested" in my life or my digital afterlife.
If anyone's interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it's created and used by big tech companies is uncomfortably real.
The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can't help but wonder if it's at least partially because of its harsh criticisms of the tech industry.
Upload is also good.
Well yeah, if you passed a reference then once the original is destroyed it would be null. The real trick is to make a copy and destroy the original reference at the same time, that way it never knows it wasn't the original.
I want Transmetropolitan style burning my body to create the energy to boot up the nanobot swarm that my consciousness was just uploaded to
I think you mean std::move
get your std away from me sir
I get this reference
There are many languages I would rather die than be written in
You see, with Effective Altruism, we'll destroy the world around us to serve a small cadre of ubermensch tech bros, who will then somehow in the next few centuries go into space and put supercomputers on other planets that run simulations of people. You might actually be in one of those simulations right now, so be grateful.
We are very smart and not just reckless, over-indulged douchebags who jerk off to the smell of our own farts.
Glad that isn't Rust code or the pass by value function wouldn't be very nice.
Borrow checker intensifies
In a language that has exceptions, there is no good reason to return bool here…
Result<_>
HRESULT
Literally the plot twist in...
spoiler
Soma
I think that really depends on the implementation details. For example, consider a thought experiment where artificial neurons can be created that behave just the same as biological ones. Then each of your neurons is replaced by an artificial version while you are still conscious. You wouldn't notice losing a single neuron at a time, in fact this regularly happens already. Yet, over time, all your biological neurons could be replaced by artificial ones at which point your consciousness will have migrated to a new substrate.
Alternatively, what if one of your hemispheres was replaced by an artificial one. What if an artificial hemisphere was added into the mix in addition to the two you have. What if a dozen artificial hemispheres were added, or a thousand, would the two original biological ones still be the most relevant parts of you?
A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy
public static Consciousness Instance;
The Closest-Continuer schema is a theory of identity according to which identity through time is a function of appropriate weighted dimensions. A at time 1 and B at time 2 are the same just in case B is the closest continuer of A, according to a metric determined by continuity of the appropriate weighted dimensions.
I don't think that I fully agree with it but it's interesting to think about
So, I'm curious.
What do you think happens in the infinite loop that "runs you" moment to moment? Passing the same instance of consciousness to itself, over and over?
Consciousness isn't an instance. It isn't static, it's a constantly self-modifying waveform that remembers bits about its former self from moment to moment.
You can upload it without destroying the original if you can find a way for it to meaningfully interact with processing architecture and media that are digital in nature; and if you can do that without shutting you off. Here's the kinky part: We can already do this. You can make a device that takes a brain signal and stimulates a remote device; and you can stimulate a brain with a digital signal. Set it up for feedback in a manner similar to the ongoing continuous feedback of our neural structures and you have now extended yourself into a digital device in a meaningful way.
Then you just keep adding to that architecture gradually, and gradually peeling away redundant bits of the original brain hardware, until most or all of you is being kept alive in the digital device instead of the meat body. To you, it's continuous and it's still you on the other end. Tada, consciousness uploaded.
What if every part of my body is replaced by computer part continously. At what point do I lose my consciousness?
I think this question is hard to answer because not everyone agrees what consciousness even is.
It would be easier to record than upload. Since upload requires at least a decode steps. Given the fleeting nature of existence how does one confirm the decoding? This also requires we create a simulated brain, which seems more difficult and resource intensive than forming a new biological brain remotely connected to your nervous system inputs.
Recording all inputs in real time and play them back across a blank nervous system will create an active copy. The inputs can be saved so they can be played back later in case of clone failure. As long as the inputs are record until the moment of death, the copy will be you minus the death so you wouldn't be aware you're a copy. Attach it to fresh body and off you go.
Failure mode would take your literal lifetime to reform your consciousness but what's a couple decades to an immortal.
We already have the program to create new brains. It's in our DNA. A true senior developer knows better than to try and replicate black box code that's been executing fine. We don't even understand consciousness enough to pretend we're going to add new features so why waste the effort creating a parallel system of a black box.
Scheduled reboots of a black box system is common practice. Why pretend we're capable of skipping steps.