Brain of Theseus.
This is the correct way IMO. “Uploading” your mind to a computer is making a clone/copy, but the original dies the same.
Maintaining continuity of consciousness is the only thing that would make me feel comfortable with converting myself to a machine intelligence.
I hate to break it to you, but our meat brains don’t even have continuity of consciousness. We become unconscious all the time. The only real constant is the “hardware” our consciousness emerges from, but even that is always changing.
I don’t get the down votes. Did y’all forget about sleep? No one vividly dreams every night all night long. Often it’s the fade to black going to sleep then the sudden awakening.
What does maintaining continuity of consciousness look like to you? As in you are able to talk to your copy? And continue to live your normal life outside while your digital self lives their digital life?
Or are you saying you want the transition to digital to be seamless, where your digital self remembers laying in a chair, a quick pin-prick, and then they’re in the digital realm?
Keep in mind, we have zero understanding of how you’d get the meat consciousness to transition into the digital consciousness - it’s likely not even possible. The two options for copying are keep both alive or terminate the original somewhere before bringing the digital one online. There’s many ways to do both, but those are the two.
I think the only way we know it is us for sure is if we are conscious in both the original and clone at the same time. Like… okay… I know this is me in the new brain, I’ll shut down the other one.
Like… okay… I know this is me in the new brain, I’ll shut down the other one.
the other one: i’m pretty sure you’ve got it backwards, pal
I agree.
But here is an interesting thing to think about:
What is the perceived difference between falling asleep and waking up the next day, vs going to sleep and copying your consciousness to a machine/new body.
Your brain is still functioning while you’re asleep. If it turned off all the way then you’d become brain-dead.
That continuity of function is arbitrary. In reality it provides people comfort in some idea of a soul but there’s nothing suggesting it actually provides anything to the continuity of consciousness.
Between every loss in time, where you stop forming memories until you wake up again, you have nothing to affirm that your current consciousness is the same as your last waking period’s. The only thing vaguely providing that illusion is your previously-formed memories, which would exist all the same on the digital mind, in theory.
Some sleep is conscious (dreaming) but they’re easily forgotten. Perhaps being unconscious still always has a grain of consciousness (but is just forgotten).
It seems there is a grain of reduced experience while sleeping. Copying seems to imply it’s always a clone (a different ego, a different person).
The body. It’s feeding you vast amounts of information every moment, it’s the one making decisions, you’re the AI assistant providing analysis and advice
If you clone a tree, you get a similar tree. The branches aren’t in the same place. If you clone a human, why would the nerves be laid out the same way? Even if it’s wired up correctly, without a lifetime of cooperation why would your body take your advice?
Imagine you wake up. Red looks blue. Everything feels numb. The doctor says “everything looks good, why don’t you try to stand up?”. You want to cooperate with the doctor, but you don’t stand up. You could move, but you don’t. Rationalizing your choices, you tell the doctor you don’t feel like it. You feel your toes, you shift to get away from the prodding of your doctor, but you just can’t muster the will to stand
Imagine you wake up. Your sight is crystal clear, you feel your body like never before. The doctor says “don’t move yet”. With the self control of a child, you rip out the itchy IV to get the tape off of you. The doctor says something in a stem tone, and you’re filled with rage. You pummel the doctor, then are filled with regret and start to cry
Emerging science suggests this kind of situation could lead to brand new forms of existential horror
The doctor says something in a stem tone
!keming@lemmy.world moment?
As long as it’s made mandatory to cover with insurance so it’s available to everyone. The last thing we need is an immortal ruling class.
Don’t worry, going by past history this will be available to any and…uhh, [checks notes] oh, uh-oh.
Oh at this point it seems like we’re treating dystopian science fiction as a guidebook instead of a warning.
On the plus side an immortal ruling class might actually start caring about climate change.
If they’re functional, and we get serious about space or birth control, then no it’s not a problem. But that is another path we can take to really juice the dystopia.
It will take a very long time indeed before we can reach another habitable planet enough to alleviate an exponentially growing population, and forced birth control will be unpopular, not to mention probably employed as eugenics by those in power against those who aren’t.
We don’t need immortal billionaires sucking up everyone’s oxygen.
Yeah it’s not like the rest of the population ever benefits from advances in technology… Oh wait…
The final boss of subscriptions