cross-posted from: https://lemmy.ml/post/14869314

“I want to live forever in AI”

You are viewing a single thread.
View all comments View context
2 points

It’s an analogy. There is actually an academic joke about the point you are making.

A mathematician and an engineer are sitting at a table drinking when a very beautiful woman walks in and sits down at the bar.

The mathematician sighs. “I’d like to talk to her, but first I have to cover half the distance between where we are and where she is, then half of the distance that remains, then half of that distance, and so on. The series is infinite. There’ll always be some finite distance between us.”

The engineer gets up and starts walking. “Ah, well, I figure I can get close enough for all practical purposes.”

The point of the analogy is not that one can’t get close enough so that the ear can’t detect a difference, it’s that in theory analog carries infinite information. It’s true that vinyl recordings are not perfect analog systems because of physical limitations in the cutting process. It’s also true for magnetic tape etc. But don’t mistake the metaphor for the idea.

Ionic movement across membranes, especially at the scale we are talking about, and the density of channels in the system is much closer to an ideal system. How much of that fidelity can you lose before it’s not your consciousness?

permalink
report
parent
reply
0 points
*

"I’d like to talk to her, but first I have to cover half the distance between where we are and where she is, then half of the distance that remains, then half of that distance, and so on. The series is infinite. "

I get it’s a joke but that’s a bad joke. That’s a convergent series. It’s not infinite. Any 1st year calculus student would know that.

"it’s that in theory analog carries infinite information. "

But in reality it can’t. The universe isn’t continous, it’s discrete. That’s why we have quantum mechanics. It is the math to handle non contiguous transitions between states.

How much of that fidelity can you lose before it’s not your consciousness?

That can be tested with c elegans. You can measure changes until a difference is propagated.

permalink
report
parent
reply
2 points

Measure differences in what? We can’t ask *c. elegans * about it’s state of mind let alone consciousness. There are several issues here; a philosophical issue here about what you are modeling (e.g. mind, consciousness or something else), a biological issue with what physical parameters and states you need to capture to produce that model, and how you would propose to test the fidelity of that model against the original organism. The scope of these issues is well outside a reply chain in Lemmy.

permalink
report
parent
reply

Programmer Humor

!programmer_humor@programming.dev

Create post

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

  • Keep content in english
  • No advertisements
  • Posts must be related to programming or programmer topics

Community stats

  • 3.4K

    Monthly active users

  • 1K

    Posts

  • 38K

    Comments