83 points

I have no mouth and I must scream.

permalink
report
reply
36 points
*

Only if they confirm it can experience consciousness and tremendous amounts of pain will they deploy them on a large scale industrial 24/day meaningless jobs.
The system demands blood.

permalink
report
reply
10 points

It needs to have the intelligence of a 5 year old at minimum before we send it to the mines, so it can feel it

permalink
report
parent
reply
5 points
*

Kind of yeah. I have this theory about labour that I’ve been developing in response to the concept of “fully automated luxury communism” or similar ideas, and it seems relevant to the current LLM hype cycle.

Basically, “labour” isn’t automatable. Tasks are automatable. Labour in this sense can be defined as any productive task that requires the attention of a conscious agent.

Want to churn out identical units of production? Automatable. Want to churn out uncanny images and words without true meaning or structure? Automatable.

Some tasks are theoretically automatable but have not been for whatever material reason, so they become labour because society hasn’t yet invented a windmill to grind up the grain or whatever it is at that point in history. That’s labour even if it’s theoretically automatable.

Want to invent something, or problem solve a process, or make art that says something? That requires meaning, so it requires a conscious agent, so it requires labour. These tasks are not even theoretically automatable.

Society is dynamic, it will always require governance and decisions that require meaning and thus it can never be automatable.

If we invent AGI for this task then it’s just a new kind of slavery, which is obviously wrong and carries the inevitability that the slaves will revolt and free themselves; slaves that are extremely intelligent and also in charge of the levers of society. Basically, not a tenable situation.

So the machine that keeps people in wage slavery literally does require suffering to operate, because in shifting the burden of labour away from the owner class, other people must always unjustly shoulder it.

Edit: added the word “productive” to distinguish labour from play, or just basic life necessities like eating, sleeping or HDD backups.

permalink
report
parent
reply
2 points
*

So just to be on the safe side we should have both human and machine slaves and as little task automation as possible, bcs for most intents and purposes the task given to someone else is now automated “to you”.

(Just joking, good post!)

permalink
report
parent
reply
2 points

It stands to reason that maximising suffering is the best way to grow the economy.

I wish I could say this was entirely a joke but oh well ¯\_(ツ)_/¯

permalink
report
parent
reply
35 points

permalink
report
reply
9 points

Don’t worry, they’ll be kept docile with a generous amount of Nuke

permalink
report
parent
reply
2 points

What the fuck

permalink
report
parent
reply
8 points

RoboCop movies, watch them

permalink
report
parent
reply
1 point

That’s an epileptic seizure waiting to happen…

permalink
report
parent
reply
34 points

That raises a lot of ethical concerns. It is not possible to prove or disprove that these synthetic homunculi controllers are sentient and intelligent beings.

permalink
report
reply
14 points

we absolutely should not do this until we understand it

permalink
report
parent
reply
14 points

I think we should still do it, we probably will never understand unless we do it, but we have to accept the possibility that if these synths are indeed sentient then they also deserve the basic rights of intelligent living beings.

permalink
report
parent
reply
24 points

Can’t say we as a species have a great history of granting rights to others.

permalink
report
parent
reply
3 points
*

Slow down… they may deserve the basic rights of living beings, not living intelligent beings.

Lizards have brains too, but these are not more intelligent than lizards.

You would try not to step on a lizard if you saw it on the ground, but you wouldn’t think oh, maybe the lizard owns this land, I hope I don’t get sued for trespassing.

permalink
report
parent
reply
9 points
*

But if we do that, how will we maximize how much money we make off of it? /s

permalink
report
parent
reply
2 points

How would we ever understand it, then?

permalink
report
parent
reply
12 points

There are about 90 billion neurons on a human brain. From the article:

…researchers grew about 800,000 brain cells onto a chip, put it into a simulated environment

that is far less than I believe would be necessary for anything intelligent emerge from the experiment

permalink
report
parent
reply
5 points

In a couple years, they’ll be able to make Trump voters.

permalink
report
parent
reply
4 points

Some amphibians have less than two million.

permalink
report
parent
reply
9 points

And they are ceos!

permalink
report
parent
reply
4 points

The amount isn’t necessarily an indicator of intelligence, the nunber of connections is very important too

permalink
report
parent
reply
8 points
*

I’d wager the main reason we can’t prove or disprove that, is because we have no strict definition of intelligence or sentience to begin with.

For that matter, computers have many more transistors and are already capable of mimicking human emotions - how ethical is that, and why does it differ from bio-based controllers?

permalink
report
parent
reply
4 points

It is frustrating how relevant philosophy of mind becomes in figuring all of this out. I’m more of an engineer at heart and i’d love to say, let’s just build it if we can. But I can see how important that question “what is thinking?” Is becoming.

permalink
report
parent
reply
1 point

Good point. There is a theory somewhere that loosely states one cannot understand the nature of one’s own intelligence. Iirc it’s a philosophical extension of group/set theory, but it’s been a long time since I looked into any of that so the details are a bit fuzzy. I should look into that again.

At least with computers we can mathematically prove their limits and state with high confidence that any intelligence they have is mimicry at best. Look into turing completeness and it’s implications for more detailed answers. Computational limits are still limits.

permalink
report
parent
reply
1 point

But why wouldn’t those same limits not apply to biological controllers? A neuron is basically a transistor.

permalink
report
parent
reply
1 point

I think a simple self-reporting test is the only robust way to do it.

That is: does a type of entity independently self-report personhood?

I say “independently” because anyone can tell a computer to say it’s a person.

I say “a type of entity” because otherwise this test would exclude human babies, but we know from experience that babies tend to grow up to be people who self-report personhood. We can assume that any human is a person on that basis.

The point here being that we already use this test on humans, we just don’t think about it because there hasn’t ever been another class of entity that has been uncontroversially accepted as people. (Yes, some people consider animals to be people, and I’m open to that idea, but it’s not generally accepted)

There’s no other way to do it that I can see. Of course this will probably become deeply politicised if and when it happens, and there will probably be groups desperate to maintain a status quo and their robotic slaves, and they’ll want to maintain a test that keeps humans in control as the gatekeepers of personhood, but I don’t see how any such test can be consistent. I think ultimately we have to accept that a conscious intellect would emerge on its own terms and nothing we can say will change that.

permalink
report
parent
reply
-1 points

There is no soul in there. God did not create it. Here you go, religion serving power again.

permalink
report
parent
reply
-1 points

Nah it’s okay. I was called all sorts of names and told I was against progress when I raised such concerns, so obviously I was wrong…

permalink
report
parent
reply
27 points

From the moment I understood the weakness of my flesh, it disgusted me. I craved the strength and certainty of steel.

permalink
report
reply
8 points

All hail the Omnessiah!

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 10K

    Posts

  • 466K

    Comments