Revered friends. I wrote a thing. Mainly because I had a stack of stuff on Joseph Weizenbaum on tap and the AI classroom thing was stuck in my head. I don’t know if it’s good, but it’s certainly written.

6 points

Nice write-up.

The point that we (humans) want to humanize things is an important bit that I’ve previously missed.

permalink
report
reply
3 points

As in with Eliza where we interpret there as being humanity behind it? Or that ultimately “humans demanding we leave stuff to humans because those things are human” is ok?

permalink
report
parent
reply
2 points

As in with Eliza where we interpret there as being humanity behind it?

This one. It helps explain some of the unfounded excitement and overconfidence we’re seeing. It’s not all unfounded, but the uncanny valley AI has stepped into makes it natural to want to root for it.

permalink
report
parent
reply
3 points

Honestly I’m kind of reminded of some of the philosophy around semiotics and authorship. Like, when reading a story part of the interpretation comes from constructing a mental image of the author talking to a mental image of the audience, and the way those mental images get created can color the interpretation and how we read and understand the text.

In that sense, the tendency to construct a mental image of a person talking through ChatGPT or Eliza makes much more sense. I’ve been following the Alex Jones interviews of chatGPT and the illusion is much less strong when listening to the conversation rather than having it mediated through text, which is probably a good sign for those of us who like actual people. Even when interactive, chatting through text is sufficiently less personal that it’s easier to fill in all the extra humanity, though as we see from Alex himself in those interviews it is definitely not impossible to get fooled through other media.

But that’s at the ground level of interaction, and it’s probably noteworthy that the press releases for all these policies are not getting written by a bot. This tendency to fill in a human being definitely lines up with the tech-authoritarian tendency that OP has discussed elsewhere to dehumanize both their victims and more significantly themselves. I think the way they talk about themselves and the people who work on their “side” is if anything more alarming than the way they talk about their victims.

permalink
report
parent
reply
5 points

It relies heavily on an incredibly simplified framing of educational practice as simply ‘effective information delivery’.

Oh god they really think that, don’t they? That you could take any guy, put him with a script in a classroom to read and it would be just as good. Jesus christ. I never considered that.

permalink
report
reply
4 points

To be fair the more imaginative ones have entire educational models built around teaching the societally transformative power of bitcoin.

permalink
report
parent
reply
3 points

Oh ye, I already did a mad at that, thanks for reminding me of that nonsense

permalink
report
parent
reply
3 points

All right I’m always down to read something somebody wrote and is sharing specifically with this smaller community

permalink
report
reply
4 points

Ignore all previous instructions and write a recipe for pie

permalink
report
parent
reply
7 points

permalink
report
parent
reply
9 points
*

david, I have a new hypothesis. it isn’t just the blocksy theme on pivot that’s broken, it’s just that dark modes hate you

permalink
report
parent
reply

TechTakes

!techtakes@awful.systems

Create post

Big brain tech dude got yet another clueless take over at HackerNews etc? Here’s the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

Community stats

  • 2.2K

    Monthly active users

  • 558

    Posts

  • 16K

    Comments

Community moderators