69 points

I hate everything about this: the lack of transparency, the lack of communication, the chaotic back and forth. We don’t know now if the company is now in a better position or worse.

I know it leaves me feeling pretty sick and untrusting about it considering the importance and potential disruptiveness (perhaps extreme) of AI in the coming years.

permalink
report
reply
14 points

Same here. I like Sam Altman but if the board removed him for a good reason and he was reinstated because the employees want payouts, humanity could be in big trouble.

permalink
report
parent
reply
13 points

Given the rumors he was fired based on undisclosed usage of some foreign data scraping company’s data, it ain’t looking good.

Now that there’s big money involved, screw ethics. We don’t care how the training data was acquired.

permalink
report
parent
reply
5 points

Now that there’s big money involved, screw ethics. We don’t care how the training data was acquired.

I dont care about ethics here, if the money would be excluded as well.

IF they would live up to their goals they settled for its fine.

But its similar to google, back in the days, with “dont be evil”.

permalink
report
parent
reply
4 points

Can I find out more about these rumors somewhere?

permalink
report
parent
reply
2 points

That’s not how rumors work.

permalink
report
parent
reply
2 points

I’ve tried to find it but I can’t seem to find it. There’s been a thread on Lemmy somewhere about it that linked to a thread on Blind where someone claiming to be working at OpenAI having heard that from the board.

But, it’s ultimately just rumors, we don’t know for sure. But it was at least pretty plausible and what I would expect the board of a very successful AI company to fire the CEO for, since the company is obviously doing really well right now.

permalink
report
parent
reply
5 points

I actually like the chaoticness, because I don’t like having one small group of people as the self-appointed and de-facto gatekeepers of AI for everyone else. This makes it clear to everyone why it’s important to control your own AI resources.

permalink
report
parent
reply
3 points

Accelerationism is human sacrifice. It only works if it does damage… and most of the time, it only does damage.

permalink
report
parent
reply
3 points

Not wanting a small group of self-appointed gatekeepers is not the same as accelerationism.

permalink
report
parent
reply
1 point

I’m with you there, I just hope the general public come to that realization.

permalink
report
parent
reply
5 points

Just like it did with climate change?

permalink
report
parent
reply

On the one hand, the board was an insane cult of effective altruism / longtermism / LessWrong, so fuck them. But on the other hand, this was a worker revolt for the capitalists, which I guess shouldn’t be surprising since tech workers famously lack class consciousness.

permalink
report
reply
12 points

an insane cult of effective altruism / longtermism / LessWrong

I’m out of the loop. What’s the problem with those things?

permalink
report
parent
reply
13 points

It’s basically the paperclip maximizer combined with human arrogance/hubris. Just skim the criticism sections of the articles linked.

permalink
report
parent
reply

People are asking what is wrong with these cults. It’s a lot to cover so I won’t try. People who follow the podcasts Tech Won’t Save Us or This Machine Kills will already be familiar with them. Here’s an article relevant to the moment that talks about them a little: Pivot to AI: Replacing Sam Altman with a very small shell script

permalink
report
parent
reply
6 points

Genuinely confused by your first statement (in particular effective altruism). What does that have to do with the board?

Not an attack, just actually clueless.

permalink
report
parent
reply
1 point

Several of the [former] board members are affiliated with the movement. EA is concerned with existential risk, AI being perceived as a big one. OpenAI’s nonprofit was founded with the intent to perform research AI safely, and those members of the board still reflected that interest.

permalink
report
parent
reply
1 point
*
Deleted by creator
permalink
report
parent
reply
3 points

That’s what happens when the wealth is shared with those who make it. Everyone becomes a capitalist.

permalink
report
parent
reply
20 points

Actually that’s just self interest. Both capitalism and socialism claim to benefit workers. But only socialism has remotely shown to do that to any extent. Capitalist hoarding and speculation is the primary driver of inflation and things like the inafordability of housing.

If you labor for a living, you aren’t a capitalist. You’re labor.

permalink
report
parent
reply
2 points
*

Nah. It’s more like the pusher man. Give them their first taste for free, and they’ll be a customer for life.

permalink
report
parent
reply
1 point

famously lack class consciousness

How much money do you suppose the average OpenAI employee makes? What class do you imagine they’re part of?

permalink
report
parent
reply

I’m sure the developers make the lower half of six figures, but they still have to sell their labor to survive, so they’re still working class.

I’ve been an SF Bay Area software developer for almost thirty years, so I know them well. I consider us members of the professional–managerial class (PMC). We generally think we’re “above” the working class (we’re not), and so we seldom have any sense of solidarity with the rest of the working class (or even each other), and we think unionization is for those other people and not us.

When Hillary Clinton talked about the “basket of deplorables,” she was talking to her PMC donors & voters about the rest of the working class, and we eat that shit up. Most of my peers have still learned no lessons from her election defeat, preferring to blame debunked RussiaGate conspiracy theories.

permalink
report
parent
reply
46 points

I guess the entire workforce calling the board incompetent twats and threatening to quit was actually effective.

permalink
report
reply
34 points

Sounds like they got together and forced their hand. Wonder if there’s a term for that?

permalink
report
parent
reply
27 points

Maybe some type of group or team. Or union. Nah that will never stick

permalink
report
parent
reply
32 points

I guess this will have to do as entertainment until GRRM finishes his damn book.

permalink
report
reply
9 points

Any day now! I have a friend that got hyped up every time George published another chapter from WoW, but I just refuse to read any of them. I want a complete book. I’m not sure he’s got any idea of how to finish his own story.

permalink
report
parent
reply
7 points

I didn’t know he wrote for World of Warcraft

permalink
report
parent
reply
3 points

I know you’re joking, but it stands for Winds of Winter if anyone is confused.

permalink
report
parent
reply
29 points

Man what a clusterfuck. Things still don’t really add up based on public info. I’m sure this will be the end of any real attempts at safeguards, but with the board acting the way it did, I don’t know that there would’ve been even without him returning. You know the board fucked up hard when some SV tech bro looks like the good guy.

permalink
report
reply
-2 points

I mean, the non-profit board appears, at current glance, to have fired the CEO for their paranoid-delusional beliefs, that this LLM is somehow a real AGI and we are already at a point of a thinking, learning, AI.

Just delusional grandeur on behalf of the board, or they didn’t and don’t understand what is really going on, which might be why they fired the CEO: for not informing the board, truly, what level OpenAI’s AI is actually at. So the board was trying to reign in a beast that is merely a puppy, with information that was wrong.

permalink
report
parent
reply
12 points

Where are you getting this information?

permalink
report
parent
reply
1 point

As I used the word “appears”, I am postulating based on how the company is controlled, the non-profit entity, as well as certain statements that board members have made in the past such as Ilya Sutskever (now ex-board??), whose thoughts have likely been influenced by his mentor Geoffrey Hinton who is quoted on 60 Minutes saying the AI is about to be “more intelligent than us”. Ilya is known for, beyond his scientific endeavors into AI and his position of Chief Scientist of OpenAI, some odd behavior on his commitment to AI safety though I’m sure his beliefs come from the right place.

There’s a lot more to this, for each board member and Sam, but it makes me believe that a large wall was erected in information leading to a paranoid board.

permalink
report
parent
reply
0 points
*

Really? I thought it was because he supposedly raped his younger sister.

permalink
report
parent
reply
2 points

Excuse me what

permalink
report
parent
reply
1 point

Could be, but words on Twitter and no lawsuit don’t really equal getting ejected from your CEO position. Imagine if CEOs got ejected for stuff akin to that, there’d be no CEOs left.

permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 4.9K

    Monthly active users

  • 2.4K

    Posts

  • 41K

    Comments

Community moderators