The AI boom is screwing over Gen Z | ChatGPT is commandeering the mundane tasks that young employees have relied on to advance their careers.::ChatGPT is commandeering the tasks that young employees rely on to advance their careers. That’s going to crush Gen Z’s career path.

40 points
*

So what would that mean for the company itself long-term? If they’re not training up their employees, and most of the entry level is replaced by text generator work, there would be a hole as executives and managers move out of the company.

It seems like it would be a recipe for the company to implode after a few years/decades, assuming that that the managerial/executive positions aren’t replaced also.

permalink
report
reply
49 points

What are these decades? Is that something longer than next quarter?

permalink
report
parent
reply
8 points

we are going to hold the month open a few more days

permalink
report
parent
reply
19 points

there would be a hole as executives and managers move out of the company.

And why would those executives and managers care about that? They just need to make sure they time their departures to be early enough that those holes don’t impact the share prices. Welcome to modern capitalism where the C suites only goal is to make sure they deploy their golden parachute while the company still has enough cash left over to pay them.

permalink
report
parent
reply
4 points
*

Yeah, it should be obvious by now after 3 decades of 1980s-MBA style corporate management (and a Financial Crash that happenned exactly along those lines) that “the bonus comes now, the problems come after I’ve moved on” measures will always get a go-ahead from the suits at the top floor.

permalink
report
parent
reply
7 points

That sounds like someone else’s problem.

permalink
report
parent
reply
3 points
*

It’s a Tragedy Of The Commons situation: each market actor expects to get the benefits of automating away entry level jobs and expects it’s going to be somebody else who keeps on training people through their junior career years so that mid and senior level professionals are available in the job market.

Since most market actors have those expectations and even those who don’t are pressured by market pressures to do the same (as paying for junior positions makes them less competitive than those who automate that work, so they’re forced to do the same), the tragedy part will eventually ensue once that “field” has kept being overgrazed for long enough.

permalink
report
parent
reply
1 point

Why would you want to train people to do it wrong? If you had to train someone tomorrow would you show them the email client or give them a cart and have them deliver memos for a week?

Right now we have handed over some more basic tasks to machines. Train the next generation to take those tasks being automated as a given.

permalink
report
parent
reply
0 points
*

It’s not the tasks that matter, it’s the understanding of the basics, the implications of certain choices and the real life experience in things like “how long I thought it would take vs how long it actually took” that comes with doing certain things from start to end.

Some stuff can’t be learned theoretically, it has to be learnt as painful real life lessons.

So far there seems to be an ill-defined boundary between what AI can successfully do and what it can’t, and sadly you can’t really teach people starting past that point because it’s not even a point, it’s an area where you have to already know enough to spot the AI-made stuff that won’t work, and the understanding there and beyond is built on foundational understanding of how to use simpler building blocks and what are the implications of that.

permalink
report
parent
reply
2 points
*

And?

It it makes you feel better the alternative can be much worse:

People are promoted to their level of incompetence. Sure she is a terrible manager but she was the best at sales and is most senior. Let’s have her check to make sure everyone filled out expense reports instead of selling.

You don’t get the knowledge sharing that comes from people moving around. The rival spent a decade of painful trial and error to settle on a new approach, but you have no idea so you are going to reinvent this wheel.

People who do well on open-ended creative tasks are not able to do as they failed to rise above repetitive procedural tasks. Getting started in the mailroom sounds romantic but maybe not the best place to learn tax law.

The tech and corporate and general operational knowledge drifts further and further away from the rest of the industry. Eventually everyone is on ancient IT systems that sap (yes pun intended) efficiency. Parts and software break that it is hard to replace. And eventually the very systems that were meant to make things easier become burdens.

For us humans there really is no alternative to work and thinking is the hardest work of all. You need to consistently reevaluate what the situation calls for and any kinda rigid rule system of promotion and internal training won’t perform as well.

permalink
report
parent
reply
4 points
*

I think those in charge often don’t care. A lot of them don’t actually have any incentive for long term performance. They just need a short/medium term stock performance and later they can sell. Heck, they’ll even get cash bonuses based solely on short term performance. Many C-secs aren’t in for the long haul. They’ll stay for maybe 5-10 years tops and then switch jobs, possibly when they see the writing on the wall.

Even the owners are often hoping to just survive until some bigger company buys their business.

And when the company does explode… They’ll just declare bankruptcy and later make a new company. The kinds of people who created companies rarely do it just once. They do it over and over, somehow managing to convince investors every time.

permalink
report
parent
reply
146 points

Bullshit. Learn how to train new hires to do useful work instead of mundane bloat.

permalink
report
reply
10 points

Exactly this.

permalink
report
parent
reply
60 points

100% if an AI can do the job just as well (or better) then there’s no reason we should be making a person do it.

permalink
report
parent
reply
44 points
*

Part of the problem with AI is that it requires significant skill to understand where AI goes wrong.

As a basic example, get a language model like ChatGPT to edit writing. It can go very wrong, removing the wrong words, changing the tone, and making mistakes that an unlearned person does not understand. I’ve had foreign students use AI to write letters or responses and often the tone is all off. That’s one thing but the student doesn’t understand that they’ve written a weird letter. Same goes with grammar checking.

This sets up a dangerous scenario where, to diagnose the results, you need to already have a deep understanding. This is in contrast to non-AI language checkers that are simpler to understand.

Moreover as you can imagine the danger is that the people who are making decisions about hiring and restructuring may not understand this issue.

permalink
report
parent
reply
15 points

The good news is this means many of the jobs AI is “taking” will probably come back when people realize it isn’t actually as good as the hype implied

permalink
report
parent
reply
2 points

And AI is not always the best solution. One of my tasks at my job is to respond to website reviews. There is a button I can push that will generate an AI review. I’ve tested it. It works… but it’s not personal. My responses directly address things they say, especially if they have issues. Their responses are things like, “thanks for your five-star review! We really appreciate it, blah blah blah.” Like a full paragraph of boilerplate bullshit that never feels like the review is addressed.

You would think responding to reviews properly would be a very basic function an AI could do as well as a human, but at present, no way.

permalink
report
parent
reply
2 points

This.

In accounting, 10 years ago, a huge part of the job was categorising bank transactions according to their description.

Now AI can kinda do it, but even providers that would have many billions of transactions to use as training data have a very high error rate.

It’s very difficult for a junior to look at the output and identify which ones are likely to be incorrect.

permalink
report
parent
reply
2 points

Exactly!

permalink
report
parent
reply
4 points

The problem is really going to be in the number of jobs that are left with 40hrs of work to do.

permalink
report
parent
reply
4 points

I fully agree, however doing some mundane work for a few weeks while you learn is useful. You can’t just jump straight into the deep work.

permalink
report
parent
reply
26 points

They don’t want to train new hires to begin with. A lot of work that new hires relied on to get a foothold on a job is bloat and chores that nobody wants to do. Because they aren’t trusted to take on more responsibility than that yet.

Arguably whole industries exist around work that isn’t strictly necessary. Does anyone feel like telemarketing is work that is truly necessary for society? But it provides employment to a lot of people. There’s much that will need to change for us to dismiss these roles entirely, but people need to eat every day.

permalink
report
parent
reply
2 points
*

Indeed: at least in knowledge based industries, everybody starts by working with a level of responsability were the natural mistakes a learning person does have limited impact.

permalink
report
parent
reply
1 point

One of my interns read the wrong voltage and it took me ten minutes to find his mistake. Ten minutes with me and multiple other senior engineers standing around.

I congratulationed him and damn it I meant it. This was the best possible mistake for him to make. Everyone saw him do it, he gets to know he held everything up, and he has to just own it and move on.

permalink
report
parent
reply
4 points

The “not willing to train” thing is one of the biggest problems IMO. But also not a new one. It’s rampant in my field of software dev.

Most people coming out of university aren’t very qualified. Most have no understanding of how to actually program real world software, because they’ve only ever done university classes where their environments are usually nice and easy (possibly already setup), projects are super tiny, they can actually read all the code in the project (you cannot do that in real projects – there’s far too much code), and usually problems are kept minimal with no red herrings, unclear legacy code, etc.

Needless to say, most new grads just aren’t that good at programming in a real project. Everyone in the field knows this. As a result, many companies don’t hire new grads. Their advertised “entry level” position is actually more of a mid level position because they don’t want to deal with this painful training period (which takes a lot of their senior devs time!). But it ends up making the field painful to enter. Reddit would constantly have threads from people lamenting that the field must be dying and every time it’s some new grad or junior. IMO it’s because they face this extra barrier. By comparison, senior devs will get daily emails from recruiters asking if they want a job.

It’s very unsustainable.

permalink
report
parent
reply
1 point

This is not going to turn out well for a lot of people. Soon human beings will be obsolete in the name of AI

permalink
report
reply
3 points

could you please elaborate? do you mean in the workplace?

permalink
report
parent
reply
3 points
325 points

The fucked up part isn’t that AI work is replacing human work, it’s that we’re at a place as a society where this is a problem.

More automation and less humans working should be a good thing, not something to fear.

permalink
report
reply
162 points

But that would require some mechanism for redistributing wealth and taking care if those who choose not to work, and everyone knows that’s communism.

permalink
report
parent
reply
37 points

So much this. The way headlines like this frame the situation is so ass-backwards it makes my brain hurt. In any sane world, we’d be celebrating the automation of mundane tasks as freeing up time and resources to improve our health, happiness, and quality of life instead of wringing our hands about lost livelihoods.

The correct framing is that the money and profits generated by those mundane tasks are still realized, it’s just that they are no longer going to workers, but funneled straight to the top. People need to get mad as hell not at the tech, but at those who are leveraging that tech to specifically to deny them opportunity rather than improving their life.

I need a beer. 😐

permalink
report
parent
reply
5 points

money and profits generated by those mundane tasks are still realized, it’s just that they are no longer going to workers, but funneled straight to the top

Workers should be paid royalties for their contributions. If “the top” is able to reap the rewards indefinitely, so should the folks who built the systems.

permalink
report
parent
reply
34 points
18 points

I think you misspelled “taxes,” but its possible your spelling will turn out to be more accurate.

permalink
report
parent
reply
8 points
*
Deleted by creator
permalink
report
parent
reply
1 point

some sort of A Better World?

permalink
report
parent
reply
19 points

Wait you expect a wealthy mammal to share?

permalink
report
parent
reply
69 points

Exactly. This has nothing to do with AI and everything to do with UBI.

But, the rich and plebes alike will push AI as the Boogeyman as a distraction from the real enemy.

permalink
report
parent
reply
21 points

There’s this bizarre right-wing idea that if everyone can afford basic necessities, they won’t do anything. To which I say, so what? If you want to live in shitty government housing and survive off of food assistance but not do anything all day, fine. Who cares? Plenty of other people want a higher standard of living than that and will have a job to do so. We just won’t have people starving in the street and dying of easily fixable health problems.

permalink
report
parent
reply
8 points
*

We also have to be careful of how people define this sort of thing, and how the wide range of our current wealth inequality affects how something like UBI would be implemented.

In the rich’s eyes, UBI is already a thing and it’s called “welfare”. It’s not enough that people on welfare can barely survive on the poverty-level pittance that the government provides, but both the rich and slightly-more-well-off have to put down these people as “mooching off the system” and “stealing from the government”, pushing for even more Draconian laws that punish their situation even further. It is a caste of people who are portrayed as even lower scum than “the poors”, right down to segregating where they live to “Section 8” housing as a form of control.

UBI is not about re-creating welfare. It’s about providing a comfortable safety net while reducing the obscene wealth gap, as technology drives unemployment even higher. Without careful vigilance, the rich and powerful will use this as another wedge issue to create another class of people to hate (their favorite pastime), and push for driving the program down just as hard as they do for welfare.

permalink
report
parent
reply
43 points

It’s not even a new thing either.

It used to be that every single piece of fabric was handmade, every book handwritten.

Humans have been losing out on labor since they realized Og was faster at bashing rocks together than anyone else.

It’s just a question of if we redistribute the workload. Like turning “full time” down to 6 days a week and eventually 5, or working hours from 12+ to 8hrs. Which inflates the amount of jobs to match availability.

Every single time the wealthy say we can’t. But eventually it happens, the longer it takes, the less likely it’s peaceful.

permalink
report
parent
reply
0 points

Where are you that 7 days a week 12 hour days is full time? That’s literally just always working. Standard full time in the states is 40 hour work weeks.

permalink
report
parent
reply
2 points

The past. You should probably read their comment again.

permalink
report
parent
reply
0 points
*

But eventually

There’s no eventually, people have been killed, murdered and harassed whilst fighting to make it a reality. Someone has to fight to make it happen and an “eventually” diminishes the value of the effort and risks put forth by labor activists all over the world throughout history. It didn’t happen magically, people worked really hard to make it so.

permalink
report
parent
reply
0 points

It sounds like you just don’t know what the word eventually means…

permalink
report
parent
reply
17 points

The problem, as it almost always is, is greed. Those at the top are trying to keep the value derived from the additional efficiency that ai is going to bring for themselves.

permalink
report
parent
reply
2 points
*
Deleted by creator
permalink
report
parent
reply
21 points

But how will the rich people afford more submarines to commit suicide in?

permalink
report
parent
reply
27 points
*

There are both dystopian (a tiny Elite owns the automatons and gets all gains from their work and a massive unemployed Underclass barelly surviving) and utopian (the machines do almost everything for everybody) outcomes for automation and we’re firmly in the path for Dystopia.

permalink
report
parent
reply
9 points

This was exactly the problem that Charles Murray pointed out in the bell curve. We’re rapidly increasing the complexity of the available jobs (and the successful people can output 1000-1,000,000 times more than simple labor in the world of computers). It’s the same concept as the industrial revolution, but to a greater degree.

The problem is that we’re taking away the vast majority of the simple jobs. Even working at a fast food place isn’t simple.

That alienates a good chunk of the population from being able to perform useful work.

permalink
report
parent
reply
8 points

That book is shit and should not be cited in any serious discussion. Here’s a good video explaining why the book is full of racist shit: https://youtu.be/UBc7qBS1Ujo

permalink
report
parent
reply
5 points

Here is an alternative Piped link(s): https://piped.video/UBc7qBS1Ujo

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source, check me out at GitHub.

permalink
report
parent
reply
0 points

If it were full of shit, then you wouldn’t be discussing the exact he pointed out in this book.

There is some racist discussion in there, but that’s secondary and doesn’t detract or impact his main point about what increasingly complex labor does to a society.

permalink
report
parent
reply
1 point
*

Precisely, the hill to die on is to socialize the profits, not to demand we keep the shitty, injuring, repetitive task jobs that break a person’s back by 35.

You don’t protest street lights to keep the lamp lighters employed. The economy needs to change fundamentally to accommodate the fact that many citizens won’t have jobs yet need income. It won’t change, but it needs to.

So we’ll keep blaming the wrong thing, technology that eases the labor burden on humanity, instead of destroying the wealth class that demands they be the sole beneficiary of said technology and its implementation in perpetuity to the detriment of almost everyone outside the owner class. Because if we did that, we’d be filthy dirty marxist socialist commies that hate freedumb, amirite?!

permalink
report
parent
reply
50 points

In an ideal world, people would start receiving better and more fulfilling opportunities when their mundane tasks are automated away. But that’s way too optimistic and the world is way to cynical. What actually happens is they get shitcanned while the capitalists hoard the profits.

We need a better system. One that, instead of relentlessly churning for the impossibility of infinite growth and funneling wealth upwards, prioritizes personal financial stability and enforces economic equallibrium.

permalink
report
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 555K

    Comments