I think this article misses the forest for the trees. The real “evil” here is capitalism, not AI. Capitalism encourages a race towards optimality with no care to what happens to workers. Just like the invention of the car put carriage makers out of business, so AI will be used to by company owners to cut costs if it serves them. It has been like this for over a 100 years, AI is just the latest technology to come along. I’m old enough to remember tons of these same doom and gloom articles about workers losing their jobs when the internet revolution hit in the late 90s. And probably many people did lose jobs, but many new jobs were created too.
This person explains all her failures: insted of adopting and using chatgpt herself, reducing price and finding more clients she did nothing.
She was writing most boring pieces of text than no one is reading (corporate blog posts and spam emails).
Refused to learn new things which would keep her in position.
Yes, some jobs disappear other appear. I believe that 90+% of today’s jobs didn’t exist even 50 years ago. Especially not without will to learn new ways of doing things. Imagine farmer with knowledge of 100 years ago. Or hotel front desk worker without computer and telephone.
For mid-level writers, which she was, using AI doesn’t work. The few remaining clients you have specifically don’t want AI to be used. So you either lie and deceive them or you stay away from AI.
And using AI to lower prices and finding new clients also doesn’t work. Writers are already competing against writers from nations with much lower cost of living who do the same work for a fraction of the cost. But the big advantage that domestic writers had was a batter grasp of the language and culture. These advantages are mostly lost if you start using AI. So if that’s your business plan you are in a race to the bottom. It’s not sustainable and you will be out of a job in maybe 3-5 years.
Thank you for good insight, I was just thinking if all here clients are satisfied with AI, then
The few remaining clients you have specifically don’t want AI to be used.
Is not completely true.
At the end of the day if an AI can do the job to an acceptable standard a human doesn’t need to be doing it.
As you say it’s happened to countless industries and will continue to happen.
Except that the ‘AI’ is fed by the work of actual humans, and as time goes on, they will be trained more and more on the imperfect output of other AIs, which will eventually result in their output being total bizarre crap. Meanwhile, humans stopped training at whatever task since they couldn’t be paid to do it anymore, so there’s no new human material.
Wow you clearly have a very good understanding of economy and of how our species has been evolving in the lady hundreds of years.
You are the same as the people who didn’t want to lose their jobs in the coal mines and in the oil rigs. BeCauSE wE wON’t HavE JOooOBs…instead of diving into the ones created by renewables.
You prefer to be in stable shity conditions then in an turbulent way to improvement
I’m really having a hard time thinking about what jobs this would create though. I get the internet thing, as people needed to create and maintain all aspects of it, so jobs are created. If some massive corporation makes the AI and all others use the AI, there’s no real infrastructure. The same IT guys maintain the systems at AI corp. What’s left to be done with it/for it by “common folk?”
There are plenty of companies out there (and growing daily) who want to do AI in house, and can’t (or don’t want) to send their data to some monolithic, blackbox company which has no transparency. The finance industry, for example, cannot send any data to some third party company like OpenAI (ChatGPT) for compliance reasons, so they are building teams to develop and maintain their own AI models in-house (SFT, RLHF, MLOps, etc).
There are lots of jobs being created in AI daily, and they’re generally high paying, but they’re also very highly skilled, so it’s difficult to retrain into them unless you already have a strong math and programming background. And the number of jobs being created is definitely a lot, lot less than the potential number of jobs lost to AI, but this may change over time.
Despite what the pseudo-intellectuals will tell you, ChatGPT is not some all powerful do everything AI. Say you want to use GPT to create your own chatbot for your company to give company specific info to people at your company, you cant just take existing chat GPT and ask it “how do I connect to the wifi” or “is the office closed on monday” you need an in-house team of people to provide properly indexed information, train and test the bot, update it, handle error reports, etc.
AI is not magic, its literally just an advanced computer script, and if your job can be replaced by an AI then it could have been replaced by a regular computer script or program, there just wasnt enough buzzwords and media hype to convince your boss to do it.
Not optimality. Maximum profit. Very different from any definition of optimal I would personally use.
Well, in business school they teach you that running a company is an exercise in maximising profits as a constrained optimisation problem, so optimality for a classical company (not one of those weird startups that doesn’t make money for 10+ years) almost always is maximum profit.
I honestly can’t tell if you’re being serious. The ‘evil’ is the same force that replaced carriages with cars? The world would be better if carriage-making was still a critical profession?
It’s comical how she uses the example of the printing press in her introduction. Are we really sad that we don’t have to rely on monks copying books?
Yeah not to mention do we really need human labor for the jobs she was doing: " I’d work on webpages, branded blogs, online articles, social-media captions, and email-marketing campaigns."
Email marketing campaigns? Social media captions? Branded blogs? You’d think she’d be happy to be free of it.
I imagine the prestige of being able to tell people she was a “professional writer” was worth something to her mentally, but 'cmon…she was a marketing droid. She’s just been replaced by another marketing droid.
Maybe she should pivot to using ML tools to produce the same content she was already writing, but faster.
Yes, we do still need to have Monks copying books, but not for the latest Romance Novel. Let the machine do what it does well, and crank out millions of copies of dreck. However the remaining monks might still find good employment going upscale, competing for prestige and quality, rather than quantity or turnaround time.
This author wants to keep turning out quantities of dreck, but now there’s a cheaper way, yet she doesn’t seem interested in trying to upscale to a product where humans are still better than AI (I assume them are what she means by “funnels”)
I’m in the tech field so my point if comparison is outsourcing. We had a couple decades where management decided the most profitable way to do business was outsourcing quantities of dreck to lowest priced providers in third world countries. That even drove racism that hadn’t previously existed. However more recently the companies I work for are more likely to be looking for quality partners or employees in different time zones and price points. Suddenly results are much better now that our primary concern is no longer lowest price. Don’t be a monkey banging on a type writer for an abusive sweatshop in a third world country that can be replaced by someone or something yet cheaper, but upscale to being a respected engineer in a different time zone making a meaningful contribution to the technical base
To be fair, passing out free samples at a grocery store is also trivial to automate.
Don’t worry, only jobs that aren’t soul crushing will be automated. How else will people build their little wannabe corporate fiefdoms lording over the miserable peasants?
For the past several years I worked as a full-time freelance copywriter; I’d work on webpages, branded blogs, online articles, social-media captions, and email-marketing campaigns.
Turns out when all you need is low-quality product, and a machine can do it cheaper, that’s what people will choose. It’s shitty that this affects people’s livelihood in the short term, but this is what happens in capitalism.
all you need is low-quality product
Isn’t this the real problem? Maybe my outside perspective is wrong, but it really seems like companies have changed what they want from writing to mass quantities of eye catching dreck, rather than useful, informative, well written articles. I’m not just talking Buzzfeed either but this illness has infected news, marketing, and tech doc as well.
A friend who works for a consulting company has talked about when he is between gigs, he works internally improving their doc generator. This is a high end, expensive consultancy, and part of what you get is mass quantities of generated dreck
Humans can still create better writing in many ways, but how can we fix society to value that?
THE POORLY WRITTEN SENTENCE with the typo right at at the punchline doesn’t help her case: “The contract was six months, because that’s how long it’d take the AI would learn to write just like me but better, faster, and cheaper.” Yep. Better than that.