JDubbleu
Almost all of the people who are fearful that AI is going to plagiarize their work don’t know the difference between statistical analysis and generative artificial intelligence. They’re both AI, and unfortunately in those circles it seems anything even AI-related is automatically bad without any further thought.
I also think when people are much better than average at one thing they think that makes them better than everyone else at other things. A great example I see all the time in tech circles is software engineers who believe because they can write code it makes them intelligent in all areas.
Small personal anecdote. I’m pretty damn good at math, computers, and software engineering. It comes extremely naturally to me and it has made me a better problem solver and allowed me to have a good career. Outside of computer-based problem solving though, something I’ve spent 10s of thousands of hours developing, I’m pretty average. My ability to grasp chemistry and other hard sciences outside of computer science (which is a horribly named applied math field) is probably about what you’d expect from an average college graduate. My writing and linguistics abilities are definitely below average because while I can write pretty decently, I don’t know the technicalities of why certain things are correct and others aren’t other than that one sounds better than the other.
It’s also pretty easy for someone to latch onto the idea they are “gifted” when they only consider things they regularly practiced and became good at. Even then I think the spirit of the meaning of gifted in the OC is those kids that within a few days become adept at anything, and can keep that pace up well into mastery. Not just practicing something and following a normal progression curve.
lemmy.world kept going down every other hour and I figured being a software engineer this instance made the most sense.
There’s a subset of artificial intelligence called unsupervised learning which is a form of statistical analysis in which you let an agent find patterns in data for you, as opposed to trying to drive the agent to a desired outcome. I’m not 100% sure that is what the website author was using, but it sounded pretty close to it. It’s extremely powerful and not anything like the generative LLMs most people now think of when the words AI are thrown around.
I agree though, it sucks project got killed it seemed super interesting and insightful.
Pretty much. All US veterans who have died in Ukraine were volunteers. Just about everything we’ve given Ukraine is old military equipment we don’t need, and it accounts for such a small amount in the total budget while absolutely fucking the greatest threat to Europe at the moment. It might be the best ROI we’ve ever gotten from anything ever.
This is coming from someone who is extremely anti-war, but that doesn’t make me anti-defend yourself.
I think this article goes to show just how out of touch execs are with employees. The overwhelming majority of people resented RTO and were very vocal about it, execs pushed it hard, and now they’re all surprise Pikachu face when attrition skyrocketed and morale plummeted. People on my team used to somewhat give a shit about our work, but now not a single one of us has the motivation to go the extra mile after we’ve been shown our well being doesn’t mean shit.