What you’re describing is called the Singularity, and it’s an AGI, we’re not even close to anything remotely similar to that. I’m not even sure we’ll get there in my lifetime.
Don’t be a condescending little prick, mate.
I’m not talking about an AGI or a singularity. It’s a long way between where we are and what we have now and that.
What I’m talking about will happen in the meantime and will finally allow me to not deal with Prima Donnas who think they’re the last coca cola in the desert because they can copy paste code from stackoverflow.
You’re talking about a computer you can ask it to do any stuff and it not only understands what you ask (computers can do this now) but what you mean (you need an AGI for that).
You can already replace anyone whose sole job is to copy paste stuff from stack overflow, but that’s not all a programmer does.
There’s an excellent demonstration of what being a programmer is that some teachers do on a programming 101 class which is have the students describe step by step how to do day-to-day tasks, and always people will skip steps or not consider corner cases. Being a programmer is knowing how to explain stuff to a computer in an unambiguous way, and until computers gave a general intelligence they’ll not do ambiguous tasks or make wrong assumptions about it. If LLMs became advanced enough that you could “prompt” the computer to do stuff, the prompt would have to be very specific, and written in a very specific way, which would essentially become a programming language.
You’re stuck in the current paradigm about how software works. What I’m talking about is not a current paradigm and it’s not AGI.
We don’t need AGI for what I’m talking about. You’re fixated on programmatically tell a computer how to do something and I’m not sure you’re just being difficult or can’t grasp or imagine what I’m imagining.
We already have useful LLMs for different tasks. Heck, my team is developing software to perform different tasks using LLMs that if we had to program from scratch we’d be so fucked! Right now, not 2 years after the first version of ChatGPT was released. Do you think this technology will remain the same or will continuously be developed into something that most of us cannot comprehend or will even deny, like you’re doing now?
It’s your right to not agree with me, and I accept it, but don’t say I’m wrong mate, you can’t possibly know!
And don’t talk about AGI or singularity like it’s the next step, you’re doing a disservice to yourself.