I have been working at a large bank for a few years. Although some coding is needed, the bulk majority of time is spent on server config changes, releasing code to production, asking other people for approvals, auth roles, and of course tons of meetings with the end user to find out what they need.
I guess when I was a junior engineer, I would spend more time looking at code, though I used to work for small companies. So it is hard for me to judge if the extra time spent coding, was because of me being a junior or because it was a small company.
The kicker, is when we interview devs, most of the interview is just about coding. Very little of it is about the stuff I listed…
I’m sure in a few years from now nobody will code anymore and you will just tell the AI what you want to see implemented.
Same as nobody writes actual machine code anymore and everyone only uses higher languages.
You still have to understand the code as a senior, you don’t want to merge code, where you don’t know what it’s doing (AI extinction anyone, haha?). And I actually doubt that nobody will code anymore in a few years, as some stuff requires quite some creativity that is just nowhere close with LLMs (even with GPT4). I’ll admit though that probably 90+% will be able to be written by AI (and is in some way today already, if it’s relatively repetitive code). So yeah the new main dev-role will likely be “prompt-engineer”. But it’ll be interesting, the fast progress with AI never stops to surprise. And GPT4 is definitely able to think slightly abstract (and with correction writes quite good code). Deepmind also just recently released alpha-dev, which improved the state of the art in sorting…
Higher-level languages compile (pretty much) reproduceably into machine code, so the same C code would always produce the same executable as long as the same compiler version is used.
Current AI chatbots will produce completely different results if given the same prompt twice. In its current form it can’t be used in a reliable toolchain for anything.