> You're gonna have to do a lot of work to convince me that people who only know how to drive an LLM are learning how to adapt to sweet fuck all
Driving an LLM properly requires knowing to evaluate if the results are correct. People can certainly try to pass generated code over for PR. But even just one code feedback or debugging should uncover if the person understood what they were doing.
Driving an LLM properly requires knowing to evaluate if the results are correct. People can certainly try to pass generated code over for PR. But even just one code feedback or debugging should uncover if the person understood what they were doing.