Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

ChatGPT is a godsend for junior developers, it isn't very great at providing coherent answers to more complex or codebase specific questions. Maybe that will change with time but right now it's mostly useful as a learning aid.


For complex codebases it’s better to use copilot since they do the work of providing context to gpt for you. CopilotX will do a lot more but it’s still waitlist signup. You could hack something together yourself using the API. The quickest option is just to paste the relevant code in to the chat along with your prompt.


I tend to use both. Copilot is vastly better for helping scaffold out code and saves me time as a fancy autocomplete, while I use ChatGPT as a "living" rubber duck debugger. But I find that ChatGPT isn't good at debugging anything that isn't a common issue that you can find answers for by Googling (it's usually faster and more tailored to my specific situation, though). That's why I think it's mostly beneficial in that way to junior devs. More experienced devs are going to find that they can't get good answers to their issues and they just aren't running into the stuff ChatGPT is good at resolving because they already know how to avoid it in the first place.


And this gets into why companies are banning it, at least for the time being; developers and especially junior developers in general think nothing of uploading the sum total contents of the internal code base to the AI if it gets them a better answer. It isn't just what it can do right this very second that has companies worried.

It isn't even about the AI itself; the problem is uploading your code base or whatever other IP anywhere not approved. If mere corporate policy seems like a fuddy-duddy reason to be concerned, there's a lot of regulations in a lot of various places too, and up to this point while employees had to be educated to some extent, there wasn't this attractive nuisance sitting out there on the internet asking to be fed swathes of data with the promise of making your job easier, so it was generally not an issue. Now there is this text box just begging to be loaded with customer medical data, or your internal finance reports, or random data that happen to have information the GDPR requires special treatment for even if that wasn't what the employee "meant" to use it for. You can break a lot of laws very quickly with this textbox.

(I mean, when it comes down to it, the companies have every motivation for you to go ahead and proactively do all the work to figure out how to replace your job with ChatGPT or a similar technology. They're not banning it out of fear or something.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: