Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

if the golden rule is that code is a liability, what does this headline imply?


The code would be getting written anyways, its an invariant. The difference is less time wasted typing keys (albeit small amount of time) and more importantly (in my experience) it helps A LOT for discoverability.

With g3's immense amount of context, LLMs can vastly help you discover how other people are using existing libraries.


my experience dabbling with the ai and code is that it is terrible at coming up with new stuff unless it already exists

in regards to how others are using libraries, that’s where the technology will excel— re-writing code. once it has a stable AST to work with, the mathematical equation it is solving is a refactor.

until it has that AST that solves the business need, the game is just prompt spaghetti until it hits altitude to be able to refactor.


Nothing at all. The headline talks about the proportion of code written by AI. Contrary to what a lot of comments here are assuming, it does not say that the volume of code written has increased.

Google could be writing the same amount of code with fewer developers (they have had multiple layoffs lately), or their developers could be focusing more of their time and attention on the code they do write.


Well, either they just didn't spend as much time writing the code or they increased their liability by about 33%.

The truth is likely somewhere in between.


I'm sure google won't pay you money to take all their code off their hands.


But they would pay me money to audit it for security.


yup, you can get paid all kinds of money to fix/guard/check billion/trillion dollar assets..




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: