Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There was a post a few weeks back (or a reply to a post) showing an app entirely made using an LLM. It was like a 3D globe made with 3js, and I believe the poster had created it locally on his M4 MacBook with 96 GB RAM? I can't recall which model it was or what else the app did, but maybe someone knows what I'm talking about?


That was with Qwen 32B, which is still the best coder model for the size. You could run that just fine with a 36 GB RAM Mac.


Is it actually able to generate all the files and directory structure, etc? Or did the author just take all the responses he got from multi-shot prompting and eventually assemble it into the final product? I believe I used LlamaCoder some time ago to build the scaffolding of an app, but I'm not sure about the state of that project now.


I imagine _a lot_ of iteration was involved regardless of the model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: