I think they're spiritually related, but my understanding is that Marvin actually invokes the LLM at runtime to execute an AI Function (which is why it can perform, e.g., sentiment analysis). Maccarone invokes the LLM to generate code during development.
Ha, I just want to clarify that the quoted text isn't actually from the README or something like that, I'm not quite that crazy.
But no real argument with the concern. An LLM will generate bugs, and that may be a reason this kind of thing never makes sense in practice (isn't that an argument against copilot, too, though?).
> You should be squeamish about running the code without reading it first, given that you're pair-programming with a bot.
It's funny, the first version of this project[0] let you do exactly that, e.g.,
def main(path: str):
#<<filenames = a list of filenames under path>>
for fn in filenames:
#<<size = size of fn in bytes>>
print(fn, size)
#<<use argparse and call main>>
and then run that program like any other Python script (using import magic):
Angaza enables life-changing products, such as solar + battery home energy systems, to be sold _on payment plans_ in off-grid regions across Africa and Asia.
We've reached millions of people who now have electricity for the first time:
Angaza enables life-changing products, such as solar + battery home energy systems, to be sold _on payment plans_ in off-grid regions across Africa and Asia.
We've reached millions of people who now have electricity for the first time:
Angaza enables life-changing products, such as solar + battery home energy systems, to be sold _on payment plans_ in off-grid regions across Africa and Asia.
We've reached millions of people who now have electricity for the first time: