Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

poor prompting then. One should narrow down the question to the particular circumstance it should apply.


this is exactly what you can’t do as an amateur though. You don’t know enough context to provide your relevant circumstances.


Well no, not really. I’m replying in the context of the parent comment, that was painting a scenario where the GPT would reply with information relevant to different countries.

It doesn’t need to be advanced prompting, it’s enough to provide enough information and ask to “provide advice relevant to the current and applicable codes and regulations”

It’s the first question a forum dweller would reply to a poorly articulated post.


And how would the LLM know that a comment in english in a random forum is applicable to a specific country but not another?

UK, US, Australia all have different rules and regulations, but their websites don't exactly advertise their location. It is implied that you visit them based on your country. The UK has a weird mix of metric and imperial, so you can't even use the units to figure it out!! It's not always easy to figure it out, even for a human.


I think you're exaggerating. There are several ways an LLM can discern the applicability of certain data to a context (document metadata such as TLD, cross check with applicable authoritative codes and regulations, be wary of random forum posts.)

I'm pretty sure an LLM can provide pretty refined answers given some reasonable context. i.e. I want to rewire a socked in my apartment, which is in London, UK. What do I need to do?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: