Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's as feasible as telling how many chars in html lead to this comment by looking at a screenshot. LLM doesn't see characters, tokens, numbers or its own activations. LLM is a "set of rules" component in a chinese room scenario. Anything an operator of that room does is lower-level.

GGP's idea suggests that an LLM, allegedly as a whole-room, receives something like: "hey, look at these tokens: <tokens>, please infer the continuation". This puts it into a nested-room's-operator position, which (1) it is not, (2) there's no nested room.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: