Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
hbarka
6 days ago
|
parent
|
context
|
favorite
| on:
GPT-5.3-Codex
How can they be diverging, LLMs are built on similar foundations aka the Transformer architecture. Do you mean the training method (RLHF) is diverging?
iranintoavan
6 days ago
[–]
I'm not OP but I suspect they are meaning the products / tooling / company direction, not necessarily the underlying LLM architecture.
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: