Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I had to skim through this a couple times before I realized that I still need to run an MCP server locally. This is basically a proxy between an LLM and the proposed protocol.

It’s a nice proof of concept.

And makes sense that the goal would be for LLM clients to adopt and support the standard natively. Then the proxy won’t be necessary.



ChatGPT couldn't websocket connect to it directly in your browser?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: