Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In that scenario, how do you keep cold startup as fast as possible?

The nice thing about JS workers is that they can start really fast from cold. If you have low or irregular load, but latency is important, Cloudflare Workers or equivalent is a great solution (as the article says towards the end).

If you really need a full-featured container with AOT compiled code, won't that almost certainly have a longer cold startup time? In that scenario, surely you're better off with a dedicated server to minimise latency (assuming you care about latency). But then you lose the ability to scale down to zero, which is the key advantage of serverless.



Apparently not nice enough, given that they rewrote the application in Go.

Serverless with containers is basically managed Kubernetes, where someone else has the headache to keep the whole infrastructure running.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: