Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd buy some of these explinations, except the depth estimation, colorization, and super-resolution ML models they use in the app DO run locally and are still subscription-gated.

Apple has been doing on-device machine learning for portrait blurs and depth estimation for years now, though based on the UI, this might use cloud inference as well.

Granted, these aren't the super heavy ones like generative fill / editing, and I understand that cloud inference isn't cheap. A subscription for cloud-based ML features is something I'd find acceptable, and today that's what has launched... The real question is what they plan to do with this in 2-5 years. Will more non-"AI" features make their way into the pro tier? Only time will tell!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: