Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems to be the biggest deal, a few links away.

> Reading data in a Cloud Storage bucket located in a multi-region from a Google Cloud service located in a region on the same continent will no longer be free; instead, such moves will be priced the same as general data moves between different locations on the same continent.

If I understand correctly (do I?), this means that storing frequently used data in a multi-region bucket is suddenly very expensive — we go from paying $0 to $0.02/GB. Reading 10TB / hour goes from $0/year to $1.75M/year.

We can switch to single-region buckets, but it's quite an effort to move all the data.



I'd love to be on a team that's reading 10TB per hour and has to explain that huge bill to executives!


I'm no GCP user but if you've planned for "schema on read" and throw a bunch of poorly indexed/partitioned/compressed files in there you could probably get to it pretty quick...


Just gunning for the enterprise.

Who cares about DR, and having 3x copies of your data 100mi apart from each other? Small startups, or Enterprises? Enterprises can just push those costs to their DR budget.


The fire last year at OVH showed us impressively that it is not a good idea to have your data only in one region. So don't do it and stick to multi-region.


For the case, data on one zone is a problem but one region (multiple zone) isn't a problem, isn't it?


Well, that depends. IIRC, OVH's fire hit multiple floors, and each floor was a "zone" - the power, networking, etc were all independent, but they still had a single core dependency - the building itself.


Oh it's bad.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: