I think it’s been changed since, but wow was it weird finding out that instead of taking photos, the Android app used to essentially take a screenshot of the camera view.
I worked on the camera in Instagram iOS for a while. There at least, there could be a 5,000ms latency delta between the “screen preview” and the actual full quality image asset from the camera DSP in the SOC.
I don’t know a thing about Android camera SDK but I can easily see how this choice was the right balance for performance and quality at the time on old hardware (I’m thinking 2013 or so).
Users didn’t want the full quality at all, they’d never zoom. Zero latency would be far more important for fueling the viral flywheel.
I worked on the Snapchat Android back in 2017. It's only weird for people who have never had to work with cameras on Android :) Google's done their best to wrangle things with CameraX, but there's basically a bajillion phones out there with different performance and quality characteristics. And Snap is (rightfully) hyper-fixated on the ability to open the app and take a picture as quickly as possible. The trade off they made was a reasonable one at the time.
Things have improved since then, but as I understand it, the technical reason behind that is that it used to be that only the camera viewfinder API was universal between devices. Every manufacturer implemented their cameras differently, and so developers had to write per-model camera handling to take high quality photos and video.
:) this is exactly how we used to do it even on iOS, back in the days before camera APIs were not made public, but Steve Jobs personally allowed such apps to be published in the iOS App Store (end of 2009) ...
That was the only way to avoid the insane shutter lag that was very common on Android phones at the time. It's called SnapChat not HoldStillForAMinuteChat so it made sense.
Blame Google if you want to blame anyone. They could have mandated maximum shutter lag times (maybe they do now, I don't know).