Their cameras have been terrible for a long time (comparatively speaking). I switched from a Pixel 6a to an iPhone 16 and was shocked at how bad my pictures are now. I get the feeling that iPhone users just don't know any better because they've always had an iPhone.
Seems like it just has to do with what you're expecting from a smartphone camera. I feel like Google Pixel is doing even more photo edit magic than iPhone.
Which is literally what Apple announced in this video:
"and the 2x telephoto has an updated photonic engine, which now uses machine learning to capture the lifelike details of her hair and the vibrant color of her jacket"
"like the 2x telephoto, the 8x also utilizes the updated photonic engine, which integrates machine learning into even more parts of the image pipeline. we apply deep learning models for demosaicing"
They've been using that terminology for like a decade. They take multiple photos and use ML to figure out how to layer them together into a final image where everything is adequately exposed, and applies denoising. Google has done the same thing on Pixels since they've existed.
That's very different from taking that final photo and then running it through generative AI to guess what objects are. Look at the images in that article. It made the stop sign into a perfectly circular shiny button. I've never seen artifacting like that on a photo before.