Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple aren’t actually designing its own sensors, are they? I expected they would be using an off the shelf Sony sensor just like (most) everyone else.

I generally disagree with your premise here. Yes, you can do some pretty amazing things with image processing in software, but you have to have a decent starting point. A good example of this is just how minimal the image processing improvements are in the M1 MacBooks compared to their Intel counterparts — marginally better, but nothing to write home about.

I’m not at all an expert in optics, but I would expect the biggest constraint is the available depth for the lens and camera sensor within the upper clamshell of a laptop vs a phone. It doesn’t seem like it would be a huge difference but an extra millimeter or so makes a huge difference in the size of the sensor that can be used.



You assume that this year's M1 MacBooks have a different camera package from the Intel MacBooks prior to them, and/or that the camera package used in Intel MacBooks could be improved by Apple Silicon.

I think that the iPhone/iPad camera package requires Apple Silicon, and that Apple simply hasn't installed it (or one like it) in their MacBook yet, so that it doesn't matter if it has Intel or Apple Silicon, it still looks terrible because it hasn't been upgrade to one that takes advantage of Apple Silicon yet.

I don't know who makes the sensor inside that package, but I have no assumptions that the sensor is the sum total of the package. Apple is all about bizarre integrations and coprocessors (such as the iOS/bridgeOS/whatever chip inside their Lightning-to-HDMI dongle) and I'm simply not willing to bet that the Apple Silicon package can operate on an Intel computer, or that it's just processing of the sensor, without a teardown showing us that they're the same in the face of a significant quality improvement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: