I think the parent comment is implying that if the source code is released at the end of the device's supported life, it will be much easier for hackers to find vulnerabilities. Then users who aren't paying attention will continue running that last version, and hackers will attack them using those now-public vulnerabilities.
So you'd still need some mechanism to force-update devices in response to vulnerabilities found in open-source end-of-support firmware.
What I mean is, reverse-engineering takes time, effort, and special talent, hence your job. This is the little security moat they get by not releasing src code, or at least not the latest running version. Of course a well-maintained and audited open-source codebase is better than a closed one, but a lot of this stuff isn't well-maintained.
Also, there are high-profile instances of hardware security that rely on obscurity, like secure enclaves or the iPhone passcode unlock. They tend to get cracked eventually, but it's still hard.
Releasing source code could lower the barrier a bit but the main thing I was calling out is releasing the keys - maybe they could be transferred to a trusted custodian instead.
In certain cases probably yes, but maybe still worth it? If you have the keys you still need to get your maliciously manipulated build on the customer's device... And this is assuming the manufacturer even bothered signing and verifying in the first place.
So this would be bad for manufacturers releasing secure well designed devices without security vulnerabilities.... But if you think about it for a second, isn't this good? As long as there is no known vulnerability, the manufacturer can say the device is still supported, and it costs them nothing, as they have no reason to release an update. And well, if there is a security issue, then it might be better to have the source and keys after all?
How secure or insecure a device is is unrelated to whether its source code is public.
Disclosure: I might be biased on this, as I'm a reverse engineer.