If security sees someone carrying a gun in surveillance video, on a gun free campus, and policy verify it, then yes, that's justified, by all aspects of the law. There are countless examples of surveillance of illegal activity resulting in police action.
Nobody saw a gun in a video. Nobody even saw something that looked like a gun. A chip bag, at most, is going to produce a bulge. No reasonable human is going to look at a kid with a random bulge in their pocket and assume gun. Otherwise we might as well start sending our kids to school naked; this is the kind of paranoia that brought us the McMartin Preschool nonsense.
The presence or absence of human review is irrelevant. A system with stupid humans are just as bad as a system with stupid machines. They are complementary, really.
Nobody saw a gun. We know this because there was no gun.
They didn't see that, though. They saw a kid with a bulge over their pants pocket, suggesting that something was in the pocket. The idea that any kind of algorithm can accurately predict that an amorphous pocket bulge is a gun is just bonkers stupid.
(Ok, ok, with thin, skin-tight, light-colored pants, maybe -- maybe -- it could work. But if it mistook a crumpled-up Doritos bag as a gun, clearly that was not the case here.)
Are you suggesting it's not?