More than a dozen companies market software that identifies weapons visible on CCTV at schools, businesses, and public buildings. The concept is prett

School Shooting Data Analysis and Reports

submited by
Style Pass
2024-06-05 13:00:02

More than a dozen companies market software that identifies weapons visible on CCTV at schools, businesses, and public buildings. The concept is pretty simple, AI software can spot a weapon faster than a human and immediately alert police. The problem is these models can’t reliably tell the difference between weapons and weapon-shaped objects (e.g., umbrellas, water guns, sticks, cellphones, mops, or basically anything that has a straight line like a gun barrel).

As I wrote about in March ( How do AI security products being sold to schools really work?), an image classification model is only looking for a specific object and doesn’t understand anything about the context of the whole scene.

AI weapon classification software doesn’t see a kid playing with a water gun because it doesn’t know what a kid or playing is. The classification model only sees an object that has a high probability of matching the same characteristics of a gun in the training dataset.

To get around this problem, some companies have human reviewers checking and rejecting images which adds time, bias, and error that undermines the advantages of having AI instantly identify a threat.

Leave a Comment