Recently, national news buzzed about the San Francisco police’s plan to use robots – and permit them to use deadly force in some cases. The public had serious concerns about this matter, which ultimately led to restrictions on police use of robots and a pause on the plan.
However, this issue is not the only controversial, technological risk to individuals’ rights.
North Carolina community pushback on gunshot detection
Several Fayetteville residents have concerns about the local police’s contract with ShotSpotter, and the use of their technology to detect gunshots. According to Carolina Public Press, ShotSpotter uses acoustic sensors and artificial intelligence (AI) to detect and locate loud noises within a three-square mile area.
The worries primarily center on:
- Violations of residents’ rights
- Violations of privacy
- Racial bias
These worries are not unfounded. Studies found that AI algorithms created by humans can carry a bias that influences how the technology works. Additionally, many institutions, including Stanford University, highlight the risks of misuse and overuse of AI.
Even so, the most immediate risk of such matters is the increased risk of facing criminal charges.
What are the risks?
In most cases, the discharge of a firearm is a felony in North Carolina. Simply firing a weapon in certain occupied areas could result in a Class D or E felony, which could lead to up to 160 months of jail time and serious fines.
The presence of an AI-generated detection system could increase the risk of facing such charges. Additionally, what happens if the “loud noise” detected by the system is not, in fact, a gunshot? The police could still appear at your doorstep. In most cases, the police will still need a warrant to search your property.
Even so, the use of AI systems could still put your rights in jeopardy. To prevent this, it is crucial to proactively and fully understand your rights under the Constitution and North Carolina law.