San Francisco’s Municipal Transit Authority has announced it will deploy 288 surveillance cameras equipped with special software to monitor suspicious behavior in the city’s buses, trains, and 12 transport hubs.
The cameras will be outfitted with BRS Labs’ AISight artificial intelligence software, which is programmed to recognize “normal” behavior. Activities not meeting these criteria are deemed suspicious and activate a text message or phone call to MTA guards. Each camera reportedly is able to monitor 150 suspects simultaneously.
The AISight software issues real-time text alerts when cameras detect strange behavior. That raises concerns among privacy experts such as Harley Geiger, a senior policy counsel at the Center for Democracy & Technology, a nonprofit public policy organization that deals with technology’s impact on privacy. “It will lead to more harassment of individuals making minor mistakes or exhibiting unusual but completely legal behavior,” Geiger said.
‘They Will Feel Less Free’
David Gerulski, a spokesman for BRS Labs, says AISight can detect an object left behind, a car parked somewhere it shouldn’t be, or abnormal activity by people. Although ostensibly AISight is being used primarily to prevent terrorist attacks, the software can also detect a person that has fallen at the bottom of an escalator or off a rail platform onto subway tracks.
Geiger said AISight text alerts sent to security personnel in a post-9/11 world are not likely to be taken lightly. Nobody in law enforcement wants to be the person who ignored the text message that turns out be the one that could have prevented an attack, Geiger noted.
Officers will be compelled to check out completely legal activity with alert after alert, Geiger said. “That legal activity will feel a lot less legal when law enforcement checks up on it every time. This will change the way people behave. They will feel less free,” Geiger said.
Reasonable Privacy Expectations
The AISight software is being implemented at a time when the nation’s courts are still determining how technological advances affect citizens’ privacy.
Katherine Strandburg, a professor of law at New York University, says video surveillance of public places is essentially unregulated by statute. “If information from a video surveillance system like that in San Francisco is pieced together to track a particular individual, a warrant may be required,” Strandburg says. She said such tracking might violate a reasonable expectation of privacy and may require a warrant.
Gerulski, however, says that’s not an issue, because AISight “makes no attempt to identify specific persons or objects. The system does not recognize individual faces or read vehicle license plates.” He added, “As far as personal liberties go, AISight has no prejudices. It cannot profile. AISight cannot tell the difference between Catholic and Muslim. It does not have facial recognition algorithms.”
Instead, Gerulski said, AISight “learns,” which helps prevent false positives such as alerts about events that are not important. “It ‘learns’ that cars are allowed to move in one direction through a parking lot exit and not the other way,” Gerulski said. “It knows not to alarm on rain and snow because it sees this as something that is not abnormal.”
Where There’s Smoke
Gerulski says studies have shown a person can watch a monitor for about 22 minutes before they become ineffective. If a site has 100 cameras, it is nearly impossible for a guard to observe all the dangerous situations, he said.
“How about a group of people who suddenly disperse [throughout] an area?” asked Gerulski. “AISight can tell if smoke suddenly appears in a room, subway platform, or building. A guard watching 32 cameras at one time may miss these events. AISight truly helps public safety.”
A general rule of the Fourth Amendment sometimes permits warrantless searches in case of “special need” such as terrorism prevention, Strandburg notes. The special needs doctrine could permit warrantless surveillance as well, which is why police can perform random backpack searches in the New York subway that would ordinarily require a warrant, she added.
“The special needs doctrine is a balancing test, so whether a ‘smart’ surveillance system qualifies depends on specifics: How much does it serve the ‘special need’ of terrorism prevention as opposed to the law enforcement need of investigation? How intrusive is it? How effective is it?” Strandburg said.
Tom Gantert ([email protected]) is senior capitol correspondent for the Mackinac Center for Public Policy in Midland, Michigan.