San Francisco Woman Pulled Out of Car at Gunpoint Because of License Plate Reader Error
A lawsuit pertaining to the use of license plate readers in San Francisco illustrates how dangerous it can be when police officers turn off their eyes, ears, and brains, and mistakenly rely on imperfect technologies to tell them who’s up to no good.
On March 30, 2009, Denise Green, a 47 year-old black woman, was pulled over by multiple SFPD squad cars. Between four and six officers pointed their guns at her—one had a shotgun, she says—and told her to raise her hands above her head and exit her car. She was ordered to kneel, and she was handcuffed. Green, who suffered from knee problems, complied with all of their orders. Four officers kept their guns trained on her as she stood handcuffed, she says. Officers then searched her car and her person, finding nothing derogatory. After about 20 minutes, the police let her go.
It turns out that Denise Green was stopped because police, acting on a tip from a controversial piece of law enforcement surveillance technology, mistakenly thought she was driving a stolen car. A license plate reader had misread her plate and alerted officers that her car, a Lexus, was stolen. But if police officers had performed the most basic, visual check to ensure the information coming from the license plate reader system was accurate, they would have realized that her license plate wasn't a match, and that the stolen car in question was a gray GMC truck, while Denise Green was driving a burgundy Lexus.
The unfortunate chain of events began when SFPD officers received a license plate reader alert identifying Green’s car as stolen. It turns out the machine misread her plate by one number, seeing a 7 where there was actually a 3. Despite not having visually confirmed that Green’s plate was a match (it wasn't), officers radioed the plate number the license plate reader system had identified—not Green’s—and received confirmation from dispatch that a car with the license plate number in question was in fact reported stolen. Dispatch told them that the stolen car with the offending plate number was a gray truck, not a dark sedan. Meanwhile, Sergeant Kim of the SFPD heard the radio chatter about the dark Lexus and the stolen car, and saw Green’s car pass him by. He then began following her, while he radioed for backup.
While following Green’s car, Sergeant Kim confirmed that the first three characters on her license plate were a match to the stolen GMC truck’s plate. But he didn’t confirm the rest of the plate was a match. Instead, he radioed to the officers who had initially put out the alert about her car, and confirmed that the dark Lexus had set off the system. Once backup arrived, Kim and the other officers pulled over Denise Green, guns drawn.
Green later sued the department, the city, and Sergeant Kim, alleging that they had violated her "Fourth Amendment rights on the grounds that the incident constituted an unreasonable search and seizure and a de facto arrest without probable cause and involved an unreasonable use of force," as well as bringing various similar claims under California state law.
A district court found against her, ruling that Sergeant Kim had reasonably assumed his colleague had visually confirmed Green’s plate number, and that the stop was therefore justified—as far as the officer knew at the time. Kim’s was a "good faith, reasonable mistake," the court ruled. "[N]o reasonable jury could find that Kim lacked reasonable suspicion to conduct an investigatory stop," the judge found.
This week the Ninth Circuit Court of Appeals disagreed and reinstated Green’s suit, finding that a reasonable jury could indeed conclude her rights were violated. According to the three judge panel, which unanimously supported the suit’s reinstatement, "the question [is] whether it was reasonable as a matter of law for Sergeant Kim to effect the stop without making an independent visual verification of the license plate."
That’s a hugely important question, and one that will become more meaningful every day, as law enforcement agencies across the country increasingly implement technologies that assist police officers in determining whom to stop, for what alleged offenses, and when. Using tools like license plate readers and pre-crime ‘intelligence-led’ policing algorithms, police officers are relying more and more on computers to tell them who is dangerous, who is wanted for crimes, and who is suspect. And while police officers can make mistakes on their own, without a helping hand from malfunctioning technological systems, the buck must stop with the officers in the flesh.
As Denise Green’s frightening experience with SFPD officers attests, information that comes out of a computer is not always accurate or reliable and should never be treated as such—particularly when high-stakes stops and guns are involved. Real human beings need to make sure the information they are fed from increasingly complex computerized policing systems holds up to what they see in the real world, right in front of them.
Just because a computer tells you something, it doesn't make it so. When police are using Google-glass style headgear and scanning faces in a crowd for supposed threats, this will become all the more important to remember.
Originally posted on the ACLU of Massachusetts Privacy SOS blog.