Today we travel to a future where your face is constantly being scanned and tracked everywhere you go.
Guests:
- Brandeis Marshall — data scientist & professor at Spelman College
- Matt Cagle — attorney, ACLU Northern California
- Keith Kaplan — councilman, Teaneck, NJ
- Ankita Shukla — PhD candidate, IIIT Delhi
Further Reading:
- Man arrested twice for bank robbery sues Denver, police and FBI for $10 million
- TALLEY v. CITY AND COUNTY OF DENVER
- Face Recognition primer by EFF
- Facial recognition is increasingly common, but how does it work?
- FTC: Best Practices for Common Uses of Facial Recognition Technologies (2012)
- Facial recognition’s ‘dirty little secret’: Millions of online photos scraped without consent
- The FBI Has Access to Over 640 Million Photos of Us Through Its Facial Recognition Database
- Why facial recognition’s racial bias problem is so hard to crack
- Facial recognition gives police a powerful new tracking tool. It’s also raising alarms.
- Gender and racial bias found in Amazon’s facial recognition technology (again)
- Facial Recognition Is Accurate, if You’re a White Guy
- When the Robot Doesn’t See Dark Skin
- Amazon’s Face Recognition Falsely Matched 28 Members of Congress With Mugshots
- The NYPD uses altered images in its facial recognition system, new documents show
- San Francisco’s facial recognition technology ban, explained
- “Stop Secret Surveillance” ordinance
- San Francisco Is Right: Facial Recognition Must Be Put On Hold
- ACLU Community Control Over Police Surveillance
- Primate Face Identification in the Wild
- Face Recognition: Primates in the Wild
- Facial recognition tool tackles illegal chimp trade
- Emerging technology could identity cattle through facial recognition
- Facial recognition tool ‘could help boost pigs’ wellbeing’
- Does A Pregnant Giraffe Deserve Privacy?
Actors:
- Evan Johnson as Mr. Morton
- David Romero as David
- Ash Greenberg as Ash
- Santos Flores as Santos
- Charlie Chalmers as Charlie
- Grace Nelligan as Grace
- Ava Ausman as Ava
- Sidney Perry-Thistle as Sidney
- Arthur Allison as Arthur
Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. The episode art is by Matt Lubchansky. Special thanks to the Women’s Audio Mission, where all the intro scenes were recorded this season. Special thanks also to Evan Johnson who played Mr. Morton and also coordinated the actors of the Junior Acting Troupe who play the students in the intros this season. Here’s a fun fact about the intros this season, the actors, the teens, actually wrote their own scripts. I assigned them sides, and they had to come up with their arguments. So I actually did not know what they were going to say until we got into the studio. Special thanks to Veronica Simonetti and Erin Laetz at the Women’s Audio Mission, where all the intro scenes were recorded this season. Check out their work and mission at womensaudiomission.org.
If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.
And if you want to support the show, there are a few ways you can do that too! Head to www.flashforwardpod.com/support for more about how to give. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help.
That’s all for this future, come back next time and we’ll travel to a new one.
FULL TRANSCRIPT BELOW
▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹
Rose: Hello and welcome to Flash Forward! I’m Rose and I’m your host. Flash Forward is a show about the future. Every episode we take on a specific possible… or not so possible future scenario. We always start with a little field trip to the future, to check out what’s going on, and then we teleport back to today to talk to experts about how that world we just heard might really go down. Got it? Great!
This episode we’re starting in the year 2060.
***
[Kids Chatting]
Mr. Morton: Okay! Alright, it’s 3:35, a few more stragglers are going to be joining us potentially a little later here, but find your seats. I want to start relatively on time today because it’s our first real debate day! Very exciting, yeah!
So, last week you heard about the five cases we’re going to argue about this semester, and you all picked out of the HAT OF DOOM [kids cheer] to determine what sides you’re on. I know some of you didn’t love which side you got, but the point here isn’t that YOU believe that you’re right, it’s that you can convince OTHER people that you’re right. So even if you’re arguing for something you don’t believe, try to be convincing, okay?
Oh yeah, remember these are real cases from history, all the stuff we’re doing this semester, this all really happened.
Okay, case one, the cheating case. This one is from 2020. A woman is caught cheating on her wife via facial recognition, via the systems around town. The victim files for divorce. The cheater sues the city, arguing that she should have the right to privacy moving around in her own town, and that there shouldn’t be so much facial recognition around town. Who is right?
So, David, Ash, you guys are up.
[they walk up to the mics]
Let’s start with David, are you ready? Go for it!
David Romero: Anyone has the legal right to videotape you in public. Why is facial recognition so different? The answer is: it’s not.
In fact, It’s not that different from many identification systems we have already. When you go to the airport, some dude looks at your id, then looks at your face, then, if they match, let’s you through. That could be automated by facial recognition. At bars, people check your face against an ID to determine you are who you say you are, and are over 21 and let’s you through. In police work, police try to identify criminals that they have pictures of to catch them. All of that
Mr. Morton: No phones in class please.
David: could be automated by facial recognition.
Now you might be scared about error rates- what if the facial recognition does not identify you correctly? Well in that case, all that would happen is the same thing that happened before facial recognition, an airport staff member would check your face and an ID, a bartender would check your face to an ID or a police officer would investigate the lead that they got from facial recognition, and find out that it wasn’t you who committed the crime. Or they find out that you did.
Because facial recognition has already been used to catch a lot of criminals. There are so many cases where, without facial recognition, the culprit would still be on the run. Facial recognition can save lives of future victims and catch current criminals. And it does. Currently.
It’s at the point where a ban, just wouldn’t work. You can’t put the genie back in the bottle, you can’t ban a new technology that has so much incentive for use in both the private sector and the government. And if you do ban it, and the companies still use it but secretly, then they will still be able to track you if you are cheating on your wife, they just won’t tell you how they did it. Unless, of course, you have the money to stop them. Unless of course, you have the money to stop the companies from tracking you. Maybe you even own some of those companies.
The powerful would be free from facial recognition and the average american would still be
Tracked, just secretly. However, we didn’t ban it, we could instead have everyone, including people with money and power registered to a face database, and then regulate how that database is used. We could ban only very specific uses of the technology and require companies and government to be very transparent with how their facial recognition is used. We can make this world run faster, safer and be more transparent if we don’t try to shove this technology down in a futile attempt to stop what is already here, and get rid of what is woven into our society. We can’t -and shouldn’t- stop this technology, but we can shape it.
Thank you
Mr. Morton: Okay thank you David, very interesting and very researched, I appreciate that, thank you.
Ash Greenberg: That was good. Spoiler alert, my speech is not that good.
Mr. Morton: Okay, Ash, let’s have you give your side please
Ash: Okay, hello! People who are watching this, in this classroom, anyway so, we all know what topic we are discussing is facial recognition! I will be arguing for the side that we should not be using alright. Now, let’s get started. Alright so my first point is that it can be used to pretty much stalk people. Having facial recognition in all places has the potential for people to hack and use it. According to an article in Whole Earth, the potential of a, for abuse, is astromical, pervasive automatic face recognition could be used to track individuals wherever they go. Systems operated by different organizations could easily be networked to cooperate in tracking and individual from place to place whether they know the persons identity or not. And they can share whatever identities they do know. This tracking information can be used for many purposes. This shows that facial recognition has the potential to be abused, in ways that could harm others.
Anyway, so, my second point is that facial recognition systems have the potential to have major flaws and errors, which can cause innocent people to be possibly be charged with things like crimes that they didn’t commit. According to another article on Geek.com, facial recognition technology trailed by the metropolitan police is reportedly 81 percent inaccurate. The system, according to a study by the University of Isa I don’t know how to pronounce it, but mistakenly targets four out of five innocent people as wanted suspects. So with a very high rate of failure, it seems likes facial recognition systems are not very reliable.
My third and final point against using facial recognition systems is that it’s pretty discriminatory in the fact that it has a harder time identifying the faces of women and people of color. According to, yet again, another article, although the tech is pretty good at identifying white male faces because those are the sort of faces it’s been trained for, it often misidentifies people of color and women. That basis could lead to them being disproportionately held for questioning when law enforcement agencies put the tech to use. So even if you are going to use facial recognition systems to identify criminals and identify people and things like that, they better be white males otherwise there’s a high chance that it will not work.
And in conclusion, that is why we should, facial recognition technology should be banned. Thank you.
Mr. Morton: Wow, okay, great, great great great, so uh, next you each get to ask each other one follow up question. And, um, uhhh why don’t you go first Ash and David you can respond.
Ash: Alright, so, my question is that, how do you expect to catch criminals using facial recognition if it has an 81 percent fail rate.
David: Uh, well, the thing about uh catching criminals is, like, in some of the articles that you were talking about, or at least some articles that I’ve read like, the police chiefs and stuff, they said, we do not uh, arrest people solely based on facial recognition. Which makes sense, uh, before facial recognition if you had a picture of the suspect you’d still try to ID them, this is just more of an automation of that process than, um, as good police officers you follow up on that and so it gives you leads when there are none, but that doesn’t mean that oh if it says that this person has the same face as the criminal that you automatically arrest him. You go and investigate, you don’t just rely on the facial recognition, however, the facial recognition can still produce really good leads, uh, that when you do follow up, when you do investigate, then you’ll find that it is a good lead, that it does um, that that person is the person you’re looking for.
Mr. Morton: Alright, absolutely, thank you very clear nad concise, David it’s time for you to ask your question to Ash.
David: Okay, I have to choose between two because I thought that we two.
Mr. Morton: One question just because of time.
David :Okay, um, what would you tell the victims of the crimes that could have been solved by facial recognition, and how do you justify letting the culprits go free?
Ash: Well… there were times when people, hm, did not have facial recognition and there were, and criminals were caught, and also… yeah… yeah! End of sentence.
David: There were also times when people didn’t have medicines that saved many lives and yet some people still lived until they were like forty.
Mr. Morton: That is true.
Ash: Yeah
Mr Morton: You guys, thank you for being our first guinea pigs and starting our semester off with an incredibly lively and well researched debate. Let’s give it up for Ash and David.
Ash: Good job you did good.
[applause]
***
Rose: Okay, so this season is all about the future of CRIME, and we are kicking this season off with facial recognition, a technology that you’ve probably heard a lot about recently. I feel like I can’t finish this episode fast enough because there are constantly new cases to talk about here. And in fact, I did a facial recognition episode way back in 2015, in the first season of this show. I’ll link to that episode in the show notes. Over the past four years, a lot has changed both on Flash Forward and on the facial recognition front, so I think it’s time to revisit the topic, and talk about how our faces might or might not be used against us in the future.
Oh and a quick note about the intro that you just heard, this season I’m doing something different with those fictional scenes from the future. Stick around to the end of the episode and I’ll explain by that. It’s fun, and weird, and I’ll play you some surprise tape.
Now, since this is the CRIME mini-season, let’s start… with a crime.
[Dramatic music]
Just kidding, this mini season is about crime but I promise I will not go full on true crime melodramatic on you. We won’t be talking about any gruesome murders or serial killers and I’m not going to do any Nancy Drew cosplay and try to solve a cold case and somehow makes the entire series about my personal journey instead of the victims. If you love true crime there are plenty of podcasts out there for you.
But we are going to start every episode this mini-season with a case. And today’s is a robbery.
At 11:57am on May 14, 2014, there was an armed robbery at a bank in Denver. There’s surveillance footage of the robbery — the perpetrator is wearing a black baseball hat and a red windbreaker. And he’s almost caught, the security guard at the bank tries to grab the robber as he leaves, but he trips and falls and our thief gets away. For months, they have no leads on this case. Then, our burglar strikes again. On September 5th, he robs another bank in Denver. Again, he’s caught on video surveillance cameras. This time he’s wearing sunglasses, another black baseball hat, and a black jacket.
The Denver police shared images from both those surveillance videos with their Crime Stoppers network, basically asking the community for tips, seeing if anybody recognized this guy. And after that second robbery, two different anonymous tipsters called in with a name: Steven Talley. From there, they found Talley’s estranged ex-wife, and showed her the pictures, and she said, yep, that’s my ex-husband.
So the police arrested Steven Talley. And Talley says they arrested him pretty brutally, broke some ribs, knocked out some teeth, it was the kind of break down the door arrest that you see on TV crime shows.
The problem is, Talley had an alibi. He worked as a financial analyst for a company called Transamerica Capital. Here is Talley talking to the local CBS station in Denver.
Steve Talley: My job was to be on the phone all day. That was my job, to sell mutual funds. I knew that they were going to have a recorded conversation because I was on the phone all day.
Rose: And indeed, his company did have a recording of a call from him at the exact time of the first robbery.
Steve Talley: Hi Nicole, Steve Talley with TransAmerica, how are you doing today?
Rose: Plus, when the detectives showed a mugshot lineup to the bank tellers and the security guard from the first robbery, they couldn’t identify Talley. So after two months in jail, he was let go.
But the story doesn’t end there. For some reason, the Denver police were convinced that Talley had robbed at least one of these banks. So they couldn’t get him for the first one, but they tried again to charge him for the second bank robbery. And this time, they used facial recognition.
In April of 2015, Detective Jeffrey Hart of the Denver Police Department, requested that the FBI forensic analysis unit do a facial recognition comparison between Talley and the surveillance footage that they had. Now, I want to be clear, this facial recognition analysis was manual, as in, done by a human, not a computer. But it’s the same basic method, measuring distances and elements of the face to determine if someone is a match. The FBI analysis didn’t find a perfect match, but they did find, quote, “multiple corresponding characteristics such as the overall shape of the head, jaw line, chin, nose, lips and mouth, moles/marks on the face and neck and specific features in the left ear.” Ultimately the analysis concluded that the person, quote, “appears to be Talley.” end quote.
So, a year after getting out of jail for one robbery he didn’t commit, Talley was arrested again for the second robbery. But he didn’t do that one either. Cell phone records showed that he was at a food bank during the second robbery, getting food. And the bank teller from that second bank who had originally identified Talley as the perpetrator, Bonita Shipp, retracted her statement. Here she is talking to KDVR in Denver:
Bonita Shipp: He was an innocent man, he didn’t do it.
Rose: Bonita had noticed something about the robber’s hands when she was giving him the money.
Bonita Shipp: He had worn surgical gloves, which you could see through the gloves and you could see that he had marks like moles or brown spots on his hands.
Rose: Talley didn’t have those marks on his hands. And his face had an important different too. Talley has a mole on his cheek, and the robber doesn’t. Oh, and Talley is three inches taller than the robber on the video. It wasn’t him, still.
And so on April 21st, 2016, two years after the first robbery, all charges were finally dropped against Steve Talley. But being charged with two bank robberies, spending months in jail while you wait to be acquitted, that can ruin your life.
Steven Talley: Destroyed my life, destroyed my family. I haven`t seen my kids. Destroyed my career, destroyed my health, almost killed me. So what did it not do? And I’m still living like this, still living on the streets,. I still haven’t even gotten an apology. They know they got the guy the first time, I still haven’t gotten a simple apology.
Rose: Talley is currently suing the city of Denver for how they handled his case. And that lawsuit is ongoing. I tried to get Steve Talley and his lawyers on the show, but I did not hear back from them, probably because the cause is still active.
Today, five years after Talley was mistakenly identified by facial recognition techniques, a lot has changed on this front. Our faces are being scanned constantly — your phone might use your face as a biometric password to unlock. Students in school all over the world are being watched on cameras equipped with facial recognition systems. Airlines are asking people to scan their face to get on the plane. Amazon is selling their facial recognition system to police departments across the country.
Now, you might be like “well wait, Steven Talley’s case isn’t about the same kind of facial recognition as the stuff you just mentioned.” But I wanted to start with this story to show that facial recognition, even when done manually, by a person who is in theory paying attention to every detail, can be wrong. And being wrong about this kind of thing can have huge implications. It can ruin people’s lives. And now, we’re taking that system, and we’re removing humans from the equation.
But we’re getting ahead of ourselves. Let’s first establish how facial recognition works.
Brandies Marshall: In essence what it does is tries to find a map for your face, like the distance between your eyes the distance in your eyes and your nose district between your nose and the top of your lip between the lips and the chin.
Rose: This is Brandeis Marshall, a data scientist and professor at Spelman College.
Brandies: So it’s trying to find markers on your face to identify where all the main portals of your faces exist, and finding distance. And then trying to take what your face is mapped on digitally to other faces that have been mapped digitally.
Rose: So these systems look at pictures or videos of your face, measure a bunch of stuff to turn your face into a series of numbers, and then compares those numbers to other faces that it knows. Now, we’ve talked about algorithms on this show before, so you probably already know this, but to train an algorithm how to do something like this, you need to feed it data to learn from. So if you want it to be able to recognize faces, you need to give it faces. A lot of faces. And to get a lot of faces, companies pull from some places that might surprise you.
Brandeis: They’re pulling from datasets that are freely available because all of us have taken a picture with our cameras on our phones and posted it online. So they are scraping the Internet so that’s Facebook that’s Flicker, Twitter or Instagram.
Rose: That’s right, you yourself are helping train these systems with your face! And sometimes, you’re actually helping train the system with other people’s faces without their permission. If you’ve ever uploaded a group photo to Flickr, you probably didn’t ask all the people in that photo, “Hey, in a couple of years, is it cool if IBM uses your face to train their algorithms to create a facial recognition system that they will then going to sell to The United Arab Emirates so they can crack down on dissidents?” And yet, that could have happened. IBM did scrape images from Flickr, and they did sell their facial recognition system to the UAE, a dictatorship known for throwing critics into secret prisons.
Matt Cagle: Many of those people uploading their photos and giving it a sort of license were thinking that maybe somebody would take their photo and share it in a clause or they would repurpose the photo for some other sort of creative art product.
Rose: This is Matt Cagle, an attorney at the ACLU in Northern California.
Matt: I think very few of those folks understood that their photos might be available for training facial recognition systems that governments and maybe law enforcement agencies might want to use.
Rose: Some researchers have called this image scraping the “dirty little secret” of facial recognition. Sometimes companies will literally go into Google and type in the name of a famous person, a politician, an athlete, a singer, and just download all the photos they can find of their face, to use in their algorithms. And even if you have a copyright to your images, that doesn’t always stop people from using your photo. Academics can, and do, use copyrighted photos to train their algorithms. They claim that because they’re not doing commercial research, they don’t have to respect copyright claims for your photos.
This is why people get nervous about things like FaceApp, the app that recently was making the rounds online where you could see what you look like as and old person, or as a baby. If you look at the agreement you sign to use the app, you’re handing over access to your face for them to use for whatever they want.
Brandeis: I’m very particular about my own face and what imagery is out right that shows my face. So I have like three images that I you know know that’s out in the ethos than any other any other image I’d be like, “Well I didn’t give permission for that.”
Rose: Uploading photos to any app, with any technology company these days, requires a whole lot of trust. And that trust hasn’t necessarily been earned.
Brandeis: Historically within at least the United States, I’m not going to talk about globally but at least within the United States, there is a lack of trust between marginalized communities and the mainstream. So if someone’s going to use my face I want to know how it’s being used where it’s being used with the frequency and if they’re making money off of my face. And if they’re making money off my face I want that money.
Rose: Now, not all photographs that these companies use are shadily scraped from the web. Some of them are taken specifically for training purposes. But those tend to be smaller, and more homogenous — as in, the photos tend to show a less diverse set of people. Which is a problem, because if you don’t have a diverse set of faces in your training data, then your data doesn’t learn how to recognize a diverse set of people in the real world.
Brandeis: Let’s say it’s all women then you’re going to look at the features of just women’s faces and you’re gonna be excluding men’s faces which has a different bone structure therefore different way that the algorithm would tune or not tuned to a male face.
Rose: And this gets us to probably one of the most famous problems with facial recognition: it’s not good at recognizing not white faces. And that’s because the data it has is mostly of white male faces — one of the most commonly used datasets in this field is 75 percent male and over than 80 percent white. So if you’re a white guy, you have the best chance at the system matching you! Congrats, I guess? If you’re not a white guy… well… too bad for you.
What this means in practice is that when you actually plug these systems into the real world, they’re wrong a LOT. One analysis of the system that London’s Metropolitan Police uses found that it was wrong 81 percent of the time. The ACLU did an analysis of Amazon’s Rekognition system and it mistakenly flagged 28 members of congress as “matches” with people who have been arrested for a crime. And most of the politicians that were flagged were not white.
On top of these algorithms being wrong all the time, in most places, there are basically no rules about how these systems can and should be implemented. In New York City, for example, the police once uploaded a photo of the actor Woody Harelson into their system, because eyewitnesses said that the suspect looked like Harelson. And based on this upload, they actually arrested someone, and charged them with petty larceny. In other cases the New York City police would alter the images they had of suspects, to make them work with the system. Remember, you need a photo that shows enough of the face for the algorithm to read. If some part of the person’s face is covered or blurry or in some way not compatible it won’t work. And in a training presentation about these systems, the New York City police actually suggested that officers alter the images they had of potential suspects by copying and pasting pieces of stock images onto those photographs. So if the only photo they have of a potential suspect was one with an open mouth, they would go and get a closed mouth from another photo of a completely different person, and paste it onto the picture, and then run it through the database to try and find a match. And that is totally allowed!
Now, when I first started reading about how bad facial recognition is, how wrong it is all the time, it sounded kind of weird to me. Because we’re not used to hearing about how bad a technology is. In fact, we usually hear about the opposite, when it comes to surveillance tech. We’re used to hearing about how GOOD this tech is, and that’s part of why it’s scary. We hear about how Facebook can identify us without even our faces, how cops can see through walls, or about how our phones are tracking our movements down to the footstep. We’ve sent a rover to Mars! Why… is facial recognition so hard to do?
Brandeis: I think there’s two prongs to this issue. The first one is just the cost of the technology. So yes there are these like great algorithms quote unquote that are supposed to really help with facial recognition. But you have to have a really good system. You have to have very good high def video that is not cheap.
Rose: This problem is slowly going away, as high def video cameras get cheaper and cheaper. But the second problem, that one isn’t going away any time soon.
Brandeis: The other issue is people on the video move right. So most of the, most of the experiments done, you have a, like, 2D face and where the eyes are looking directly at you you might have a side image of someone or you know a profile image of a person but it’s pretty much a 2D image. What happens inside video is that it’s 3D. And then the person is moving and then you’re on a system that doesn’t have high def. There’s all these other factors that go into the actual video that aren’t accounted for inside of the research.
Rose: But I think, personally, that the argument about facial recognition gets a little bit too caught up on this question of it not being very accurate. Because that argument kind of assumes that if this worked 100% perfectly every time, it would be fine to use. And that… is not actually a given.
Matt: Even if this technology were perfectly accurate even if you could identify all of us with out making any errors if it could do it from a distance. Even with blurry images that is still still a world that we shouldn’t want to live in because when the government can perfectly track us the government has an outsized power to know who we associate with, where we go and to enforce laws in ways that could be biased and that could impact particular communities. Particularly immigrants, activists and people of color who in the past have been disproportionately targeted by government surveillance.
Rose: And this is kind of the fundamental question about not just facial recognition, but tracking technologies in general? Should you expect to be constantly scanned and tracked everywhere you go? Do we no longer think that humans should expect privacy in public and private spaces?
Brandeis: What will happen in this environment of facial recognition by machines, is that there are cameras everywhere now. Every business have some type of camera and that means that now the person being screened has no idea that they’re being screened and they have no idea that they may have a certain percent match as far as facial recognition. They don’t even know they’re a target. And when you don’t even know you’re a target that means you have lost a certain amount of privacy, going back to trust?
Rose: Do we trust law enforcement, or companies like Amazon or Facebook with this kind of power? Have they earned that trust from us? Most people from marginalized groups would say… no! Not even a little bit! And for good reason. Facial recognition has already been used by police during Black Lives Matter protests, and to track and out sex workers. Trans people also worry that the system might be used to out them to employers, friends, or enemies and harassers.
And in fact, it’s not just marginalized groups who are wary of this kind of surveillance. In a March 2019 poll that the ACLU commissioned, they found that 76 percent of California voters supported strongly or somewhat a law that would “require public debate and a vote by lawmakers before any surveillance technology is obtained or used by government and law enforcement.” And 82 percent of Californians said they disagreed strongly or somewhat to this statement: “Government should be able to monitor and track who you are and where you go using your biometric information.” The poll also found that these opinions did not break along party lines, both Republicans AND Democrats felt the same way. There is an overall consensus, at least in California, that this surveillance technology is suspect.
And as a result, some cities have decided to ban this technology from use by police and government entities. And when we come back we’re going to visit two very different places, and hear how they’re tackling facial recognition technology head on. Face on? Whatever… first a break.
[[AD BREAK]]
Rose: So let’s recap what we’ve learned so far about facial recognition: it’s biased, often inaccurate, and even if it worked perfectly would still plug into our biased and problematic society that already over-polices certain communities.
Matt: Even if face surveillance technology becomes perfectly accurate we have to remember that it will be deployed in a very imperfect world.
Rose: That’s Matt Cagle again, with the ACLU.
Matt: We think facial recognition will supercharge the over enforcement of laws and communities that are already subject to biased law enforcement practices who are already living with a significant police presence in their neighborhoods.
Rose: Just to give you one statistic here, out of MANY, studies show that black and white people in the United States are equally likely to smoke weed. But black Americans are Blacks are almost four times more likely to be arrested for marijuana possession. That’s because cops are more likely to patrol those communities, and are more likely to arrest a black person for possession, where a white person might just get a verbal warning.
Now, one argument you hear a lot in regards to law enforcement and technology is: well, if you have nothing to hide, you’re fine. Only people who are doing crimes should worry, if you don’t do crimes you shouldn’t have anything to worry about. We’ve talked about this argument before on the show, and again I’ll recommend the book Nothing to Hide: The False Tradeoff between Privacy and Security by Daniel Solov But here’s what Matt had to say about that line of thinking:
Matt: It’s not about having something to hide it’s about having something to protect. In some communities, that’s going to be your ability to go to a political protest without being tracked and identified as present by local law enforcement who wants to put you on a list of some sort of protesters or activists.
Rose: Conversations about privacy and governments aren’t new, but what is new is a series of ordinances banning facial recognition around the country. And earlier this year, the ACLU actually succeeded in pushing through a new law in San Francisco called the “Stop Secret Surveillance” ordinance.
Matt: it is a ban. It says San Francisco at the heart of Silicon Valley and technological innovation on the west coast has decided to step back and draw a line in the sand for this very dangerous technology.
Rose: And it’s notable that this is the first ban on facial recognition in the entire country.
Matt: This is a city government and a population that is really technically talented and understands how this technology works. So I don’t think it’s a surprise that the supervisors at the San Francisco Board of Supervisors saw the dangers coming down the road with this technology and decided to draw a line in the sand.
Rose: The ban is coming from inside the house, so to speak.
Now, the ban only applies to certain organizations in San Francisco:
Matt: So the San Francisco surveillance technology ordinance applies to city departments. So that will be law enforcement agencies the Recreation and Parks Department. Any agency that an actor working for the city or county government. But there is an entire additional industry of corporate players who want to be using face surveillance technology to scan your face when you walk into a store when you enter a stadium
Rose: The gym, or the mall, or the private school your kid goes to, they can all use facial recognition. This ordinance doesn’t stop them. But Matt is actually hopeful that public opinion has turned against facial recognition enough that people might actually demand that these private business stop doing that.
Matt: And there’s going to be a lot of pushback I think against companies who seek to secretly scan people’s faces when they walk into a store and then use that to make money or to serve of adds.
Rose [on the phone]: So you’re actually hopeful, that people are going to reject this.
Matt: Oh yeah.
Rose: The San Francisco ban was passed nearly unanimously, eight to one. And it sparked a lot of conversation and a handful of similar measures: nearby Oakland banned the technology right after San Francisco. Berkeley is currently considering a similar ordinance. Somerville, Massachusetts passed a ban unanimously at the end of June.
Matt: You also see states proposing legislation to regulate and rein in facial recognition technology. California’s considering a ban on the use of facial recognition technology with officer worn body cameras. Massachusetts is considering a moratorium. Michigan and New York are also considering or at least proposed legislation.
Rose: Another place inspired by the San Francisco ban, is a much smaller town, in a much smaller state.
Keith Kaplan: We’re in Bergen County. We’re about five minutes without traffic outside the George Washington bridge away from New York City. We’re a municipality of about 40000 people.
Rose: This is Keith Kaplan, and he’s a councilman in Teaneck, New Jersey, not that far from where I grew up. And Keith is working on introducing a facial recognition ban in his town as well.
Keith: I knew I wanted to do it, and I wasn’t exactly sure the best way to do it. And once I saw the ACLU working on it with San Francisco obviously not having to recreate the wheel from scratch is a wonderful thing.
Rose: But Teaneck, New Jersey and San Francisco, California are pretty different places.
Keith: The San Francisco rules for those that don’t know are extremely complicated, and I mean look they’re a city of millions of people it’s going to require layers of complexity. We have 40,000 people and you know our police department is one hundred people strong.
Rose: For comparison, San Francisco employs over 2,000 police officers. Keith’s version of this measure has three parts. First, he wants to ban the use of facial recognition for now.
Keith: The tech doesn’t really work at the moment. It targets a lot of the wrong people.
Rose: Second, he wants a public comment period before this kind of technology can be adopted in town in the future. And third, if it does get used, he wants there to be a specific plan for how to measure its impact. And this is something that rarely happens when places implement a fancy new technology. Take automatic license plate readers for example. There are cameras on most cop cars and even those little parking enforcement buggies that go around town. And those cameras are constantly scanning license plates and checking those plates against a database of stolen cars or getaway vehicles. Precincts all over the US installed these systems, which cost a decent amount of money.
Keith: And then years later when I asked. “What is the revenue coming in you know for the readers are they paying for themselves. Are we finding a lot of these out of date registrations and you know with cars that are vastly out of date and causing all this pollution are now getting registered and processed? Are we finding people with these warrants out for them?” And it turns out we weren’t keeping records based on it in a way that can be reviewed and see if these things were meeting their objectives.
Rose: So Keith wants to avoid that with facial recognition, so if they ARE going to use it, they have to actually have a way to track whether or not it is working.
Just as a quick aside, these automatic license plate readers are also systems that can get things wrong. In fact, just a few weeks ago I came home and there were four cop cars parked on my street at weird angles, like they had raced there and skidded to a halt. It turned out that a parking enforcement buggy’s license plate scanner had mistakenly matched the plates on my neighbors car with a stolen vehicle, which triggered an alert and sent four cop cars to the house. And this isn’t an isolated incident.
Keith: There was one last year, in California, where they had the right number but it was just bad data. A rental car had been listed as stolen and they never updated it when it was brought back. This guy Brian Hoffer gets pulled over and police are holding him at gunpoint. And you know in 2014 there was a LAPD license plate reader that confused this guy Arthur Eric’s license plates and it was the right number but it was the wrong state.
Rose: And I’ll just say that, if a license plate reading system can’t get things right, how do we expect a facial recognition system to be better? Also, wouldn’t you think that a human might review the license plate in question and double check before sending a whole slew of cops? I don’t know, just a thought.
Anyway. Right now, Keith is working on basically taking the San Francisco rule, and stripping it down to its most essential parts to introduce in Teaneck.
But what about bad guys, you might ask? Don’t we want to catch bad guys? Doesn’t this help catch bad guys? In the intro you heard David argue this, right? What would you tell all the families of people who might never get justice because we didn’t use this technology. The problem is that, we don’t actually have a lot of evidence that facial recognition helps us catch more bad guys than we already do without it. Police departments already have pretty advanced techniques for tracking someone or trying to locate a person. Here’s Matt Cagle again.
Matt: San Francisco stood back and said “You know what we can do we can keep the community safe and we can enhance public safety without recklessly deploying a new surveillance technology in the community.”
Rose: And even if this technology did work to find some extra baddies, it might still not be worth using, given the potential downsides.
Keith: We have rights and I’m sure would be easier to go and track every single person 24/7, 365 to find all the bad people. But we don’t do that here. We tell people that they have the rights that are listed in the Constitution and we need to abide by that in our deeds and in our words
Rose: Keith is going to introduce the new rule to the town council in September. And he encourages you to tune in if not to the Teaneck meetings, at least to your own
Keith: I mean let’s face it there are not a huge amount of people that rush out to town council meetings across their land, although if anyone’s listening to this seriously it’s must see TV. And if they’re not broadcast ask your local legislators to at least give you internet streaming of their meetings. The open comment period is worth it at least. My wife loves watching.
Rose: I’ll be keeping an eye on what’s going on in Teaneck and I’ll keep you all up to date on how things go. And if you are interested in introducing a facial recognition ordinance in your town, the ACLU actually has some resources to get you started. They have a program called Community Control Over Police Surveillance, or CCOPS, that includes model bills and plans. I’ll link to that in the show notes.
Okay, we’re going to take one more quick break, and then we’re headed to India, to talk about a completely different kind of facial recognition. But first, another word from our sponsors.
[[AD BREAK]]
Rose: Alright, facial recognition, we’ve covered how it works, how it can be used, where it might fall down, and why some cities have banned it. But there’s another application of facial recognition that I am really fascinated by. You see, humans are not the only creatures with faces.
Ankita Shukla: In case of primates we see that we have a very similar face arrangements. Like you have a flat surface, you have two eyes one noes one one mouth.
Rose: This is Ankita Shukla. She’s a PhD student at IIIT in Delhi, India. And she works on applying facial recognition to primates. And the basic principle is the same here, they have faces just like we have faces. And the challenges are similar too, you need good photos of those faces to train your model on.
Ankita: Capturing that kind of data is pretty difficult. When you are capturing these images in the wild in case of monkeys or any other primate they are pretty much they are moving everywhere. You have to make sure that when you’re clicking those pictures that you get enough information are showing in the picture at least half of the monkey face should be visible when you are clicking such pictures.
Rose: If you’ve ever tried to wrangle a small child into taking a passport photo, you know this struggle.
Ankita: Guy, can you just sit down so I can take different pictures of you in different poses? That just cannot happen.
Now, you might be asking, why? Why would we want to use facial recognition on monkeys? Are they stealing things from the corner store? Did monkeys rob that bank in Denver? No, not yet, but kind of. You see, monkeys are a problem in some parts of india.
Ankita: There’s one species of monkey which isn’t one is rhesus macaques. Which are known to be overpopulated in India because they can just adjust in the urban settings. They can come to your houses, steal food. They don’t, they can eat anything as what humans can eat.
Rose: Rhesus macaques in some parts of India are EVERYWHERE, and they’re not afraid of humans at all. According to one organization that represents farmers, these monkeys cause up to $400 million dollars in crop losses and diverted labor every year. People who live in Delhi report coming home to macaques basically living in their houses — opening up their refrigerator, eating all their food, leaving a mess. Oh and they bite. Basically it’s like having a pack of furry teenagers swinging in and out of your windows all the time. And they must be stopped.
Ankita: So they wanted to start a sterilization operation and in that case.
Rose: So these organizations would go out, catch monkeys, and sterilize them. But then they encountered a problem: But how do you know if you’ve already sterilized a monkey?
Ankita: For now the way they were doing they were like, going manually, picking up a monkey doing the sterilization and again leaving it putting up a mark on the body of the monkey. But we do not want to do that to any animal. We do not definitely want to leave a mark on the body of the animal
Rose: They also want to minimize the interactions people have with monkeys, because remember, they bite, and transmit Herpes, and also they get really stressed out by being handled. So they turned to Ankita and her team and asked them if they could develop a facial recognition system for these macaques.
Ankita: You scan towards a monkey and you say oh this guy is John and this is already sterilized.
Rose [on the phone]: Do they have names?
Ankita: You can name them. It’s fun to name them.
Rose: Ankita recently published a paper where they used photos taken from a wildlife preserve and from zoos to build a facial recognition algorithm that could identify two different types of primates: Rhesus Macaques, and Chimpanzees. And they got it to work really well.
Ankita: Ninety eight percent of the time you were we were able to correctly identify that this is that monkey.
Rose: So in the case of the macaques, the idea is build a system that can identify if you’re looking at a sterilized macaque without having to interact with it. But in the case of other primates, like chimpanzees, it’s more for biological research purposes. If you install camera traps around the forest equipped with this kind of facial ID system, you could get a better sense of which animals are moving where without having to radio collar them, or try and track them yourselves through the forest. And there are teams hoping to use facial recognition to catch poachers who are trying to illegally smuggle chimpanzees. The idea is to tag chimps in the system, so that when photos of these chimps show up on social media chimp rescue groups can catch them. And yes,that is a thing that happens! People post photos of their pet chimpanzees on Facebook!
Now, if you’ve been listening to the show for a while, you know that I’m… sort of obsessed with animals, and how we think about animals and how we talk about the ways that animals and technology interact. And this primate face system got me thinking again about these questions again. This whole episode has been about how important it is for us to push back on surveillance, about how we have this right to privacy, this right to move around in our spaces without being flagged and tracked and monitored. We don’t extend that right to animals right now, but should we? If we don’t think it’s okay to use facial recognition on humans all the time, why should we use it on animals?
Ankita: I don’t think it’s a concern in that way for privacy of animals. I would rather see that having the least amount of interaction with the animal, is a good idea so that they could dwell in the natural way that they want to carry on. This way of doing facial recognition is I think a step in that direction.
Rose [on the phone]: Yeah it’s interesting it’s like the opposite of what people say about humans. Like in humans they say you should know, you should be allowed to know exactly when you’re being, when the cameras are on or whatever. Whereas with this you want to be as minimally invasive as possible so you’re almost trying to disappear. You’re trying to make sure they don’t know that this is happening.
Ankita: Yes, yes.
Rose: There are a lot of questions here. Do monkeys even care about being watched? Probably, at least a little. Research has shown that when chimpanzees know they’re being observed by humans they change their behavior. Which of course is an argument for these passive systems, but also suggests that chimps know when they’re being watched and might have thoughts about it, just like people do.
So just, indulge me in some dystopian thinking here for a second. Let’s say there is a primate that does something bad to people. Maybe it steals something or it hurts a kid, or something like that. Which does happen. Maybe that thing is bad enough that people want revenge, they want that animal apprehended and punished. If we have facial recognition for that creature, if we can tag him as Joe the Baboon, and we can track him through the forest and find him and punish him, maybe even kill him… should we? Or what if poachers get access to this network and can use it to track their targets?
There’s a whole body of research on this topic of animals and surveillance that we don’t have time to get into, but I’ll talk more about it on the bonus podcast this week. Yes, I’m going to talk about the monkey selfie, don’t worry, I can talk about the monkey selfie for hours. But I’ll end this episode with this interesting nugget. I was curious what people thought about surveillance and chimps and facial recognition. so I polled Twitter. Yes, it’s a VERY biased poll, but I was still kind of surprised by the results. 239 people responded, and 54 percent of them said that yes, we should use facial recognition on non-human primates. But 45 percent said no, we shouldn’t. And honestly I’m kind of surprised that it was that balanced, I thought more people would be like “yeah you dummy, of course we should use this on primates.” But I’m not alone in my questions about the ethics here! Hooray! Not alone!
What do you think? I’m serious, I want to know. You can get in touch via email at info@flashforwardpod.com. Or, you can send that email a voice memo. Or you can call and leave the show a voicemail at 347-927-1425. I might play some of your answers next week, if we have time.
For now, here’s what it sounds like when two barn owls discover a hidden camera that is spying on them, get very angry, and destroy the camera.
[Owls screeching]
[music up]
That’s all for this episode!
Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. The episode art is by Matt Lubchansky. If you have ever marveled at how amazing the art is for Flash Forward, you have Matt to thank. They do incredible art for each episode. And I just want to do a quick shoutout, so Matt works with The Nib, which is an amazing publisher of political cartoons, comics journalism, humor, non fiction, they’re amazing and they just lost their funding. So if you like those things, or if you just like the illustrations of Flash Forward please go check out The Nib and potentially support The Nib. That’s TheNib.com, and that’s how you can become a member, if you become a supporter you get stickers. And they just do great work so you should definitely check them out, especially if you’ve ever admired the art for Flash Forward. Special thanks to Veronica Simonetti and Erin Laetz at the Women’s Audio Mission, where all the intro scenes were recorded this season. Special thanks also to Evan Johnson who played Mr. Morton and also coordinated the actors of the Junior Acting Troupe who play the students in the intros this season. Today’s debaters were played by David Romero and Ash Greenberg.
Okay so here’s the deal with the intro scenes this mini-season — they’re not quite as fictional as they usually are on the show. What you heard were real teens, who were basically playing themselves basically. I gave each of them a side to debate, and they prepared their own statements and questions, and then really did improvise the answers. Before we started recording, I had no idea what they were going to say. And they did an amazing job, they researched their positions and wrote speeches and then had long, really interesting discussions as a group after each debate which I couldn’t always include in the episode just for time. So if you want to hear those longer discussions all you have to do is become a Patron at five dollars or more per episode, and you get the bonus podcast. And for the course of this mini-season, each Bonus podcast is going to include the full cut of the discussion that these teens had about these topics. They’re so interesting, insightful, and full of opinions and it was so fun to have them in the studio. Here’s a taste, from today’s episode:
David: In my opinion I feel like I can’t, bring myself to care enough about like the loss of privacy that I know I should. Because when I just think about it, I think like, privacy is dying, the government already tracks you in so many ways, I personally don’t really care about that.
Ava: David, what you said kind of sounds like what people say about global warming. I know it’s a big deal, I just don’t care very much.
David: Global warming could cause the end of the human race.
Ava: Corruption by the government, with our facial recognition could cause the end of the United States of America but I don’t care.
David: But I don’t think that’s something that’s really going to happen…
To hear their whole conversation, head to patreon.com/flashforwardpod and sign up as a patron at $5 per episode or more. That gets you access to the bonus podcast, which will include these clips, plus all kinds of other insights about how Flash Forward works, other things I cut from episodes, and other updates on the Flash Forward universe.
Speaking of Patreon! Flash Forward is mainly supported by Patrons! If you like this show, and you want it to continue, the very best way to make that happen is by becoming a Patron. Even a dollar an episode really helps. You can find out more about that at flashforwardpod.com/support. If financial giving isn’t in the cards for you, the other great way to support the show is by head to Apple Podcasts and leave us a nice review, or just tell your friends about us. The more people who listen, the easier it will be for me to get sponsors and grants to keep the show going.
If you want to suggest a future I should take on, send me a note on Twitter, Facebook or by email at info@flashforwardpod.com. I love hearing your ideas! If you want to discuss this episode, or just the future in general, with other listeners, you can join the Flash Forward FB group! Just search Facebook for Flash Forward Podcast and ask to join. And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.
That’s all for this future, come back next time and we’ll travel to a new one.
[music out]
1 comment
[…] CRIME: Jaw & Order […]