Three situations have come to light in the past week that have reminded me we need to be ever-aware that we all have the ability to make smart choices and use our tech and photo tools for good.
Facial Recognition Isn’t Racist on Purpose, But…
I recently gave a talk on Artificial Intelligence and Photography to a photography group1 and received some feedback that a segment of my presentation wasn’t really relevant to photographers.
We were discussing facial recognition technology. The non-controversial part was the discussion of how facial recognition in programs such as Apple Photos and Adobe Lightroom makes it easy for us to filter and find our images. What once required us to manually assign keywords can now be done automatically. After I’ve told Lightroom that a few images are of my wife, Lightroom uses AI to identify other images that have my wife in them. It’s great.
But a discussion of facial recognition technology also needs to discuss the fact that other than those of us with photography as a hobby or profession, the other major use for facial recognition is by government and and law enforcement agencies, which can be problematic on a couple levels.
- Facial recognition as developed by the big tech companies is not as effective at accurately identifying non-white faces as it is with white or Caucasian-appearing faces. This is a result of the training data sets being used having a higher number of white people than people of color. This skews the results to be more accurate for white faces, as the software is built to better handle the data that was more common in the development process. If law enforcement is going to use facial recognition to identify folks and take enforcement action, this racial disparity will place persons of color at a disadvantage.
- As demonstrated repeatedly over the past month, US law enforcement has shown little regard for equality or a desire to ensure that their processes and technologies aren’t racially biased.
Companies such as Clearview AI and Amazon2 offer facial recognition products marketed for law enforcement.
To speak only of how facial recognition benefits my photo library but not also acknowledge that it is being used in a discriminatory fashion is to only tell part of the story. Whether photography audiences want to hear it or not, we need to discuss the full picture (pun intended). You can bet that there’s a nontrivial chapter in my book on AI and photography devoted to nefarious use of the technology.
The Power of the Camera
Yesterday, I shared this on an Instagram story:
DEAR.ALL.PEOPLE. If you see a copy talking to any African American person, I KNOW YOU BETTER STOP WALKING. You better stay right there till that cop gets back in their car and drives away. Let them know WE ARE WATCHING THEM and they are not ALONE. This is the LEAST we can do.
The text I added when I shared the story says “Don’t just stop walking, start recording.”
I got a message in response from someone saying “This is bad. Don’t be the problem.”
The problem isn’t that I’d suggest recording law enforcement. The problem is that law enforcement has given us uncountable reasons why they need to be recorded. Abusive and racist practices by law enforcement aren’t new… what’s new is that we’re paying attention, and a lot of that paying attention is because of video.
We don’t know about George Floyd’s murder just because he was killed. We know about it because there was a 9-minute video of the event.
Most encounters between law enforcement and the public are relatively uneventful, but far too many still lead to abuse, injury, or death.
The least that we can do as bystanders (especially as photographers) is to get out our phones and press Record. It’ll keep the honest cops honest, and help tell the truth if something goes sideways. To stand by, or walk by, and ignore the situation is to be complicit in the outcome.
Are Private Communications Done?
Earlier this week, backed by Attorney General Bill Barr, Senate Repbulicans introduced the Lawful Access to Encrypted Data Act which would mandate that technology companies break encryption, adding backdoors that (in theory) would only be used by law enforcement.
But… that’s not how encryption, or the law, works.
If there’s a backdoor, it’s not truly encrypted.
I’m not going to repeat all of the details behind this (check that link just above), but this law would mean the end of internet security and private messaging.
And while it’s becoming a bit of a recurring theme in this article and in our lives right now, law enforcement organizations have demonstrated their willingness to bypass and violate the law for various purposes unrelated to their jobs. We simply cannot give them another tool that could be abused. Everyone in the US ought to be contacting their senators and urging them to oppose this legislation that will strip us of online communications privacy.
– –
The turmoil in our current reality has led to many changes across all walks of life. For those of us who aren’t facing immediate threats to our safety and livelihood, we can amplify the voices of those who are, and we owe it to the world to push for responsible technology choices.
Misuse of technology leads to abuse of people. And that’s not okay.
- I’ve given three such talks in the last couple of months to groups in two states and one province; the particular group is not relevant to this story. ↩
- Amazon has announced a one-year pause on selling its technology to police. ↩
Leave a Reply