Do you have an “expectation of privacy” when it comes to your facial features?
That’s an unfounded assumption in the good ol’ USA. . In this “land of the free,” the Supreme Court has ruled that you have no expectation of privacy regarding your “personal characteristics” if you’re in public.
It’s why you can’t sue anyone for capturing your image as you walk down the street, allowing anyone with a camera or smartphone can legally take your picture. And, perhaps more chillingly, there are no federal laws regulating biometric data collection.
At the same time, advances in face recognition have greatly improved since it was first pioneered in 1974 by a computer scientist named Woodrow Bledsoe.
This technology can admittedly be very convenient, with the ability to use your facial image to unlock your smartphone, load an app, or open your home’s front door. But face recognition is also increasingly used for surveillance, law enforcement, and marketing purposes.
For instance, US Customs and Border Protection (CBP) scanned the faces of 23 million travelers entering the country between October 2019 and September 2020. The agency’s announced goal was to detect people using fake identity documents. But it didn’t catch a single imposter at any American airport during this period.
Meanwhile CBP’s sister agency, the Transportation Safety Administration (TSA), has announced an initiative to use face recognition to verify the identity of 100% of passengers boarding international flights by the end of the year.
Airlines are only too happy to cooperate. Delta, JetBlue, British Airways, Lufthansa, and American Airlines are integrating face recognition into their check-in process. Naturally, it’s pitched as a matter of “security” along with “passenger convenience.” In some cases, you needn’t even show your passport or boarding pass to board your flight. You just stare into a camera.
But facial recognition at the airport is only the beginning. Millions of closed-circuit television (CCTV) cameras are in place across America, and many of the faces they’ve recorded have made their way into archives compiled by law enforcement. The FBI can now search over 400 million photos from this source as well as driver’s license photos, passport photos, and visa application databases. It can also search photo archives of social media giants like Facebook, although law enforcement agencies need a warrant to retrieve photos or other data not posted for anyone but “friends” to view.
Speaking of Facebook – as of 2013, its members had uploaded more than 250 billion photos onto the social media platform, with an estimated 350 million more added each day. Of course, not all the photos are of faces, but many (perhaps most) are. At that rate, Facebook now maintains an archive of nearly one trillion photos.
Also consider Clearview AI, a company’s whose purported mission is to “quickly, accurately, and efficiently identify suspects, persons of interests and victims of crime.”
That sounds wonderful until you understand how they do it. Using the Clearview app, you take a picture of a person, upload it, and obtain a near-instant match from a database of more than three billion photos the company retrieved from Facebook, YouTube, Twitter, Venmo, and millions of other websites.
Clearview has sold the app to hundreds of law enforcement agencies, plus an undetermined number of private companies. The company claims it has a First Amendment right to collect photos that web users have posted in public forums. And it could well prevail on this claim in court because there are legal precedents allowing companies to retrieve or “scrape” data from websites.
But at what cost? Will law enforcement agencies entrusted with this tool refrain from using it to stalk political dissidents? Will intelligence agencies use it to unearth secrets about their citizens and then blackmail them? Will hackers succeed in taking over Clearview’s database of three billion photos? (They’ve already stolen Clearview’s customer list.)
Even worse, there’s PimEyes, a face-recognition search engine anyone can use. The company retrieves images from a range of websites, including company, media, and even pornography sites. Upload your photo to PimEyes and you can find images with your face that appear online. BBC calls it “facial recognition on steroids.”
But the person uploading the photo to retrieve matching images doesn’t need to be you. It could be anyone who finds a picture of you online and wants to know more (although this is a violation of the company’s terms of service). While PimEyes doesn’t identify anyone by name, it takes only a few clicks of a mouse to connect a face with an identity.
And if you don’t want PimEyes to display your photos? Well, you’re in luck! Just subscribe to the company’s PROtect premium service (just $79.99 per month) and PimEyes won’t display photos with your face in its search results. Oh, and the PROtect service also gives you four hours per month of professional services where its agents try to remove your photos from the websites the company retrieved them from.
Let’s review this business model. PimEyes compiles a massive database of photos from websites, including pornographic sites, and allows anyone with an internet connection to retrieve them. It then charges $79.99 per month to not display the photos it’s found.
Or as PimEyes described it on its website last July: “Upload your photo and find where your face image appears online. Start protecting your privacy.”
Ask yourself: when did you give the entire world permission to upload your facial image to the internet and retrieve every photo ever posted of you online?
The answer, of course, is that you never did. And while there are ongoing efforts to ban or restrict face recognition, they’re unlikely to stop law enforcement agencies or private companies from using your facial images for their own purposes.
It’s not easy to protect your privacy against this technology, but we have a few suggestions.
Don’t renew your driver’s license until it expires. Photos taken more than a decade or so for driver’s licenses aren’t necessarily in digital form and are harder to match. A few states even allow you to cite your religious beliefs to avoid having a photo appear on your driver’s license at all.
Unsubscribe from Facebook and other social networks. If you use these networks, don’t post photos of yourself. And remove photos and videos you’ve posted.
Wear head coverings. A hat will prevent an overhead camera from capturing a clear image of your face unless you look directly at it. If you’re a Muslim woman or don’t mind dressing as one, a burqa will obscure your entire face.
If you’re a man, grow a beard. Like hats or other head coverings, a full beard hides enough of your face to make face recognition more difficult.
Lastly, you might want to keep wearing the masks you acquired last year when the COVID pandemic raged through the country. The National Institute of Standards and Technology (NIST) recently tested 89 commercial face-recognition algorithms and found an error rate of 5%-50% in identifying digitally masked faces.
It’s not a perfect solution, but it just might be the start of your personal Plan B to thwart the use of your facial image without your knowledge or consent.