In a bid to wrap up the race for the Tin Ear of the Year Award before June 1, Facebook has begun asking its 2.2 billion users to discreetly share their indiscreet nude photos with the company. The plan, they say, is to train Facebook to block the images you don’t ever want on Facebook, in cases such as revenge porn.
The company is partnering with several third-party groups – such as the Cyber Civil Rights Initiative and the National Network to End Domestic Violence – to distribute review forms to those who’ve had to deal with former sexual partners improperly posting their sensitive images.
Requesters are given a one-time upload link to send those images to Facebook, where they are reviewed by “a handful of specially trained members” of the company’s burgeoning content-review team.
Those team members will create what’s effectively a digital fingerprint of the images so that Facebook’s systems can automatically recognize and block the images before they can be seen by anyone outside the company. The program is undergoing trials in the United States, United Kingdom and a couple of other territories.
This all sounds pretty good. Except, remember that this is the same company that has had such a bumpy time the past couple of years controlling what’s happening to personal data on its site, and what’s being shared with outside companies.
You may recall that little kerfuffle last year when it became clear that as many as 80 million people had their data improperly shared/exposed to third-party providers during the 2016 elections as part of relatively routine Facebook operations.
Beneficiaries of that abused and prodigal data included the now-bankrupt Cambridge Analytica, whose work may have helped sway the outcome of both Brexit and the 2016 U.S. presidential election. If you’re James Clapper, the former U.S. Director of National Intelligence, you believe (and wrote in a book released this week) that companies such as Cambridge Analytica helped Russian trolls harness Facebook data to throw the election.
Amid the slew of fallout since the Russian meddling and Cambridge involvement surfaced, Facebook has instituted a range of reforms, cutting off data-sharing agreements with some, and forcing changes with others. They’ve also committed to hiring 4,000 more members of their content-review team.
And maybe, if someone shared deeply personal naked photos with Facebook, the putatively chastened tech giant would now do a better job protecting that material.
Given, however, the frosty reaction European legislators had this week when Mark Zuckerberg delivered pablum-filled non-answers to their questions about how Facebook handles data, I’m skeptical about how much to trust the company.