Facebook Inc. keeps offering inconsistent responses to privacy and content-moderation controversies on the social network, with recent flip-flops indicating that Supreme Ruler Mark Zuckerberg is making things up as he faces new crises seemingly every day.
In the most recent round of issues for the embattled company, it has refused to take down an altered video of a prominent politician, it said in court the exact opposite of what it told lawmakers, and fought its own shareholders to maintain Zuckerberg’s complete control over every move the company makes. And all those moves appear to be focused on finances, not the greater good.
On Thursday, MarketWatch reported that Facebook announced internally it would explore the need for a new policy on manipulated media, just two days before a doctored video of House Speaker Nancy Pelosi generated millions of views on social media.
The company has steadfastly refused to take down the faked Pelosi video, in which the third-highest-ranking U.S. politician appears to be drunk or out of it, slurring her words, because the video has been slowed down. On Wednesday, the California Democrat told KQED Radio in San Francisco that the company’s refusal to take down something that is known to be false shows that “they were willing enablers of the Russian interference in our election.”
Those were fighting words, but Facebook
is not getting much sympathy right now. In an interview with CNN’s Anderson Cooper last Friday, a Facebook executive defended the company’s stance, but seemingly made contradicting statements against its own policies. Cooper noted that Facebook’s community guidelines are geared toward stopping behavior or activity that misleads people in an attempt to get likes, clicks and be shared.
Facebook’s response was that if the misinformation is related to safety, “we can and we do remove it,” said Monika Bickert, Facebook’s head of global policy management. She repeated that anyone viewing the video on Facebook sees that it has been marked as false, although it has not been taken down.
“I actually think Facebook is taking an interesting approach here,” Corynne McSherry, legal director of the Electronic Frontier Foundation, a nonprofit focused on defending civil liberties in the digital world, said in an email. “It shows that companies have options in handling content on their platforms. Too often, platforms and users assume the only option is to take content down or leave it up. But there are other options — like providing additional information. We also need more tools for users, so they have the ability to control their internet experience based on their own judgment.”
Of course, Facebook could just change its mind, or say something completely different in another situation. For example, lawyers representing Facebook in a class-action lawsuit filed by users over the Cambridge Analytica scandal seemed to suggest in a hearing on Wednesday that privacy should not be expected when posting to the social network, which goes against Zuckerberg’s major talking points in congressional hearings last year, as well as other public statements.
“What you are saying now sounds contrary to the message that Facebook itself disseminates about privacy,” Judge Vince Chhabria told Facebook’s attorney Wednesday, according to the Recorder, during its motion to dismiss the case.
Zuckerberg has made a big pivot toward privacy, promising that messaging on Facebook and Instagram in the next few years will be encrypted, as it is now on WhatsApp. Encrypting messages could also give the company another way out of controversial issues: It won’t be able to see private messages and therefore it won’t be able to act on them or remove them. Zuckerberg said that its new tools will help it look at clusters of activity, not just the content.
Of course, it is always more helpful to look at what an executive does rather than what he or she says. During Facebook’s annual shareholders meeting Thursday, for instance, Zuckerberg said that while it appears that the company makes its decisions around content based on business reasons, that is not the case.
But Amy Webb, founder of the Future Today Institute, a future-forecasting firm in Washington, said that money is exactly why the Pelosi video still remains up on Facebook.
“There is a financial disincentive for them to clean this up, which is why they have tags on there,” she said.
And don’t expect anything to change. At Thursday’s shareholder meeting, nearly every proposal mentioned Facebook’s spate of scandals and its role in spreading hate speech and violence or in influencing elections, and some sought to wrest control from Zuckerberg. All were rejected.
Instead, Zuckerberg insisted that the government needs to step in — “One of the conclusions we have come to is there needs to be an updated regulatory framework” regarding content moderation around the world, he said earlier this year — while continuing to promise a new entity to do what Facebook’s board has failed to do.
“We are not waiting for regulation, we are setting up an independent oversight board, if we make a decision, you disagree, you can appeal to this independent body,” Zuckerberg said, adding that the oversight board’s decision will be binding. “It doesn’t matter what I think.”
But depending on Facebook to do all its content-policing itself also means relying on its increasingly large group of human moderators and artificial intelligence, which both have their own issues. Webb, whose most recent book, “The Big Nine,” is a call to arms about the broken nature of artificial intelligence, said that there is no turn-key solution for Facebook in using AI to solve the problem.
“Facebook is an enormous company,” she said. “My assumption is that there are a lot of ‘Franken-algorithms’ embedded in their system because it’s so big and so spread out. It’s not like there is a single Facebook algorithm. They may not know how to deal with these things, but that doesn’t mean they can’t figure it out.”
She isn’t sure the oversight board — which Facebook says will be made up of 40 individuals — will have much of an impact either.
“The board will make suggestions — however they’re not in charge of actually implementing them. This is a problem at other companies that have established ethics and oversight boards. Their recommendations aren’t self-executing.”
The only sure thing in this mess is that Zuckerberg will control every little move that Facebook makes as it continues through the current maelstrom.
“Mr. Zuckerberg cannot fix Facebook alone,” said Natasha Lamb, managing partner at Arjuna Capital, which filed a proposal along with the state of New York requesting that Facebook publish a report on its policies for governing content. One shareholder Thursday even asked a question directly to the lead independent director, Susan Desmond-Hellmann, if she would request an executive session of the board without Zukerberg’s presence.
“The answer is no, I don’t have an intention of calling such a meeting. That is not a direction we want to take the company,” Desmond-Hellman said.
That could be exactly what Facebook needs at this point, for more board members to start asking some tougher questions instead of rubber stamping everything Zuckerberg wants. But unless he is wiling to relinquish his control, that’s never going to happen.