Ultimately, the issue came up at a March 2022 meeting with Clegg, who seemed surprised by the board members’ frustration. He promised to break the deadlock, and a few weeks later the board finally got the tool they should have had from the start. “We had to fight them to get it, which was confusing,” says Michael McConnell, a Stanford law professor who is a co-chair of the board. “But we did it.”
No sooner was this skirmish settled than another incident stirred the waters. When Russian troops invaded Ukraine last February, Facebook and Instagram were quickly flooded with questionable, even dangerous, content. Posts that promoted violence, like “Death to the Russian invaders,” were a clear violation of Meta’s policy, but a ban could indicate the company supports those invaders. In March, Meta announced that it would temporarily allow such violent remarks in this narrow case. It turned to the Board for support and asked for an advisory opinion. The Board accepted the motion, eager to reflect on the human rights issue involved. It prepared a statement and made appointments to brief reporters on the upcoming case.
But just before the board announced its new case, Meta abruptly withdrew the motion. The reason given was that an investigation could put some meta employees at risk. The board officially accepted the declaration but rejected it in private meetings with the company. “We made it very clear to Meta that it was a mistake,” says Stephen Neal, the chair of the Oversight Board Trust, who noted that if security was indeed the reason, it would have been obvious before Meta requested the advisory opinion would have.
When I asked Neal if he suspected that the board’s opponents wanted to prevent him from meddling on a sensitive issue, he didn’t deny it. In what appeared to be an implied kickback, the board picked up a case that addressed the very issues raised by Meta’s retracted advisory opinion. It was a Russian-language post by a Latvian user who showed a presumed dead body on the ground and quoted a famous Soviet poem that reads: “Kill the fascist so that he lies on the backbone of the ground… Kill him!” Kill him!”
Other members also noted the mixed feelings in Meta. “There are a lot of people in the company who find us more of an annoyance,” says McConnell. “No one really likes to be looked over their shoulder and criticized.”
Since the board members are savvy people who were probably chosen in part because they aren’t bomb throwers, they’re not the type to directly declare war on Meta. “I don’t approach this job thinking meta is evil,” says Alan Rusbridger, board member and former editor of The guard. “The problem they are trying to solve is one that no one on Earth has ever attempted. On the other hand, I think there is a pattern where they are screaming and kicking to give us the information we are looking for.”
There’s worse than no information. In one instance, Meta gave the board the not correct information – which could soon lead to his most devastating decision yet.
During Trump In this case, meta-researchers had mentioned a program called Cross Check to the board. Essentially, it gave special treatment to certain accounts held by politicians, celebrities, and the like. The company described it to the board as a limited program involving only “a small number of decisions.” Some board members saw this as inherently unfair and, in their recommendations on the Trump case, asked Meta to compare error rates in their cross-check decisions with those on regular items and accounts. Basically, the members wanted to make sure that this strange program wasn’t a free pass for the powerful to get out of prison.