Jeff Horwitz for Wall Street Journal:
The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted”—rendered immune from enforcement actions—while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.
At times, the documents show, XCheck has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users. In 2019, it allowed international soccer star Neymar to show nude photos of a woman, who had accused him of rape, to tens of millions of his fans before the content was removed by Facebook. Whitelisted accounts shared inflammatory claims that Facebook’s fact checkers deemed false
Content moderation is tough. I was surprised to observe this well written investigation in WSJ – something I don’t always equate with “high quality journalism”, but a content farm that operates under the guise of a newspaper. (I am only expressing my opinion, precisely the way WSJ does).
I link to the paywalled articles because they ensure a long-term viability of published content and ensure it doesn’t turn into a link-rot.
Facebook’s mischaracterisations are too many to mention. They are actively engaged in changing social behaviour and seem beyond the regulatory oversight. I doubt it will be broken up, because it allows everyone to “communicate at scale”. I am not venturing into political debates here, though this post is only for the ring of caution – it is an unreliable medium for anything, and there are better ways to seek information. They have been described as walled gardens, but require a better terminology of walled rabbit-holes of misinformation.
Facebook has “whitelisted” specific users on its platform:
“Facebook currently has no firewall to insulate content-related decisions from external pressures,” a September 2020 memo by a Facebook senior research scientist states, describing daily interventions in its rule-making and enforcement process by both Facebook’s public-policy team and senior executives.
A December memo from another Facebook data scientist was blunter: “Facebook routinely makes exceptions for powerful actors.”Tweet