^

Technology

Meta modifying its special handling of VIP posts

Agence France-Presse
Meta modifying its special handling of VIP posts
This photo illustration taken in Melbourne on Feb. 24, 2023 shows a message on Facebook introducing the Meta Verified service. Facebook and Instagram began a week-long rollout of their first paid verification service on February 24, testing users' willingness to pay for social media features that until now have been free.
AFP/William West

SAN FRANCISCO, United States — Meta on Friday said it will modify the company's criticized special handling of posts by celebrities, politicians and other big audience Instagram or Facebook users, taking steps to avoid business interests swaying decisions.

The tech giant promised to implement in full or in part most of the 32 changes to its "cross-check" program recommended by an independent review board that it funds as a sort of top court for content or policy decisions.

"This will result in substantial changes to how we operate this system," Meta global affairs president Nick Clegg said in a blog post.

"These actions will improve this system to make it more effective, accountable and equitable."

Meta declined, however, to publicly label which accounts get preferred treatment when it comes to content filtering decisions and nor will it create a formal, open process to get into the program.

Labeling users in the cross-check program might target them for abuse, Meta reasoned.

The changes came in response to the oversight panel in December calling for Meta to overhaul the cross-check system, saying the program appeared to put business interests over human rights when giving special treatment to rule-breaking posts by certain users.

"We found that the program appears more directly structured to satisfy business concerns," the panel said in a report at the time.

"By providing extra protection to certain users selected largely according to business interests, cross-check allows content which would otherwise be removed quickly to remain up for a longer period, potentially causing harm."

Meta told the board that the program is intended to avoid content-removal mistakes by providing an additional layer of human review to posts by high-profile users that initially appear to break rules, the report said.

"We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure," Meta said in its response to the oversight board.

"While we acknowledge that business considerations will always be inherent to the overall thrust of our activities, we will continue to refine guardrails and processes to prevent bias and error in all our review pathways and decision making structures."

vuukle comment

FACEBOOK

INSTAGRAM

META

Philstar
x
  • Latest
Latest
Latest
abtest
Are you sure you want to log out?
X
Login

Philstar.com is one of the most vibrant, opinionated, discerning communities of readers on cyberspace. With your meaningful insights, help shape the stories that can shape the country. Sign up now!

Get Updated:

Signup for the News Round now

FORGOT PASSWORD?
SIGN IN
or sign in with