Facebook Home Meta to Change Cross-checking Feature for VIP Posts #facebook, #Main, #cross, #crosscheck program, #willchange, #facebook change main meta crosscheck feature 32 vip posts changes meta, #Posts, #for, #instagram, #check, #Meta, #Feature, #VIP
On Friday, Meta said the company would change the way it privately handles posts criticized by celebrities, politicians and other mass Instagram or Facebook users and would take steps to prevent commercial interests from influencing decisions.The tech giant has promised to implement most or all of the 32 changes in its “cross-checking” program recommended by an independent review board it funds as a sort of supreme court for content or policy decisions.“This will lead to significant changes in how we operate this system,” Nick Clegg, head of global affairs at Meta, said in a blog post.“These actions will improve this system to make it more effective, accountable and fair.”But Meta has refused to explicitly label which accounts are preferred when it comes to content filtering decisions and will not create a formal, open process for entering the program.Meta reasoned that tagging users in its cross-checking program could target them for abuse.The changes came in response to the supervisory board in December calling on Meta to overhaul its cross-checking system, saying the program puts business interests ahead of human rights while giving special treatment to rule-breaking posts made by certain users.“We found that the program appeared to be structured more directly to address commercial concerns,” the panel said in a report at the time.“Cross-checking provides extra protection to specific users, selected largely based on their business interests, allowing content that would otherwise be quickly removed to remain open longer and potentially damaged.”Meta told the board that the program aims to prevent content removal errors by providing an additional layer of human review for posts by high-profile users that initially appear to break the rules, according to the report.“We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure,” Meta said in its response to the oversight board. Said.
On Friday, Meta said the company would change the way it privately handles posts criticized by celebrities, politicians and other mass Instagram or Facebook users and would take steps to prevent commercial interests from influencing decisions.The tech giant has promised to implement most or all of the 32 changes in its “cross-checking” program recommended by an independent review board it funds as a sort of supreme court for content or policy decisions.“This will lead to significant changes in how we operate this system,” Nick Clegg, head of global affairs at Meta, said in a blog post.“These actions will improve this system to make it more effective, accountable and fair.”But Meta has refused to explicitly label which accounts are preferred when it comes to content filtering decisions and will not create a formal, open process for entering the program.Meta reasoned that tagging users in its cross-checking program could target them for abuse.The changes came in response to the supervisory board in December calling on Meta to overhaul its cross-checking system, saying the program puts business interests ahead of human rights while giving special treatment to rule-breaking posts made by certain users.“We found that the program appeared to be structured more directly to address commercial concerns,” the panel said in a report at the time.“Cross-checking provides extra protection to specific users, selected largely based on their business interests, allowing content that would otherwise be quickly removed to remain open longer and potentially damaged.”Meta told the board that the program aims to prevent content removal errors by providing an additional layer of human review for posts by high-profile users that initially appear to break the rules, according to the report.“We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure,” Meta said in its response to the oversight board. Said.