Human at the center of effective digital defense

As a result, content moderation (monitoring UGC) is crucial to online experiences. In his book Guardians of the Internet, sociologist Tarleton Gillespie writes that despite the “utopian notion” of the open internet, effective content control is essential for digital platforms to work. “There is no platform that does not impose some degree of rules – it would be untenable not to do so,” he writes. “Platforms must control one way or another: both to protect one user from another or a group of enemies, and to eliminate what is offensive, vile, or illegal, and to present new users with their best face, to their advertisers and partners, and the public at large.”

Content control is used to address a wide variety of content across industries. Skilled content control can help organizations keep their users safe, make their platforms available, and keep their reputations intact. A best practices approach to content moderation supports these efforts with human skill and judgment, leveraging increasingly sophisticated and accurate technical solutions.

Content moderation is a rapidly growing industry and is critical for all organizations and individuals who come together in digital environments (i.e. more than 5 billion people). According to Abhijnan Dasgupta, director of practice specializing in trust and safety (T&S) at Everest Group, the value of the industry was roughly $7.5 billion in 2021, and experts predict that number will double by 2024. Gartner research found that about one-third (30%) of large companies will make content moderation a top priority by 2024.

Content moderation: More than just social media

Content moderators remove hundreds of thousands of problematic content every day. For example, Facebook’s Community Standards Enforcement Report found that in the third quarter of 2022 alone, the company removed 23.2 million violent and uncensored content and 10.6 million cases of hate speech, plus 1.4 billion spam posts and 1.5 billion fakes. documents that he removed the account. But while social media is the most widely reported example, numerous industries rely on UGC for everything from product reviews to customer service interactions and require content moderation as a result.

“Any site that allows input of information that is not internally generated needs a content audit,” explains Mary L. Gray, a senior principal investigator at the Luddy School of Informatics, Computing, and Microsoft Research. engineering at Indiana University. Other industries that rely heavily on content control include telehealth, gaming, e-commerce and retail, and the public sector and government.

In addition to removing offensive content, content moderation can detect and eliminate bots, identify and remove fake user profiles, handle fake reviews and ratings, delete spam, deceptive ads to the police, reduce predatory content (particularly content targeting minors) and two safe processes. can facilitate. one-way communication
in online messaging systems. One of the areas of serious concern, especially in e-commerce platforms, is fraud. “There’s a lot of bad actors and scammers trying to sell counterfeit products, and there’s also a big problem with fake reviews,” he says. global brands. “Content moderators help ensure products follow the platform’s guidelines and also remove banned products.”

Download the report.

This content is produced by Insights, the exclusive content arm of MIT Technology Review. It was not written by the editorial staff of MIT Technology Review.

Leave a Reply

Your email address will not be published. Required fields are marked *