Mark Zuckerberg has positioned Threads, Meta's Twitter clone programme, as a "friendly" haven for online public dialogue, setting it apart from Elon Musk's more antagonistic Twitter.
"We are definitely focusing on kindness and making this a friendly place," Meta CEO Zuckerberg said on Wednesday, shortly after the service's launch.
It's another matter entirely to keep Threads true to that idealistic aim after it garnered more than 70 million users in its first two days.
Meta Platforms is certainly not a novice when it comes to controlling the smut-posting, rage-baiting online masses. The business declared that the same guidelines it enforces on Instagram, a social media platform for sharing photos and videos, would also apply to users of the new Threads app.
In an effort to veer more towards entertainment and away from journalism, the owner of Facebook and Instagram has also been actively embracing an algorithmic approach to putting up content, giving it greater influence over the kind of fare that succeeds.
The appeal of microblogging to news junkies, politicians, and other aficionados of rhetorical conflict, however, as well as the connection of Threads to other social media platforms like Mastodon, mean that Meta is also courting new obstacles with Threads and attempting to forge a new course through them.
First off, according to company spokesperson Christine Pai in an email statement on Thursday, the company would not expand its current fact-checking programme to Threads. This gets rid of a unique aspect of how Meta handled false information on its other apps.
Pai stated that Facebook or Instagram posts that have been labelled as fake by fact-checking partners, such as a division at Reuters, will also receive the same designation if they are shared on Threads.
When questioned by the media about its decision to treat false information on Threads differently, Meta refuses to comment.
Adam Mosseri, the CEO of Instagram, acknowledged in a New York Times podcast on Thursday that Threads was more "supportive of public discourse" than Meta's other services and thus more likely to attract a news-focused audience, but claimed the company aimed to concentrate on lighter topics like sports, music, fashion, and design.
But right away, Meta's ability to remain above controversy was called into question.
Within hours of its inception, accounts on Threads reviewed by Reuters were discussing the Illuminati and "billionaire satanists," while other users engaged in arguments and comparisons to Nazis over issues ranging from gender identity to violence in the West Bank.
After labels emerged alerting potential followers that they had posted fake information, conservative figures, including the son of former U.S. President Donald Trump, complained of censorship. Those labels were incorrect, according to a different Meta representative.
Once Meta connects Threads to the so-called fediverse, where users from servers run by other non-Meta organisations will be able to speak with Threads users, additional difficulties in content moderation are in store. According to Meta's Pai, those users would be subject to Instagram's policies as well.
"If an account or server, or if we find many accounts from a particular server, is found violating our rules then they would be blocked from accessing Threads, meaning that server's content would no longer appear on Threads and vice versa," she said.
However, experts in online media stated that the devil will be in the specifics of Meta's strategy for these encounters.
Without access to back-end information about users who post prohibited content, Alex Stamos, the director of the Stanford Internet Observatory and a former head of security at Meta, wrote on Threads that the firm would have more difficulty carrying out important sorts of content moderation enforcement.
"With federation, the metadata that big platforms use to tie accounts to a single actor or detect abusive behavior at scale aren't available," said Stamos. "This is going to make stopping spammers, troll farms, and economically driven abusers much harder."
He predicted that Threads would reduce the visibility of fediverse servers with a high proportion of abusive accounts and impose more severe sanctions on users who publish unlawful content like child pornography.
However, the contacts themselves present difficulties.
"There are some really weird complications that arise once you start to think about illegal stuff," said Solomon Messing of the Center for Social Media and Politics at New York University. He cited examples like child exploitation, nonconsensual sexual imagery and arms sales.
"If you run into that kind of material while you're indexing content (from other servers), do you have a responsibility beyond just blocking it from Threads?"
(Source:www.tbsnews.com)
"We are definitely focusing on kindness and making this a friendly place," Meta CEO Zuckerberg said on Wednesday, shortly after the service's launch.
It's another matter entirely to keep Threads true to that idealistic aim after it garnered more than 70 million users in its first two days.
Meta Platforms is certainly not a novice when it comes to controlling the smut-posting, rage-baiting online masses. The business declared that the same guidelines it enforces on Instagram, a social media platform for sharing photos and videos, would also apply to users of the new Threads app.
In an effort to veer more towards entertainment and away from journalism, the owner of Facebook and Instagram has also been actively embracing an algorithmic approach to putting up content, giving it greater influence over the kind of fare that succeeds.
The appeal of microblogging to news junkies, politicians, and other aficionados of rhetorical conflict, however, as well as the connection of Threads to other social media platforms like Mastodon, mean that Meta is also courting new obstacles with Threads and attempting to forge a new course through them.
First off, according to company spokesperson Christine Pai in an email statement on Thursday, the company would not expand its current fact-checking programme to Threads. This gets rid of a unique aspect of how Meta handled false information on its other apps.
Pai stated that Facebook or Instagram posts that have been labelled as fake by fact-checking partners, such as a division at Reuters, will also receive the same designation if they are shared on Threads.
When questioned by the media about its decision to treat false information on Threads differently, Meta refuses to comment.
Adam Mosseri, the CEO of Instagram, acknowledged in a New York Times podcast on Thursday that Threads was more "supportive of public discourse" than Meta's other services and thus more likely to attract a news-focused audience, but claimed the company aimed to concentrate on lighter topics like sports, music, fashion, and design.
But right away, Meta's ability to remain above controversy was called into question.
Within hours of its inception, accounts on Threads reviewed by Reuters were discussing the Illuminati and "billionaire satanists," while other users engaged in arguments and comparisons to Nazis over issues ranging from gender identity to violence in the West Bank.
After labels emerged alerting potential followers that they had posted fake information, conservative figures, including the son of former U.S. President Donald Trump, complained of censorship. Those labels were incorrect, according to a different Meta representative.
Once Meta connects Threads to the so-called fediverse, where users from servers run by other non-Meta organisations will be able to speak with Threads users, additional difficulties in content moderation are in store. According to Meta's Pai, those users would be subject to Instagram's policies as well.
"If an account or server, or if we find many accounts from a particular server, is found violating our rules then they would be blocked from accessing Threads, meaning that server's content would no longer appear on Threads and vice versa," she said.
However, experts in online media stated that the devil will be in the specifics of Meta's strategy for these encounters.
Without access to back-end information about users who post prohibited content, Alex Stamos, the director of the Stanford Internet Observatory and a former head of security at Meta, wrote on Threads that the firm would have more difficulty carrying out important sorts of content moderation enforcement.
"With federation, the metadata that big platforms use to tie accounts to a single actor or detect abusive behavior at scale aren't available," said Stamos. "This is going to make stopping spammers, troll farms, and economically driven abusers much harder."
He predicted that Threads would reduce the visibility of fediverse servers with a high proportion of abusive accounts and impose more severe sanctions on users who publish unlawful content like child pornography.
However, the contacts themselves present difficulties.
"There are some really weird complications that arise once you start to think about illegal stuff," said Solomon Messing of the Center for Social Media and Politics at New York University. He cited examples like child exploitation, nonconsensual sexual imagery and arms sales.
"If you run into that kind of material while you're indexing content (from other servers), do you have a responsibility beyond just blocking it from Threads?"
(Source:www.tbsnews.com)