[ad_1]
As US Election Day approaches, social networking startup Bluesky, now flush with new capital, hopes to prove that its platform can serve as a more reliable, fact-checking alternative to Elon Musk’s Program X. Take control Supported by Musk For the Trump campaignBluesky leans in Lean to the leftThis is thanks to an influx of disgruntled former Twitter users who don’t like the platform’s new direction. Now, with the US election approaching, Bluesky is preparing for its biggest test yet: its ability to handle potential misinformation that could mislead users during these crucial national events, including any posts intended to disrupt the voting process or those that use new technologies. , such as artificial intelligence, to confuse the electorate.
While another competitor to Timed with X’s recent changes to block functionality, which have angered some users, Bluesky may be poised to once again benefit from another
To manage its election operations, Bluesky earlier this year hired a prominent former Twitter leader as head of trust and safety, Aaron Rodricks. Rodricks already has experience with the policies, tools, and teams needed to manage election integrity at Twitter, where he co-led the Trust and Safety team, and once made headlines as the target of a right-wing campaign on X after announcing on LinkedIn how he did it. It was looking to hire more staff for the 2024 election season. The CEO later lost his job at X when Musk cut the election integrity team in half after promising to expand it.
Now at Bluesky, the team led by Rodericks has announced how it is preparing to handle the US presidential election, including by reviewing content for potential misinformation as well as other unsubstantiated reports and claims.
In a series of Supports At Bluesky, the Bluesky Safety team has detailed its election safety plans, reminding users that they can report posts to Bluesky’s moderation service for misleading, illegal, or urgent content by clicking on the three-dot menu next to each post and account. There will also be a priority queue in its system for any election-related reporting.
To keep the process “safe and accurate,” Bluesky says it will also remove any content that “encourages or glorifies intimidation or disruption to voting, tabulation, or certification.” It also plans to label posts that contain misleading claims about voting, such as those that share incorrect voter ID requirements or other manipulated media.
Meanwhile, “emerging” election-related reports that cannot be immediately verified will be classified as “unconfirmed.” For example, if someone reported long lines at their polling place or other incidents at polling places, they would likely be classified as uncertain at the time of assessment. (The company did not share whether or how it would update these reports if national media later confirmed them.)
The company says its plans to oversee the platform extend beyond Election Day as well, as it will also work to identify and address any disruptions to the “peaceful transition of power.”
Additionally, Bluesky says it reserves the right to roll out further safeguards in the coming days, if necessary, to ensure the integrity of elections on the platform.
Unlike X and Threads, where moderation is handled solely by the company itself, the promise of decentralized Bluesky is that anyone can run their own Bluesky server and their own server. Special moderation service. Users can also subscribe to Multiple moderation services To customize their feed to their liking,
“Our online experience shouldn’t depend on billionaires making unilateral decisions about what we see,” the company said. He explained In March. “On an open social network like Bluesky, you can shape your own experience.” In other words, if you don’t like the way Bluesky runs its own app, you can create your own. And if you don’t like Bluesky’s moderate options, you can Build your own Independent service instead.
Bluesky’s moderation team has also been expanded with additional hires following two recent surges that brought more users to the service. Although the company did not mention the size of its moderation team today, CEO Jay Graber hinted at the size of the team at interview with Niley Patel’s Decoder podcast in March, when she said, “We’re about 18 people in engineering and operations, and then we have about that number in support and oversight.”
[ad_2]