Don't worry,video of old clergyman begging whore to have sex militia members will have to wait until afterElection Day to be algorithmically pointed to Facebook groups of like-minded individuals.
At Wednesday's Senate hearing on (at least in theory) Section 230, Facebook CEO Mark Zuckerberg let slip a slight behind-the-scenes change his company has taken in the lead up to Nov. 3. Specifically, Zuckerberg offhandedly mentioned that Facebook has temporarily stopped recommending political issue Facebook groups to its users.
Of course, Facebook intends to spin this presumably dangerous — or, at the very least, worrisome — recommendation feature right back up again after the election. So reports BuzzFeed News, which was able to confirm that the new policy is only temporary.
"This is a measure we put in place in the lead-up to Election Day," Facebook spokesperson Liz Bourgeois told the publication. "We will assess when to lift them afterwards, but they are temporary."
Because obviously we won't have any social media-juiced instances of violence after the election. Heavens no.
Notably, this move comes at a time when Zuckerberg — as expressed in his Thursday earnings call — is "worried that with our nation so divided and election results potentially taking days or weeks to be finalized, there is a risk of civil unrest across the country."
Facebook, which recently attempted to ban QAnon conspiracy groups, has particular reason to be concerned about the upcoming election and possible associated violence. Well, concern for its reputation, anyway. The platform has served as a breeding group for violent conspiracy theories for years, and a simple QAnon ban isn't going to change that.
There is a real possibility that the next Kenosha-style tragedy is already being planned, coordinated, or hyped with Facebook tools — only now with an Election Day twist. Facebook's attempt to cool things down by pausing an element of its own recommendation system calls attention to the simple fact that Facebook itself is fundamentally problematic.
Facebook knows this. In May of this year, the Wall Street Journalreported that Facebook had ignored its own internal research showing that its algorithms were making the site more divisive.
"Our algorithms exploit the human brain's attraction to divisiveness," read a slide from a 2018 presentation. "If left unchecked," it warned, Facebook would feed users "more and more divisive content in an effort to gain user attention & increase time on the platform."
SEE ALSO: People are fighting algorithms for a more just and equitable future. You can, too.
No temporary pause of a single recommendation feature, no matter how well intentioned, is going to change that.
Topics Facebook Social Media
Chicago Sky vs. Atlanta Dream 2024 livestream: Watch live WNBAiPhone 11 to iPhone 16: What has changed in 5 years?NYT Strands hints, answers for September 16Young Boys vs. Aston Villa 2024 livestream: Watch Champions League for freeWordle today: The answer and hints for September 17Las Vegas Aces vs. Seattle Storm 2024 livestream: Watch live WNBASpurs vs. Arsenal 2024 livestream: Watch Premier League for freeLas Vegas Aces vs. Seattle Storm 2024 livestream: Watch live WNBAAirPods 3rd gen: Just $10 from historic lowToday's Hurdle hints and answers for September 16, 2024 Best Prime Day camera deals: Kodak, GoPro, more Bears vs. Vikings 2024 livestream: How to watch NFL online Chiefs vs. Browns 2024 livestream: How to watch NFL online Ravens vs. Giants 2024 livestream: How to watch NFL online Select Amazon account deal: Use code PLUG to get an Amazon Smart Plug for just $1.99. OpenAI makes ChatGPT Search available to everyone Early Prime Day Roomba deal: Roomba Combo j9+ is 43% off Best Haptic Strap deal: Save $30 on Woojer Haptic Strap 3 Best Prime Day bladeless fan deals: Shop Dyson deals and more NYT Strands hints, answers for December 15
0.1464s , 14408.2265625 kb
Copyright © 2025 Powered by 【video of old clergyman begging whore to have sex】Enter to watch online.Facebook pauses groups recommendation feature until after election,