Forget about media outlets and Watch Fast & Furious 7 OnlineFacebook — worry about readers.
Facebook's 2 billion monthly users have come to rely on the social network as a way to keep up with the news. Now Facebook is limiting the reach of news publishers, leaving a vacuum to be filled with... well, it's anyone's best guess.
The change is simple. Facebook is going to show users more posts that their friends and family have either created, shared, or commented on. In turn, Facebook is reducing the reach of pages including news outlets. That may sound innocuous, but the shift turns up the dial on the signals that help amplify fake news. And there's no way to tell how bad this is going to get — not even Facebook knows for sure.
We're already seeing this in action.
CEO Mark Zuckerberg claimed the changes will put more "meaningful" content into people's feeds. But previous tweaks and tests have shown there's plenty of downside to showing people less news. The New York Timespublished a story over the weekend that highlighted how Facebook's changes have had particularly negative consequences in countries where journalists are at risk and news media is censored. Publishers are having trouble reaching people with real news while fake news spreads.
This follows a report that Facebook was used to spread misinformation and propaganda in Myanmar, where the United Nations says the government is participating in the ethnic cleansing of Rohingya Muslims.
Now, the social network is kneecapping publishers, leaving its News Feed open for whatever posts can get the most comments.
It's unclear how users are supposed to deal with this change. The average person is probably not aware of the changes to Facebook, let alone in possession of a clear understanding that they're going to see less news directly from publishers. A person who pulls up Facebook in the coming weeks isn't going to think, "OK, there are fewer news stories directly from the platforms I trust. I should be careful about what I'm seeing, and maybe go to news outlets directly."
This Tweet is currently unavailable. It might be loading or has been removed.
Zuckerberg says that the changes are meant to improve "well-being." Their research shows that "meaningful" posts from friends and family on social media can do this. How does Facebook tell what's "meaningful?" Facebook executives have said that comments are the leading indicator, especially longer comments.
"The research shows that when we use social media to connect with people we care about, it can be good for our well-being. We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos -- even if they're entertaining or informative -- may not be as good," Zuckerberg wrote in a Facebook post.
People may end up feelingbetter in the coming months after visiting Facebook, but it's hard to understand how they won't be subjected to more misinformation. People aren't going to stop using Facebook as a news source just because of this tweak. They're going to keep on visiting Facebook and expecting to see news.
And they will see some news. There will be stories that your friends or family share and comment on. That sounds fine, except that this is exactly how fake news spreads on Facebook — and how politicians and interest groups have been trained by Facebook to maximize their reach.
It's no accident that Facebook became a destination for news and politics. The social network works closely with political campaigns in the U.S. and abroad, convincing them to spend big money to push their messages. That's included working with Rodrigo Duterte, the president of the Philippines who has been accused of carrying out extrajudicial killings and shuttering news organizations. Facebook has a team dedicated to developing tools for politicians.
Meanwhile, Facebook's embrace of publishers made it a destination for news. Almost half of U.S. adults get news from Facebook. There's a good chance you're reading this after coming from Facebook.
What we already know about how fake news spreads on Facebook makes this a scary proposition.
Here's how: A group wants to spread a particular piece of misinformation or propaganda for whatever reason. They do this by paying Facebook to show this content to people who are likely to share it. Those people are shown these paid-for posts, and they then start spreading it around their network.
Examples of this tend to center on politics and elections, but there are other types of scams circulating on Facebook. Right now, bitcoin and cryptocurrency are particularly hot.
This Tweet is currently unavailable. It might be loading or has been removed.
We are left with a scary timeline: Facebook makes itself into a news destination. Facebook makes itself into a way for propaganda to spread. Facebook removes news and boosts signals that help propaganda spread.
It's hard to imagine Facebook hasn't taken this into consideration, but the past few years are littered with examples of the company not quite realizing what it has created or what's happening on its network, especially with regard to fake news. One former Facebook employee familiar with the News Feed said that the company's system is so complex that even its engineers don't know what will happen when they tweak the system.
So there you have it. Readers, who were trained to get their news from Facebook, are now going to see a bunch of posts based on signals that are perfect for the spread of fake news — after Facebook explicitly pushed governments to use its tools to get their message out.
That's bad news for readers — and their well-being.
Topics Facebook
Previous:Norms Follow Function
Next:Man Out of Time
John Jeremiah Sullivan Tonight at The Half King! by The Paris ReviewMiss Piggy, Literary Icon by Emma StraubAlice in Bed, Again by The Paris ReviewWin Tickets to BAM’s Artist Talk! by Sadie SteinLove Stories by Phoebe ConnellyOn the Shelf by Sadie SteinThe Moleskines Have Arrived! by Sadie SteinKate Beaton on ‘Hark! A Vagrant’ by Nicole RudickUnread Books; Changing Character Names by Lorin SteinStaff Picks: ‘Desire,’ Tim Tebow by The Paris ReviewThe Paris Review Auction—Now Live! by The Paris ReviewBrenda Shaughnessy’s “I’m Over the Moon” by Lorin SteinTaylor’s Multitudes by Liz BrownAbstracts by Scott TreleavenThe Desert’s Daughters by Jenna WorthamA Week in Culture: Sadie Stein, Editor by Sadie SteinThe Laundry Room by Thomas BellerStaff Picks: Ghost Stories, Black Books by The Paris ReviewThe Sporting Life by Louisa ThomasGabriel Orozco by Sabine Mirlesse Google Pixel Watch 2 deal: Save $50 on WiFi and cellular models Read Trump's third criminal indictment for yourself Golf is cool now. Here's why the sport is all over TikTok and leading fashion trends. Nothing Is Like Anything Else: On Amy Hempel by Alice Blackhurst TikTok's 'when you know the words to the song, sing along' trend, explained 'There's a man out front' explained: Twitter's creepy new copypasta meme X (née Twitter) launches its ad revenue sharing program for creators A Walk with Fame by Aysegul Savas Easter in Sri Lanka: Today Is Loss That Isn’t Loss by Vyshali Manivannan Ms. Difficult: Translating Emily Dickinson by Ana Luísa Amaral 'GTA 6' trailer dropped a day early. Here's the release date window. The BookTok controversy with Seattle Kraken hockey player Alex Wennberg, explained Redux: Everything Is a Machine by The Paris Review The Joys of ‘Breaking and Entering’ The Artist The Royally Radical Life of Margaret Cavendish by Michael Robbins Redux: Desire Is Curled by The Paris Review Zoom is making its workers return to the office ChatGPT gives longer responses if you 'tip it $200,' according to one user The Benefits of Chronic Illness by Tom Lee
1.9706s , 8223.0078125 kb
Copyright © 2025 Powered by 【Watch Fast & Furious 7 Online】,Wisdom Convergence Information Network