In July, Danielle Bostick joined a nationwide movement against sexual violence on school campuses when she made a Facebook page for her daughter. “Justice for Francesca,” is meant to raise awareness about the 15-year-old, who was sexually assaulted last summer by a classmate she didn’t know. For weeks, Bostick used the page to share articles about Francesca’s case and those of other students in similar circumstances.
But recently, Bostick began running into problems with Facebook’s platform. The social network wouldn’t let her share several news stories about Francesca, instead marking them as spam or abuse. The issue is the result of a mistake on Facebook’s part, but the incident highlights the unexpected, local impact that a global news and social platform can have on a community.
Facebook barred Bostick from posting two local news articles published by The Winchester Star, a daily newspaper in Virginia where she and her family live. The first, published in June, was about a school board meeting, during which Francesca and her family spoke out about her case. Six weeks later, the outlet published another story, this time concerning the appointment of Winchester\’s new Title IX officer, who Bostick publicly criticized. She couldn\’t share either to the social network.
\”I tried it from my computer, I tried it from Chrome, I even tried Internet Explorer. I was just trying all these different ways, it just wasn\’t working,\” says Bostick. \”I just started getting really suspicious that I couldn\’t share these articles anymore. This is just a flawed system where you\’re unable to share links from legitimate news sources.\”
The Winchester Star is the kind of regional newspaper that many US towns and cities are rapidly losing as newsrooms shut down or shed staff across the country. And it’s the kind of hyper-local news outlet that Facebook said in January it would give priority to in its algorithmically generated News Feed.
And yet, the platform blocked users from sharing the Winchester Star stories about Francesca’s case. When WIRED tried to share the pair of aforementioned articles to Facebook Tuesday, one was marked as being against Facebook’s Community Standards and the other was removed for spam.
\”It\’s been an annoyance more than a problem. We don\’t think certain stories are being targeted; it seems kind of random,\” says Brian Brehm, a reporter at The Winchester Star and the author of the stories. He says several other articles from his publication have also experienced issues with Facebook. \”I wrote a story about a dog who died, and it got blocked.\”
After WIRED reached out to Facebook with links to the stories, they were no longer barred Wednesday. Facebook says they were blocked as a result of a problem with its spam detection efforts.
\”We maintain a set of anti-spam systems to identify potentially harmful links and stop them from spreading in an effort to help keep spam off of Facebook,\” a spokesperson for the company said in a statement. \”In this case, our automated systems incorrectly blocked these links. We worked to fix this issue as quickly as possible and the URLs should now be able to be posted. We’re very sorry about this error and any inconvenience it may have caused.\”
Facebook says the issue is separate from a similar one it experienced on August 24, which erroneously caused some posts to be marked as spam, reportedly including an opinion piece written by a New York Post columnist.
Facebook is where a reported 45 percent of Americans get their news.
The Winchester Star is not the only news outlet that has struggled with Facebook\’s filters in recent weeks. Earlier this month, the company mysteriously removed the English-language page for Telesur, a state-owned Latin American news network, according to the The Intercept. The social network reportedly took down the page because it detected \”suspicious activity.\” It has since been restored.
Part of what may have tripped up The Winchester Star stories is the newspaper\’s lack of encryption. The site doesn\’t use HTTPS, a secure communication protocol utilized by most major websites, including mainstream news outlets.
Facebook is also under tremendous pressure to rid itself of the kind of fraudulent news sites that spread misinformation during the lead-up to the 2016 presidential election. The company has recently taken a more proactive approach to weeding out fake accounts and activity; there were bound to be some false positives.
Earlier this month, Facebook also announced it would begin assigning users a \”reputation score\” based on how trustworthy they are when it comes to reporting fraudulent news stories. It\’s one measurement the company plans to use to weed out genuinely fake stories from ones a particular user simply doesn\’t like. The scores are not made public, and how exactly they are calculated remains opaque.
While every online platform will inevitably make errors, the stakes are uniquely high for Facebook, which has come to play a role in millions of Americans’ civil lives. Facebook is where a reported 45 percent of Americans get their news, according to the Pew Research Center. We also often implicitly trust the social network to work properly, sometimes struggling to assign blame for what may very likely be a genuine technical mistake.
When Bostick noticed she couldn\’t share the stories about her daughter, she feared the worst. She worried the school district had hired a public relations firm to try to curb the reach of negative coverage, or that someone in her town was purposely reporting the stories to Facebook to have them taken down. Facebook is designed to bring people together, but when the platform makes mistakes, it can end up sowing distrust. Bostick\’s suspicion was also fueled by the fact that she was unable to reach someone at Facebook who could explain to her why the articles were being blocked.
\”They can say their system is not perfect, but what is their system? There doesn\’t seem to be a human element,\” she says.
If anything, Bostick\’s experience highlights how Facebook has become a necessary part of how millions of Americans interact with their own communities. Protesting a local school board decision or sharing a news article often now intrinsically involves the social network. Bostick and her daughter live across the country from the company\’s headquarters in Menlo Park, but they can be impacted when its employees, or its algorithms, make an error.
This content was originally published here.