After Facebook’s scandal in May over their lack of conservative news in their trending section, the team of editors that previously checked stories for accuracy and helped determine its popularity were kicked out to allow a hopefully unbiased group of bots to find the most popular news stories and bring them to the public’s eye. Instead of the previous team of editors, there is now a team of engineers who work on the bots rather than the stories. However, this act has lead to a series of problems for the social media site. The trending news tab has begun to collect and show off hoax stories as real stories. The algorithm works by looking through several news stories on a specific subject and uses a sentence from one of the stories as the main headline. This main headline shows up when people click on the word or hashtag that shows up on the trending tab. The problem is that it sometimes picks central stories that are not true but still connected to the central idea. This is a serious problem because while hoaxes and satirical stories are not inherently evil, if people believe them because of their trust in Facebook, it can cause serious issues.
An example of this is a story about Fox News correspondent Megyn Kelly. In this story, it tells that Megyn Kelly has been called a “traitor” and kicked from Fox News for supporting the Democratic nominee, Hillary Clinton. While this story was false and riddled with typos, it held the spot as one of the highest trending stories of the day for several hours. Facebook eventually got around to getting rid of this false story, but the damage had already been done. This has not been the only story to break through Facebook’s trending area, either. There was another fake story that trended on Facebook about how 9/11 being a conspiracy and how the World Trade Center was actually blown up with bombs.
The use of less than perfect bots is a problem throughout the internet. This includes websites like Youtube where video creators get their videos taken down because of fake copyright strikes that are moderated by ineffective bots. At this moment in time, it is not a good idea to rely almost entirely on bots to do the delicate work required in this very strong areas. Instead of getting rid of bias, it is creating falsehoods.
Facebook’s attempt to get rid of human bias is a noble effort but it is not effective at the moment. If the algorithm of Facebook evolves to a point where it is mostly safe from false stories, it would be a great idea to use it in place of potential bias. However at this moment in time, it is not effective and Facebook needs to bring back humans into the mix until the technology is ready to work properly. In an attempt to help Facebook weed out these fake or satirical stories, Facebook users need to stop sharing or promoting stories that are not true.