Facebook users, unfortunately are not the best when it comes to telling the difference between what’s true, and what’s not. It is very common for Facebook users to share any news they see, whether or not it is actually reliable, without stopping to really check before. This occurs for two main reasons. One, users want the false information to be taken as true. And secondly, many just don’t know the difference.
Overly common media illiteracy is the main factor that needs to be given attention and dealt with. It is something that has been a challenge for too long and will likely last until either the end of Facebook or the end of time, depending on what comes to an end first. Mark Zuckerberg, the founding father of Facebook has decided to make it his mission to correct this digital media illiteracy issue but giving more power to the literature.
Zuckerberg made a Facebook post sharing that he has requested from his product teams to make a special emphasis on ensuring that news is local, trustworthy and informative. He also admits that there is much too much misinformation and polarization in today’s world, and that his social media platform enables users to share information faster than ever before. How this is only now become an important issue still remains questionable.
It is of no doubt that action needs to be taken. But the difficult part is to really decide which news sources should be considered reliable, especially so in today’s world with such a division. This decision is one that could be attempted to be made ourselves, however we are not quite comfortable making it. Facebook considered hiring experts, who would have the responsibility of choosing what is objective, and what is not. The other option considered was to ask the Facebook community themselves in order to rank the reliability of news.
Facebook chose the latter, and has made the decision to rely merely on their users and their own biases and subjectivity. Facebook explains that they will prioritize certain media outlets by questioning their 2 billion users, things such as how familiar they are with a certain source, and whether or not they trust the source. The concept behind this method is that certain news organizations are trusted solely but their own readers, while others are trusted throughout society as a whole, even people who do not directly follow them.
Either Facebook is unaware, or is just not quite ready to handle the fact that people inevitably will have certain biases and loyalties to certain sources. A user that likes to share content will very likely vote that the source is highly trustworthy. Without this, people are openly admitting that they both share and read false information, and we all have egos much bigger than that to really settle for that. The surveys, according to a Facebook representative, have the intention of ensuring that people can gain more content from their preferred sources, which should in theory, come together with more trusted sources.
Facebook is giving all of the responsibility to it’s community as well as the algorithm, and are fairly reluctant to take any further measures, would could cause discomfort and negative response from their users.
How the surveys will actually work has yet to be released, and it is still unclear what the affect will be on new or smaller sources. It seems as though that as long as Facebook exists, this problem will too.