Facebook Utilizes Automated Tools to Scan your Messages for Abuse

Think your conversations on Facebook, Inc. (NASDAQ: FB) are private? Think again. 

Facebook has entrenched that it uses automated tools to look over Messenger conversations for malware links and child porn images. It also grants users to report conversations that possibly violate its community standards. Through this, the company's moderators can review any messages that are flagged by users or the automated systems.

Facebook has always been adamant in letting their users know that its workers can review posts to ensure they adhere to its community standards, however many users assumed their chats on Messenger were private.

Facebook said in a statement on Thursday that keeping messages private is a high priority, but ensuring the safety of the users and the community is important, defending the automated tools as being “very similar to those that other internet companies use today.”

“The content of the messages between people is not used for ads targeting,” a company spokesperson said, “We do not listen to your voice and video calls.”

Facebook has come under a heavy public eye due to its recent scandals after news confirmed that Cambridge Analytica, a data firm that ties to President Donald Trump's campaign, may have had took and utilized information of millions of Facebook users without their knowing. Following that, questions arose over the social media platform that led to reasons for tough new regulations, and has incited calls for Facebook to be more candid on how it handles user data.

Messenger, which is derived from Facebook and allows users to text amongst themselves, became a point of importance this week after Chief Executive Officer Mark Zuckerberg said the company had “detected” that “sensational messages” were being sent via messenger in Myanmar. Human rights advocates and journalists believe Facebook was being used to spread misinformation in the country, inciting more fuel to ethnic violence against a Muslim minority group called the Rohingya.

“In that case, our systems detect that that's going on,” Zuckerberg said, “We stop those messages from going through.” Facebook clarified in a statement on Thursday,

“In this particular instance, a number of people reported receiving these messages which prompted us to begin investigating.”

Zuckerberg admitted on Wednesday that the company could do a better job of explaining what it does with user data, “[There are] many misperceptions about what we actually do,” he said. Zuckerberg is scheduled to answer questions from two U.S. congressional panels next week about how his company manages its user data.

Leave a Comment