Facebook took a hit on Wall Street today with its stock tanking by 7% and losing billions and as it continues to lose the trust of its users, the fall may not stop there. Facebook is facing increasing criticism as its policy in protecting sources and earning billions of dollars on ads that promote discrimination, fake news are rampant on the network.
As if being linked to influence the 2016 election was not bad enough, it is now linked to a truly horrific act of violence, a genocide. Of course, there’s the recent reporting by the Observer, the Guardian and the New York Times revealed that Cambridge Analytica — the social media monitoring firm that publicly acknowledged it helped to put Trump in the White House by gaining access to the data of 50 million Facebook users without impunity, but now the company is also linked to a genocide. Reuters reported earlier this month, that UN officials said Facebook played a role in a possible genocide against Rohingya Muslims in Myanmar. I cannot help myself than wonder what other horrors can Facebook be linked to? What will it take for its CEO and its boards to take definitive actions and better protect its users?
Researchers want more data — but Facebook is unwilling to share it.
Max Planck Institute and Northeastern University’s team of researchers has taken a look at these issues and published a paper. It believes that they may have figured out how to end ad discrimination on Facebook once and for all with a few simple steps. ProPublica stories reported that Facebook has not stopped advertisers from targeting ads to specific races, gender, and political leniency, potentially violating discrimination laws around housing and employment. The Planck researchers laid out in a paper last month how such a system would be applied with the hope that these steps could redefine how ads are targeted online. The researchers are suggesting to start by judging an ad based on the overall users targeted instead of the inputs used to achieve that target. But they immediately ran into a problem. The lack of transparency with Facebook itself. For the system to work, the next step should find out how many Facebook ads might end up blocked by the new anti-discrimination system, but Facebook’s current data policy means that quest is met by an inscrutable wall. Of course, all ad buyers can see the targeting tools, but only Facebook is able to view how people are actually using the ads.
“We just don’t know how the ad-targeting system is being used,” says Northeastern University professor Alan Mislove, who co-authored the paper. “Only Facebook has that data, which is an unfortunate situation.”
As this may appeal to those with less than honorable intentions, but for its critics, the social network is spreading conspiracy theories, complacent with Russian influence campaigns, and plays an active role in undermining democracy around the globe.
The issue of lack of transparency isn’t limited to targeted ads but also to bots and dummy accounts. Business Insiders reported that there are more than 20 million dummy accounts on Facebook.
At this point, the question all Facebook users should ask themselves is whether Facebook can be trusted with our personal information and unfortunately based on the latest exposure, the answer seems to be a resounding no.
The Scope Weekly contacted Facebook for comments but at the time of publishing, hasn’t heard back from the company. If new information is provided, the article will be updated accordingly.
Remember to follow The Scope Weekly on Twitter and Facebook and Instagram. If you would like to become a contributor to The Scope Weekly, read our submission guidelines, and apply. For product reviews, click here. We welcome your ideas and recommendations.