Much sound has rightly been made about the function Facebook played in the 2016 usa president election. Critics have pointed to some targeted ad campaign by Russian organizations as proof that the Menlo Park-based company wasn’t minding the store â? and alleged that disaster implemented as a result. Â
But that disagreement overlooks one key point: In displaying microtargeted “dark ads” to customers, Facebook was doing exactly what it was created to do. The larger problem is not these types of specific Russian ads (which Fb refuses to disclose to the public) â? or even that Donald Trump has been elected president â? but the quite system upon which the company is built. Â
Mark Zuckerberg’s plan to increase openness on political advertisements, while pleasant, falls into the same trap. Indeed, more disclosure is good, but very best remedy when the underlying architecture by itself is gangrenous? Â
Zeynep Tufekci, author of Twitter and Rip Gas and associate professor on the University of North Carolina at Church Hill, made this point painfully obvious in a September TED Talk that will dove into the way the same methods designed to better serve us advertisements on platforms like Facebook are able to be deployed for much more dark purposes. Â
“So Facebook’s marketplace capitalization is approaching half the trillion dollars, ” Tufekci informed the gathered crowd. “It’s since it works great as a persuasion structures. But the structure of that architecture will be the same whether you’re selling shoes or boots or whether you’re selling national politics. The algorithms do not know the difference. Exactly the same algorithms set loose upon all of us to make us more pliable to get ads are also organizing our politics, personal and social information moves, and that’s what’s got to change. “Â
Tufekci further argued that when device learning comes into play, humans can shed track of exactly how algorithms work their own magic. And, she continued, not really fully understanding how the system works provides potentially scary consequences â? such as advertising Vegas trips to people going to enter a manic phase.
This concern is real. Fb can now infer all kinds of data regarding its users â? from their political sights, to religious affiliations, to cleverness, and much more. What happens when that strength is made available to anyone with a small marketing budget? Or, worse, an oppressive government?
“Imagine what a condition can do with the immense amount of information it has on its citizens, inch noted Tufekci. “China is already making use of face detection technology to identify plus arrest people. And here’s the particular tragedy: we’re building this facilities of surveillance authoritarianism merely to obtain people to click on ads. “
Facebook bills itself as a company trying to bring “the world closer collectively, ” but the truth of the issue is far different. It is, naturally , a system designed to collect an endless quantity of data on its users with the objective of nudging us toward no matter what behavior the company believes is in the best interest â? be that will purchasing an advertised item, voting, or being in a particular mood. Â
That’s a fundamental problem that slashes to Facebook’s very core, and it is not one that a new political advertisement disclosure policy will fix. Â