Facebook can’t pin the blame on the machine-optimizing algorithms. It’s humans who are responsible for managing the equations and policing validity. A recent study also proved that it is humans, not bots, that spread fake news.
Data is the new oil
Even worse, says Tufecki, the precedent sets the stage for those in power to leverage data to their own advantage:
We’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t [easyazon_link identifier=”0452284236″ locale=”US” tag=”wells01-20″]”1984.”[/easyazon_link] Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it.
But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it.
Tufecki paints the picture of a haunting dystopia at our doorstep. And it’s the social networks, which started off so benign that may be opening the maw of hell.