“People don’t really check that things are working,” she tell Fast Company. “They don’t even know how to ask the question.”
For the logo, Cathy O’Neil requested the designer Katie Falkenberg make it look “fat and fierce.” I think they just about nailed it.
Right now, the seal is a simple ring design with ORCAA’s killer whale logo and text that reads, “Algorithm audited for accuracy, bias, and fairness,” with the date. Falkenberg hopes to one day update it so it gets timestamped from the date it’s uploaded to a company’s website. Because algorithms are constantly changing, Falkenberg wants the seal to let users know when an algorithm was last certified. O’Neil says algorithms should be regularly audited–perhaps once every two years or so, depending on the complexity of the code. Falkenberg also hopes to link the seal to O’Neil’s website so users can understand exactly what it means when they see it.
“Another flaw in human character is that everybody wants to build and nobody wants to do maintenance,” said Kurt Vonnegut.
Everybody’s wants to start something, but they rarely want to maintain it.
The problem in growing at no costs is that it blinds integrity. Instead of leading by example, the race to the bottom unearths the highest greed.
“The selfish reason to be ethical is that it attracts the other ethical people in the network.” Naval Ravikant
That’s the lesson of Facebook, the so-called ‘behavior modification empire.‘ The social network cut corners on data collection to make another buck. No Facebook, we will not answer any more questions “to help people get to know us.” Replace the word “people” with the attention merchants.
The Cambridge Analytica scandal was the nudge Facebook needed to become more accountable. Seizing the data of others and building on top of it contorts the machinery of morality. Sometimes the genie of innovation has to contain the miraculous.
#Zuckerberg starts spewing technical lingo that these senators do NOT understand. It makes him SOUND smart & like he's answering the question. The senators don't want to look dumb by challenging him, so advantage to Zuck.🙄
This was interesting. An older senator wanted Zuckerberg to pay lip service to American exceptionalism. He wasn't interested. Indicative maybe of a broader generational gap—millennials not so moved by such needless piety. pic.twitter.com/wCRdut9nWX
Facebook can’t pin the blame on the machine-optimizing algorithms. It’s humans who are responsible for managing the equations and policing validity. A recent study also proved that it is humans, not bots, that spread fake news.
Data is the new oil
Even worse, says Tufecki, the precedent sets the stage for those in power to leverage data to their own advantage:
We’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t “1984.” Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it.
But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it.
Tufecki paints the picture of a haunting dystopia at our doorstep. And it’s the social networks, which started off so benign that may be opening the maw of hell.
Outside parties were abusing stolen Facebook data to develop psychological profiles of voters. The data mining company Cambridge Analytica was central to the information warfare. They allegedly worked with Russians to stoke fears in the UK and America on immigration and other polarizing issues. So people got fake news and conspiracy theories in their feeds which led to Brexit and Trump.
Facebook is like an adult video game. People are obsessed with the sensational. And reality pays the price of fabricated events.
‘Move fast and break things’ may be a popular hacker’s motto but it’s shown to breed more carelessness than good. Thankfully, Facebook, Instagram, Twitter, and YouTube are facing up to the truth that while their tools bring us closer together but they also tear the world apart.
The damage has been done. The question now is how will they fix it? Some argue that the crackdown on Cambridge Analytic is just the start. Others like Om Malik are less optimistic. Pumping users and engagement are in Facebook’s DNA regardless of the consequences. Om writes:
Facebook is about making money by keeping us addicted to Facebook. It always has been — and that’s why all of our angst and headlines are not going to change a damn thing.
Screens are contagious. If we see one person look at their phone, we emulate them like we do catching someone yawn.
But the addiction is not totally our fault. With the vibrant colors of apps, the dopamine of Facebook likes and news alerts, on top of serving as a consolidated utility of our camera, wallet, and communications device, our phones are designed to hook us.
It’s amazing that in this post-internet world of surfeit information and 24/7 conversation we can even concentrate at all. We’ve numbed our thumbs from excessive use.
We’ve lost the signal to those little gaps of solitude and doing nothing where we reaped the benefits of a wandering imagination.
Can we get our bored minds back?
There are plenty of options other than riding the Facebook or Google monopoly on our attention. For as many tricks these companies play us, there as many tips to get away from them: turning our screen gray, just sitting and staring outside the window, and at the most extreme: throwing our phone into the ocean.
We can only harvest quality attention if we can escape the torment of distraction and external stimuli fighting for the inside of our heads. The world around us already creates a theater inside our head. We see the world once, with an intrinsic pair of eyes, with no need to record the outside world with a third eye.
“Attention is a form of prayer,” wrote French philosopher Simone Weil. We should insist on slowing down if we’re to restrengthen the human will.
“Reality is an activity of the most august imagination,” said poet Wallace Stevens.
What we call reality emerged from human ingenuity. So if we can take today’s tools and use them for good we’ll naturally have a better future.
Instead, we are building technology that paints a future dystopia. Hackers hijacked Facebook, Google, and Twitter and filled them with fake news during the 2016 election. What did we think was going to happen with free-flowing information? “The art of debugging a computer program is to figure out what you really told the computer to do instead of what you thought you told it to do,” quipped Andrew Singer, director of electrical and computer engineering at the University of Illinois. Meanwhile, Amazon is replacing its workers with bots.
While we can expect software manipulation to continue, there are still reasons to be hopeful. As Tim O’Reilly points out, we should be looking at ways to work with artificial intelligence to fuel productivity and innovation.
We have to make it new. That’s a wonderful line from Ezra Pound that’s always stuck in my brain: “Make it new.” It’s not just true in literature and in art, it’s in our social conscience, in our politics. We have look at the world as it is and the challenges that are facing us, and we have to throw away the old stuck policies where this idea over here is somehow inescapably attached to this other idea. Just break it all apart and put it together in new ways, with fresh ideas and fresh approaches.
We have a choice: we can deny optimism and permit darkness or we can build a brighter future. For every time Google chooses to be evil, or Facebook invades our privacy in an attempt to make stockholders happy, there’s another rocket Elon Musk is building that takes us from New York to Shanghai in 39 minutes.
There’s a lot to be hopeful for, as experiments should continue to be encouraged. The real question is how can we create a society for both rapid technological advancement and reflexive sociopolitical change. How do ‘we make it new’ without throwing out the stuff that made it good in the first place?
It is a canard to think that math can’t fail. All you need to do is look at the way society constructs algorithms – from job and college applications to Facebook feeds to find out that sorting can be wrong and biased.
In the case of the 2016 election, algorithms did more harm than good. Facebook fed the internet silos with fake news. As Cathy O’Neil author of Weapons of Math Destruction puts it in a 99% Invisible podcast: “The internet is a propaganda machine.”
We’ve adopted the factory mindset of mass-sorting, leaving the anxiety of decision-making up to machines. Humans are pieces of data, waiting to be organized by the least valuable candidate or customer.
There’s too many of us and not enough time to make individual considerations. But a conversation around algorithmic frailty might do us some good. Making generalizations impedes the magic of a discovering an outlier.
Social networks are specifically designed to keep us hooked as long as possible. No matter how aware we are of the entrapment, the exit door never tempts us enough to permanently leave.
Yet we ‘users’ are literally the ones being used and tracked so big brother can sell our data to advertisers. We permit cookies to follow our behavior all over the web, from shopping to googling health related issues.
While the internet offers us a marketplace of ideas, instead we find ourselves collecting resources and stealing other people’s opinions to reconfirm our own.
Rather than tapping into a common goodness of those who disagree with us, we pat the backs and converse with those who serve us their boilerplate bullshit.
Stuck in boxes, spied on and taxed, it’s no wonder the internet makes us so antsy we can’t get along. Writer Noah Smith puts it best: “15 years ago, the internet was an escape from the real world. Now, the real world is an escape from the internet.”
The only threat to the longevity of Facebook is that it makes people feel like shit.
Facebook’s relationship with its users, the product, is deeply psychological. It wants us to post whatever want, but all we end up doing is comparing our lives to other people in our own cocoons. We are ambiently aware of what everyone in our feed is doing.
The internet is a vast space of potential connectedness yet our relationships are usually with like-minded people. Our ideological bunkers reconfirm our beliefs, whether the content is real or fake.
The benefit of connectedness is proximity at scale — we can chat with a friend from the couch while Facebook surrounds us with ads like we’re standing in the middle of Times Square. Facebook is surveillance, and we give Big Brother the benefit of the doubt in selling our information to marketers in exchange for the ease of communication with so-called ‘friends.’
Facebook wants us to present our best selves online. It could care less about authenticity since it is our curated selves generate clicks and thereby give Facebook Ads a chance to make more money.
Facebook purports to be to the social network that upholds your real identity but its attention-based algorithm is psychologically damaging. The platform profits from fantasy, loneliness, and mimetic desire. Facebook persuades us to live the life we don’t want, thereby infringing on the personal liberty of making decisions that are key to our heart. Impressing others drains the soul of what we really want to do: express our uniqueness.
Facebook is the world’s biggest copy machine. It tries to box us in and disregard the person we really want to be. We are hooked on to its expectations of conformity and insularity.
The software and hardware companies like Apple, Google, and Facebook want us to trust them. The theory is that our information is better kept stored with them in a private cloud rather than with the government. Outside America, however, the NSA can collect our information without a search warrant.
The internet companies are not only American-based, their manifest destiny makes them look like hegemonic colonizers.
“This is a dilemma of the feudal internet. We seek protection from these companies because they can offer us security,” says Maciej Cegłowski. “But their business model is to make us more vulnerable, by getting us to surrender more of the details of our lives to their servers, and to put more faith in the algorithms they train on our observed behavior.”
We are all citizens of tech companies, trading privacy for free communication. But the users are the ads and coders are the kings; the latter which convert our interests and attention into ad revenue.
Technology platforms appear to be doing more harm than the good. Most recently, they’ve facilitated fake news and ushered in FOMO-hitting mental health issues.
The internet is as indispensable as water but it’s also a perceivable threat when the few that run the show are creating new problems while hesitating to solve them.
Facebook makes you unhappier because it produces envy. We always want in our feeds what we don’t have in real life: a stable relationship, a high-paying job, a weekend vacation in the Caribbean, a beautiful house, a new car, the latest gadgets–the list goes on.
But social media is edited real life. We tend to over-post happiness and under-post negativity. Who’s going to share about their mental illness, a divorce, or a family death? That’s sad stuff, even if Facebook allows you to respond with a weepy face instead of a thumbs up.
We usually post things that we wish were, not as they are. Social media presents the best of the best, an online Truman Show that excludes the beautiful struggle in between. At the very least, social media is pseudo-news that often omits context.
For many people, Facebook is their sole newspaper. One of the primary roles of a newspaper is to validate events rather than spread false rumors.
But fake news runs rampant on the platform because anyone can post it without consequence. Facebook does nothing to validate sources, especially since it fired its human curators and replaced them with an algorithm that amplifies noise, true or false. Twitter is equally culpable.
Should we believe anything on social media platforms? Probably not. But the press isn’t exactly trustworthy either. It also has an agenda, that which revolves around whichever drives the most site traffic and clicks.
Misinformation and lies are at the root of chaos. Even the smartest people can often be the most gullible, duped by comedians faking death.
If marketers are liars and social media is edited real life, people must also interpret the news with a grain of salt. Doubt everything.