We’ve gone from the invention and proliferation of 15th century mirrors to human iPhone selfies, to the Google street view robot taking pictures of itself off mirrors.
Everything starts on paper.
Whether you are using post-it notes or loose leaf, paper is ideal for getting down thoughts and mapping out ideas quickly. In fact, some Google employees prohibit phones and use paper exclusively to brainstorm. The magic of writing in analog is a controlled speed, flexibility, and focus.
“Everyone can write words, draw boxes, and express his or her ideas with the same clarity.”
If computers are a bicycle for the mind, as Steve Jobs once proclaimed, then writing on paper is like taking a walk. Paper jogs the mind, it is slow yet methodical, allowing it to connect the dots between disparate things.
“As with music, so with thought: when you want clarity, you seek out paper. Paper is the slow food of thought.”
As much as technology facilitates creativity, it can also distract it. Various studies show that taking notes by hand helps students remember more. Physical books, like vinyl, are also still hanging around despite the popularity of e-readers. Meanwhile, handwritten letters are considered more meaningful because of the perceived effort it went into writing and mailing them.
Digital abundance drives up the value of scarce objects like paper. Paper is proving its longevity not just as a nostalgic medium but also because it benefits the process of thinking and planning.
“As long as everyone is thinking and writing stuff on paper, you’re on the golden path.”
The smartphone hypnotizes us into screen glaring addicts.
We have zero control of our attention and it makes us feel like we’re losing our mind. Writes Farhad Manjoo in his piece We Have Reached Peek Screen:
Screens are insatiable. At a cognitive level, they are voracious vampires for your attention, and as soon as you look at one, you are basically toast.
There are studies that bear this out. One, by a team led by Adrian Ward, a marketing professor at the University of Texas’ business school, found that the mere presence of a smartphone within glancing distance can significantly reduce your cognitive capacity. Your phone is so irresistible that when you can see it, you cannot help but spend a lot of otherwise valuable mental energy trying to not look at it.
The companies Apple and Google who got us hooked in the first place are now trying to reduce screen time by outsourcing things like to-do’s to voice assistants like Siri.
If Apple could only improve Siri, its own voice assistant, the Watch and AirPods could combine to make something new: a mobile computer that is not tied to a huge screen, that lets you get stuff done on the go without the danger of being sucked in. Imagine if, instead of tapping endlessly on apps, you could just tell your AirPods, “Make me dinner reservations at 7” or “Check with my wife’s calendar to see when we can have a date night this week.”
That candy-colored rectangular glow is too seductive, a trap that leads into a ludic loop of distraction. It’s about time the tech heads, like car companies did with seat belts, are doing something to preserve our neurological safety.
The internet owns our words.
Anyone can pull up an old Tweet or Facebook post and show you ‘this is what you said.’ The internet makes permanent the written word.
But such posts are usually “naked and without context.”
It’s not that people don’t look at the time stamp; it’s that words get lost in time. They are instantly indexable. They can be copy-pasted with a click, reemerging from the abyss of dormancy.
Writes Peter Pomerantsev in his article “Pay For Your Words”:
“There is a sense that words have slipped the leash. We think we’re expressing ourselves, but actually we’re just leaving a data imprint for someone else to make use of. Whether we write an email, a Facebook message, store content on a Google drive, or type out a text, all of what we write is sucked into a semantic web.”
But a photo lives and dies from the second it’s taken. It’s born with a frozen setting, a time and a place. Our eyes taste pictures with the past, even before we gaze to analyze them.
“But you can push away from the photo of yourself: it was a younger you, you look different now. Words are different. They feel ever-present, always as if you’ve just said them. It’s harder to disentangle yourself. ‘You will pay for those words’ goes the banal phrase – no one ever says ‘you will pay for that photo’.”
If we are accountable for what we say, why write anything at all if it comes back to bite you? The durability of the written words appears to be riskier than ever.
Are we selling our souls for ads?
Technosociologist Zeynep Tufecki seems to think so. The Cambridge Analytica-Facebook debacle demonstrates the Wild West of data exploitation.
Facebook can’t pin the blame on the machine-optimizing algorithms. It’s humans who are responsible for managing the equations and policing validity. A recent study also proved that it is humans, not bots, that spread fake news.
Even worse, says Tufecki, the precedent sets the stage for those in power to leverage data to their own advantage:
We’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t [easyazon_link identifier=”0452284236″ locale=”US” tag=”wells01-20″]”1984.”[/easyazon_link] Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it.
But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it.
Tufecki paints the picture of a haunting dystopia at our doorstep. And it’s the social networks, which started off so benign that may be opening the maw of hell.
“Reality is an activity of the most august imagination,” wrote poet Wallace Stevens.
What we call reality emerged from human ingenuity. So if we can take today’s tools and use them for good, we’ll naturally have a better future.
Instead, we are building technology that paints a future dystopia. Hackers hijacked Facebook, Google, and Twitter and filled them with fake news during the 2016 election. What did we think was going to happen with free-flowing information?
“The art of debugging a computer program is to figure out what you really told the computer to do instead of what you thought you told it to do,” quipped Andrew Singer, director of electrical and computer engineering at the University of Illinois. Meanwhile, Amazon is replacing its workers with bots.
While we can expect software manipulation to continue, there are still reasons to be hopeful. As Tim O’Reilly points out, we should be looking at ways to work with artificial intelligence to fuel productivity and innovation.
We have to make it new. That’s a wonderful line from Ezra Pound that’s always stuck in my brain: “Make it new.” It’s not just true in literature and in art, it’s in our social conscience, in our politics. We have look at the world as it is and the challenges that are facing us, and we have to throw away the old stuck policies where this idea over here is somehow inescapably attached to this other idea. Just break it all apart and put it together in new ways, with fresh ideas and fresh approaches.
We have a choice: we can deny optimism and permit darkness, or we can build a brighter future. For every time Google chooses to be evil, or Facebook invades our privacy in an attempt to make stockholders happy, there’s another rocket Elon Musk is building that takes us from New York to Shanghai in 39 minutes.
There’s a lot to be hopeful for, as experiments should continue to be encouraged. The real question is how we can create a society for rapid technological advancement and reflexive sociopolitical change. How do ‘we make it new’ without throwing out the stuff that made it right in the first place?