Humans are thinking creatures. Otherwise, the only difference between humans and other animals is that we have bigger brains that also allow us to speak.
But we use less brain power every day because we use calculators, Google, and self-driving cars. We're not lazy, but we prefer to do the things we want so we can carry on with the business of living. What we risk skipping though are the lessons in between, which give neurons a chance to make new synaptic connections.
When we want to recall a statistic or a dig back into our vocabulary, the brain runs past its library of facts and pictures and jogs the mind's memory.
Thinking is a bicep curl for the mind.
Yet today, we're more likely to outsource our chance to think, choosing exactitude rather than admitting to our weaknesses and coping with uncertainty.
Nevertheless, what most digital naysayers don't realize is that new technology, whether it's the rise of machines via the industrial revolution or smart computers driving artificial intelligence, there will be other things to learn like coding. Coding feeds the machines and tells them what to do. However, we should resist becoming the tools of our tools, as Thoreau admonished.
We're both and winning losing it at the neurocognitive level while advancing society at the same time. The hard part will be holding on to ambiguity, the space in between the strange things, as the data will always feel the need to identify and fix things. Most importantly, what thinking teaches us is that it's ok to be wrong.