Oct/Nov 2018 Salon

You Think What You Are

by Thomas J. Hubschman

Public domain image adapted by Tom Dooley

We think that we think with our prefrontal lobes, our so-called conscious mind. But it ain't so. We think with our entire body. That's why the idea that Artificial Intelligence or computers could replace us is absurd. Unless we made a computer out of material and in a form identical to the flesh-and-blood ones we already have—in which case it would be human—AI can never replace us.

We imagine our minds as the experience that takes place between our ears, especially the rational and conscious part of that experience. But our mental states are the products of our physical reality, what we condescendingly call our "body," all our body. Even abstract thought as an attempt to stand outside the reality of our bodies and the world beyond relies on our physical experience to find expression for what we call "ideas."

When I listen to music, I don't just do so with my ears and brain. Every organ in my body is involved. The very structure of the music, no matter whether it's a Beethoven sonata or a bit of punk rock, derives from the way my GI tract, my respiration, my heart, and most importantly, my sexual organs and endocrine systems work. The musical climax is a facsimile of the sexual one in all its stages and varies from composer to composer as much as a sneeze varies from one individual to another. The sense of longing, of pain, of satisfaction derives from my experiences of those feelings in my body.

The same thing is true for our so-called abstract thinking. We are simply not aware of how it follows the same patterns as our more obvious flesh-and-blood experience, just as we are not aware how it occurs before we realize or express it. Consider how we talk. We don't generate ideas and sentences in our conscious mind. The words come tumbling out without effort for the most part, already formed, charged with feeling no matter how dry the subject matter may appear to be. Thought is as emotional as anything else we experience. It involves anticipation, controlled release, catharsis. It wouldn't be thought without those feelings, and those feelings are all cognate with what happens in my gut when I'm hungry or gaseous and in my lungs when I climb several flights of stairs.

In how many other ways do we experience tension and release? An itch and scratch, a good cough, eating and drinking... the list is almost endless. And all these experiences are part and parcel of the shape of our minds, which vary from individual to individual much more than we would like to believe, despite a common basic structure. What we think of as mental or emotional, i.e., non-physical (whatever that means), is the same as what we experience in our muscles, bowels, and other body parts, or in our bodies as a whole (i.e., in our selves).

Our experience of time, of succession and duration, is a given for the way we think about anything. It's why we're so addicted to narrative, including the most abstract kind. And time is a construct of what we experience through our senses. There is no "objective" reality to time. If we and other sentient things did not exist, neither would time. The rocks and stars don't experience time. We learn time through the beat of our heart and the breaths of our lungs. Trees know it in their own ways, fruit flies in theirs. To speak of the first few seconds of the universe, except as a feeble metaphor, is to engage in anthropomorphism with a cosmic hubris. But the only way we can experience anything is by anthropomorphizing it, however elegant our mathematical or scientific model.

Yet we do make sense of things like the Big Bang or how to get from Philadelphia to St. Louis by using those very faculties that form nothing more than models of—what?—the "real" world? There is no such thing, only perceptions of it—ours, earthworms, insects' (drosophila melanogaster, the common fruit fly, apparently dreams). We only need to know enough so that the knowledge works for us. Dogs and cats don't need to know what we do, and vice versa. Our brains were constructed over millions of years to be good at what they do, which is very good indeed but superficial in terms of how the world actually works.

When Isaac Newton introduced the theory of gravity, he was roundly criticized for appealing to the occult and setting back science hundreds of years. Newton was apologetic but insisted such a force was at work and that his mathematics proved it. What all human beings and other animals understand from our earliest years is that, in order to make something move, something else has to come in contact with it. We have no experience of motion taking place by attraction at a distance. That's because we don't need to know. We know things fall. That knowledge has gotten us where we are very nicely without our having to come up with an explanation for the tides or why the heavenly bodies revolve around one another.

We no more understand gravity today than Newton did three and a half centuries ago, just as we don't and never will understand Relativity or Quantum phenomena. We can make use of our knowledge of how they operate, but our brains are just not configured to comprehend them in the sense we do motion-by-contact. Rats can solve intricate maze puzzles in order to get food, but if a rat had to understand the concept of prime numbers to negotiate a maze, it would go hungry.

We like to believe we humans transcend the physical world. But we are immersed in it as much as any other creature. We can react with indignation and denial, or we can accept and revel in our humanity's scope and limits. When we experience the pleasure of a tasty dessert or the joy of a new idea, one is not a "physical" delight and the other a disembodied one. They are both physical and both "spiritual," if you will.

Further Thoughts:

The clever people who are developing Artificial Intelligence use terms like "neural networks" and "reward" when they talk about how their machines operate. But they acknowledge what they are putting together has nothing to do with the nerve cells in our brains or the emotion we feel when we are given something as a result of completing a task successfully. They just have no other vocabulary, i.e., no human experience, to describe cybernetic phenomena other than metaphors drawn from their own human physiology and personal history. What actually goes on inside computers is extremely complicated, too much so sometimes for their makers to understand, but it is essentially stupid to a degree that no being, never mind the human species, has ever been similarly stupid and survived.

How do I know so much about computers, I who have trouble keeping my laptop in functioning condition and understand next to nothing about either hardware or software? I don't, but that may also be why I am not impressed by the amazing things they can do and why I do not believe they can replace us or even replace the abilities of the simplest one-celled animal. I get reinforcement for such conclusions from the likes of Richard Feynman, whose video I encourage you to watch on YouTube, in which he explains everything you need to know about computers if you lie awake nights worrying they will one day replace us. He explains with great clarity how computers are just filing clerks, extremely stupid filing clerks (again, to use a human metaphor for something that actually has nothing in common with any kind of living thing). In effect, they do only one thing, add and retrieve data, but they do it very, very fast.

Programmers devise code to make computers more and more useful without, I suspect, ever expanding the very simple operation they are capable of. By increasing the sophistication of the operations the machines perform, they are making those machines capable of more and more advanced tasks. Those tasks indeed mimic and in some cases far exceed what human minds can do in specific disciplines, but computers no more think or have a mind than a photograph does, however realistic the image may seem. The abacus, after all, performs a similar function, as does the apparently simple task of writing, though we rarely hear anyone refer to the abacus or a book as a form of artificial intelligence. Writing was deplored early on by those who believed it would lead to a deterioration of human memory. It probably has, and no doubt the abacus put out of work clever individuals who could add and subtract in their heads rapidly and correctly. But does anyone seriously claim human beings have been replaced by adding machines and print?


The Japanese have a demographic problem: too many old people and not enough young to take care of them. They will have a shortage of doctors and nurses and other caregivers, including family members in a nation that prides itself on taking care of their elderly at home. One solution they are coming up with is the development of robots to act not just as physical aids but as companions to the old. They have already deployed some of them. They carry on conversations with their human charges, remember their favorite songs, and greet them by singing them. Some of those old people are growing fond of their robots. You don't have to be 80 or 90 to feel there is a living being in a machine when it responds to you in a human-like way.

But isn't this a dangerous track for us to be on? Is there not even something sinister, whatever the good intentions, in substituting a piece of software for a human heart and brain? It would be kinder to give those old folks cats. At least animals have real feelings. I find it sad to think of an old person whose mind might already be compromised being fooled into believing a piece of plastic and silicon can care about them.


Previous Piece Next Piece