Our Bodily Bias
How much does a mind’s embodiment have to do with its recognition of other minds? The question might seem to come out of science fiction, as science fiction is full of examples of intelligence that has a different physical incarnation from our own. Most of the time, the denizens of science fiction universes seem to have little trouble in recognizing other intelligent entities as intelligent, regardless of their appearance. The bodies range from the nearly identical, in the case of humanoid replicants in Blade Runner to the gargantuan and fantastic, as is the case of the vessel that houses HAL in 2001. In most cases, the illusion is believable. Give an object a voice, preferably one that seems to correlate with some movement, and the audience is easily taken to think some mental process is behind it.
Not so when the audience leaves the theater and with it, their suspension of disbelief. We deal with robots all the time. I’m interacting with one while I type this, use an ATM, or call the automated voice that tells me what numbers to press when I want to pay my phone bill. At no point in my regular interaction with these devices am I convinced that what I’m dealing with has intelligence. There is no knockdown artificial entity that can convince us, today in the world, that there is consciousness behind the voice and movement.
I’ve written about this before—it was the topic that introduced my philosophy of mind class a year and a half ago. My interest in embodied cognition came about because I wanted to investigate the question. Then, as now, I predict that our recognition of any differently embodied (artificial or otherwise) would be clouded by bias. We want to look into eyes like our own and perceive, somehow, a spark that appeals to our intuition.
The paper I wrote focused on what an approach enlightened by theories of embodied cognition might do to help us understand how our bodily bias might affect our recognition of minds with embodiment different from our own. It made heavy use of Nagel’s famous “What is it Like to Be A Bat?” and Andy Clark’s work, as well as some sources from professors at UCSD’s embodied cognition lab. It was a solid final paper for my first philosophy of mind class.
This semester, I’ve been auditing a graduate class on embodied cognition at the University of Edinburgh. Today was the second to last seminar of the semester, and I couldn’t stop wondering about my original question afterwards. Near the end of this term, how much closer am I to answering that question?
I think that I now have a better understanding of the question, itself. The problem of other minds has many facets, but my question is most concerned with a particular facet of the problem. We intuitively recognize cognition, to different degrees, in many places external to ourselves. Sometimes, this intuition is overanalyzed. This is the problem Wittgenstein considers wayward philosophy makes when it declares that animals do not talk because they do not think, instead of considering that “they simply do not talk” (Philosophical Investigations §25). It is easy to ascribe a measure of cognition to a dog, and the reality of other minds is presupposed by many of our interactions. At least in the case of those interactions with that which has, in an intuitively obvious fashion, cognition.
That, of course, is the rub. Some dogs have big, watery eyes. They make sounds and assume postures similar to our own when we feel a certain way. It is fairly easy to simulate, in one’s own mind, what the dog might be feeling. What goes into this simulation in our minds? We can’t have the same phenomenal experience of a dog, after all. How much is our biology directly responsible for that instant, non-theory laden simulation (if that is indeed what happens) of what it is like to be a dog? To what degree does our environment contribute? These sorts of questions are central to embodied cognition theorists, many of which have their sights set on the higher question of what, exactly, constitutes cognition.
The course has thoroughly covered Jesse Prinz’ theory of emotions, body image, body schema, and lately the phenomenology of agency. These are all topics that can be informed by embodied cognition theory. The problem of other minds has appeared periodically in all of my classes this semester, and I get the feeling that philosophers are getting past the apparent truth of “they simply do not talk” to what happens when we attribute agency, or consciousness, to ourselves and others. Hopefully, when we understand how we recognize consciousness in those beings that have embodiment similar to our own we will also be on the way to an understanding of consciousness that allows us to see through our bodily bias.