26 Functionalism
Introduction: Two Monsters We Must Avoid
While passing through the Strait of Messina, between mainland Italy and the isle of Sicily, Homer has Odysseus come upon two monsters, Scylla and Charybdis, one on either side of the strait. If Odysseus is to pass through the strait, he must choose between two very unhappy options; for if he averts one along the way, he will move in the other’s monstrous reach. On the one side is roaring Charybdis, who would surely blot out—as if by colossal whirlpool—Odysseus’s entire ship. (Have you ever been faced with an option so bad that you cannot believe you have to seriously consider it? Well, this is Odysseus’s bleak situation.) On the other side of the strait, things fare little better for Odysseus and his war-weary crew: we have vicious Scylla, who only by comparison to Charybdis, looks like the right choice. The ship makes it through, Homer tells us, minus those who were snatched from the ship’s deck and eaten alive. Six are taken, we are told, one for each of Scylla’s heads. By comparison only, indeed.
In this chapter we consider the theory of mind known as functionalism, the view that minds are really functional systems like the computing systems we rely on every day, only much more complex. The functionalist claims to sail a middle path between materialism (discussed in Chapter 2), or the joint thesis that minds are brains and mental states are brain states, and behaviourism (also discussed in Chapter 2), or the thesis that mental states are behavioural states or dispositions to behave in certain ways.
Avoiding Materialism
One the one side we have materialism, which we must avoid because there appears to be no strict identity between mental states and brain states. Even though human Freya is different than a wild rabbit in many interesting ways, we think they can both be in physical pain. Suppose that while restringing her guitar, Freya lodges a rogue metal splinter off the D string in the top of her ring finger. She winces in pain. Physiologically and neurologically, a lot happened—from the tissue damage caused by the metal splinter, to Freya’s finally wincing from the sensation. But it only took milliseconds.
Now suppose that while out foraging and hopping about, the wild rabbit mishops on the prickly side of a pinecone. The rabbit cries out a bit, winks hard, and hops off fast. A very similar physiological and neurological chain of events no doubt transpired from the mishop on the pinecone to hopping off fast in pain. But as interestingly similar as the wild rabbit’s brain is to human Freya’s, it is not plausible to think that both Freya and the wild rabbit entered into the same brain state. We do want to say they entered into the same mental state, however. That is, they were both in pain. Since the same pain state can be realized in multiple kinds of brains, we can say that mental states like pain are multiply realizable. This is bad news for the materialist; it looks like brain states and mental states come apart.
Avoiding Behaviourism
Now we look bleary-eyed in the direction of behaviourism. But here, too, we find a suspicious identity claim—this time between mental states, like Freya’s belief that her house is gray, and behavioural states or dispositions to behave in certain circumstances. For example, if Freya were asked what color her house is, she would be disposed to answer, “Gray.” But just as with mental states and brain states, Freya’s believing that her Colonial-period house is painted the original gray from when the house was first built and painted in 1810, and her dispositions to behave accordingly, come apart, showing that they could not be identical.
Suppose Freya wants to throw a housewarming party for herself and includes a colorful direction in the invitation that hers is the “only big gray Colonial on Jones St. Can’t miss it.” We say that Freya would not sincerely include such a thing if she did not believe it to be true. And we have no reason to suspect she is lying. We can go further. We want to say that it is her belief that her Colonial is big, is gray, and the only one like it on Jones Street that causes her, at least in part, to include that direction in the invitation. But if it is her mental state (her belief) that caused her behaviour, then the mental state and the behavioural state (her including the colorful direction in the invitation) cannot be strictly identical.
Freya might very well have been disposed to give just such a colorful direction to her home, given her beliefs, as the behaviourist would predict; and this disposition might even come with believing the things Freya does. But if we want to refer to Freya’s beliefs in our explanation of her behaviour—and this is the sort of thing we do when we say our beliefs and other mental states cause our behaviour—then we must hold that they are distinct, since otherwise our causal explanation would be viciously circular.
It would be circular because the thing to be explained, her Colonial-describing behaviour, is the same thing as the thing that is supposed to causally explain it, her Colonial-descriptive beliefs; and the circle would be vicious because nothing would ever really get explained. So the behaviourist, like the materialist, seems to see an identity where there is none.
No Turning Back: The Mind is Natural
The goal is to formulate an alternative to the above two theories of mind that nevertheless both make a promise worth making: to treat the mind as something wholly a part of the natural world. From the failures of materialism and behaviourism, we must not turn back to a problematic Cartesian dualist view of mind and matter (discussed in Chapter 1), where it again would become utterly mysterious how Freya’s beliefs about how her Colonial looks could possibly influence her physical behaviour, since her beliefs and physical behaviour exist on different planes of existence, as it were. But there is a third way to view beliefs like Freya’s.
Functionalism as the Middle Path
Our way between the two monsters is to take seriously the perhaps dangerous idea that minds really are computing machines. In England, Alan Turing (1912-1954) laid the groundwork for such an idea with his monumental work on the nature of computing machines and intelligence (1936, 230-265; 1950, 433-460). Turing was able to conceive of a computing machine so powerful that it could successfully perform any computable function a human being could be said to carry out, whether consciously, as in the math classroom, or at the subconscious level, as in the many computations involved in navigating from one side to the other of one’s room.
A Turing machine, as it came to be called, is an abstract computer model designed with the purpose of illustrating the limits of computability. Thinking creatures like human beings, of course, are not abstract things. Turing machines are not themselves thinking machines, but insofar as thinking states can be coherently understood as computational states, a Turing machine or Turing machine-inspired model should provide an illuminating account of the mind.
Turing’s ideas were developed in the United States by philosopher Hilary Putnam (1926-2016). Functionalism treats minds as natural phenomena contra Cartesian dualism; mental states, like pain, as multiply realizable, contra materialism; and mental states as causes of behaviour, contra behaviourism. In its simple form, it is the joint thesis that the mind is a functional system, kind of like an operating system of a computer, and mental states like beliefs, desires, and perceptual experiences are really just functional states, kind of like inputs and outputs in that operating system. Indeed, often this simple version of functionalism is known as “machine” or “input-output functionalism” to highlight just those mechanical features of the theory.
Nothing’s Shocking: The Functionalist Mind is a Natural Mind
The functionalist says if we conceive of mental stuff in this way—namely, as fundamentally inputs and outputs in a complex, but wholly natural system—then we get to observe the reality of the mind, and the reality of our mental lives. We get to avoid any genuine worries about mental stuff being too spooky, or about how it could possibly interact with material stuff, as one might genuinely worry on a Cartesian dualist theory of mind, where we are asked to construe mental stuff and material stuff as fundamentally two kinds of substances. With functionalism, the how-possible question about interaction between the mental and material simply does not arise, no more than it would for the software and hardware interaction in computers, respectively. So, on the functionalist picture of the mind, the mysterious fog is lifted, and the way is clear.
Multiple Realizability
Let us use a thought experiment of our own to illustrate the functionalist’s theory of mind. Imagine Freya cooks a warm Sunday breakfast for herself and sits on a patio table in the spring sun to enjoy it. Freya’s belief that “my tofu scramble is on the table before me” is to be understood roughly like this: as the OUTPUT of one mental state, her seeing her breakfast on the table before her, and as the INPUT for others, including other beliefs Freya might have or come to have by deductive inference (“something is on the table before me,” and so on and so forth) and behaviours (e.g., sticking a fork into that tofu scramble and scarfing it down). Note well: we have not mentioned anything here about the work Freya’s sensory cortex or thalamus or the role the rods and cones in her retina are playing in getting her to believe what she does; her belief is identified only by its functional or causal role. This seems to imply that Freya’s breakfast belief is multiply realizable, like pain is.
Recall our earlier discussion of the important difference between rabbit-brain stuff and human-brain stuff. Nevertheless, we wanted to say that both Freya and the wild rabbit could be in pain. We said pain, then, is multiply realizable. This is another way of saying that being in pain does not require any specific realization means, just some or other adequate means of realization. The point also strongly implies that the means of realization for Freya’s breakfast belief, no less than her pain, need not be a brain state at all. This signals a major worry for the materialist. Since our beliefs, desires, and perceptual experiences are identified by their functional or causal role, the functionalist has no problem accounting for the multiple realizability of mental states.
Real Cause: The Functionalist Mind Causes Behaviour
Finally, we saw that our mental states cannot be counted as the causes of our behaviour on a behaviourist view, since on that view of mind, mental states are nothing over and above our behaviour (or, dispositions to behave in certain ways in certain circumstances). In an effort to disenchant the mind in general and individual minds in particular, and move mental states like beliefs and pain into scientific view, the behaviourist recoiled too far from spooky Cartesian dualism, leaving nothing in us to be the causes of our own behaviour. The functionalist understands, like the behaviourist, that there is a close connection between our beliefs, desires, and pains, on the one hand, and our behaviour, on the other. It is just that the connection is a functional, or causal, one, not one of identity. Since mental states (like Freya’s belief that “my tofu scramble is on the table before me”) are identified with their functional or causal role in the larger functional system of inputs and outputs, other mental states and behavioural states, the functionalist has no problem accounting for mental states playing a causal role in the explanations we give of our own behaviour. On the functionalist theory of mind, mental states are real causes of behaviour.
Objections to Functionalism
Now that we have seen some of the major points in favor of the theory, let us have a look at some of the worries that have been raised against functionalism.
The Chinese Room
John Searle argues against a version of functionalism he calls “strong” artificial intelligence, or “strong AI” In “Minds, Brains and Programs,” Searle develops a thought experiment designed to show that having the right inputs and outputs is not sufficient for having mental states, as the functionalist claims (1980). The specific issue concerns what is required to understand Chinese.
Imagine someone who does not understand Chinese is put in a room and tasked with sorting Chinese symbols in response to other Chinese symbols, according to purely formal rules given in an English-language manual. So, for example, one person can write some Chinese symbols on a card, place it in a basket on a conveyor belt which leads into and out from the little room you are in. Once you receive it, you look at the shape of the symbol, find it in the manual, and read which Chinese symbols to find in the other basket to send back out. Imagine further that you get very good at this manipulation of symbols, so good in fact that you can fool fluent Chinese speakers with the responses you give. To them, you function every bit like you understand Chinese. It appears, however, you have no true understanding at all. Therefore, Searle concludes, functioning in the right way is not sufficient for having mental states.
The functionalist has replied that, of course, as the thought experiment is described, the person in the room does not understand Chinese. But also as the case is described, the person in the room is just a piece of the whole functional system. Indeed, it is the system that functions to understand Chinese, not just one part. So it is the whole system, in this case, the whole room, including the person manipulating the symbols and the instruction manual (the “program”), that understands Chinese.
The Problem of Qualia
The splinter Freya picked up from her D string caused her a bit of pain, and perhaps more so for the behaviourist, as we saw earlier. One major worry for the functionalist is that there seems to be more to Freya’s pain than its just being the putative cause of some pain-related behaviour, where this cause is understood to be another mental state, presumably, not identified with pain at all. (Remember, the functionalist wishes to avoid the vicious circularity that plagued the behaviourist’s explanations of behaviour.)
There is an undeniable sensation to pain: it is something you feel. In fact, some might argue that at the conscious level, that is all there is to pain. Sure, there is the detection of tissue damage and the host physiological and neurological events transpiring, and yes, there is the pain-related behaviour, too. However, we must not leave out of our explanation of pain the feel of pain. Philosophers call the feeling aspect of some mental states like pain fundamentally qualitative states. Other qualitative mental states might include experiences of colored objects, such as those a person with normal color vision has every day.
In seeing a Granny Smith apple in the basket on a dining room table, she has a visual experience as of a green object. But the functionalist can only talk about the experience in terms of the function or causal role it plays. So, for example, the functionalist can speak to Freya’s green experience as being the cause of her belief that she sees a green apple in the basket. But the functionalist cannot speak to the feeling Freya (or any of us) has in seeing a ripe green Granny Smith. We think there is a corresponding feeling to color experiences like Freya’s over and above whatever beliefs they might go on to cause us to have. Since mental states like pain and color experiences are identified solely by their functional role, the functionalist seems without the resources to account for these qualitative mental states.
The functionalist might reply by offering a treatment of qualia in terms of what such aspects of experience function to do for us. The vivid, ripe greenness of the Granny Smith functions to inform Freya about a source of food in a way that pulls her visual attention to it. Freya’s color experiences allow her to form accurate beliefs about the objects in her immediate environment. It is certainly true that ordinary visual experience provide us with beautiful moments in our lives. However, they likely function to do much more besides. Likewise, it is more likely that there is a function for the qualitative or feeling aspects of some mental states, and that these aspects can be understood in terms of their functions, than it is that these aspects are free-floating above the causal order of things. So, the functionalist who wishes to try to account for qualia need not remain silent on the issue.
Conclusion
We have not considered all the possible objections to functionalism, nor have we considered more sophisticated versions of functionalism that aim to get around the more pernicious objections we have considered. The idea that minds really are kinds of computing machines is still very much alive and as controversial as ever. Taking that idea seriously means having to wrestle with a host of questions at the intersection of philosophy of mind, philosophy of action, and personal identity.
In what sense is Freya truly an agent of her own actions, if we merely cite a cold input to explain some behaviour of hers? That is to say, how does Freya avow her own beliefs on a merely functionalist view? If minds are kinds of computers, then what does that make thinking creatures like Freya? Kinds of robots, albeit sophisticated ones? These and other difficult questions will need to be answered satisfactorily before many philosophers will be content with a functionalist theory of mind. For other philosophers, a start down the right path, away from Cartesian dualism and between the two terrors of materialism and behaviourism, has already been made.
References
Putnam, Hilary. (1960) 1975. “Minds and Machines.” Reprinted in Mind, Language, and Reality, 362-385. Cambridge: Cambridge University Press.
Searle, John. 1980. “Minds, Brains, and Programs.” Behavioral and Brain Sciences 3(3): 417-457.
Turing, Alan, M. 1936. “On Computable Numbers, with an Application to the Entscheidungsproblem.” Proceedings of the London Mathematical Society 42 (1): 230-265.
Turing, Alan, M. 1950. “Computing Machinery and Intelligence.” Mind 49: 433-460.
Further Reading
Block, Ned. 1980a. Readings in the Philosophy of Psychology, Volumes 1 and 2. Cambridge, MA: Harvard University Press.
Block, Ned. 1980b. “Troubles With Functionalism.” In Block 1980a, 268-305.
Gendler, Tamar. 2008. “Belief and Alief.” Journal of Philosophy 105(10): 634-663.
Jackson, Frank. 1982. “Epiphenomenal Qualia.” Philosophical Quarterly 32: 127-136.
Lewis, David. 1972. “Psychophysical and Theoretical Identifications.” In Block 1980a, 207-215.
Lewis, David. 1980. “Mad Pain and Martian Pain.” In Block 1980, 216-222.
Nagel, Thomas. 1974. “What Is It Like To Be a Bat?” Philosophical Review 83: 435-450.
Putnam, Hilary. 1963. “Brains and Behavior.” Reprinted in Putnam 1975b, 325-341.
Putnam, Hilary. 1967. “The Nature of Mental States.” Reprinted in Putnam 1975b, 429-440.
Putnam, Hilary. 1973. “Philosophy and our Mental Life.” Reprinted in Putnam 1975b, 291-303.
Putnam, Hilary. 1975a. “The Meaning of ‘Meaning.’” Reprinted in Putnam 1975b, 215-271.
Putnam, Hilary. 1975b. Mind, Language, and Reality. Cambridge: Cambridge University Press.
Shoemaker, Sydney. 1984. Identity, Cause, and Mind. Cambridge: Cambridge University Press.
Shoemaker, Sydney. 1996. The First-Person Perspective and Other Essays. Cambridge: Cambridge University Press.