Word Count: 1390Measuring the IQ of Mind and Machine:
an Examination of Functionalism as Represented by Fodor and Searle
~ Shanaree Sailor ~
Fodor begins his article on the mind-body problem with a review of the current theories of dualism and materialism. According to dualism, the mind and body are two separate entities with the body being physical and the mind being nonphysical. If this is the case, though, then there can be no interaction between the two. The mind could not influence anything physical without violating the laws of physics. The materialist theory, on the other hand, states that the mind is not distinct from the physical. In fact, supporters of the materialist theory believe that behavior does not have mental causes. When the materialist theory is split into logical behaviorism and the central-state identity theory, the foundation of functionalism begins to form. Logical behaviorism states that every mental feeling has the same meaning as an if-then statement. For example, instead of saying “Dr. Lux is hungry,” one would say “If there was a quart of macadamia brittle nut in the freezer, Dr. Lux would eat it.” The central-state identity theory states that a certain mental state equals a certain neurophysiological state. The theory works in a way similar to Berkeleys representation of objects. Both mental states and objects are a certain collection of perceptions that together identify the particular state or object.
Fodor develops the idea of functionalism by combining certain parts of logical behaviorism and the central-state identity theory. From logical behaviorism, Fodor incorporates the idea that mental processes can be represented by physical if-then statements. As such, behavior and mental causation are no longer distinct and unable to interact. Also, logical behaviorism provides a way for mental causes to interact with other mental causes. This, in turn, may result in a behavioral effect. The last point is also a characteristic of the central-state identity theory. One doctrine of the central-state identity theory is called “token physicalism.” Token physicalism states that all mental states that currently exist are neurophysiological. Thus, token physicalism does not place physical restrictions on the type of substance capable of having mental properties. When the points of logical behaviorism and the central-state identity theory, as described here, are combined, functionalism is the result. The theory of functionalism supposes that a mental state depends upon how a system is put together rather than upon the material which composes the system. Functionalism also states that the output of the system is related to both the input and the internal status of the system at a given time.
Based on the definition of functionalism, the mental processes of a human are not distinct from the systemic processes of a machine. Mental processes are defined as an operation on symbols to yield certain results. Thus, if the same symbols yielded the same results in two separate systems, then the mental states can be seen as similar, or even identical. Along this vein, consider a computer programmed with the same reasoning process as a mind. When the input “B” is entered, the output depends both upon “B” and upon the state of the system resulting from the computation of “A.” If the computer was programmed with the exact same reasoning process as a mind, then the result would be the same. Thus, the mental state of the mind would be indistinguishable from the systemic state of the computer. The computer metaphor upholds the theory of functionalism because the output is the result of interaction between the input and the current state of the system. The metaphor also demonstrates the insignificance of the physical state of the system when determining whether two mental states are alike. Thus, it shows that the processes, rather than the composition, of the system determine the mental state.
Searle disagrees with the view that the physical composition of the system does not influence the mental state of the system. To support this, he develops the Chinese room argument. Suppose a computer program is written that simulates an understanding of Chinese. Thus, when the computer is presented with a question in Chinese, it searches its memory and answers appropriately in Chinese. If the program is written well enough, the answers may be indistinguishable from a native speakers answers. According to functionalism, then, the computer and the native speaker have the same mental state since the results would be identical. Suppose, though, that a person is placed in a room without knowing any Chinese. They are given a list of what symbols they should collect and how to order them in response to the presentation of a different set of symbols. Around the room, there are baskets containing Chinese symbols. When this person is given a set of symbols, they would be able to consult their list, gather and order their symbols accordingly, and pass them back through the door. If the list was comprehensive enough, then their answers would also be on a par with a native speakers answers. Again, according to functionalism, the native speaker and the person who does not understand a symbol of Chinese would have the same mental state. Having a mental state is not simply ordering symbols, though. It involves the interpretation of symbols and an understanding of the meanings attached to them. Since the operation of a computer is based solely on its ability to implement programs, the computer only works with symbols without actually understanding what the symbols mean. Thus, this argument helps to demonstrate that syntax alone is not sufficient for semantics.
When we developed the theory of functionalism, we asked the question, “Could a machine think?” In the sense that a machine is just a physical system capable of performing operations, the answer is, “Yes.” This answer is not satisfactory, though, because we generally do not define thinking as simply being able to perform an operation. When we ask the question, “Is implementing a computer program with the right input and output enough to constitute thinking,” the answer is clearly “No.” Thinking refers to an understanding rather than something defined syntactically. No matter how fast or correct the program is, it still defines its operations by syntax rather than by consciousness and emotion. Thus, the computer is unable to duplicate a mind although it may simulate it. The idea of simulation is key here. We are able to simulate everything on a computer from the hypothetical path of a hurricane to the probable course of the stock market. We do not, however, believe the simulations to be real. In other words, we do evacuate a town or invest our lifes savings based on a simulation. It may be a good approximation of the course of events, but it is not the actual course of events. If this is the case, then, why do we even consider whether the simulation of mental processes resulting from a computer program are real mental processes?
At the end of chapter two, Searle summarizes his criticism of functionalism in the following way. The mental processes of a mind are caused entirely by processes occurring inside the brain. There is no external cause that determines what a mental process will be. Also, there is a distinction between the identification of symbols and the understanding of what the symbols mean. Computer programs are defined by symbol identification rather than understanding. On the other hand, minds define mental processes by the understanding of what a symbol means. The conclusion leading from this is that computer programs by themselves are not minds and do not have minds. In addition, a mind cannot be the result of running a computer program. Therefore, minds and computer programs are not entities with the same mental state. They are quite different and although they both are capable of input and output interactions, only the mind is capable of truly thinking and understanding. This quality is what distinguishes the mental state of a mind from the systemic state of a digital computer.