Fable
Debate
No Final Theory
Cognitive Constructivism
(revised 26 September 1999)

Free Will and the Computational Mind

The philosopher Aaron Sloman points out that the assumption underlying most discussions of free will is that "there is a well-defined distinction between systems whose choices are free and those which are not."  He argues, however, that "if you start examining possible designs for intelligent systems in great detail you find that there is no one such distinction. Instead there are many 'lesser' distinction corresponding to design decisions that a robot engineer might or might not take--and in many cases it is likely that biological evolution tried ... alternatives." He goes on to give some twenty or so such distinctions to make his point. Here are a few examples.

Design distinction 1.
(a) an agent that can simultaneously store and compare different motives.
(b) an agent that has only one motive at a time.

Design distinction 2.
(a) agents all of whose motives are generated by a single top level goal (e.g. "win this game")
(b) agents with several independent sources of motivation , e.g. thirst, sex, curiosity, political ambition, aesthetic preferences, etc.

Design distinction 3.
(a) an agent whose development includes modification of its motive generators in the light of experience,
(b) an agent whose generators and comparators are fixed for life (presumably the case for many animals).

Design distinction 4.
(a) an agent whose generators change under the influence of genetically determined factors  (e.g. puberty),
(b) an agent for whom they can change only in the light of interactions with the environment and inferences drawn therefrom.

Design distinction 5
(a) an agent whose motive generators and comparators are themselves accessible to explicit internal scrutiny, analysis and change,
(b) an agent for which all the changes in motive generators and comparators are merely uncontrolled side effects of other processes, such as addictions, habituations, etc.

Sloman suggests that the useful question to raise is "What kinds of designs are possible for agents and what are the implications of different designs as regards the determinants of their actions?"

The best formulation of Sloman's thesis can be found in StanFranklin, Artificial Minds (MIT Press, 1995), pp. 35-39.
 

Discussion
It may be objected that Sloman appears to succeed in refuting the notion of free will by implicitly appealing to it, or that some of his contrasting design solutions imply free will. The appeal of a design where "motive generators and comparators are themselves accessible to explicit internal scrutiny, analysis and change" may in part lie in its lack of specification, which allows us to apply an intuitive psychology of free will. An agent "that can simultaneously store and compare different motives" comes very close to a folk description of free will, so calling this a "design solution" begs the question.

Conversely, if Sloman's designs are interpreted not to imply free will, they may appear less intuitively convincing. If we construct a computational model that specifies the algorithms handling the "internal scrutiny, analysis and change," the design might seem further from our intuitive notion of free will and thus a less adequate refutation of the utility of this notion.

Sloman's thinking helps to clarify that much of what we think of as free will is complex design features. The "problem" of free will is then made somewhat more tractable by taking each one of these design features and producing a computational model. However, the question is if the problem has really been simplified into a set of component processes, and whether these components together add up to our intuitive notion of free will.

We could limit the use of the computational model to represent how specific behavioral decisions are reached and may be able legitimately to conclude that in each case some causal computational process was responsible. To be able to make free will obsolete one would have to be able to generalize from that to an overall model -- which would simply be the claim that all of cognition is computational. However, "all of cognition" is an intractible problem that should not be addressed.

The notion of free will may be tied to sentience, which in turn is the locus of ethical value. If we found out that we could produce an exhaustive description of some organism, in such a way that would allow us to reproduce it at will, I suggest it would cease to have ethical value. A pure computational model thus runs into some major ontological opposition, and the question would arise if one ontology can be demonstrated to be ultimately valid, at the expense of another.

There difficulty here is that what is at stake is really different modes of construal of reality rather than different theories as such. Modes of construal include thinking of reality in terms of objects, of living things, or of informational processing. One view of evolutionary psychology would suggest that evolution has produced a range of inference systems with primitives that are not reducible to each other or to a common denominator; these underlie our modes of construal. If so, it would not be possible to make a decision between different modes of construal -- ontologies -- with respect to truth. Such modes of construal produce selective constructions patterned by their contingent history. They may be more or less useful (pragmatically true) in any given situation, but they are not true in an unqualified sense.

Francis Steen
26 September 1999
 
 

Science as Perception

Bohm and Hiley (1993:321) contains one of the clearest discussions of the epistomology of science I have come across, and I present the main sections here. The discussion occurs in the context of Bohm's "ontological" interpretation of the quantum theory:

"[I]t is evident that there is no way to prove that any particular aspect of our knowledge is absolutely correct. Indeed no matter how long it may have demonstrated its validity, it is always possible that later it may be found to have limits. For example, Newtonian physics, which held sway for several centuries, was found to be limited and gave way to quantum theory and relativity. Yet just before this revolution took place at the end of the nineteenth century, Lord Kelvin, one of the leading physicists at the time, advised young people not to go into physics because it was finished and only subject to refinement in the next decimal places. Nevertheless he did point to two small clouds on the horizon, the negative results of the Michelson-Morley experiment and the problems of black body radiation. It must be admitted that he chose his clouds correctly, though he totally underestimated their importance [the first gave rise to Einstein's Theory of Relativity, the second to the Quantum Theory]. Nowadays physicists are similarly talking about an ultimate unified theory including the four forces and perhaps strings, supersymmetry etc. that would explain everything. However, it can hardly be said that the clouds in this picture are only two and restricted to being small.

Our notion of the qualitative infinity of nature can usefully be brought out by considering the role of the atomic hypothesis of Democritus. Not only did this hypothesis show the importance of speculative new concepts going far beyond what could be demonstrated at the time of its proposal, but it also exhibited a further key feature that we would like to emphasize here. This is that deeper explanations often imply the limited validity of what were previously accepted as basic concepts, which later are then recovered only as approximations or limiting cases.

This relationship of concepts is indeed typical of the whole of our experience. For example, as we go round a circular table, what we see immediately is an ever changing elliptical shape. But we have learned to regard this changing shape as a mere appearance, while the essence, i.e. the true being, is considered to be a rigid circular object. This latter is known first in thought, but later this thought is projected into our immediate experience so that the table even appears to be circular.

But further investigation shows that this object is not solid and eventually discloses an atomic structure. The solid object now reverts in our thought to the category of an appearance, while the essence is the set of atoms out of which it is constituted. However, deeper studies then showed that even the atom is an appearance and that the essence is a nucleus surrounded by moving planetary electrons. And later still even these particles were seen to be appearances, while the essence was a set of yet more fundamental particles such as quarks, gluons, preons, or else sets of excitations of strings, etc. But, in all of this development of our knowledge, it seems that whatever we have thought of as matter is turning more and more into empty space with an ever more tenuous structure of moving elements. This tendency is carried further by quantum field theory which treats particles as quantised states of a field that extends over the whole of space.

What has been constant in this overall historical development is a pattern in which at each stage, certain features are regarded as appearance, while others are regarded as of an essence which explains the appearance on a qualitatively different basis. But what is taken as essence at any stage is seen to be appearance of a still more fundamental essence. Ultimately everything plays both the role of appearance and that of essence. If, as we are suggesting, this pattern never comes to an end, then ultimately all of our thought can be regarded as appearance, not to the senses, but to the mind.

What science is aiming for is that these appearances be correct, i.e. that the actions flowing from them, such as experiments, be coherent with what the appearances would imply. Incorrect appearances are thus either mistaken or illusory. The progress of science then implies the elimination of mistakes and illusions and the development of new sets of related appearances at various levels. Thus with the table to have the appearance of an elliptical object is correct only in a very limited sense. By bringing in the thought that the object is a rigid circle, we obtain a more nearly correct overall set of appearances. The thought of the atomic structure then makes the whole set still more nearly correct and so on. Thus what we are doing is constantly extending the appearances with the aid of thought. The ultimate reality is unlimited and unknown, but its successive appearances serve as an ever more accurate guide to coherent action in relation to this reality.

Another way of looking at the notion of appearances is to note that appearances basically are what arise in our perception of the world. As we have seen, the appearances in sense perception give rise to inferences about an essence that might be their origin, but this essence, which is seen in thought, turns out to be yet another appearance and therefore still part of our overall perception. No matter how far we go, we are therefore involved basically in perception. Our theories are not primarily forms of knowledge about the world but rather, they are forms of insight that arise in our attempts to obtain a perception of a deeper nature of reality as a whole. As such we do not expect their development ever to come to an end, any more than we would look forward to a final sense perception."

Bohm and Hiley (1993:321)
 

Literature

Albert, David Z. (1994). Bohm's Alternative to Quantum Mechanics. Scientific American, May 1994.  A relatively non-technical presentation, understandable to the layman.

Bohm, David, and Hiley, Basil J. (1993). The Undivided Universe: An Ontological Interpretation of Quantum Theory. Routledge: London and New York.

For a historical presentation of the mind-body problem, see Robert H. Wozniak's site Mind and Body: Rene Descartes to William James at Bryn Mawr College.
 

Quantum Mechanics and the Limits of Human Modes of Construal
(revised April 25, 1998)

Theories of matter are at the center of a theory of mind. Scientific theories of mind tend to be materialistic or dualistic, and the notion of matter appears to be derived from our primitive theories of objects (see Spelke's work on object perception). At the quantum level, however, matter behaves in ways that are difficult to reconcile with our intuitive notions of local causality, and it seems likely that what we call "matter" is something much more complex than the model laid at the basis of cognitive science generally allows. In fact, along the analogy of Lord Kelvin's view of physics at the end of the last century, we can now say that Cognitive Science has two dark clouds on the horizon: consciousness and ethics (Kelvin identified the ether and the black body problem, further work with which gave rise to relativity and quantum theory respectively). An understanding of these two phenomena may require as radical a shift in our conceptual framework for both matter and mind as the theory of relativity and quantum mechanics did for physics.

One of the crucial issues in the debate between classical (Newtonian) physics and quantum mechanics is whether states can be exactly measured and predicted -- the so-called measurement problem. Specifically, are pure states (i.e., states undisturbed by avoidable noise) states such that the outcome of every measurement can be exactly predicted? Classical physics is based on the proposition that the answer to the question is yes. Orthodox quantum mechanics is a theory based on the proposition that the answer is no, and that we can only make precise quantitative statements about probabilities, the limitation due to an essential interaction between the observer and that which is being measured. This has consequences for a theory of mind, since in this view the act of conscious observation has a direct effect on the measurement made. The view has been criticized by David Bohm for not providing a coherent ontology of either matter or mind.

Sheldon Goldstein at Rutgers University, in the first of a two-part review of the current state of the development of a quantum theory without observers (Physics Today 51: 3 (March 1998): 42-46), suggests that despite the claims of most of the originators of quantum theory, the appeal at a fundamental level to observers and measurement, which is so prominent in orthodox quantum theory, is not needed to account for quantum phenomena. Referring to the classical Bohr-Einstein debate, Goldstein says the debate has already been resolved in favor of Einstein. What Einstein desired and Bohr held impossible -- an observer-free formulation of quantum mechanics in which the process of measurement can be analyzed in terms of more fundamental concepts, does in fact exist, and there are many such formulations, several of which have the potential to become a serious program for the construction of a quantum theory without observers.

In the second part of the review (Physics Today 51: 4 (April 1998): 38), Goldstein makes the following points: 1) Several current quantum theories without observers are completely well defined and hence provide a conclusive refutation of Bohr's claim that such a theory is impossible. 2) The paradoxes of quantum theory can be resolved in a surprisingly simple way: by insisting that particles always have positions and that they move in a manner naturally suggested by the Schroedinger equation (e.g., the quantum mechanics of David Bohm as amplified by John Bell). 3) The possibility of a deterministic reformulation of quantum theory has been regarded by many physicists as having been conclusively refuted, particularly by the 1932 refutation of John von Neumann, but the von Neumann proof is false, and subsequent "refutations" are not convincing. 4) Bohmian mechanics is by far the simplest and clearest version of quantum theory. 5) Although none of the quantum theories without observers is Lorentz invariant, Goldstein believes such a theory is possible, and that the three approaches of decoherent histories (which assumes the wave function is not a complete description of a physical system), spontaneous localization (which assumes spontaneous and random collapse of wave functions), and Bohmian mechanics (which assumes the wave function provides only an incomplete description of a system and governs the motion of more fundamental variables) have much to teach us about finding such a theory.

Queries: Sheldon Goldstein, Rutgers University New Brunswick, tel. (908) 932-8789.
 

Literature

Goldstein, Sheldon. "Infinite Potential: The Life and Times of David Bohm." Science 275, 5308 (March 28, 1997):1893-4 (book review).

Goldstein, Sheldon. The Undivided Universe: An Ontological Interpretation of Quantum Theory. Physics Today 47, 9 (Sept 1994):90 (book review).

Goldstein, Sheldon. David Joseph Bohm. (Obituary) Physics Today 47, 8 (August, 1994):72 (2 pages).

Goldstein, Sheldon. The Quantum Theory of Motion: An Account of the de Broglie-Bohm Causal Interpretation of Quantum Mechanics. Science 263, 5144 (Jan 14, 1994):254-5 (book review).

Smolin, Lee. A Theory of the Whole Universe. In John Brockman, The Third Culture. Simon & Schuster, 1995 (both external).
 

Einstein on Science and the Mind:

"Science is not just a collection of laws, a catalogue of unrelated facts. It is a creation of the human mind, with its freely invented ideas and concepts. Physical theories try to form a picture of reality and to establish its connection with the wide world of sense impressions. Thus the only justification for our mental structures is whether and in what way our theories form such a link... The psychological subjective feeling of time enables us to order our impressions, to state that one event precedes another. But to connect every instant of time with a number, by the use of a clock, to regard time as a one- dimensional continuum, is already an invention. So also are the concepts of Euclidean and non-Euclidean geometry, and our space understood as a three-dimensional continuum. Physics really began with the invention of mass, force, and an inertial system. These concepts are all free inventions. They led to the formulation of the mechanical point of view. For the physicist of the early 19th century, the reality of our outer world consisted of particles with simple forces acting between them and depending only on the distance. He tried to retain as long as possible his belief that he would succeed in explaining all events in nature by these fundamental concepts of reality. The difficulties connected with the deflection of the magnetic needle, the difficulties connected with the structure of the ether, induced us to create a more subtle reality. The important invention of the electromagnetic field appears. A courageous scientific imagination was needed to realize fully that not the behavior of bodies, but the behavior of something between them, that is, the field, may be essential for ordering and understanding events."
-----------
A. Einstein and L. Infeld. The Evolution of Physics (Simon & Schusterm, New York 1938)


Debate
Return to the Debate index


Home
Page maintained by Francis F. Steen, Department of English, UC Santa Barbara