Cariani P. (2012) Infinity and the Observer: Radical Constructivism and the Foundations of Mathematics. Constructivist Foundations 7(2): 116–125. https://cepa.info/254

Problem: There is currently a great deal of mysticism, uncritical hype, and blind adulation of imaginary mathematical and physical entities in popular culture. We seek to explore what a radical constructivist perspective on mathematical entities might entail, and to draw out the implications of this perspective for how we think about the nature of mathematical entities. Method: Conceptual analysis. Results: If we want to avoid the introduction of entities that are ill-defined and inaccessible to verification, then formal systems need to avoid introduction of potential and actual infinities. If decidability and consistency are desired, keep formal systems finite. Infinity is a useful heuristic concept, but has no place in proof theory. Implications: We attempt to debunk many of the mysticisms and uncritical adulations of Gödelian arguments and to ground mathematical foundations in intersubjectively verifiable operations of limited observers. We hope that these insights will be useful to anyone trying to make sense of claims about the nature of formal systems. If we return to the notion of formal systems as concrete, finite systems, then we can be clear about the nature of computations that can be physically realized. In practical terms, the answer is not to proscribe notions of the infinite, but to recognize that these concepts have a different status with respect to their verifiability. We need to demarcate clearly the realm of free creation and imagination, where platonic entities are useful heuristic devices, and the realm of verification, testing, and proof, where infinities introduce ill-defined entities that create ambiguities and undecidable, ill-posed sets of propositions. Constructivist content: The paper attempts to extend the scope of radical constructivist perspective to mathematical systems, and to discuss the relationships between radical constructivism and other allied, yet distinct perspectives in the debate over the foundations of mathematics, such as psychological constructivism and mathematical constructivism.

Abstract: Under the aspect of constructivism evolution generates the varying boundary conditions to which evolution itself then is subject. This applies for organic as well as for cognitive evolution. The currently valid conditions for cognitive evolution we describe as laws of nature brought about by an independent reality. Within the constructivist evolutionary epistemology CEE), however. the regularities we perceive and which we condense to the laws of nature are seen as the invariants of phylogenetically formed cognitive operators. The extension of the inborn operators by means of experimental operators (i.e. by measurement facilities) will lead to the consolidation of the classical world picture if both _are _commutable. Otherwise there will be invariants which cannot be described in classical terms and, which therefore, will require non-classical theories. Likewise mathematical and logical structures can be seen as invariants of cognitive operators. It is shown that the propositions of Gödel would deal with what can be considered as the analogy of non-classical phenomena in physics. To renounce reality as an element of physical metatheory requires some rearrangements of those notions which explicitly refer to reality such as acting and perceiving, learning and adapting, and, partially, language. It turns out that the distinction between acting and perceiving is not unambiguous as it is in the “theory of reality.” Similarly we can see learning as a process of adaptation to the given environment as well as an independent development into something for which an appropriate environment or application still has to be found. It will be shown that both “adaptive” and “initiative” evolution occur in organic as well as in cultural evolution. Within CEE, language is seen as a “generative” theory rather than as a tool to portray independently existing facts. Its competence is based on the fact that it is generated by mechanisms closely related to those generating our physical perceptions. A similar genetically grounded relationship between mental operators enables mathematics to compress empirical data into a generating theory, and then, based on this theory, to extrapolate them (problem of induction). The linguistic equivalent of algorithmic data compression and the subsequent extrapolation is the recognition of a text’s meaning, and the subsequent drawing of conclusions from it, or semantic extrapolation as proposed to say. Accordingly, communication can be defined. Some parallels are discussed between verbal, cultural and genetic communication.

The constructivist evolutionary epistemology (CEE) has taken up the demand of modern physics that theoretical terms have to be operationalizable (i.e. the description of nature should comprise only quantities, variables or notions which are defined by means of measurement facilities or other physical processes) and extended it by the idea that operationalisation is something general which must be the constituting basis also for observational terms. This is realised by considering the regularities we perceive and which we condense to the laws of nature as the invariants of phylogenetically formed mental cognitive operators. Experimental operators (i.e. measurement facilities) can be seen as extensions of these inborn operators. This will lead to the consolidation of the classical world picture if the mental and the experimental operators involved are commutable. Otherwise there will be invariants which cannot be described in classical terms and, therefore, will require non-classical approaches such as the uncertainty principle in quantum mechanics enunciated by Heisenberg. As the development of experimental facilities will never be completed and, therefore, will continue to bring about novel invariants, evolution of science cannot converge towards what many physicists envisage as the “theory of everything” describing definitively the structure of reality (Feynman, 1965; Hawking, 1979). So, both organic and scientific evolution are entirely open and non-deterministic. When seeing also mathematical objects and structures as invariants of mental operators we must expect similar phenomena. Indeed: Just as experimental operators, though constructed entirely according to the rules of classical physics, may lead to results which cannot be described in classical terms, there are also mathematical calculuses which, though based entirely on well tested axioms, can lead to statements which cannot be proven within the context of these axioms as shown by Gödel.

Diettrich O. (1994) Is there a theory of everything? Bulletin of the Institute of Mathematics and its Applications 80: 166–170. https://cepa.info/5339

It is widely understood in physics that evaluation criteria for empirical theories are determined by what is called the objective structures of an outside and real world, and on this basis, discussions ensue as to whether our scientific efforts to condense observations into theories will eventually result in a “theory of everything” (Feynman 1965, Hawking 1979, Barrow 1990, Chalmers 1982) reflecting precisely these structures. “Unless one accepts that the regularities (we perceive) are in some sense objectively real, one might as well stop doing science” (Davies 1990a). I.e., reality is seen as a prerequisite for a non arbitrary and reasonable development of theories. Without reality “anything goes” – which is the downright unacceptable in physics. On the other hand, if regularities are objective in the sense that they depend on the structures of an objective outside world, it remains unclear why mathematics which obviously does not include any information on these structures is nevertheless so helpful in describing them in such a way that purely mathematical extrapolations will lead to correct predictions. This is the old question about “the unreasonable effectiveness of mathematics in the natural sciences” (Wigner 1960), or, as Davies (1990b) put it, “why the universe is algorithmicly compressible” (i.e. why the obviously complex structure of our world can be described in so many cases by means of relatively simple mathematical formulae). This question is closely linked to why induction and, therefore, science at all, succeeds. It is difficult to avoid asking whether mathematics, as the outcome of human thinking has its own specificity which, for what ever reason, fits to the specificity of what man would see or experience. As long as this question is not comprehensively answered science may explain much – but not its own success. But how can such entirely disparate categories as perceiving and thinking be linked with each other? This question will be discussed here in the context of a new constructivist version of evolutionary approaches to epistemology (Diettrich 1991, 1993), which will lead to a revised notion of reality, as well as to some rather unexpected links between the phenomena of non-classical physics and the mathematical findings of Gödel.

Diettrich O. (1995) A constructivist approach to the problem of induction. Evolution and Cognition 1(2): 11–30. https://cepa.info/4261

The unsolved problem of induction is closely linked to “the unreasonable effectiveness of mathematics in the natural sciences” (Wigner 1960) and to the question “why the universe is algorthmicly compressible” (Davies 1960). The problem of induction is approached here by means of a constructivist version of the Evolutionary Epistemology (CEE) considering both, the perceived regularities we condense to the laws of nature and the mathematical structures we condense to axioms, as invariants of inborn cognitive and mental operators. A phylogenetic relationship between the mental operators generating the perceived and the mathematical regularities respectively may explain the high suitability of mathematical tools to extrapolate observed data. The extension of perceptional operators by means of experimental operators, i.e., by means of measurement devices) would lead to the completion of the classical world picture if both the cognitive and the physical operators are commutable in the sense of operator algebra (quantitative extensions). Otherwise the physical operators will have invariants which no longer can be described in classical terms, and, therefore, would require the formation of non-classical theories (qualitative extension), exceeding the classical world picture. The mathematical analogon would be the algorithmic extension of elementary mathematical thinking exceeding the axiomatic basis previously established according to Gödel’s incompleteness theorem. As a consequence there will be neither a definitive set of axioms in mathematics, nor will be there a definitive theory of everything in physics.

Diettrich O. (1997) Kann es eine ontologiefreie evolutionäre Erkenntnistheorie geben? Philosophia naturalis 34(1): 71–105. https://cepa.info/3914

Most of what nowadays is called evolutionary epistemology tries to explain the phylogenetic acquisition of inborn ‘knowledge’ and the evolution of the mental instruments concerned – mostly in terms of adaptation to external conditions. These conditions, however, cannot be described but in terms of what is provided by the mental instruments which are said to be brought about just by these conditions themselves. So they cannot be defined in an objective and non-circular way. This problem is approached here by what is called the Gödel’s incompleteness theorem. The ontological prerequisites being the basis of the various epistemologies discussed in the philosophy of science, are replaced by the requirement of consistency: our cognitive phenotype has to bring about a world picture within which the cognitive phenotype itself can be explained as resulting from an abiotic, then biotic, organic, cognitive and eventually scientific evolution. Any cognitive phenotype reproducing in this sense (together with its organic phenotype) represents a possible and consistent world together with its interpretation and mastery – and none of them is ontologically privileged.

Diettrich O. (2001) A physical approach to the construction of cognition and to cognitive evolution. Special issue on “The impact of radical constructivism on science” edited by A. Riegler. Foundations of Science 6(4): 273–341. https://cepa.info/4500

It is shown that the method of operational definition of theoretical terms applied in physics may well support constructivist ideas in cognitive sciences when extended to observational terms. This leads to unexpected results for the notion of reality, induction and for the problem why mathematics is so successful in physics. A theory of cognitive operators is proposed which are implemented somewhere in our brain and which transform certain states of our sensory apparatus into what we call perceptions in the same sense as measurement devices transform the interaction with the object into measurement results. Then, perceived regularities, as well as the laws of nature we would derive from them can be seen as invariants of the cognitive operators concerned and are by this human specific constructs rather than ontologically independent elements. (e.g., the law of energy conservation can be derived from the homogeneity of time and by this depends on our mental time metric generator). So, reality in so far it is represented by the laws of nature has no longer an independent ontological status. This is opposed to Campbell’s ‘natural selection epistemology’. From this it is shown that there holds an incompleteness theorem for physical laws similar to Gödels incompleteness theorem for mathematical axioms, i.e., there is no definitive or object ‘theory of everything’. This constructivist approaches to cognition will allow a coherent and consistent model of both cognitive and organic evolution. Whereas the classical view sees the two evolution rather dichotomously (for ex.: most scientists see cognitive evolution converging towards a definitive world picture, whereas organic evolution obviously has no specific focus (the ‘pride of creation’).

A view of the sources of mathematical knowledge is sketched which emphasizes the close connections between mathematical and empirical knowledge. A platonistic interpretation of mathematical discourse is adopted throughout. Two skeptical views are discussed and rejected. One of these, due to Maturana, is supposed to be based on biological considerations. The other, due to Dummett, is derived from a Wittgensteinian position in the philosophy of language. The paper ends with an elaboration of Gödel’s analogy between the mathematician and the physicist.

Kauffman L. H. (1998) Virtual logic – self-reference and the calculus of indications. Cybernetics & Human Knowing 5(2): 75–82.

This is the fifth column in this series on “Virtual Logic.” In this column we shall begin by recalling the “Descent into the Form” that was described in the previous column, and how this descent makes clear how the mark of distinction can be seen as self-referential. We then relate this direct appearance of self-reference to other models and to Godel’s Incompleteness Theorem. The author apologizes beforehand for the ending of Part III of this essay with its modulation on the final lines of “Little Gidding” by T. S. Eliot.

Kauffman L. H. (1998) Virtual logic – The Smullyan Machine. Cybernetics & Human Knowing 5(4): 71–80. https://cepa.info/3119

This is the seventh column in this series on “Virtual Logic.” In this column I will discuss an imaginary machine devised by the logician Raymond Smullyan. Smullyan managed to compress the essence of Gödel’s theorem on the incompleteness of formal systems into the properties of a devilish machine. This column consists in two parts. In the first part we find a story/satire about such a machine, with the Smullyan structure at its core. In this story, the protagonist is bent on detecting a flaw in the machine and he operates with strict two-valued logic. In such logic a statement is either true or false. Thus we call the statement “If unicorns can fly then all numbers are less than pi.” true because it is not definitely false. In general “A implies B” is taken to be false only if A is true and B is false. This is the one significant case where “A implies B” must be false. All other cases, such as A false and B true are taken to be true. This is the classical logical convention. It works quite well in its own domain, but it has its limits. One of these limits occurs when there is a gradation of qualities. For example in statements about tall and short the truth is relative to your idea of this discrimination. Another limit is in the realm of self-referential statements. Certainly the Liar Paradox – “This statement is false.” is neither true nor false in any timeless sense.