The work of physicist and theoretical biologist Howard Pattee has focused on the roles that symbols and dynamics play in biological systems. Symbols, as discrete functional switching-states, are seen at the heart of all biological systems in the form of genetic codes, and at the core of all neural systems in the form of informational mechanisms that switch behavior. They also appear in one form or another in all epistemic systems, from informational processes embedded in primitive organisms to individual human beings to public scientific models. Over its course, Pattee’s work has explored (1) the physical basis of informational functions (dynamical vs. rule-based descriptions, switching mechanisms, memory, symbols), (2) the functional organization of the observer (measurement, computation), (3) the means by which information can be embedded in biological organisms for purposes of self-construction and representation (as codes, modeling relations, memory, symbols), and (4) the processes by which new structures and functions can emerge over time. We discuss how these concepts can be applied to a high-level understanding of the brain. Biological organisms constantly reproduce themselves as well as their relations with their environs. The brain similarly can be seen as a self-producing, self-regenerating neural signaling system and as an adaptive informational system that interacts with its surrounds in order to steer behavior.
W. Ross Ashby was a founder of both cybernetics and general systems theory. His systems theory outlined the operational structure of models and observers, while his cybernetics outlined the functional architecture of adaptive systems. His homeostat demonstrated how an adaptive control system, equipped with a sufficiently complex repertoire of possible alternative structures, could maintain stability in the face of highly varied and challenging environmental perturbations. The device illustrates his ‘law of requisite variety’, i.e. that a controller needs at least as many internal states as those in the system being controlled. The homeostat provided an early example of how an adaptive control system might be ill-defined vis – vis its designer, nevertheless solve complex problems. Ashby ran into insurmountable difficulties when he attempted to scale up the homeostat, and consequently never achieved the general purpose, brainlike devices that he had initially sought. Nonetheless, the homeostat continues to offer useful insights as to how the large analogue, adaptive networks in biological brains might achieve stability.
Cariani P. (2015) Sign functions in natural and artificial systems. In: Trifonas P. P. (ed.) International handbook of semiotics. Springer, Dordrecht: 917–950.
This chapter outlines a broad theory of sign use in natural and artificial systems that was developed over several decades within the context of theoretical biology, cybernetics, systems theory, biosemiotics, and neuroscience. Different conceptions of semiosis and information in nature are considered. General functional properties of and operations on signs, including measurement, computation, and sign-directed actions are described. A taxonomy of semiotic systems is built up from combinations of these operations. The respective functional organizations and informational capabilities of formal systems and computempiral-predictive scientific models, percept-action systems, purposive goal-seeking systems, and self-constructing systems are discussed. Semiotic relations are considered in terms of Morrisean semiotic triad of syntactics, semantics, and pragmatics. Analysis of statetransition structure is used to demarcate functional boundaries, such as epistemic and control cuts. Capabilities for open-ended behavior, combinatoric and emergent creativity, and umwelt expansion are taken up. Finally, basic problems of neurosemiotics, neural coding, and neurophenomenology are outlined.
It is generally agreed that organisms are Complex Adaptive Systems. Since the rise of Cybernetics in the middle of the last century ideas from information theory and control theory have been applied to the adaptations of biological organisms in order to explain how they work. This does not, however, explain functionality, which is widely but not universally attributed to biological systems. There are two approaches to functionality, one based on etiology (what a trait was selected for), and the other based in autonomy. I argue that the etiological approach, as understood in terms of control theory, suffers from a problem of symmetry, by which function can equally well be placed in the environment as in the organism. Focusing on the autonomy view, I note that it can be understood to some degree in terms of control theory in its version called second order cybernetics. I present an approach to second order cybernetics that seems plausible for organisms with limited computational power, due to Hooker, Penfold and Evans. They hold that this approach gives something like concepts, certainly abstractions from specific situations, a trait required for functionality in its system adaptive form (i.e., control of the system by itself). Using this cue, I argue that biosemiotics provides the methodology to incorporate these quasi concepts into an account of functionality.
Goldspink C. & Kay R. (2003) Organizations as self-organizing and sustaining systems: A complex and autopoietic systems perspective. International Journal of General Systems 32(5): 459–474. https://cepa.info/3951
Many alternative theories about organization exist. Despite this, or perhaps because of it, adequate explanation of the relationship between macro and micro processes of organization, and organizational dynamics remains elusive. In the recent past there has been growing interest in two areas of systems science that offer a different basis for understanding the generative and dynamic qualities of organizations. These are autopoietic theory and complex adaptive systems theory. In this paper, we outline a theory of organization built on a synthesis of these two theoretical strands. It is argued that the approach provides an improved framework for understanding the nature and dynamics of organizational phenomena, and as such a more rigorous basis upon which to base future organizational research.
Juarrero A. (2015) What does the closure of context-sensitive constraints mean for determinism, autonomy, self-determination, and agency? Progress in Biophysics and Molecular Biology 119(3): 510–521. https://cepa.info/4662
The role of context-sensitive constraints – first as enablers of complexification and subsequently as regulators that maintain the integrity of self-organized, coherent wholes – has only recently begun to be examined. Conceptualizing such organizational constraints in terms of the operations of far from equilibrium, nonlinear dynamic processes rekindles old metaphysical discussions concerning primary and secondary relations, emergence, causality, and the logic of explanation. In particular, far-from-equilibrium processes allow us to rethink how parts-to-whole and whole-to-parts – so-called “mereological”– relationships are constituted. A renewed understanding of recursive feedback and the role context-dependence plays in generating the boundary conditions and the internal organization of complex adaptive systems in turn allows us to redescribe formal and final cause in such a way as to provide a meaningful sense of heretofore seemingly intractable philosophical problems such as autonomy, self-determination, and agency.
Roberts J. W. (2018) The Nonmodern Ontological Theatre. Review of The Cybernetic Brain: Sketches of Another Future by Andrew Pickering, 2010. Constructivist Foundations 13(3): 398–401. https://cepa.info/5312
Upshot: Andrew Pickering’s book traces an early history of Britain’s cybernetics. Pickering draws connections between the “brain” in its many forms - from performative organ to strange machine, to larger social developments - as adaptive systems coupled to humans and artifacts. Focusing on the work of early British cyberneticians, Pickering argues for a performative paradigm as opposed to strict representational approaches to the world, and therein locates an “ontology of cybernetics” that challenges modern modes of knowledge production and which serves as a necessary component of cybernetic projects.
Umpleby S. A. (2008) Ross Ashby’s general theory of adaptive systems. International Journal of General Systems 38(2): 231–238. https://cepa.info/892
In the 1950s and 1960s Ross Ashby created a general theory of adaptive systems. His work is well-known among cyberneticians and systems scientists, but not in other fields. This is somewhat surprising, because his theories are more general versions of the theories in many fields. Philosophy of science claims that more general theories are preferred because a small number of propositions can explain many phenomena. Why, then, are Ashby’s theories not widely known and praised? Do scientists really strive for more general, parsimonious theories? This paper reviews the content of Ashby’s theories, discusses what they reveal about how scientists work, and suggests what their role might be in the academic community in the future. Relevance: Since Ashby defines a system as a set of variables selected by an observer, his work is quite compatible with second order cybernetics even though Ashby never directly addressed the issue of the observer or second order cybernetics.