Anthropocene, exosomatization and negentropy
On transition : in response to Antonio Guterres
After precursors such as Georgescu-Roegen, we maintain that political economy in the Anthropocene is a challenge that requires a fundamental reconsideration of epistemology.
Abstract
The industrial economy took shape between the late eighteenth century and the nineteenth century, initially in Western Europe and then in North America. Besides technical production, it involves technological production – the integration of sciences in order to produce indus-trial goods –, to the strict extent that, as Marx showed, capitalism makes knowledge and its economic valorization its primary element. <br> Newton’s physics and the metaphysics that goes with it originated the epistemic (in Michel Foucault’s sense) and epistemological (in Gaston Bachelard’s sense) framework of this great transformation. In this transformation, otium (productive leisure time) submits to negotium (worldly affairs, business). All along, mathematics has been applied with ever more powerful and performative calculating machines. <br> After precursors such as Nicholas Georgescu-Roegen, himself inspired by Alfred Lotka, we maintain that political economy in what is now called the Anthropocene (whose features were delineated by Vladimir Vernadsky in 1926) is a challenge that requires a fundamental reconsideration of these epistemic frameworks and epistemological frameworks. With Dar-win, living beings became part of a historical process of becoming. In humans, knowledge is a performative part of this process that shapes and reshapes lifestyles in order to tame the im-pact of technical novelties.
Table of contents
Reading time: ~21 min
Chapter 1
Anthropocene, exosomatization and negentropy
The industrial economy took shape between the late eighteenth century and the nineteenth century, initially in Western Europe and then in North America. Besides technical production, it involves technological production – the integration of sciences in order to produce indus-trial goods –, to the strict extent that, as Marx showed, capitalism makes knowledge and its economic valorization its primary element.
Newton’s physics and the metaphysics that goes with it originated the epistemic (in Michel Foucault’s sense) and epistemological (in Gaston Bachelard’s sense) framework of this great transformation. In this transformation, otium (productive leisure time) submits to negotium (worldly affairs, business). All along, mathematics has been applied with ever more powerful and performative calculating machines.
After precursors such as Nicholas Georgescu-Roegen, himself inspired by Alfred Lotka, we maintain that political economy in what is now called the Anthropocene (whose features were delineated by Vladimir Vernadsky in 1926) is a challenge that requires a fundamental reconsideration of these epistemic frameworks and epistemological frameworks. With Darwin, living beings became part of a historical process of becoming. In humans, knowledge is a performative part of this process that shapes and reshapes lifestyles in order to tame the impact of technical novelties.
A brief historical introduction: knowledge and technics
The intellectual context of the industrial revolution is the idea that science and economy, especially trade, would become the new basis of legitimacy,security, justice, and peace. For example, Hume argued that the gold standard adjusts the balance of payments between states spontaneously. The underlying scientific paradigm is Newtonian, where deterministic mathematical laws are the ultimate embodiment of knowledge. Under this perspective, equilibrium and optimization follow from the relations between the parts of a system. Studies describe spontaneous, optimal equilibria and, therefore,they promote the withdrawal of rational supervision once the intended dynamic takes place. Further intervention would break the balance of these equilibria. Along these lines, scientific and technological developments yield progress by the optimization of processes and the providence of spontaneous balances. However, by construction, such analyses neglect the context of a situation even when this context is the condition of possibility of this very situation. Moreover, following the same rationale, both in science and industry, complicated situations are reduced to a combination of simple elements that can be known and controlled. Then, for example, the production of a single craftsman can be decomposed into simple tasks performed by several specialized workers eventually by machines. This method entails the progressive loss of workers’ knowledge because of its transfer to the technological apparatus; this was first described by Adam Smith and later by Karl Marx, who named this trend proletarianization. This loss of knowledge is a critical component in a more general process of denoetization, that is, the loss of the ability to think (noesis). Technics has become technology, and like technics, technology is a pharmakon: like drugs, it can lead both to positive and toxic outcomes.
At the same time that these events took place, new major scientific ideas emerged. Darwin’s views on biological evolution provided a historical framework to understand living beings. Darwin’s framework has been interpreted by some as another instantiation of the Newtonian model of science, while others emphasized the originality of historical reasoning in natural science. In this Darwinian framework, the living world is no longer a static manifestation of divine order. Instead, current life forms stem from a process of historical becoming. This change of perspective led to question the becoming of humankind and the role played by human intelligence in this process; namely, eugenics and social Darwinism emerged - against Darwin’s view that embraced the singularity of human societies.
Another scientific framework appeared on the scene. With the industrial revolution, heat engines were developed which raised theoretical questions that gave birth to thermodynamics. Physicists developed the concept of entropy and showed that entropy can only increases in isolated systems. In physics, energy is conserved by principle but entropy increase means that it becomes less usable to perform macroscopic tasks. In a nutshell, the increase of entropy in a physical system is the process of going from less probable to more probable macroscopic states. It follows that the increase of entropy is the disappearance of improbable initial features and their replacement with more probable features. This means the erasing of the past. This notion departed from the reversibility of classical mechanics – the latter lacks an objectivized time arrow – and brought about the cosmological perspective of the universe heat death. This concept goes hand in hand with the discovery of chaotic dynamics by Poincaré and the refutation of Laplace’s view that mathematical determinism entails predictability, thus taking a stab, in principle, at the notion of mathematical predictability and control of natural phenomena. In particular, Poincaré’s work applies to the solar system whose stability cannot be ascertained. These scientific developments provide a precarious view of the cosmos.
Nevertheless, in the XXth century, determinism sensu Laplace has found a second wind with mathematical logic and the subsequent computer sciences. These developments took place when industrial production shifted to consumer capitalism, a framework driven by mass consumption. Mass media are designed to trigger standard responses from consumers. As a result, the trend of denoetization expands to consumers as such – for example, processed foods led to a loss of folk cooking knowledge and contributed to the pandemic of non-communicable diseases like obesity.
In this context, the lax notion of information became central. Shannon coined a precise concept of information in order to understand the transmission of written or audio messages in noisy channels of communication. A very different concept was proposed by Kolmogorov to describe how hard the generation of a given sequence of characters is for a computer program. Specifically, Shannon’s theory states that information means improbability. This idea becomes absurd when used in order to assess meaning instead of facing transmission difficulties (noise), which was Shannon’s original motivation. For example, a constant binary sequence has maximum information sensu Shannon, while a random sequence has the maximal information sensu Kolmogorov (i.e, elaboration of information), and both limit cases have more information in their respective sense than a Shakespeare’s play of the same length. Despite the incompatibility of these frameworks and their limits, the received view in current cognitive sciences – themselves dominating representations in digital capitalism – is that intelligence is information processing, that is to say, a computation. Similarly, information plays a central role in molecular biology in spite of the failure to characterize it theoretically. Last, ignoring early criticism by authors such as Poincaré, the economy has been conceptualized as a process of spontaneous, mathematical optimization by “rational” agents, with possibly biased information processing due to “imperfect” cognitive processing.
At the beginning of the XXIst century, computer use has spread in diverse forms (such as personal computers, smartphones, and tablets). Their connection in networks has deepened and transformed the role of media. Private interests started competing to catch and retain the attention of users. With these technologies, the services provided to users depend on users’ data, and at the same time, service providers use these data to capture the users’ attention. These transformations led to a further wave of automatization. Algorithms like those used in social networks formalize and automatize activities that were foreign to the formal economy. These changes lead to further losses of knowledge and denoetization where attention itself is disrupted. Since the received view in cognitive sciences is that intelligence is information processing, several scientists consider the algorithms used as artificial intelligence an neglect the conditions of possibility of human intelligence such as attention. At the same time, management, as well as commercial platforms, decompose humans into tables of skills, interests, behaviors that feed algorithms, drive targeted political and commercial marketing, and shape training and recruitment policies.
The same trend occurs in the sciences,: knowledge tends to be balkanized in always more specialized fields of investigation, and scientific investigations tend to be reduced to the deployment of new observation apparatus and new information processing on the data obtained. By contrast, theorization is a necessary process for science, and it is a synthetic activity that reevaluates the concepts and history of a field, empirical observations, and the insights of other fields. With the emergence of data mining Chris Anderson advocated the end of theory. This perspective has been accurately criticized; however, the dawn of theorization in sciences seems to come mostly from another path. Following society’s general trend, it comes as the indirect result of institutional restructurations and the increasing weight of scientific marketing, both in publications and funding decisions. It also comes from an insufficient critical assessment of digital technologies and their consequences for scientific activities; it follows that the academic appropriation of these technologies to mitigate their toxic consequences and push forward scientific aim is lacking (except for purely mathematical questions).
Now, at the beginning of the XXIst century we are also witnessing the rising awareness of the consequences of human activities on the rest of the planet, leading to define a new era: the Anthropocene. The Anthropocene is characterized by human activities that tend to destroy their conditions of possibility – including both biological organizations (organisms, ecosystems) and the ability to think (noesis). In this context, the ability to generate knowledge to mitigate the toxicity of technological innovations is deeply weakened, to the extent that the problem of this toxicity is seldom raised as such by governments and societies.
Entropies and the Anthropocene
Energy or mineral resources, such as metals, are conserved quantities from the perspective of physics; however, there is some truth in saying that these resources are becoming scarce. A crucial concept to understand these situations is the concept of entropy. Entropy describes configurations and is directly related to our ability to use such resources. For example, ore deposits are at an improbably high concentration - generated by geological and atmospheric far from equilibrium processes -, and human activities concentrate them further by the use of free energy. For these resources, the critical concepts are the dispersion and, on the opposite, the concentration of matter; that is, the entropy of their distribution on Earth. However, a straightforward accounting of entropy is not conceptually accurate, and it is necessary to provide a finer-grained discussion of the articulation of entropy and the living, including the special case of human societies.
From the perspective of thermodynamics, biological situations are not at a maximum entropy and do not tend towards maximum entropy. The low and sometimes decreasing entropy of biological objects seems to “contradict” the second principle of thermodynamics, which states that entropy cannot decrease in an isolated system. However, biological situations, including the biosphere as a whole, are not isolated systems. Biological situations are open; they use flows of energy, matter, and entropy. At the level of the biosphere, the sun is the primary provider of free energy that is used by photosynthetic organisms. Therefore, biological situations do not contradict the second principle. A consequence is that biological organizations and, by extension, social organizations, are necessarily local and depend on their coupling with their surroundings. In organisms, the relationship between the inside and the outside is materialized and organized by semi-permeable membranes.
How to move forward in order to understand biological situations and their articulation to thermodynamics? Predicting requires to single out theoretically a situation among many others: typically, the state that the changes of the object will bring about. Entropy maximization singles out a macroscopic state: the one that maximizes entropy. Functions performing this role in physics are called potentials. There is a diversity of potentials in the field of equilibrium thermodynamics, which are different variants of free energy, involve entropy, and whose relevance depends on the coupling between the system studied and its surroundings. However, in the case of systems far from thermodynamic equilibrium – situations that require flows with the surroundings to last, like organisms –, there is no consensus on the theoretical existence of such a function or family of functions. For example, Prigogine’s fundamental idea is that the rate of entropy production (i.e., the rate of energy dissipation) could play the theoretical role of potential; however, this idea is valid only in particular open systems. It follows that the ability to understand general systems far from equilibrium by calculus is not theoretically justified. From a less technical perspective, Schrödinger introduced the idea that the problem in biology is not to understand order from disorder, like in many physical situations, but instead to understand order from order. To capture this idea, he proposed to look into negative entropy, an idea which was later elaborated by Brillouin, who named the corresponding negative entropy “negentropy.”
However, negative entropy does not precisely reflect biological organizations. Entropy can be lowered just by decreasing temperatures, while biological organizations remain as such only within a range of temperatures. A major glaciation would decrease entropy, but it would also destroy biological organizations. Moreover, functional parts of biological organizations often involve a local increase of entropy to be functional. For example, diffusion of a compound from its production location to the rest of the cell is a process of physical entropy production. Nevertheless, this process leads the said compound to reach locations where it can play a functional role. It follows that an articulation between entropy and biological organizations requires a careful analysis. In a nutshell, biological organizations maintain themselves far from maximum entropy configurations thanks to fluxes from their surroundings. At a given time, they actively sustain this situation by the interaction between their parts and fluxes. The necessary coupling between organisms and their surroundings takes place in ecosystems that are themselves embedded in larger levels up to the biosphere. The viability of living situations stems from the systemic properties of these various levels, and at the same time, from the underlying history that originated organizations in their respective past contexts. More generally, the way biological organization sustain themselves is fundamentally historical, i.e., they stem from natural history. This historicity implies a particular vulnerability to fast anthropogenic changes that disrupt biological organizations at various levels simultaneously. Examples of those changes are climate change at the level of ecosystems, or endocrine disruptors at the level of organisms. Moreover, life forms continue to change over time by generating new structures and functions. More than individual species, biologists emphasize the conservation of biodiversity and of the branching process of evolution that we may call biodiversification. This process is itself the object of anthropogenic disruptions. In a nutshell, biological organizations are precarious because the existence and the nature of their parts are fundamentally contingent and these parts need to be actively sustained. Organizations sustain themselves in ways that stem from past contexts, and can reorganize with sufficient time, however both processes are disrupted by anthropogenic changes. This argument is well accept in the state of the art biological knowledge, and at the same time, these matters are insufficiently theorized.
A possible strategy to go further in this analysis is to propose a complementary concept to that of entropy (and its mathematical opposite negentropy. Bailly, Longo, and Montévil proposed such a new concept called anti-entropy that refers to biological organizations (organs, functions …). In contrast to (digital) information, which is a one-dimensional notion (Shannon’s and Kolmogorov alpha-numeric strings), its geometry and dimensions do matter. A living organism produces entropy by transforming energy, sustains its anti-entropy by setting up and renewing its organization continually and produces anti-entropy by generating organizational novelties.
Anti-entropy aims to accommodate biological organizations in their historicity. Current life forms sustain themselves by the use of functional novelties that appeared in the past (anti-entropy) and the production of functional novelties (anti-entropy production). These novelties are unpredictable and unprestatable a priori (i.e., their nature cannot be predicted). At the same time, they are not generic random outcomes. They are specific because they contribute to the ability of biological objects to last over time by contributing to their organization in a given context (that this organization may impact). Entropy depends on the coupling of a system with its surroundings. Similarly, anti-entropy is relative to an organization, and not all objects are organized. For example, considered alone, a heart has no function; it is only at the level of the organism that it is endowed with a function. As a result, all discussions on anti-entropy are relative to an intended organized object, that is to say, to a specific locality.
As pointed out by Lotka, a specificity of human societies is the importance of inorganic objects in their organizations, such as tools, written texts, or computers. These objects are shaped and maintained by human activities. The constitution of objects theoretically analogous to organs outside organic bodies is called exosomatization by Lotka, and this process underlies how humans’ ways of living evolve.
In order to enable these inorganic objects to have a functional role and to limit the destabilization they introduce, evolution and developmental and physiological plasticity have a role in the process of exosomatisation. For example, reading recruits the plasticity of several brain areas that depend on the writing system. However, these purely biological responses are insufficient, and noetic activities are required to complete the process of exosomatization. For example, philosophy can be interpreted as a reaction to writing and its use by sophists, with possibly catastrophic consequences for the polis. In contemporary terms, it is far from being sufficient for a technic to find a market by the use of marketing to become desirable. It is also required to find variations and uses that mitigate the toxicity of these technics – especially in the perspectives of climate change, the decline of biodiversity, and denoetization. In other words, more work is required to single out exosomatic novelties (i.e., technics and technologies) that would be compatible with a desirable future for humankind. In this perspective, knowledge in all its forms plays a special role. Knowledge prescribes variants and uses for the novelties introduced by exosomatization and is tied to ethics.
Computers participate to this process and can be defined as automatic rewriting systems. With the increase of their speed and inputs (data), computers’ ability to process information and perform categorization increases dramatically. However, the tasks that they can perform are not equivalent to the novelties produced by human work. In the latter, meanings are produced that are neither in the initial data nor in their combinations by algorithmic methods. For example, the principle of inertia describes a very exotic situation on Earth where no forces are exerted on an object (e.g.,no friction and no gravitation): it cannot be derived from data, but was posed by Galileo as an asymptotic principle, a way to “make sense” of all movements at once and analyze what may affect them, that is, frictions and gravitation. Similarly, equal rights between citizens and gender equality are political principles that trigger a departure from former situations and reshape social organizations; they cannot be deduced from the former situations. These examples are historically significant in their respective domains; however, such processes are, in a sense, ordinary in human activities. They define work by contrast with labor: the former is also the permanent “invention of a new configuration of sense.” The current trend, however, is unfortunately not to develop work in this sense; instead, it is a convergence between algorithms and human activities. This convergence means a sterilization of work by its standardization – its transformation into generic information processing.
The scientific consensus is that the current path of civilization leads to its destruction, in particular by identifying anti-entropy, extended to social organizations, with information, a one-dimensional flattening. Work invents new tools and uses, thus constructs new configurations and sense for human and ecosystemic interactions. Thus, it departs from the alpha-numeric combinatorics in a pregiven set of possibilities (computational data processing), and it is required at all levels of society to face the current crisis.
References
- Bailly, F. and G. Longo (2009). Biological organization and anti-entropy. Journal of Biological Systems, 17(1):63–96. doi: 10.1142/S0218339009002715.
- Bizzarri, M., A. Soto, C. Sonnenschein, and G. Longo (2017). Why Organisms? Organisms. Journal of Biological Sciences, 1(1):1–2. doi: 10.13133/2532-5876_1.1.
- Georgescu-Roegen, N. (1993). The entropy law and the economic problem. In Valuing the Earth: Economics, ecology, ethics, pages 75–88. MIT Press Cambridge, MA.
- Kauffman, S. A. (2019). A World Beyond Physics: The Emergence and Evolution of Life. Oxford University Press.
- Longo, G. and M. Montévil (2014). Perspectives on Organisms: Biological time, symmetries and singularities. Lecture Notes in Morphogenesis. Springer, Dordrecht. ISBN 978-3-642-35937-8. doi: 10.1007/978-3-642-35938-5.
- Longo, G., Miquel, P.-A., Sonnenschein, C., and Soto, A.M. Is Information a proper observable for biological organization? Progress in Biophysics and Molecular Biology, 2012 109:108-14.
- Montévil, M. (submitted). Entropies and the anthropocene crisis. AI and society.
- Nicolis, G. and I. Prigogine (1977). Self-organization in non-equilibrium systems. Wiley, New York.
- Schrödinger, E. (1944). What Is Life? Cambridge U.P.
- Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27:379–423.
- Stiegler, B. (2017). Automatic society 1. The future of work, Polity pressTaking care of youth and the generations. Stanford University Press.
- Stiegler, B. (2018). The neganthropocene. Open Humanities Press.
- Supiot, A. (2019) “Homo faber : continuité et ruptures” in Le travail au XXIè siècle. Livre du centenaire de l'Organisation internationale du Travail, Éditions de l'Atelier.