Earlier this year I discussed how I was inspired by Bertalanffy’s humanistic approach to systems theory.1
A few weeks ago, I read a paper that explores the historical relationship and philosophical tensions between humanistic General System Theory as envisioned by Bertalanffy and the cybernetic approach to systems research pioneered by Norbert Wiener and his colleagues.2
It’s a long paper with plenty of great content for anyone interested in the historical development of systems science. But this week I’d just like to highlight one aspect — Bertalanffy’s firm belief that systems research should not treat men as machines.
Men are not Machines
The authors quote Bertalanffy frequently to help illustrate his concerns about how cybernetics and technology-focused concepts were exerting an oversized influence on the then nascent disciplines of systems science and systems theory.
“The student in “systems science” receives a technical training which makes systems theory – originally intended to overcome current overspecialization – into another of the hundreds of academic specialties…
…systems science, centered in computer technology, cybernetics, automation and systems engineering, appears to make the systems idea another – and indeed the ultimate technique to shape man and society ever more into the “megamachine” “ (General System Theory p. xxxi) 3
Based on my experience in my systems science master’s program, his fears were very justified.
While the technical training I’m receiving for computer-aided interdisciplinary investigation is incredibly valuable and enjoyable, I’ve been surprised by the lack of vibrant philosophical discussion in my classes about pressing human and social concerns from a systems perspective.
I can’t help but wonder if the practical successes of the cybernetic approach in systems science have played a significant role in leading the field, and society as a whole, directly into the “megamachine” future feared by Bertalanffy.
Indeed, the founding cyberneticists explicitly embraced and argued for the necessity of the machine metaphor which Bertalanffy despised.
“We believe that men and other animals are like machines from the scientific standpoint because we believe that the only fruitful methods for the study of human and animal behavior are the methods applicable to the behavior of mechanical objects as well.
Thus, our main reason for selecting the terms in question was to emphasize that, as objects of scientific enquiry, humans do not differ from machines.” (Purposeful and Non-purposeful Behavior P.326) 4
Honoring the Human Component
Bertalanffy vehemently opposed the idea of treating humans as machines or robots — he advocated instead for an organismic approach to understanding human behavior.
While he acknowledged the importance of incorporating mathematics along with pure and applied science into systems theory, he believed that the humanistic and philosophical aspects must also be considered to ensure general system theory wasn’t “limited to a restricted and fractional vision.”
He held a bleak, yet prescient picture of humanity’s future guided by a systems approach grounded in mechanistic world views.
“Moreover, the dangers of “systems” are apparent. Systems designers, analysts, researchers, behavioral engineers, and members of similar professions…contribute to or even lord over the industrial-military establishment. Elaborating weapons systems, dominating advertising, mass media, and propaganda, and in general preparing a cybernetic society of the future, they must of necessity tend to exclude or suppress the “human element.”
For this human element, individualistic, capricious, often unforeseeable, is precisely the unreliable part of the “megamachine”…of the present and future; hence, it should be either replaced by computers and other hardware or else made “reliable” that is, as machinelike, automated, uniform, conformist, and controlled as possible. ‘Systems’ thus appears to be the epitome of the automated wasteland of Brave New World and 1984.” 5
To be fair, I should acknowledge that I haven’t read much Wiener. Reading his work may give me a greater understanding of how a cybernetic approach doesn’t necessarily imply an entirely mechanistic worldview. I’m aware that he also advocated for a humanistic approach to systems research and voiced similar concerns to Bertalanffy.
It seems that Wiener’s answer to these problems was “to have a society based on human values” that went beyond reliance on pure market mechanisms as a universal norm of societal organization. He was well aware of the potential use of cybernetics for good and for evil.6
However, as I prepare to spend the next semester deeply immersed in the world of deductive mathematical systems theory grounded in cold hard logic, it feels important to reaffirm my commitment to Bertalanffy’s humanistic approach.
I’m eager to gain a deeper understanding of the mathematical techniques which have had such a profound impact on the evolution of systems theory and science, but I will constantly be asking myself:
How do I keep the human component, and humanity as a whole, at the forefront of my mind?
Shingai. (2024, January 2). Humanistic Systems Research [Substack newsletter]. System Explorers. https://systemexplorers.substack.com/p/humanistic-systems-research
Drack, M., & Pouvreau, D. (2015). On the history of Ludwig von Bertalanffy’s “General Systemology”, and on its relationship to cybernetics – part III: Convergences and divergences. International Journal of General Systems, 44(5), 523–571. https://doi.org/10.1080/03081079.2014.1000642
von Bertalanffy, L.. (1969) (14th paperback printing 2003). General System Theory – Foundations, Development, Applications. New York: George Braziller.
Rosenblueth, A., and N. Wiener. 1950. “Purposeful and Non-purposeful Behavior.” Philosophy of Science 17: 318–326. https://www.jstor.org/stable/185931
von Bertalanffy, L. (1969). “General Systems Theory and Psychiatry – An Overview.” In General Systems Theory and Psychiatry
Wiener, N. 1948a (2nd ed., 1961). Cybernetics. Cambridge, MA: MIT Press. https://direct.mit.edu/books/oa-monograph/4581/Cybernetics-or-Control-and-Communication-in-the
Western nineteenth- and twentieth-century thinking and reasoning with its industrial machine metaphors show up everywhere. Break things into parts so that you can understand and fix them. Complex things need to be divided into even smaller parts. Divide knowledge into scientific, humanistic, and religious. Separate humans from nature. The university is designed as a factory system where the highest honors go to specialists. From that worldview when the feedback loops/control systems first designed by engineers and mathematicians for guiding rockets were recognized throughout nature, it was logical to see people as machines. Shift to a systemic worldview, describe feedback loops as one of the many interacting "systems processes" that make up all systems--that describe how all systems work--and youʻve moved away from the machine metaphors. Machines are poorly conceived and designed compared to the networked, self-organizing, and emergent systems of nature. The current problems with AI, cryptocurrencies, climate/environmental destruction, etc exist because we havenʻt yet conceived of how to design deeply ethical systems., systems that support the systems that make up their environments, systems that organize themselves to ensure the health of peer systems and suprasystems. We can create really great sales and marketing algorithms, but we havenʻt yet created ethical algorithms.