1,687
215
20
0
Social Sciences
Cultural Studies
As soon as something implicit intrudes consciousness human thought undergoes a radical change. The introduction of any new tool or code brings a shift in cognition; every micro-step layering new semiotic forms within each macroevolutionary-stage has buttressed a new semantic leap. Our mechanization of everyday life and the tech-systems we interact with are impacting communication, cultural norms and values, market-aesthetics, and economics, in societies at large.
Undergirded by a survey of the role and significance of tools in human evolution, this study arrives at what is already a well-entrenched new era: the digital, screen-mediated age. Revolutionized by the algorithm, introduced by computers, this age is dominated by the addictive quality of instant contact, unlimited information, virtual gaming, and titillating service-forms, all at our finger tips.
Aside from the interpersonal impact on the new humans growing up with devices in hand, how does this disembodied, digital code-form through which our interactions are mediated condition human cognition? How does its seductive efficiency interfere with how we relate, feel, assign meanings, think? Rooted in Code Biology macro-evolutionary and psychoanalytic principles, this paper examines the algorithm itself and takes a sweeping interdisciplinary approach to the developmental, psychosocial, and cognitive implications for the human mind/brain as it interacts with its technological extension.
Correspondence: papers@team.qeios.com — Qeios will forward to the authors
It was the success of the simplest tools that started the whole trend of human evolution and led to the civilization of today.
S. Washburn 1960, 63
We need first to understand that the human form—including human desire and all its external representations – may be changing radically, and thus must be re-visioned … five hundred years of humanism may be coming to an end as humanism transforms itself into something we must helplessly call posthumanism.
I. Habib, 1977, 212
In this essay, the idea that new coding forms mark the advent of new eras (Barbieri, 2015) is taken up and brought up to date by examining the code that has reconfigured human life and is defining a new era - the algorithm. I chose ‘tool-making’ to trace the adaptive course of human ingenuity. This revealed three broad categories that are present from the start; for the hand that flaked the stone, is also the hand that recorded lunar cycles and painted cave walls: Tools, Signs, and Representation, interweave, engaging eyes, movements, and thought, in feedback loops: perception, kinetics, and cognition, together, evolved the human brain, mind, and intelligence.
Throughout I am deeply interested in the semiotic dimension, in all its implications, in interaction, communication, invention, and representation, believing that semiosis evolved through micro-biological pathways involving neural connections that unite body and mind and retains these connections in a sensorimotor matrix. This exploration also considers the social adjustments created by new technologies through diverse eras; from small agrarian groups to city-states to nations; astrology to mathematics; cuneiform tablets to the alphabet to the Gutenberg press; books to the telegraph and telephone; from screens to today’s global-web. Each of these developments radically changed the fabric of how we live, communicate, assign meaning, and think. And each new coding-form led to new semiotic systems.
* * * *
During the second world war, preceding the London blitz, as Britain braced for a German invasion, huge teams of the finest minds in electrical engineering, physics, mathematics, and the sciences, on both sides of the Atlantic, gathered to create air-defense systems and ways to intercept German planes and decipher coded communications. The first radar and ground to air missiles were two inventions from this time. Later came penicillin and a decoding machine that detected patterns far more quickly and effectively than humans. Turing introduced the idea and possibility of computer science; John von Neumann would later build the first stored computer program shortly transforming the atomic age into a new, computerized, world-order.
* * * *
N. Weiner, C. Shannon, J. von Neumann
The complexity of human artifacts, finds its explanation in human intelligence.
D. Berlinski, 2000, 314
Among the many post-war inventions – radar, lasers, nuclear-bombs and power-plants, batteries, transistors, space-rockets and satellites -- the most important breakthroughs were in electronics and biotechnology. Consumer-electronics now mass-produced watches, calculators, televisions, portable-radios, and appliances: overshadowing them all was the computer. Computers themselves, though not task-specific, can be programmed to process all sorts of ‘information’ because all input is transformed into binary digits in strings of ones and zeros. We had entered the age of ‘information.’
This paradigm shift was not lost on N. Weiner, professor at MIT, mathematician, and scientist who, in ‘The Human Use of Human Beings,’ (1950) recounts that he had begun working on a ‘theory of messages’ as a “means of controlling machinery and society, the development of computing machines, and other such automata.” (15), reflecting on the nervous system and drawing parallels between the “internal transforming powers of the apparatus, whether alive or dead”(26). It is not possible in a short space to cover the range of his ideas, such as entropy, feedback, self-replicating machines, etc, but importantly he couched these in a probabilistic framework. Looking for a name to subsume this complex of ideas Weiner landed on “Cybernetics,” derived from the Greek “kubernētēs, steersman, a word that would spawn a veritable lexicon of Cyber-offspring, especially in pop-culture! From the start he coupled communication with control, claiming that “society could only be understood through a study of messages and the means that facilitate them,” whether these are “between human and machine, machine and human, or machine and machine.” (16) His vision was prescient in that he foresaw the growing role all forms of communication would play in the future.
The ramifications of this cutting-edge conception spread, engaging many avantgarde minds: Mathematicians, physicists, biologists, philosophers, epistemologist, sociologists, psychologists, and a psychoanalyst, participated in lively multidisciplinary discussions out of which evolved a solid conceptual foundation. The cybernetic framework took hold and exploded; Weiner’s conceptual vision was vast, moving from electronic engineering to the life sciences. Likened to a human nervous system the machine would yield output from input, much as we learn from experience. Mathematicians think in abstract terms, and once ideas are rarified at high levels of abstraction it does not matter whether the input goes through a human sensorium or mechanized ‘sensors’, what matters is input and output; “To me” writes Weiner, “…the fact that the signal in its intermediate stages has gone through a machine rather than a person is irrelevant.” (1950, 16) This is a loaded statement! A signal transmits via semiotic instruments - sign/symbols; cognitively, these function to represent the ‘thing’ itself and how we refer to it. Semiosis is therefore our intermediary between interlocutors as well as our mediation between ‘reality’ out-there and our minds, processed via the human sensorium and nervous-system at different levels of developmental organization, nothing like programmed machines. The line between living and non-living systems is already blurring at the outset.
The purpose of Cybernetics was to develop a vocabulary and technique that would enable humans to “attack the problems of control and communication in general” (Weiner 1950, 17) and properly classify their manifestations within a unified framework. In this he succeeded supremely well, setting in motion a conceptual revolution, for good and ill, that continues to this day in the far-flung ambition of creating a “master learner’ with encyclopedic knowledge. Weiner’s redefining information as the “content of what is exchanged with the outer world” (1950, 117) in accordance with the input-output formula, fit with Shannon’s (1948) reification and mathematization of information. Working at the Bell Telephone Laboratories, mathematician, electrical-engineer, and cryptographer, C.Shannon, inspired by a purely practical transmission-problem, devised an infallible theorem that solved the problem, earning him the title of "father of information theory." Together, they had endowed information with its modern form. The next great breakthrough came from physicist, mathematician, engineer, polyglot, and computer pioneer, John von Neumann, among the first to conceive of computers as devices that could be used to solve specific problems through applied mathematics. As early as 1945, he demonstrated that a computer could have a simple, set structure, while being able to execute any kind of computation with properly programmed controls without hardware modification, thereby introducing the first stored computer program along with his famous game theory. In line with the philosophy of Leibnitz, Weiner’s intellectual synthesis forged a way through mathematical abstraction to quantify mechanisms of change within and between systems. This foundation, infused by experts from various sciences and launched by Shannon’s theory, was put into practice by von Neuman’s contribution, the whole becoming the nascent field of computer science.
With a ubiquitous coding system, the mechanical fundaments of computer-software would be computationally modelled efficiently and potentially with unlimited, even self-replicating, capacities. But what then is this magical ‘computation’? how is it modelled, and what does it do? Enter the Algorithm! A human artifact, the key into which is locked a full-proof mathematical formula that performs a task more perfectly and consistently than any human could. An algorithm is a sequenced set of instructions to achieve a specific result. Once coded and programmed it is reliable and infallible in delivering these results. The term algorithm originates in the name of the 9th century Persian-Arab mathematician who invented algebra, Muhammad ibn-Mūsā-al Kwārizmi, coming down to us from the Latinized Algorismus, to algorithm. We use algorithmic thinking all the time, following a recipe or going through steps to accomplish a task. But the theorems devised by programmers accomplish astonishing tasks and computers do them infinitely quicker and better. Those that manipulate the symbolic vocabulary that codes programs are the artifact-makers of our digital-age; computer programmers are its wizards.
The mythos that exploded around the advent of the computer generated not only the libertine drug-culture of the sixties and seventies with its euphoric promise of a freewheeling alternate cyberspace behind the screen beyond reach of conventional norms, it also captured the imagination of creative minds. With their antennae into the culture artists produced books, plays, images, space-age movies, proliferating the illusion of a virtual ‘other’ world, enhancing possibility with fanciful fantasies. The psychedelic guru of this era was T. Leary, a hyperbolic cult-like figure spreading word that virtual reality was better than LSD, that the cyber-era upon us was sidelining the real world. New and unforeseeable forms of human-machine symbiosis were envisioned, even encouraged, advocating for organisms and machines to merge creating ‘new forms of life’ by entering into ‘temporary unions.’
If the boundaries of the Cybernetic field were fuzzy to begin with all boundaries were dissolving in popular culture where thinking-machines were seen not just to be changing what we do, but who we become. Analogies comparing the body to manufactured artifacts were not new, and Weiner himself had used the human nervous system to model the basics of cybernetic, input-output, feedback, central-regulation, etc. Yet constant slippage between the ‘thinking machine’ idea and the human-mind equivalence in mechanized functioning, keeps creeping in. Brain and machine are said to overlap, brain-functioning much like machines’, the premise being that the structure of a machine or an organism determines what it may perform. Leaving out volition and purpose, as computer science evolves, these notions only become more engrained.
The Cyber-prefix mushroomed into cyborg, cyberdelic, cyperpunk, cyperspace, and more seriously cyberwar. This was definitely not what Weiner had intended but what he most feared, and stated so. He foresaw that by ushering in more mechanization the ‘second industrial revolution’ could displace many workers creating massive unemployment. And having witnessed what his fellow scientists had wreaked in creating the atomic bomb he greatly feared and mistrusted human Promethean hubris. Gravely misconstrued as the ‘inventor’ of a machine-era take-over the real danger to society he saw was not from machines themselves but from what humans would make of them. He feared how future generations could misuse his creation and abhorred the idea of pushbutton warfare, anticipating the mischief that could be wrought by those in power to control populations and dominate them by means of machines. But moralizing speculations aside, once communication had become ‘a means of control’ and ‘information’ an ‘entity,’ and once T. Berner-Lee had written the code for the World Wide Web, there was no stopping what Weiner, Shannon, and von Neuman, had let loose on humanity.
Weiner’s ‘message’ stands pristine, isolated, cleared and coded by Shannon’s brilliant theorem, cleaned of all the muddy murk of ambiguity or innuendo, of colorful tone in smirk of mockery or mirth. Lost are both context and meaning, all that makes human communication human and that defines a co-created semantic field. In their place is an inanimate interface, fielded by a screen, its coded-core responding to every command, emitting perfect unlimited information on where and what to buy, how to get anywhere, watch a movie, play a game, tell a story, or social media-post a lie—whatever, whenever you need it. And Shannon’s great contribution meant that if information could be computed and quantified it could also be commercialized as a commodity and monetized. Worse, it could be stolen, falsified, manipulated, distorted, decontextualized, weaponized, and sliced and spliced into unrecognizable fragments of falsehoods.
With the rapid rise of the PC tensions between progress and trepidation that were there at the beginning only grew as the culture accommodated this pervasive new instrument with the miraculous mathematical genie inside it. The computer would open-up a connected world, an equalized platform, a virtual space where all would be intertwined, all voices heard; it was the next frontier of limitless possibilities freeing humanity from drudgery, spreading us to extended minds, alternate selves, Artificial Intelligence. The computer was a devious master-distractor forcing us into its digital language, robbing our exchanges of human nuance, isolating and insulating us; it would weaken memory and impulse control, steal our time, control us, sending us along addictively to keep following its clever enticements, its ‘virtual’ space an unregulated alleyway rife for mischief and deviance.
Vacillations between euphoria and dystopia have typified all major innovations throughout history. Introducing new tools, except perhaps in farming, aroused suspicions that weigh what is gained against what is lost. Consider the shift from the socially engaging poetic, tonal rhetoric of the oral tradition, to writing, privately created and consumed; the lilting carriage to the puffing, smokey locomotive, cutting noisily through the fields: from horses, warm, alive, physically connected, to automobiles, cold, fast, and loud; from artisanship to assembly-lines. Yet each of these spread to the many what had been reserved for a privileged few, widening the orbits of knowledge, travel, goods. Humans have always co-evolved with their tools. Our instruments empower us, as does new knowledge, feeding into the next generation’s cognitive and socio-cultural changes. Yet the power of this new coding vehicle, its prodigious potential, pervasive presence, and the fear of enmeshed dependence it unleashed, is unprecedented. Is it the lure of the device itself or the allure of the multiple uses it instantiates via a weird interior of wires and electrical nets, chips and perceptrons, operating a digital system guarded by a stern, binary, sentinel; on/off, 0/1, yes/no, no shades of maybe, perhaps, what if, why not?
A mystique evolved around the algorithm itself. After all, here was this clean theorem, a humanly-computed artifact, that made it all work, invisible, yet all-powerful, with god-like, omnipotent qualities. Was this genie divine or demonic? A force for good or evil? Descriptives like Cathedral and Fortress, Titans of technology, the spell of a gospel of big data in a modern orthodoxy grew around the mysteries of computer programming, conjuring medieval alchemists mixing pungent vapors that either turn into gold or explode! Would this new coding system free us or entrap us, aid us, or usurp us?
Biologists are deciphering the mysteries of the human body,…in particular of the brain and human feelings…computer scientists are giving us unprecedented data-processing power. When the biotech revolution merges with the infotech revolution, it will produce Big Data algorithms that can monitor and understand my feelings much better than I can, and then authority will probably shift from humans to computers.
Y.N. Harari, 2019,49-50
The advent and rise of the algorithm ushered in a modern mathematical universe, a world running silently on a multitude of equations craven to a “theology of big data” in Fin’s (2018, 16) stark words, what Hayne’s (1991) thirty years ago labelled a “regime of computation.” A gigantic Leviathan lurks behind the unassuming algorithm. Would that the already massive commercialization of human attention was enough! But human ingenuity never rests; competition fuels creativity and the acceleration of “progress” pushes on at great speed to the next quest. Consider, if a computed sequence of ‘weighted’ numbers could give us Google, just imagine what the ultimate encyclopedic learner run by a Master Algorithm could do! To inform us of the race for an all-knowing, supreme algorithm, that learns from other algorithms, there is no one more committed and impassioned to remake our world than award-winning, computer-scientist, P. Domingos (2015).
So enormous is this ambition it wants to capture a supraordinate level of transcendent knowledge, in AI; “If we can design machines that are more intelligent than us, they should…be able to design machines that are more intelligent than them, and so on ad infinitum, leaving human intelligence far behind” (286) he writes, confidently confusing big-data learners with human intelligence while envisioning the beginning of machine procreation. Like several other minds in computer-science literature Domingo’s imagination is liable to takes flight! In his vivacious description of different approaches to the quest for the ultimate learner, he takes us to an ancient city with five gates, one for each of the Five Tribes: The Symbolists, Connectionists, Evolutionaries, Bayesians, and the Analogizers, each of which models itself according to certain principles; the Symbolists endorse inverse deduction, the Connectionists backpropagation, the Evolutionaries, genetic algorithms, the Baysians, probabilistic inference, and the Analogizers, support vector machines (Domingos 291). Note that in nature all these components have probably been evolving together for millennia and must eventually come together as ffacets of one universal master learner. The story is worth recounting less for its heroic form than for how it illuminates the turgid tensions of complex twists and hurdles confronting a computer-scientist’s mind.
In Domingos’ tale the supreme learner is a continent, the five tribes its territories, and the Master algorithm its capital city where they all meet. Constructed of three concentric circles each bounded by a wall (much like ancient Constantinople) the outer is Optimization Tower, higher up the Citadel of Evaluation, and above them all is the ruling Tower of Representation, issuing immutable laws regarding what can and what cannot be done within the formal language through which learners express their models. Above its utmost tower flies a black and red flag, with a five-point star. The arduous trail ascends steeply as we are led through intertwined streets and alleyways past the Cathedral of Baye’s theorem, past Squared Error and Posterior Probability gates, and on through dense labyrinthian calculations and seemingly inviable checkpoints, even passing a statue of Aristotle. And still, tension rising, we strive computationally upward arriving finally at the narrow Gate of Accuracy, at the door to the Tower of Support Vectors, and are asked the password - “Kernel” is blurted out and we are in the Tower of the Master Algorithm! A spiral staircase at its center this large pentagonal chamber has a door in each wall: we run excitedly through each of these doors finding the Tower of Logic and then the Tower of Genetic Programs, and on to the Tower of Graphical Models, observing all the rules. But, Domingos continues, by then exhausted, we have fallen asleep only to be awakened by a hydra-headed monster of complexity. Armed with the sword of learning, and vanquishing this last trial, we climb even higher where a wedding uniting Praedicatus, the Lord of Logic and the Princess of Probability, Empress of networks, Markovia, is taking place. At this point the inscription on the five-point star flag becomes a formidable equation, and Domingos (246) unveils his insight using Markovian logic networks, thereby uniting logical and probability models (239-244). The ultimate formula found; the goal was spelled out; to reach a point where “machine intelligence exceeds human intelligence” (286). Not surprisingly, Domingos’ team called this learner ‘Alchemy’ and democratically informs where and how to download the formula.
After several more pages of mathematical complexity, I confess, I had fled this citadel through the Gate of Utter despair, comforted only by assurance that the humanities would be spared this computational agony, resurfacing alive and well supported by everything that cannot be understood without human experience (278); our metaphors and poetry, our dreams and spontaneous uncontrived human imagination, set free. Distanced, I look back at this citadel of the future bemused by a story by F. Riley, from the 1950’s, in which live-judges are being replaced by mechanized ‘Cyber-judges’ for their unerring fact-based precision decision-rulings. One remaining live-judge who values emotions and empathy in human judgment, however, remains doubtful, ending the story with the mighty infallible machine crashing ignominiously when asked to calculate the “magnitude of dreams.” (in Rid, 2016, 88).
Most features of human cognition cannot be quantified; they are part of human intellection into which converge perception, attention, emotions, and meanings, accrued via layers of constant new sensorimotor learning experiences, far beyond fixed calculations but, perhaps, not beyond the efficacy of codes in storing memories. As the seed from which representation and reference, semiosis-proper, grow, code-form compresses information, so it may be the form on which the evolution and the development of mind depend. Freud’s (1900) structural account of the dream, in fact, affirms that the ‘core ideas’ originating the dream are already there, ‘in a ready-made structure,’ before pictorial re-presentation or linguistic interpretation.
Barbieri’s (2015) vision is of a biology where organic codes are the artifacts of nature. The thrust behind his macro-evolutionary theory originating in DNA is that codes bring about absolute novelties. Algorithms are human artifacts: and they are changing us. We are being swept along by a rip-current of mathematical logic, of computability, of formulas that have become the rushing force of change in our daily lives, the world we live in, and, most importantly, in the way we see this world. The empowering ideals of a free society, the autonomy valued by Emerson’s ‘self-reliance,’ and even the Hellenic/psychoanalytic principle, “Know thyself,” are being eroded by the certainty that A.I machines can do us better, that we are being watched, known, pursued, and maybe, eventually, even conditioned, cured, and surpassed, by inanimate, programmed-computers. The extraordinary efficacy of these computer codes and our drive to abstract and miniaturize our artifacts tempts us to believe that the universe is indeed operating according to reducible mathematical principles and that the human brain, also, contains some form of bio-physical coding system like the engram that condenses and stores sensory input from its functional parts. Certainly, once linguistic re-presentation has set in, the ‘word’ sparks multiple levels of sensory-associations, condensing and abstracting references contained therein.
Just as computer scientists strive to imitate and surpass the human brain, neuroscientists are borrowing computer analogies to piece together the puzzle of memory. Gallistel (2020), a controversial, avanguard neuroscientist, notes, “The search for the engram doesn’t include the notion of a code. But the notion of a code is at the core of information-theory and molecular biology.” (April,20 2020). Barbieri’s code-poetic theory is “both rigorous and open” says neuroscientist Recchia-Luciani, in that it “corresponds to two dominant neuro-scientific hypotheses of cognition - the computational model, as in computers, and the connectionist model, as in neural-networks.” (personal communication, Dr. A. N.M Recchia-Luciani, 2022) Parallels between algorithmic efficiency and cerebral abstractions are readily found, but the latter originate in biological beginnings and continue to issue from a lifelong, sensory-motor matrix. In a book peppered with soaring fantasies and deep equations, it is Berlinski (2000) who most explicitly links the algorithm to DNA, the foundational code that brings about ‘absolute novelties’: he depicts molecular transcription and replication as conveyers of “secrets, not from one molecule to another, but from past into the future.” (292) But then he errs drastically in paralleling hardware and software, the machine hosting its algorithm, “the human being…his mind” (xii). Cerebral-neural functioning is essential to mind, but mind is not its brain/matter; mind is created, composed, and cognized out of meanings, felt, semiotically represented, referenced linguistically, and constantly re-elaborated by an evolving living whole organism, moment to moment. Domingos (2015) goes even further, “Think of big data as an extension of your senses and learning algorithms as an extension of your brain” (277). Not only does he assert that we are all cyborgs already but he anticipates that algorithms can ‘take-over’ human sensory experience. Really? And what of the dream - the deep unconscious processes of meaningful felt-experience? Sensory processing is as essential to human cognition as turning the switch on is to machine functioning. No algorithm will ever ‘develop’ as we do, from biological origins, with tight-knit sensori-motor-emotive cognitive faculties.
The cybersphere is embedded in orbits of abstraction: human abstraction, whether linguistic or numerical, is still always articulated by a mind that has arrived at abstraction bottom up, via a semiotic scale, rooted in the body, achieving higher cognitive levels through its own developmental cognitive-efforts, nothing that computation or a programmed machine can accomplish. Buried under the strata of abstraction, in humans, is the moist fertile soil of the five senses. From their seeds grow experience and language so that into the word flow a confluence of embodied signifiers, immediate and past, fitted to context, colored by purpose, delivered with inflexions in tone, volume, rhythm, innuendos, implied or specific, in expressive verbal sequence. By contrast the algorithmic word spurts quick and cold, clean and clear, but sense-less, aseptic, without context, mediation, or purpose, posing performatively as if in dialogue, but actually an imposter with whom we engage in good faith. The computer bewitches us, sucks us in, urging us to merge with its disembodied calculated form and enticements. This tilts an interface that tricks us into feeling that we are ‘interacting’ with intelligent-machines. It’s OK for sensible adults who know when to switch off. But for the young, the new humans growing up relating to ‘behind the screen,’ the merger presents a hazardous trap.
* * * *
Freud first conceived of drives in instinctual, biological terms, serving survival needs. But later, in deepening his study of human nature, he conceptualized two polarized high abstractions, Eros and Thanatos -- Love and Destructiveness. Each has derivatives and amalgams of both. In this study two such derivatives stand out; the drive ‘to know’ and to birth a ‘facsimile of ourselves,’ an artifacted life. This Frankensteinian fantasy has reached its apotheosis. With the advent of computers that think and remember for us, A.I., and robots that accomplish tasks more precisely and consistently than us, we are certainly at a disadvantage. In the final section I explore the consequences of living in a computer-mediated world.
We have modified our environment so radically that we must now modify ourselves in order to exist in this new environment.
N.Weiner, 1950,46The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly.
Licklider, 1957
A cellphone in every pocket, a PC on every desk. These are the tools of modern life. Open the slim silver box, flick the switch, and the screen will unlock unlimited possibilities: you can chat with friends, write an email, read the news, buy or sell anything, ask a question - google will answer, play music or a game, read books, watch movies, find a flight, rent a room, plan a trip - book it, find a doctor - speak to one! A cornucopia of ‘information’ for all is what the gadgets of this age provide. Information is today’s currency. What can be wrong? Nothing, if you are among the few who can detect mis- or dis-information, who can turn them off. But most cannot. The gadgets are addictive. The convenience of a multitasking hand-held tool is too great a temptation and, as always in human affairs, this instrument of instantaneous transmission can be misused. Mega-minded algorithm-programmers prey on addictive tendencies targeting the young, naïve, the greedy and needy, the fame-seekers and disenfranchised, the angry and the lonely. Everyone wants to be ‘seen’ ‘heard’ ‘plugged in.’ Children cannot detach from the screens they see adults tethered to. The medium has usurped the message: behold human life, mediated by screens.
We have become a species of information mongers, knowing more and more about less and less, oblivious to the difference between information and knowledge, digitalized data and human intelligence. Flooded by so much information, with no time to process or relay to long-term memory, it, like Time itself, floats away, forgotten, consumed in hours of viewing. We lose time absorbed in ‘screen time,’ no time-zone matters, any time is all time anywhere, anytime! The shrunken globe defies temporal and geographical boundaries, wrapped in an “internet” providing a mind-boggling supply of information while severely constricting the plane of interaction to a flat screen. As Carr (2020) points out, our attention is grabbed only to be scattered (118), “We get the data but lose the meaning” (228). There are of course indisputable gains like the immediacy of email communications, benefits of Zoom meetings that otherwise require expensive travel, opportunities afforded by courses, learning from home, and instant answers to all manner of questions in the expansion of knowledge.
Countless books have been written enthusiastically by computer-scientists and programmers extoling the great advantage of A.I, improving on human skills, freeing from labor, painting pictures of soaring advancements just ahead. But countless others have been written, from as early as the 1990’s, foreseeing dark clouds on a horizon beyond which lies an abyss towards which we humans, as we have known ourselves to be, are being driven through our own contrivance. The agitated tensions behind these books moves between blind faith in the unlimited beneficial powers of computational technology and a genuine fear that the binary footprint and way computational systems work will come to dominate the ways we think and interact, hence, Hayle’s (1999) ‘How we Became Post-Human,’ and Finn’s (2018) ominous theological vision of big-data Titans “bringing in the gospel of computation.” (16) The fear of introjecting computer-style communication is justified: we do internalize the functional-forms of our tools. But the greater fear, present from the outset, was how humans might exploit its powerful spread. It is the dark side of the web that is dangerous, its misuse in spying, hacking, fomenting falsities, its potential for seducing and conditioning
In ‘The Shallows,’ (2018) Carr provides ample neurocognitive research regarding the addictive impact and damaging effects on concentration, memory, and thought, that dependency on computers can cause. To his credit, he starts from his own experience, his slackening focus and shortened attention span darting around from one thing to another and his waning mental acuity. Startled by the notion that “fiddling with a computer, a mere tool, could alter in any deep or lasting way what was going on inside my head.” (38) he suspends all computer addictions, and writes the book. But as he is finishing it, as though to underscore his own thesis, he is already sneaking back to his previous habits! In her classic tome, Hayes (1999) is more consistent. Her prescient view of computer-driven decline is dire and dystopian; like Weiner before her she cautions that use of this machine could gradually over-ride our most basic human traits and vital cognitive functions, if abused, it could overrun our better judgement. Coming from the previous century her voice, as all such voices, has gone largely unheeded. Experiment after experiment has shown that the tools we use are mapped into our brains becoming extensions of our hands, arms, legs, imprinted in our minds; they become embodied. But never has there been a tool that supplements, extends, surpasses, and supplants, our mental faculties. And if this is a problem for adults, it is a serious problem for the developing mind.
For the very young, still psychically barely differentiated, entanglement is a given; they will merge with their gadgets’ images and sounds. Like the Pide Piper of Hamlin the algorithm will lure and goad them on, to keep staying on. For impressionable adolescents, absorbed by a second individuation phase, the risk of blurring boundaries between reality and virtual reality, between identifying-with and acting-in-imitation, are undeniable dangers. Designed to bring people together, social media, paradoxically, provided a means to hide behind a created ‘persona’ that may satisfy exhibitionistic impulses but further fragment a fledgling self. Even more so for those chronically isolated, depressed, anxious, angry, or craving attention. Moreover, extended screen-time detracts from live face-to-face relating, engaged slow-process or dedication to the pursuit of anything that requires many hours (or years) of effortful practice. Mastery is achieved only by personal effort, nothing passive.
Because an invisible code makes things visible it perpetuates fantasies of the genie in the box, an omnipotent puppeteer behind the screen commandeering a ‘virtual’ universe. Cartoons, video games, social media, chatbots, and other computer attractions, stoke these fantasies at all levels of development generating the mesmerizing effects of fairy-tales, religious ceremonies, cult rites, market-place gossip, exhibitionism, and addictive substances, all in one, gradually sliding into delusions of omnipotent control since one can dominate a platform while also flick it off with a dismissive click. Obsessive looking for calls or messages, feverish checking for “likes,” and compulsive scanning for what’s trending, all consume time and attention which become constantly divided and fragmented. Fewer are willing to sit quietly to reflect or read through a long book. Measuring what is gained against what is lost is difficult with an unbiased mind, but here is a try.
What is gained (computer/cellphone etc.):
Constant contact.
What is lost:
Dangers:
We shape ourselves around the cultural reality of code, shoring up the façade of computation where it falls short and working feverishly to extend it to complete the edifice of the ubiquitous algorithm. Finn 2017, 190
What excited the first taste of the infinite possibility of ‘other realities’ in a computerized world has become a global reality in our time. Little thought was given then to the potential pitfalls of this ‘adventure’ when T. Leary (1992) pronounced “The concept of cyberspace, creating realities on the other side of the computer screens, opens up a new and thrilling chapter in the human adventure.” Drawing parallels between the altered reality of drug-induced psychedelic experiences and the imagined world behind computer screens, Leary’s was the loudest voice of a counter-culture seeking anarchic freedom, universal “oneness,” and seeing its promise in cyberspace. His brightest insight, however, in paralleling psychedelics with cyborg was that both needed to be accessed by ‘codes’ (Leary, 1984); drugs activated areas of the brain just as algorithms activated the screen. The communion of brain and machine was complete; they share access-codes and addiction!
Tech-‘neural-networks’ learn skills by analyzing data; now they can even ‘extrapolate.’ The latest Chatbots amaze users by being able to explain complex concepts in clear, concise, punctuated prose, and seemingly generate ideas from scratch. Computer scientists have created technology whereby humans cannot be sure whether they are chatting with a machine or a person. Tapping into these Bots may feel like chatting with another person because they mimic conversation and appear smarter than they really are. But ask a complex question that requires subtle contextual assessment or deeper reasoning acumen and they reveal their imposter status! They seem to understand the situation, but they do not. They remain formulaic, fake-responders, administering algorithmic linguistic recipes with formidable language skills unsupported by reasoning capacities that distinguish between fact and fiction, without the background sensory-experience even for basic common-sense.
Early development unfolds in live socializing interactions; care, feeding, bathing, are all accompanied by cooing chat, internalized, and mimicked. The role of play in peekaboo and ‘pretend’ is crucial as it establishes the distinction between what is real and what ‘make believe.’ But the distinction is still tenuous and small children are distraught when confused as to which realm one is in. This distinction continues between ‘primary process’ and ‘secondary process,’ between play/fantasy and rule-based reality, between dreams and waking life. In the very young these are not yet solidly planted. The interpolation of a handheld gadget into this phase will have lasting impact not only because it replaces real social interactions (just look at all those pram-pushers on their cellphones) but because it pulls away from verbal exchanges, isolating the child into disengaged proximity. Only through human interacting does a fledgling ‘self’ develop. Trained to occupy itself by distraction the child learns to turn to the screen behind which lies guaranteed exciting entertainment at the flick of a switch. And this preference spreads: Consider the difference between the aggressive excitement of video-gaming versus the quiet innocence of building blocks, Legos, pick-up-sticks, puzzles, jacks, plasticine, coloring, or drawing, all requiring eye-hand coordination, focus, and manipulative dexterity. Not only the ease and speed at which stimulation can be had but audio-visual entertainment overshadows careful slow process with its delays, set-backs, failures, and start-overs.
Of all that is lost, perhaps the most subtle yet consequential for early development, is the end of absence. Fraught as it is with emotional pitfalls, negotiating separation is the source-point for the nascent capacity to re-present, to erect an absent object in the mind’s eye. What is lost to the senses becomes mind; the same space that provides object constancy is the birth of sign and symbol. The whole developmental process of symbolization and reference is contingent on this cognitive step created by the space left by absence. Availability of constant contact in talk and image, the avoidance of absence, can only thwart or degrade this interpersonal process and its important sequelae in imaginative and creative play.
Moreover, young children relating through a screen do not realize that their interlocutor is a trickster, not an ’Other’ at all. This imposter-interlocutor has been created and digitized into an algorithm by a ‘programmer’ who knows what children like but whose business mandate is to maximize time on the device. Those playing video games interact with a user-interface that generates feedback. Again, the gaming partner is a fake; and while ‘playing’ may be sought to alleviate boredom, loneliness, frit away time, or exert control over frustration, the level of stimulation distracts from other more effortful pursuits. Defined as the substitution of something for an unattainable desire, sublimation, is the channeling of this emotional longing into skills and adaptive activities of higher order. An impulse derailed by the impossibility of its satisfaction is detoured into realizing something else, often creative. This cannot happen when there is no ungratifying, empty, silent, or idle time, alone. Sublimation is lost where there is no privation. The overuse of technology and juvenile exploitation of social-media; consumerism; the subliminal seduction of targeted advertisement, and the blurring of reality and fantasy in political theatre and entertainment industries, have taken their toll on societies at large.
The interpolation of any sign-vehicle into human exchange generates one more filter, between us and nature and between each other. The current screen-driven smart-phone abbreviated texting impoverishes vocabulary as it de-symbolizes language by breaking it up into fragments of concrete signals. And the image-heavy selfie trend included in exchanges encourages unmodulated exhibitionism. We live in an “impulsera,” a time of showing, posturing, discharge, of image-bombardment and little deep reading, reflection, or thought. Visual information strikes instantly and more directly than any other sensory input. Drawing preceded language; pictured hieroglyphics came before the alphabet when words began serving as symbols. ‘An image is worth a thousand words’ whereas it takes longer to spin a sentence. This may explain the breathless speed and impatience with which contemporary discourse occurs. The loss of slow-process, careful formulation, and diversly-paced, face to face conversations, reinforces a collective narcissism nourished by omnipotent control of the ‘switch’ that can, at will, turn you on, or flick you off. There is nothing wrong with imagery, but that it is replacing crafted cogitation in disregulative ways. No wonder there is rampant degradation of form and regression of semiotic organization.
The gradual decline in careful linguistic articulation has led to an increase in unmediated primitive emotions. Raw acted-out affects eliminate the requisite space where semiotic-process generates thought, and hence reasoning dialogue. The public sphere has become prone to all manner of over-affect and/or affectless posturing-display; no nation is immune to this “primitivization” and resulting degradation of human behavior. Aggressive ideologically driven gatherings and social-media platforms provide a haven for feelings of ‘belonging,’ outlets for frustration, narcissistic exhibitionism, for the power-driven to appear as seductive omnipotent idols. There is no denying the potentially radicalizing effect of social media, manipulative videos and photos, misleading edits, fake news stories and deepfake images slanting reality. The fingerprint of misinformation and deceptive content is precisely its effort to appeal to crude emotions.
If Weiner stripped ‘messages’ of both interlocutor and content, and Shannon’s theorem dealt a final blow by digitizing them, a psychoanalytic approach comes from the opposite direction. Communication between humans is mediated by emotional signals and semiotic means, it is polysemic, multilayered, occurring in specific semantic spheres, transmitted and received by people at different levels of psycho-cognitive organization. Interface with computers is to employ a tool that conditions us to adopt its form: in its disembodied interactive field, tilted and uneven, devoid of context, intent, or meaning, language comes at us via a digital code transformed into denatured type. The algorithmic word mimics the “thing,” coded from its digital input, but cannot produce new symbols. The human verbal sign is embedded in a sensorium that has arrived at language via embodied experience that is ever replenishing symbols and meanings. Our interlocutor is a rendition, a facsimile of language, providing the illusion of dialogue, not the ‘real thing. How is an immature psyche to decipher the sense-less abstraction behind this exchange?
As that aspect of social interactions that anchor the self in an interpersonal matrix language is an indispensable channel. The relation of the symbolic process to reality is that it links experiences in the environment to an inner pole of reference, and vice versa, like a bidirectional bridge connecting the inner world to people in outer reality. But symbolization is also the vehicle of thought, a tool of reflection, knowledge, understanding, reasoning, and personal integration, the wellspring of all re-presentation and the source of our versatile, sometimes sublime, means of expression. Representation began in the body, in the co-involvement of perception, sensation, emotion, and memory. Only slowly do its yields become linguistically expressed. When interactions take place in a machine-made ‘virtual’ reality, what anchorage can a fantasy world mediated by screens provide? Hunger for more digitally-generated fantasies! like the ‘Metaverse,’ a space where singers’ voices are spun up out of pixels into fantastical creatures over a psychedelic background projected onto giant screens in an all-virtual universe. Weirder becomes the next best thing.
The machine lacks judgment and does not grasp meanings. And here’s the rub: Bots, A.I., algorithmic intelligence, cannot deal with concepts to which they have not been exposed before; they cannot generate symbols or thought as in human cognition, because human thought is embodied even when abstract, rooted in a sensory-motor core, embedded in context, intent, and semantic sphere. Human cognition incorporates knowledge that is alive, situational, derived from a sensorium nourished by motive and emotion. A computer may mince words to mimic a poem, they already do, or a metaphor, but an algorithm will never dream a real dream erupting unconsciously from fragmentary impression, distant memories, layers of interpersonal lived experiences that created turmoil, joy, anguish, elation, and have deep emotional meaning for the dreamer.
The symbolic function is the sine qua non of our uniquely human adaptation, contingent on making tools, for sure, but even more for conceiving of them through the mind’s medium. For only a mind that can draw concepts from experience can give meaning to this in contemplation; and only a mind that can contemplate is able to formulate meanings via a primary-process pictorial idiom and articulate its understanding through language.
* * * *
The year 2022, scientists determined that we are in a new evolutionary era, the “Anthropocine” the age of humanity. Yet we are becoming so removed from the natural world and our own nature, that we risk losing that humanity altogether. We live in a world-order run by computed algorithmic abstractions, conforming to their formulaic codes, internalizing their dis-embodied space that invites us to externalize many of our mental functions. The introduction of an innocent code has, once again, altered the course of human evolution manifesting Barbieri’s macro-evolutionary theory claiming that broad evolutionary shifts are marked by the advent of new codes: to the molecular, neural, and cultural codes, we can now add the ‘computational’ – a world created by the algorithm.