I think it’s time for a more thorough statement of perspective.
Remember Marshall McLuhan? Media theorist at the University of Toronto? Quondam patron saint of Wired magazine? Foresaw electronic media acting as a vector for tribalism? That guy. He’s been out of vogue ever since the novelty of equating the internet with his prophesied “global village” wore off, but his vatic probes are no less prescient nor compelling than they were during the peak of his influence during the 1960s. His contribution was the maxim that the medium is the message—that media content counts for less than format.
The concept of “outering” constitutes the heart and lungs of McLuhan’s body of work. Every tool, every form of communication, every technological contrivance we integrate into our everyday lives is to be understood as an externalized (outered) human organ or capacity which confronts us as a kind of amputated prosthetic. The wheel, for instance, is a foot; the cart mounted on a pair of wheels is a stooped back on which a load is carried.
Simple enough.
Clothing, then, is an outering of the skin; housing, at its most basic, is an exterior skin or a garment of sufficient size to be “worn” by a group. Michael Pollan has written about fire as an outered stomach, doing much of the work of digestion before our victuals even enter our mouths. The author of the Book of Job recognized the sword as not only an outered tooth or nail (our god-given equipment for piercing and tearing flesh), but of violent intent itself, and used it as an eminently legible metaphor for both capital punishment and war.
And so on.
A second step follows outering: interiorization. This is the naturalization of the artificial, the adaptation of our native senses to an expatriated counterpart. On the individual level, we can be said to have interiorized a particular technology when its use becomes second nature—such as when, for instance, we feel uneasy about accepting a useful service done by a stranger without the exchange of money. On the level of society, interiorization is achieved when our way of life can’t be imagined without the specified technology. The modern city takes air conditioning as a given, and it seldom occurs to us that our apartment or office on the twenty-ninth floor would be uninhabitable if the HVAC system stopped working and wasn’t repaired.
The mythology of the Sumerians held that Enki, the god of wisdom, founded the cities of Mesopotamia and bestowed upon their people the gifts of civilization: agriculture, architecture, husbandry, handicraft, government, music, etc. A related story of Enki and the love goddess Inanna (also called Ishtar) implies that these gifts came bundled with abstractions such as truth, falsehood, victory, and enmity, inextricably splicing bronze-age technology with a set of intellectual constructs which we might deem essentially human. On some level, it seems the peoples of ancient Mesopotamia intuited that once a technology has been interiorized, we become as aliens or animals to ourselves in its absence. (Tellingly, references to the Gutian people on Akkadian tablets compare the city-state’s nomadic, illiterate foes to monkeys, dogs, and snakes.)
This is all obviously dialectical. From digging irrigation canals to laying a fiber-optic network, technologization is the process of incrementally transforming the environment to facilitate human ends at ever broader scales and greater intensities. Any environment changed in this way invariably makes the people within unlike those who effectuated the transformation, entailing logical but almost always unforeseen changes to the ingrained social practices that produced the people that devised the change to begin with.
McLuhan’s insistence that such change consists of populating the world with actual “autoamputated” parts of ourselves may come across as a bizarre exaggeration, but it serves a useful purpose by underscoring the intimacy of our relationship with technology. It might be easy to scoff at the idea that your car is a part of yourself—but if it’s ever been wrecked or stolen, you’ll remember feeling as though your legs were reaped out from beneath you as you reckoned with your pitiful new relationship to distance and time.
By the twentieth century it was hardly controversial to claim that a technological paradigm determines social structures, but McLuhan’s work emphasized the extent to which our tools and media get under our skin and into our heads. That Kubrick and Clarke were familiar with McLuhan and took his ideas to heart should be clear to anyone who’s watched the first act of 2001: A Space Odyssey. When one of the cringing, scrounging man-apes realizes that he can fell tapirs and bash his rival’s brains out with the femur clutched in his paw, the result of the addition isn’t simply “man-ape + bone club.” In terms of his behavior and capabilities, the man-ape is transformed into a completely different animal. McLuhan maintains that the same can be said for “Homo sapiens + electric lighting,” “Homo sapiens + jet planes,” and so on.
As we are uniquely verbal creatures, the outering of speech and memory via the written word was a sui generis inflection point in the life of our species. It would be fair to say that writing was the first technological extension of a truly human capacity as opposed to an animal attribute. A culture that has gone through the process of interiorizing written language fundamentally restructures not only how it goes about the business of living, but how it views the world. When expressions of thought become stable objects in the world, the character of our unexternalized cognitions can’t remain unchanged.
Though it appears as a brief tangent in the Phaedrus, Plato’s warning about written communication has eclipsed the rest of the dialogue in the popular consciousness. The points on which he advises caution (the philosopher does not flat-out reject literacy) are well-known. When words can be fixed, retrievable objects as well as evanescent events, the mnemonic economy of a primary oral society collapses from want of necessity. A text can’t answer questions, defend itself against challenges, or be asked to “say” anything other than precisely what it already does; its ambiguities must remain ambiguous, and its errors must always err. Moreover, the text doesn’t permit the reader to process new information and facilitate retention through conversation with its “speaker.”
By the fifth century BCE, literacy was fast displacing oral practices among the Athenian elite. Plato, like us, lived during a sociotechnological sea change, and was apparently unaware of how profoundly it affected him. McLuhan’s student Walter Ong points out in his opus Orality and Literacy that despite Plato’s championing of oral dialectic and his ambivalence toward reading, his entire epistemology was an unconscious “reaction, or overreaction, of the literate person to lingering, retardant orality.” Analytic thought such as Plato’s requires the shaping of one’s mental habits in the mold of literacy. Moreover, the unmooring of knowledge from situational, concrete frames of reference (such as those a primary oral culture relies on) promotes intimations of a sphere or realm of abstract, immutable concepts contraposed to the labile, ambiguous stuff of the lifeworld.
Once interiorized, any technology of communication rearranges our cognitive and social habits around the sensory faculties it engages. Typography, the intensification and acceleration of chirography, demands an intense visual focus on the part of the reader. The streamlined parsimony of print promotes fast reading, which promotes silent reading. Engaging with a book or newspaper becomes an introverted affair, something one does in private, even in the company of others. A poem or novel has no recourse to any of the intonations, facial expressions, or body language on which we rely to gauge a speaker’s feelings, and which often help us to catch the meaning of what they’re saying. To compensate, effective communication via the written or printed alphabet requires a much higher degree of verbal precision than spoken language, and greater attention to consistency and structure. Writing and reading stimulate rationality.
“’Rational,’ of course, has for the west long meant ‘uniform and continuous and sequential,’” McLuhan writes. “In other words, we have confused reason with literacy, and rationalism with a single technology.”
To my mind, no figure of the print epoch personally exemplifies this so well as Kant, who was prone to mistaking the cognitive biases inculcated by print for immutable structures of human understanding. He disdained passion as the “pathological” snorting of the animal substratum on which our rational faculties stand, and didn’t place much value on color in visual art or excitement in music. The notion of genius he popularized in the third Critique celebrates the imaginative creator of what we’re lately calling “content” (Kant preferred content of the cerebral variety, not of the sort that arouses vulgar or libidinal rowdiness), and doesn’t extend to the chef, the carpenter, the dancer, the athlete, or anyone whose aptitudes lie beyond the scope of the liberal arts.
This is all to be expected from the subject of a culture and an intellectual tradition that had crystallized around a medium of mass communication that is fundamentally impersonal, anesthetic, individuating, and abstracting. A people whose intercourse relies primarily on the spoken word tends to be emotive, reactive, and group-oriented; one that has internalized and grown reliant on print (giving itself “an eye for an ear,” as McLuhan often puts it) can expected to be reserved, detached, and egoistic. Moreover, interiorizing any technologically mediated form of communication disposes us to conflate the quiddities of humanity with the particulars of the format. The metonymous association of a person with a fixed point of view is only really obvious to a literate culture, as is the perception of a personality or mind which resides in a text and communes with us during the act of reading.
The latter phenomenon is an instance of what McLuhan calls the Narcissus trance: being “hypnotized by the amputation and extension of [one’s] own being in a new technical form.” An explanation:
The Greek myth of Narcissus is directly concerned with a fact of human experience, as the word Narcissus indicates. It is from the Greek word narcosis, or numbness. The youth Narcissus mistook his own reflection in the water for another person. This extension of himself by mirror numbed his perceptions until he became the servomechanism of his own extended or repeated image. The nymph Echo tried to win his love with fragments of his own speech, but in vain. He was numb. He had adapted to his extension of himself and had become a closed system.
Plato had some intuition of this. By expressing concern that that the reader might come the mistaken conclusion that he can learn just as well from a book as from conversational dialectic, he assumes an inclination to anthropomorphize the text, regarding it as something like a person in itself, or otherwise as a substitute for a person. (The truth is that when we respond to a text, we’re responding to our own verbal behavior, which the text elicits in us.)
The interiorization of writing produces a blind spot in which the medium can perform a fascinating feat of prestidigitation. We’re prone to experiencing a peculiar sense of familiarity—even intimacy—with the remote (and often dead) person to whom a tract of words is attributed, even though we’ve never heard their voice, observed their body language, shaken their hand, hugged them, fought with them, laughed with them, caught them during an unguarded moment, made small talk with them about the weather, or anything else of which coming to know a person as a person truly consists.
During the print epoch, the literate public digested information privately and asynchronously. Even when describing “current events,” a newspaper or magazine must necessarily do so at a considerable temporal remove, and is incapable of sensuously reproducing what it describes. As we unconsciously mine our history of experience to give imaginary faces to people we’ve never met, guess at the timbre of quoted speech, and visualize what a distant city or foreign country might look like, print matter directs us to look inward.
For McLuhan, the sensate modality of electronic media represented the outering of the human nervous system in “a huge collective surgery carried out on the social body without regard for antiseptics.” As opposed to the anesthesia of print, the substance of radio, film, and television is all out there in the world, activating our senses and demanding our involvement.
Through radio, the united ears of nations were placed before the mouths of Roosevelt, Hitler, Detective Friday, and Elvis. The silver screen transfixed theatergoers with a visual inventory and a goading advertisement for The American Way Of Life. A massive television audience simultaneously experienced the Kennedy assassination, the moon landing, and the disgraced Nixon’s resignation as real events (whereas in the nineteenth century, newspaper readers would have received them as textual addenda to their abstract conceptions of the world). The term “para-social interaction” was inevitably coined in 1956 as millions watched, listened to, gossiped about, and entertained personal fantasies of entities whom they felt they knew exceedingly well, despite never being near enough to any them to breathe an effectual word. (Recall that the mythical Narcissus, believing somebody else’s face looked back at him from the water’s surface, was unaware that he sat alone on the river’s edge.)
The long-term consequences of America’s impetuous reformatting of its life-patterns around television, the medium it begat on the world, are manifold. It will suffice to say here that McLuhan wasn’t at all surprised when the unrest of the late 1960s neatly corresponded with the original TV generation’s coming of age. Twenty years later, his pal Ong described television culture as one of secondary orality, with “striking resemblances to the old [primary orality] in its participatory mystique, its fostering of a communal sense, its concentration on the present moment, and even its use of formulas.”
Though McLuhan recognized the “retribalizing” effect of electronic media, he failed to foresee the paradox which Putnam identified in Bowling Alone: for all the reactivity, depth involvement, and communal awareness which twentieth-century media elicited in viewers and listeners, a marked decline in social participation followed television’s ascendency. If watching TV stoked viewers’ desire to contribute and commit to society, as McLuhan claimed, then evidently the urge could be satisfied (or at least mitigated) by remaining sedentary and staying tuned. In the Narcissus trance, passive viewing (or the holding of a private opinion) simulates active participation—as anyone old enough to have been riveted to CNN’s round-the-clock coverage of the Gulf War can attest. People closer to own my age might remember watching the Daily Show’s live coverage of the 2008 presidential election and experiencing an intoxicating sense of collective effervescence, even if they sat in front of the TV by themselves.
To be fair to McLuhan, he worked out his theories about television from a poor vantage point. His landmark work Understanding Media was published in 1964; in assaying to make pronouncements about the long-term impacts of TV, he stood at more or less the same position as any futurist daring to make granular predictions of the internet’s future back in 1999. Long passages of his writings on the subject are wanton exercises in Delphic bullshitting. Some of it is irremediably kooky and short-sighted. (More on this in a later post.) A lot of it was borne out in its outlines, but not in its particulars. And occasionally, a stray aphorism or cryptic oracle gives the impression that its meaning required a period of incubation before becoming true.
With TV, the viewer is the screen.
Obviously McLuhan tossed off this remark much too soon. Its veracity awaited the implosion of all media into a unitary format, and the transition from a one-way mass communication apparatus toward a network of bidirectional channels in which the distinct roles of writer and reader, speaker and listener, performer and viewer, producer and consumer are collapsed into that of the singular user.
McLuhan might say that the internet is to twentieth-century electronic media what typography was to alphabetic chirography: an intensification. The screen and the speaker were already old tech when the dial-up modem became an increasingly common household appliance, and information had been traveling at electromagnetic speeds since the telegraph came into use in the mid-nineteenth century. If modern telecommunication networks and the media they enable are to be thought of as a collective nervous system, externalized, the development of the internet represented the enlargement and ramification of the dendritic cords linking our awareness to peoples and parts remote, and the expansion of their functions beyond the mere transmission of a programmatic selection of sounds and images.
In a culture of advanced literacy, the reader is apt to experience a sense of intellectual connection to the author while perusing a text. The prominence and innocuousness of terms like “online spaces” in the casual discourse of digital culture indicates the effectiveness with which its technology promotes an aura of actual human presence. Proximity presupposes place, and the visual, tactile, and tantalizingly responsive aspects of our devices and online platforms conjure illusions of both.
The consoles on our desks and the “phones” in our pockets (it seems strange to call them that when we so seldom use them as telephones these days) are friends, coworkers, pen pals, telephone operators, bank tellers, shop clerks, grocery boys, advice columnists, matchmakers, meteorologists, newsboys, company recruiters, navigators, stock brokers, disc jockeys, prostitutes . . . Once the networked computer has subsumed this concatenation of human functions, transactions between the user and the provider of a service are analogous to motor neuronal signals from one’s brains to one’s muscles. If losing one’s car is like losing one’s legs, to have a broken phone in the twenty-first century is to be stricken blind, deaf, and dumb, and placed in solitary confinement.
As far as the contemporary firm and government are concerned, the consumer-citizen exists as the aggregate of the data he or she has generated. Many of us have begun to see ourselves the same way, implicitly regarding our devices as the seats of our identity. It’s not uncommon for us to believe we’re enacting our “real” selves while playing online video games, typing away in one or more Discord servers, punching out posts and replies on Reddit, staging glamor shots for Instagram, recording hyperactive polemics for TikTok, or conscientiously selecting a new PFP, updating our bio blurb, and choosing a new pinned tweet. Time apart from our devices elapses like the hours the creative artist spends at a day job—an interval of tedium and dissociation to be borne out before we can resume our “authentic” behavior. Correspondingly, we perceive the users with whom we most frequently engage though digital channels as our most sympathetic and reliable friends. An “IRL” acquaintance whom we don’t track on social media or exchange direct messages with is a person kept at arm’s length.
Back in March, New York Magazine ran a piece by Elizabeth Weil called “You Are Not A Parrot”—a riff on Silicon Valley apostate Jaron Lanier’s You Are Not a Gadget—which contained an excerpt of a talk she had with former Google AI researcher Blake Lemoine. Perhaps the name rings a bell: in the summer of 2022, Lemoine caused a stir when he went off-script and declared that the company’s LaMDA chatbot had achieved sentience, possessed feelings, and ought to have its rights respected. (He was fired shortly afterward.)
Weil recounts how Lemoine steered the conversation towards sex robots, and asked her to imagine one that has ChatGPT installed and can simulate human communication. Lemoine asked: if the robot tells the user no, and the user goes ahead and “uses” it anyway, would that constitute rape?
To my astonishment, Lemoine went on to make a valuable point:
Whether these things actually are people or not—I happen to think they are; I don’t think I can convince the people who don’t think they are—the whole point is you can’t tell the difference. So we are going to be habituating people to treat things that seem like people as if they’re not.
Forget the sexbot thought experiment, and forget the future tense. Contra Lanier, we are gadgets—or at least getting comfortable with being treated like them.
Our Narcissus-like attribution of humanity to the images, text, and attributions that populate our screens (and disappear when they’re shut off) conditions our expectations of the people we deal with out in “meatspace.” Online and offline, we prefer our circles of acquaintance to be safe, low-maintenance, and responsive. We wish for their approval, certainly, but we prefer friends who come and go at our pleasure like faces on TikTok, who challenge us as little as the other participants on our favorite political subreddit, who ask us to inconvenience ourselves on their behalf as seldom as do our Twitter followers, and don’t involve us in any messy affairs we can’t mute once we’ve had enough. We like when the guy or gal we met on Tinder responds to our texts with the alacrity of an Alexa, and then quietly takes the hint when we ghost them, closing out of our fling like a browser tab whose contents no longer interest us. We despise being put upon to perform “emotional labor,” and resent the imposition of an unexpected phone call or visit. At the same time, we also want to be assured that our coterie’s serviceability is authentic. We’re all of us empaths nowadays; it hurts us to imagine that someone might be grudgingly humoring us, and we hope to believe they’re as genuinely glad to be at our disposal as Replika mindlessly claims to be.
The ideal subject of digital culture is shaping up to be Lasch’s “happy hooker” with a slightly different shade of meaning. We are users, and we like people who like us using them.
Behind every active social media account is a person craving approval—not from people qua people, but from the device itself, standing in for a nebulous and detached audience. Even when antagonizing a stranger on Twitter, the savvy user goes about it like someone playing an old arcade game and performing a risky maneuver for the payoff of a higher score, quantified by views, replies, shares, and of course, Likes. The Like button, the ultimate abstraction of interpersonal behavior, reduces the warm smile, the fit of laughter, the expression of gratitude, the remark of approval, the clap on the shoulder, etc., all to an intangible token to be collected and tallied. Their accumulation does in rare cases launch a career in entertainment and marketing—but more often it forms the basis on which the amateur user gauges his or her personal worth at a given hour. (Gen Z and younger Millennials have ample cause for their famously raw nerves, but should we surprised that a cohort whose members went through puberty with their “points” displayed on a public scoreboard suffer from social anxiety disorder at apparently epidemic rates?)
But, truth be told, we’re awfully fond of users who are so incentivized. Our feeds are at their most entertaining when they’re stacked with content submitted by people competing and exerting themselves to surprise us, amuse us, tickle our emotions, and make us fond of them (but always at a prophylactic distance). And we admittedly feel best about ourselves when the device generously rewards our own public performances and meticulous disclosures.
Getting accustomed to “really” communicating with and knowing people through digital channels primes us to perceive glimmers of humanity in “smart” chatbots designed to mimic a user replying to our direct messages—and to correspondingly espy something of the mechanical in humanity. Not too long ago, the Economist’s sister magazine 1843 published a timely profile of a private AI developer by the name of Stephen Thaler, who says of his brainchild, DABUS: “It’s a machine.” Of himself, he says: “I’m a machine.”
Descartes approached the brink of a similar conclusion as he pondered the nature of humanity in a neatly deterministic (read: marked by discrete, sequential causal connections of the sort typographic media trains one to perceive) Newtonian universe, at a time when stocking one’s pleasure garden with automata resembling animals was in vogue. Descartes found little cause to differentiate real beasts from their clockwork simulacra: both were automata, but the “originals” moved about in the world by virtue of their musculature and the efficient causal operations of instinct. Homo sapiens, of course, was a different case. For Descartes, the unaccountable phenomena of consciousness and self-awareness (the “substance” of which purportedly intersects with the material body in the pineal gland) excludes humanity from consideration as mere meaty robots.
There’s a faint but distinct Cartesian echo in Lemoine and Thalers’ pronouncements regarding LLM bots. The proposition that LaMDA and DABUS possess [Homo] sapience cannot make sense unless the fundamental determination of humanity is the capacity to generate intelligible text (or speech), answer queries, crunch numbers, identify relations in data, and maybe turn out visual art or electronic music on request. This assumes that humanity is something that exists independently of, well, humans—living bodies that do things, that open their mouths to speak to proximate ears, that are possessed of particular textures, scents, and tastes, that spit and fart and cry and sometimes laugh themselves breathless, that cluster together for warmth and barbecue in the summer and push each other into swimming pools and apply bandages to each other’s skinned knees.
A number of Silicon Valley intellectuals, comfortable and well-entertained futurists, and climate apocalypticists desperate to imagine a way out have been espousing a strain of techno-Gnosticism as of late. Like the heretical Sethians, whose metaphysical cocktail of Neoplatonism and Judeo-Christian myth guided them towards a belief in the fundamental falseness of the physical world, the Singularity enthusiasts ascribe total primacy to the elusive and apparently immaterial thinking self, and maintain that the engrammatic patterns to which this self is eminently reducible had the cosmic misfortune of being etched into and dependent upon tawdry biological matter. (The Sethians were, of course, a People of the Book.) They regard this not as an essential fact of the situation, but as a problem awaiting a practical solution.
Arguments on behalf of this line of thought typically take the undesirability of “embodiment” as a given. What, you enjoy sweating and having to poop? Don’t know about you, but I could certainly do without itching, coughing, eating, and blowing my nose. Are you telling me you wouldn’t stay up all night playing games, working on music, and hanging out on Discord if you didn’t have to sleep?
To appreciate the extent of the estrangement from ourselves these sentiments express, consider any number of ancient creation myths in which the bodily functions we’d rather be rid of are the basis of divine acts of creation. The Sumerian creation narrative involves an episode in which the god Enlil ejaculated the Tigris and Euphrates onto the Mesopotamian plain. Egyptian myth recounts the spontaneous formation of men and women in the spot where the tears cried by the primordial deity Atum splashed upon the earth. The Greeks attributed the celestial band of the Milky Way to a lactating Hera. An episode of Japanese myth describes the kami of water greens and clay being born of the death goddess Inazami’s urine and feces. Such fables of the ineffable now strike us as primitive and gross. When the nature of divinity is extrapolated from the nature of humanity, and our understanding of humanity is informed by the abstract artifacts in which we believe we most clearly see ourselves reflected, only that which is incorporeal is clean enough to be sacred.
The popular way of explaining the next way forward is the digitization of consciousness: trading the flesh, along with its limitations and secretions, for a bodiless but apperceptive residence in the network. What we’re really talking about, however, is shaving away every part of ourselves extraneous to being users. We’re elated to imagine ourselves becoming self-aware programs perpetually contemplating, merging with, and moving information about, being always and completely in the channel with Gawr Gura, engineering worlds in Minecraft, kibbitzing on Reddit, and communing in ghostly ecstasy 24/7 with the wave-structure personalities of philosophers, comedians, pop stars, and OnlyFans constructs. Unmediated meatspace is dull and lonely and uncomfortable; why not have done with it?
Listening to the Singularity geeks, it appears as though the processes of outering and interiorization may be on the verge of meeting each other in the middle. When we’ve exteriorized and reprocessed so much of ourselves that what remains seems alien, oppressive, and inadequate compared to the seductive Narcissus image smirking at us in the all outered bits, there’s not much else to do but wish for the consummation of the process via the amputation of Homo sapiens. The death drive finally and irrefutably becomes the engine of human progress as we seek a collective return to the safety of the inorganic. The bodiless, heartless hikikomori as apotheosis; Hartmann’s cosmic suicide realized at last. Narcissus plunging into his reflection and drowning.
All of this speaks to how pathetic the digital revolution turned out to be, in spite of its early proponents’ emancipatory promises. In the span of just two decades, digital culture has necrotized into Adorno’s living nightmare: a world in which all activity either exists as a form of alienated labor—paid during the workday, unpaid during our “leisure” hours—or is otherwise felt to be disjointed and unavailing. (Why even go hiking if you can’t get anything Instagrammable out of the trip? Why take the trouble of making your own vegan yogurt at home if nobody engages with your content documenting it?) Life through and with the device is so nervous, anomic, and atavistic that “touch grass” has become an admonition to unplug and tend to one’s wracked nerves. Since opting out of participation simply isn’t feasible for most of us, the hope that we’ll live to see the day when we can slough off everything of ourselves which our unaided suite of apps and subscriptions can’t satisfy expresses a perverse optimism.
The fact that social media, video games, sexy chatbots, YouTubers, and chat rooms do assuage human needs is what makes the total scheme of their use so insidious. In offering us the digital platform as an ersatz playground, pub, church, agora, clubhouse, and brothel, transnational capital promises to restore or reactivate those aspects of ourselves we feel have gone missing or been allowed to atrophy by dint of of living under the material conditions it has done all it can to promote. In return, we’re implicitly asked to accept that no other way of life is possible and that there’s no way forward but the one being paved by the free enterprise of depraved technocrats and self-serving oligarchs (who no doubt plan on being first in line for deathbed transfigurations into immortal digital entities, should such a profane thing ever be technically feasible). The patterns of use which our devices are designed to ingrain in us see to that.
McLuhan was no Marxist, but he knew whose hand was on the tiller of technological progress, and which parties stood to benefit most from the reshaping of human life and consciousness to their specifications. If the anarcho-capitalist Silicon Valley milieu once adored him as its favorite salesman-prophet, it was only because he habitually uttered his warnings more quietly than his millenarian auguries:
Electric technology is directly related to our central nervous systems, so it is ridiculous to ask “what the public wants” played over its own nerves … Once we have surrendered our senses and nervous systems to the private manipulation of those who would try to benefit from taking a lease on our eyes and ears and nerves, we really don’t have any rights left. Leasing our eyes and ears and nerves to commercial interests is like handing over the common speech to a private corporation, or like giving the earth’s atmosphere to a company as a monopoly … As long as we adopt the Narcissus attitude of regarding the extensions of our bodies as really out there and really independent of us, we will meet all technological challenges with the same sort of banana-skin pirouette and collapse.
So there you have it: the banana [peel] pirouette. That’s where we’re at now.
And this has been a digest of the stuff I’d like to examine here over the coming weeks, months—however long. I’ll be trying to adhere to a weekly update schedule (and working on other stuff in the meantime), so most posts won’t be this long. But we’ll see what happens.