I feel that when we talk about the digital revolution, we can be prone to missing the .png for the pixels. And lo, we recollect, there was made an internet, and all the people did go on it and betook themselves to posting and scrolling through content…
This skips over the fact that the digital revolution was digitization per se. The web’s rise as a popular pastime and utility depended upon the reciprocal process of liquidating and consolidating all media into a homogenous monad. Music became digital. Photography became digital. Art became digital. Television and film became digital. Books and periodicals became digital. Ultimately the public sphere became digital, and social identity became digital too. The result is perhaps most evident in the semantic change of the word content.
Lately I’ve been reskimming Jaron Lanier’s 2010 manifesto You Are Not a Gadget, and dwelt for a while on his complaint about the internet generation’s predilection for “retro, retro, retro,” miring digital culture in mashups, remixes, and nostalgic scavenging. While the observation was wholly accurate (on the whole, I’m still astounded by Lanier’s insight and prescience), he evidently forgot that the printing press was first employed in the reproduction of preexisting manuscripts, and that some of the most acclaimed programs television’s early years were adaptations of radio shows. Given the cultural and technological environments in which the web came into use, it was to be expected that the creators and consumers of online content would be powerfully predisposed towards the repackaging, rearranging, and reexamination of what they already knew—and what most of them already knew best were television, movies, record music, video games, comic books, etc.
I’d have thought that a Silicon Valley oldhead like Lanier would have read and remembered the first chapter of Understanding Media:
The instance of the electric light may prove illuminating in this connection. The electric light is pure information. It is a medium without a message, as it were, unless it is used to spell out some verbal ad or name. This fact, characteristic of all media, means that the “content” of any medium is always another medium. The content of writing is speech, just as the written word is the content of print, and print is the content of the telegraph…
Let us return to the electric light. When the light is being used for brain surgery or night baseball is a matter of indifference. It could be argued that these activities are in some way the “content” of the electric light, since they could not exist without the electric light. This fact merely underlines the point that “the medium is the message” because it is the medium that shapes and controls the scale and form of human association and action. The content or uses of such media are as diverse as they are ineffectual in shaping the form of human association. Indeed, it is only too typical that the “content” of any medium blinds us to the character of the medium.
The fact is that the technology of the internet could not but impel the supposedly linear course of cultural progress to trace out epicycles, or even towards retrograde motion. There can be no explanation of our pop-cultural malaise in which the the digital revolution and its consequences upon “human association and action” don’t play an instrumental role, and the retro explosion that irritated Lanier in 2010 was an early symptom of a bug that had yet to fully disclose itself. As McLuhan said, we were transfixed by the content when we might have more fruitfully attended to the format if we wanted to anticipate what was coming.
To Lanier’s credit, he discerns the problem (and indeed the character of the medium) in his remarks on flat information structures. While he’s a bit vague on what he means by “flat” (gosh, maybe he is a good McLuhanite after all), the upshot is that each quantum of digital cultural expression “is created using the same resources as every other one.” Content is content is content is content is content—nothing will come of nothing and content will come of content; nothing is not nothing unless it is content. Culture is content and people are content. Hail content, full of content, content is with thee…
Ahem. This is as good excuse as any for another reminiscence about the oldweb from an oldman. I hope you’ll please indulge me.

If you were a Xennial or an older Millennial who played a lot of video games in elementary school, by the time you got to middle school you didn’t have many people to talk to about your hobby. Sure, you had your little circle of dorky friends who’d pore over the latest issue of GamePro in your social quarantine zone at the far end of a cafeteria table, but by that point most of the boys your age were more interested in sports, skateboarding, music, and girls. If any of the smart and/or popular kids were into video games, they generally didn’t talk about it because video games weren’t cool. Video games were a solitary hobby for solitary tweens and teens.1
For the most part, the mainstream media wasn’t interested in games, either. TV stations were happy to run ads for new releases, but that was about it. MTV never ran any half-hour “specials” promoting Metal Gear Solid or Tomb Raider. A cable executive proposing to devote airtime to competitive Mortal Kombat play and quippy roundtable discussions about Myst and Duke Nukem would have been fired for not taking his job seriously. Ecco the Dolphin, Kirby Super Star, Streets of Rage 2, and any number of other titles now remembered as classics were released without any fanfare outside of the gaming magazines. There were shoddy cash-grab adaptations, sure, but they didn’t count for much. The more you appreciated a Japanese game, the less likely you were to appreciate its American-made Saturday morning cartoon or feature film. Even when an early-nineties TV show had a gag spoofing a game (I’m thinking specifically of The Simpsons and The Critic here), your brain couldn’t help noticing that the writers’ ideas about video games were rather out-of-date. They hadn’t kept up with them; they neither played nor cared about them.2
That was just how it was—if you weren’t among the 23 percent of Americans who were on the internet in 1996, or the 36 percent who were on it in 1997 and 1998.3
In that case, before the end of the decade you’d discovered websites like the Metroid Database, the Castlevania Dungeon, EarthBound Central, Doomworld, RPG Classics, the Fighters Generation, GameFAQs, the Video Game Museum, Zany Video Game Quotes, The Grand List of Console Role-Playing Game Cliches, etc., etc.4 You’d read them as though they were digital zines, explored every section and subsection, and periodically checked back to see if they’d been updated. Probably you were a regular lurker or active participant on a message board dedicated to video games, or to a certain genre or franchise, and were reading webcomics parodying video games and poking fun at the people who played them. Maybe you’d made you own little website about your favorite underrated Sega CD game, your own webcomic about gamer culture, and had composed and submitted your own walkthrough of an NES game to GameFAQs.
At first it didn’t even occur to me to ask how the creators of the keystone fan sites were able to incorporate pixel-perfect graphics from the games into their pages’ GUIs, collect screencaps, and populate indices of characters, enemies, bosses, etc. with isolated sprite images. I must have assumed it involved some variety of peripheral equipment that only the Big Kids who were good at computers and made websites had access to.
Not much later, I got my answer: emulation.
At some point during the 1990s, some enterprising geeks devised freeware that imitated the functions of old gaming consoles—and they probably did it just for kicks. Even in the dial-up days, it wasn’t hard to find and download a program that ran a virtual Super Nintendo in a desktop window, and game files (ripped from the old cartridges) for it to run.5 And once you had Dracula X running on your PC, capturing and editing images from it was child’s play.
That’s how Kurt Kalata acquired the graphics and screencaps for the Castlevania Dungeon. That’s how Dave Anez and Brian Clevinger were able to make webcomics out of graphics from Mega Man and Final Fantasy. That’s how Seanbaby—an e-celeb precursor—was able to add such delightful visual flair to his proto-viral list of the worst NES games of all time.
Emulation was the secret sauce: without the possibilities it unlocked, the whole world of online video game fandom would have been unable to visually represent its fetish objects. Why was this so important? “Because for rational beings to see or re-cognize their experience in a new material form is an unbought grace of life,” McLuhan writes.
This is a typically cryptic and unfalsifiable claim of his—and I’m tempted to adduce block quotes from BF Skinner and Relational Frame Theory as evidence for it—but it is borne out by common experience. Star Wars fans who saw the movies a dozen times bought the novelizations; Harry Potter fans who read the books a dozen times bought tickets to see the movies. Sports fans would buy magazines and flip on the radio to read and listen to journalists summarize the football game they watched on live television the night before. A kinky guy with a camcorder might set the device on a tripod, tape himself having sex with his spouse, and then watch it later on by himself.
It was geek culture that colonized the early web; fans of video games, anime, comic books, collectible card games, etc. were, after all, were already well-accustomed to staying indoors and obsessing over a hobby, and their interests tended to overlap. The earliest virtual geek communities were formed on newsgroups that predate the web; the monumental growth that occurred later was predicated on the ecosystem of “content sites” serving as honeypots and signposts towards message boards and IRC channels, and as incentives to contribute to the culture with one’s own review site, character shrine, game walkthrough, comic strip, fan art, fanfiction, fan translation, etc.
The proto-social media platforms (LiveJournal, Xanga, etc.) augured a turning point: the virtualization of social life and the social self. MySpace and Facebook began as compellingly fun ways of experiencing one’s relations in a new format. If you’re old enough to have signed up for MySpace when it first launched, or to have gotten a Facebook account when membership was still restricted to college students, try to recall how captivating it was to see your friendships for the first time. And just as graphic rips, page scans, screencaps, etc. entranced fans of video games, comic books, and anime as they browsed webpages, it was the platforms’ structural emphasis on digital photography that first set them apart from the text-oriented LiveJournal and Xanga.6
I won’t try to pin down the year NES nostalgia reached its zenith, but it was most assuredly A Thing during the aughts. People several years younger than me were buying up used NES consoles, collecting games, and flaunting their interest in them. When I saw dudes rocking T-shirts with manual art from the original Legend of Zelda, I always wanted to ask if they’d actually owned and played the golden cartridge when they were in elementary school. 8-bit-style pixel art came into hipsterish vogue. NES games were suddenly cool in a way they definitely weren’t back when they actually stood on the cutting edge of home entertainment, and internet culture was responsible for putting the new starch in the old hat.
The early efflorescence of geeky fan culture inspired by then-obsolete video games can easily be compared to a release of pent-up energy. A generation of people who loved this shit but had few people to talk to about it, hadn’t much seen much onscreen meta-content about their hobby, and perhaps hadn’t previously known any outlets for expressing the ideas and feelings it planted in them (or at least none with any possibility of earning them social validation) came upon an ecosystem of interlinked websites and message boards. How could they not accept its enticements to participate in growing it?
The virtual community of video game fandom wasn’t totally oriented towards the past, of course. But within the fiefs of online video game fandom, “heritage” content was practically inescapable—and NES nostalgia tended to circulate farther outside the borders than, say, stuff about Touhou or Team Fortress, owing to its cozy familiarity and kitsch value. And it was out there that it took on a life of its own, more or less in the same way that any subcultural fashion catches on with people on its periphery and radiates outward from there.
But the “energy release” metaphor may be malapropos. It would be better, I think, to say that the swell of enthusiasm resulted from a metabolic conversion of content from one form to another—from the process of etherealizing the stuff of game cartridges and print matter (gaming magazines) into the universal medium of cyberspace.
While classic game nostalgia constituted an episode in the already-familiar retro cycle, it was far from a typical one. Owing to the position of the media artifacts it fetishized on the cusp of the digital revolution, it must be regarded as a sui generis phenomenon that cannot be repeated. It was one of the last retro fads that could involve a young cohort rediscovering and celebrating online a recollected moment in offline pop culture.
After passing a certain date—maybe the mid-2000s, maybe the early 2010s—it’s difficult to point to any pop culture that wasn’t already digitally endogenous (or otherwise nativized). Even though it’s possible to, say, make a 40-minute video essay, a blog post, a longform informational comic, or a series of silly memes about a TV show or video game that captivated some smaller or larger cross-section of the public in 2010, it will never deliver the proportionate impact of Seanbaby’s satirical articles about Super Friends cartoons and NES games. It entails no translation from one mode of experience to another.7 The memorable thing from 2010 was likely consumed in a digital format to begin with, and already inspired a galaxy of of online meta-content—in 2010. Maybe the kids on TikTok can find something fun and sort of different to do with it now, but there will be nothing like the scintillating quantum leap of the first wave, when old experiences were translated en masse into totally new formats.
Content is content is content is more content.
Bear in mind another crucial difference between digital culture and its predecessors: the eminent retrievability of digital artifacts. Forty years ago, all of this meta-content that acts as the currency of fan culture would eventually fall into the possession of hoarding collectors, and become practically impossible to find elsewhere. Think about the trouble you’d have to go through to find copies of 1977 magazines containing articles about Star Wars if you were to search for them in 1987. Consider the futility of seeking out VHS and cassette tape recordings of 1977 interviews with George Lucas and Carrie Fisher broadcast on television and radio. What about copies of Star Wars fanzines printed in the 1970s, or recordings of the 1978 Star Wars Holiday Special? How far would you have to travel to visit a convention where people peddling this stuff might congregate?
Today, on the other hand, more 2015 content about The Force Awakens than you’d ever care to parse is at your fingertips with just a few keystrokes. And when you’re finished browsing, you can watch the movie on your laptop or phone. You can take screencaps of closeups and tweet them as reaction images while it’s still playing. You can livestream yourself watching it. You can record a clip of an action scene, substitute the audio with the crescendo of your favorite song in a video editor, and upload it to TikTok. (Content begets content begets content.)
In Borges’ short story “Funes the Memorious,” a narrator recounts an interview with a young man who remembers every waking moment of his life in precise detail:
Locke, in the seventeenth century, postulated (and rejected) an impossible language in which each individual thing, each stone, each bird and each branch, would have its own name; Funes once projected an analogous language, but discarded it because it seemed too general to him, too ambiguous. In fact, Funes remembered not only every leaf of every tree of every wood, but also every one of the times he had perceived or imagined it. He decided to reduce each of his past days to some seventy thousand memories, which would then be defined by means of ciphers. He was dissuaded from this by two considerations: his awareness that the task was interminable, his awareness that it was useless. He thought that by the hour of his death he would not even have finished classifying all the memories of his childhood…
I suspect, however, that he was not very capable of thought. To think is to forget differences, generalize, make abstractions. In the teeming world of Funes, there were only details, almost immediate in their presence.
Twentieth-century retro fads were predicated on the evanescence of pop culture. The radio stopped playing old songs. LPs weren’t reissued on cassette tape of compact disc. Television shows went off the air. Films were never released on a home video format. Old magazines and posters got thrown out. This stuff had to disappear in order for it to be rediscovered, reconstituted, and reassessed. But the internet retains just about everything and ensures it remains accessible to everyone at all times—and all in the same format.
Xbox 360 and PlayStation 3 nostalgia—if that was ever a thing—never had a chance at catching fire the way cartridge nostalgia did. If it was a novel thing circa 2000 to discover a website dedicated to a particular video game that came out fifteen years earlier, by today the games of 2010 have already left enduring impact craters in our exteriorized collective memory. The gaming sites’ reviews and cultural analyses have been read, commented on, argued about, and rebutted, and you can revisit it all whenever you please. The walkthroughs are already posted. The first Let’s Plays and No Commentary Longplays have been on YouTube for ten years. The old memes have been seen by everyone who needed to see them, and any new memes are just more memes. The Fandom pages are probably as complete as they need to be, and the subreddits still where we left them. Hell, the content mills probably ground out a wave of “five years on” retrospective pieces in 2015. Fan sites on Neocities would be redundant.
Digital culture can never be rediscovered by itself—only retrieved. Nor can it easily draw inspiration from other media when there are no other media. This is the character of the medium to which its content so effectively blinded us during the decades of adoption and interiorization: it flattens, it homogenizes, it merges ephemerality and permanence, action and reaction, distance and propinquity, cliché and originality, art and life. Once it has exhausted foreign contexts and vocabularies to add to its lexicon, it can no longer say anything new; all that’s left to it is analysis and permutation, colored by stochastic fads and backlashes against fads.
I’ll propose another analogy: digital culture is like an agoraphobic poet of the eighteenth or nineteenth century who never leaves his bedroom, and has never been outside of the house (that he can remember). He composes verse quickly, and in great abundance, writing about the sparse objects of his room, his feelings, what he sees outside his window, and the contents of the five or six or ten poetry collections on his bookshelf. He revisits his old poems; he writes poems about his old poems. He rebuts in a sonnet his cantos about another poet’s couplets; in a villanelle he meditates on his sestina about the dactyls he penned about the shadow of the window grille upon the wall, which looks the same to him now as it did before (though he does feel a little differently about it, having already written about it once).
He generates poems about poems about poems about poems, all in the same format, and the minor variations in his experience of the external world (the patter of the rain on the roof, the glimpse of a sparrow on the lawn, the audible sigh of the women who delivers him food through a slot in the door) influence his work only to the extent of tincturing his mood or placing a particular metaphor in his mind. Month after month. Year after year.
Reading his own work, he often finds it tedious. Repetitive. Increasingly uninspired. But he cannot stop writing poems—what else would he do?—and he is incapable of going outside and seeking new experiences to write about. Probably if he did, his compulsion to convert them to verse in the moment would for all purposes spoil them.
Perhaps generative AI is the next logical step. Why not just automate the processes as they’re already scripted to run?
(1) Overwhelmingly male tweens and teens, I might add. But that’s a separate topic.
(2) It was probably worse for you if you were older. If the contemporary stereotype of the abdominous bearded man in his twenties or thirties gripping an Xbox controller with Dorito-stained hands is unsexy to you, picture that same dude in 1992 playing Sonic the Hedgehog by himself. Totally unfuckable.
Then again—the “Bonestorm” commercial in a late 1995 Simpsons episode indicates that someone in that writers’ room was doing their homework.
I need to bookmark one source for year-by-year internet use data. This one came from Pew.
I scrupulously capitalize video game titles, but not the names of websites—and here I’m giving precedence to the second rule. I’m so neurotic about this crap that I’m making a footnote saying so.
This was on par with Napster or BitTorrent in terms of legality. A lot of the sites hosting rom files had disclaimers saying something like “these are only for backup purposes, don’t download unless you own the cartridges!” Given that an aged NES console was prone to malfunctioning after years of regular use, the pretext was almost convincing.
Not that there weren’t other factors—but would “The Facebook” have held your interest in 2005 if it hadn’t any faces?
To be sure, the comparatively miniscule population of cyberspace in the early aughts was a factor: it’s easier to be a big fish in a small pond than in the ocean, after all.
You should read "The Machine Stops". It starts with a youtuber complaining that airports all look the same. It was written in 1909.
Like many from our generation I went through this cycle of bringing the past forward into the digital world.
I will always remember my roommate and I, first-year university, downloading the entire NES rom library from a warez site (oh yeah, warez baby! ). Neither of us had touched a NES since we were kids and were currently running daily marathon sessions of Counter Strike 1.4. You have no idea how hyped we were as we got ready to stay up all night to conquer all our childhood favorites.
That thrill lasted all of about 5 minutes.
I realized a lot in those 5 minutes.
I immediately understood that nostalgia can only be remembered, not relived. Once you attempt to relive that experience it ceases to be nostalgia. You can no longer remember that experience through the rose colored glasses of memory. You are now forced to live the experience through your current frame of reference. The two experiences do not remain in parallel. The new overwrites the old instantaneously.
In those 5 minutes I immediately lost the entirety of my blissful NES youth. My older self found these games to be unbearably simple, repetative, ugly, and generalynatupid. How could I have ever thought this was worth anything? It dawned on me that I somehow innately understood this would happen, which is why I never liked (and still don't like) taking pictures at events. I'm like Bill Pullman's character in Lost Highway; I like to remember things the way I remember them, maybe not necessarily the way they happened.
To say that The Great Gatsby took on a whole new meaning would be an understatement.
I actually think that the generations born in the digital age may be free from the trappings of nostalgia specifically for the all the things you've described about the digital age and my opinion is that is one good thing this age may bring.