You have confused the true and the real.
– George Stanley, in conversation with Samuel Delany (epigraph to Dhalgren)
More than for its supernatural content, Stranger Things has garnered attention for its intoxicating re-creation of the 1980s childhood, with its roleplaying games, homemade mix tapes, and unsupervised night journeys on banana-seat bikes. As someone born in 1977, I found this to be the most infectious quality of the show. It got me thinking about what it was like to live in a time when PCs, mobile phones, and tablets weren’t around, and when electronic screens, while venerated in the form of those stacks of TVs that graced the front window of appliance stores, weren’t so present as to be deemed ubiquitous.
The Canadian indie rock band Arcade Fire recorded a song on this subject called “We Used to Wait.” It describes the era “before the flashing light settled deep” in our brains, when time seemed to stretch out to infinity.
Now it seems strange
How we used to wait for letters to arrive
But what’s stranger still
Is how something so small can keep you alive.1Arcade Fire, “We Used to Wait”: http://www.songfacts.com/detail.php?lyrics=20359
Strange indeed. One of the great ironies of the digital era is that all those hours we save thanks to our digital devices are purchased with another form of time, one that can’t be tallied in hours and minutes. During slow time spent idling, during transitory or empty time spent waiting or “wasting hours just walking around” as the song puts it, one is left alone with one’s thoughts. In this form of time, the veil between mind and world begins to fray, and a spirit of free association sets in between the environment and the images it evokes in us. Utility, function, and signification cease to be the prime determinants of meaning, and with their retreat, more expansive or imaginal insights come to the fore. It isn’t that this form of time has ceased to exist today but that the human environment seems deliberately designed to eliminate it. Idle time now has to vie with that other, more productive time if it is to have a place in our lives at all. For young people who have spent every spare moment of their lives in front of screens, there is a danger that the vying never even registers, and that slow time for them manifests only as malignant time, useless and to be killed on sight.
It’s no secret that digital media has transformed even the most solitary moments of our lives into an opportunity to shop, socialize, or work. Socioeconomic pressures once restricted to the working hours are now active around the clock.2See Jonathan Crary, 24/7: Late Capitalism and the Ends of Sleep (2013). According to the American Academy of Pediatrics, children in the United States spend an average of seven hours a day in front of screens; the figure increases to almost eleven hours a day for adults. What happened to these hours before all the screens lit up? In the Dungeons & Dragons scenes that open and close Stranger Things, we get a tangible sense of an important difference between the digital present and its analog past. There is little doubt that if the story had been set today, verisimilitude would have dictated that the kids be playing video games in those scenes, because the user interfaces that mediate our interactions with each other and the environment remain active even when we are physically together. The D&D scenes remind us that before the advent of digital life, people and especially children were largely left to their own devices—or lack thereof. Can Stranger Things be read as an attempt to reveal a lifeworld that the new age foretold by Apple in its 1984 ad seeks to replace? That’s my impression. I believe that the series shows us how pre-digital modernity, though itself obsessed with machines and yearning for much that we take for granted today, remained essentially caught in the tendrils of a primal or elemental plane of nature that the technological mind, true to its rationalist underpinnings, is committed to removing from the human experience altogether. As we shall see in the conclusion of this essay, the supernatural elements of the show can be interpreted as metaphors for this deeper level of nature.
The rapid progress of digital technology has been predicated on the ideals of innovation, efficiency, and control. Since these ideals naturally had to be in place before the progress got underway, it would be absurd to blame information technology alone for the death of slow time. When Benjamin Franklin coined the phrase “time is money” in the eighteenth century, he knew that most of his contemporaries would immediately catch his drift. The industrial revolution that spurred the development of a global technological infrastructure aspired to the same ideals that drive digital culture today. Even so, the cog-and-gear machinery of the industrial age had a clunkiness that acted as a constant reminder of the Promethean undertones of modernity’s efforts to remake nature in its image. Pre-digital technologies were too slow, too caught up in pneumatic chugging and pumping and spinning, to allow their users to overlook their essential artificiality. Through speed and efficiency, digital technologies give material substance to a picture of life that existed only in the realm of noble aspirations until they became commonplace. In a sense, the information age can be said to have turned a certain opinion about life into a fact of life. It has made an ought into an is.
Yet the differences between digital and analog go beyond their respective degrees of speed and efficiency. As its name implies, digital technology apprehends reality in terms of discrete units of data, whereas analog apprehends it as a continuous flow of indivisible change, which happens to be what reality actually is. A digital still camera transcodes the light that strikes its sensor into a discrete series of ones and zeros; continuous light is thereby broken up into a discontinuous series of fixed values, data which the camera’s software then interprets to form the image on a matrix made up of distinct pixels. Analog photography, by contrast, doesn’t “interpret data” at all. The chamber opens, light strikes the emulsion, and the crystals explode to generate a photographic image. Nor are the crystals in the emulsion separate from each other like the pixels in a dot matrix, for they burst into one another, becoming entangled and fractal. The chemical process by which analog cameras capture images is essentially no different from the process by which mirrors reflect objects, in that the light which makes an object visible in a mirror is the same light that makes one appear on a negative. The two processes are uniplanar; they occur on the plane of physico-chemical events. This isn’t the case with digital cameras, which must translate the source light prior to the production of the synthetic image. While the term “capture” is still used to denote the recording of an image or sound using digital devices, this term technically applies only to the latter’s analog predecessors. Digital media devices don’t capture, they reproduce. And they reproduce on the basis of a prior computation of sensible change as fixed, neutral data. The data is indifferent to whether the media is visual, aural, textual, or what have you. A digital camera deals with light, a digital sound recorder deals with sound waves; yet both devices transform their material into the same type of binary data. Technically, it all has to turn into ones and zeros before being reconstituted as “content.”
None of this means that digital photography and sound recording are intrinsically inferior to their analog counterparts. The idea that even the worst Polaroid is somehow “better” than the most sublime digital photograph is, of course, absurd. Through the swift development that is part and parcel of an information age, the digital has reached such high levels of sophistication as to be limited only by the imagination and vision of users. But that brings us right to the point. The ideals that guide the evolution of digital technology—innovation, efficiency, control—have nothing to do with imaginal vision at all. In fact, if we uphold Heidegger’s key distinction between techne and poiesis, they tend to work against it.3For Heidegger, the Greek term techne refers to the technical act of giving natural things a form adapted to our utilitarian needs. But techne implies a deeper though often concealed process of poiesis, which inheres in that poetic and creative revealing of the world that characterizes art. See “The Question Concerning Technology,” in Basic Writings (1993). It only follows that a culture spawned around this technology would serve to propagate its ideals, however implicitly. As a result of this message-in-the-medium effect which the proliferation of digital into every area of daily life only intensifies, the technological worldview increasingly appears to us as an accurate reflection of objective reality, whereas in fact it consists in a reduction of the Real to the categories of the human intellect.
The Desert and the Forest
The technological worldview that thrives in digital culture is what the American philosopher William James, at the turn of the twentieth century, called “intellectualism.” It consists in a radicalization of the rationalist belief that concepts have primacy over things, in that only the communicable aspect of things has objective value. In other words, the intellectualist believes that the Real is exhausted in its conceptualization. To borrow James’ words, he or she thereby carries out “a great transformation of the perceptual order” of lived reality into “a conceptual order” of fixed ideas, which is then (mis)taken for lived reality.4William James, A Pluralistic Universe, 232. Intellectualism takes for granted a dualism that was already present in Plato and Aristotle but became decisive only when Descartes split reality into two separate substances—mind and matter, subject and object—at the beginning of the modern age.
Alfred North Whitehead, a younger contemporary of James, saw the Cartesian split as the moment where everything went wrong with Western thought. “What I am essentially protesting against,” he wrote, “is the bifurcation of nature into two systems of reality which, in so far as they are real, are real in different senses.”5Alfred N. Whitehead, The Concept of Nature, 30. The history of modern philosophy can be read as a relentless contest to determine which of the two sides in the split—the “objective” or the “subjective”—is the real one, and which is the illusion. In one camp, we have materialism, which insists on the necessity of the objective “out there” and the contingency of subjective “in here.” In the other camp, we have idealism, which assumes the inverse position. But for all the furor with which the two positions have battled it out over the centuries, there is one point on which they are in perfect agreement. It is the pivot of the perpetual teeter-totter. According to both materialism and idealism, human reason is the arbiter of the Real. Both philosophies hold that the universe is rational at its core, and that the human intellect can therefore penetrate its deepest recesses. As a result, both materialism and idealism remain chained to a deeper dualism even if they claim that one term in the duality is unreal. Regardless of which argument one opts for, then, there is the same tremendous negation of the experiential world, the same abject denial of the paradoxical yet sensible universe of embodied existence. Concurrent with this negation is the conjuration of some “substance” that underlies the illusion, and which alone qualifies as real: mind or matter. Either way you cut it, the world of experience is hitched onto an abstract idea in the name of objective Reason.
If I have argued for the metaphysical implications of art in the past, it is because every great work of art cries out against this kind of world denial. As empirical as any natural science, art implicitly affirms the immediate reality of the appearances that make up the cosmos, all that John Locke dismissed as the “secondary qualities.” So far as it can be considered a work of art, Stranger Things is no different. But whereas many artworks affirm the reality of this world simply by their form, the Netflix series also does so in its content—at least if one is willing to go beneath the surface and make the symbols speak, conscious intentions on the creators’ part notwithstanding.
But before getting back to Stranger Things, I want to dig a bit deeper into how post-Cartesian dualism relates to the digital culture that arose in the final years of the last century. The upshot of the rational bifurcation of nature is that it subordinates all things to human interests and purposes. Since only humans seem to have the capacity to form communicable concepts, only humans can perceive reality. Reality belongs to humans—specifically, humans who enshrine reason as their modus operandi (the “civilized”). It isn’t hard to see the allure of such an outlook to a culture bent on planetary conquest such as that of early modern Europe. Intellectualism puts the whole slippery, turbid, amorphous business called nature into a manageable conceptual framework. As James puts it:
Sensible reality is too concrete to be entirely manageable—look at the narrow range of it which is all that any animal, living in it exclusively as he does, is able to compass. To get from one point in it to another we have to plough or wade through the whole intolerable interval.… But with the faculty of abstracting and fixing concepts we are there in a second, almost as if we controlled a fourth dimension, skipping the intermediaries as by a divine winged power, and getting at the exact point we require without entanglement with any context. What we do in fact is harness up reality in our conceptual system in order to drive it the better.6James, op. cit., 247-48. (Emphasis added.)
Notice how this passage could almost have been taken from a text praising the virtues of digital media. Thanks to it, we can shop for groceries without getting out of bed, travel without consulting a map, make friends without going out. We used to wait but must wait no more.
In spite of his abhorrence of “vicious” intellectualism, William James was the first to admit that concepts are incredibly useful, even essential to life as we know it. Yet there is a difference between recognizing the purely practical advantages of conceptual thought and deciding that conceptuality defines reality as a whole. It is in the nature of the intellect to reduce things to concepts in order to better navigate among them, but only intellectualism can make the extra move of deciding that the reduction is really a penetration, and that what is left out of the equation is less real than what is retained. Because of its immersive and ubiquitous qualities, digital culture lures us into just such an intellectualist frame of mind. James contended that because the Real exceeds our intellectual capacities, the intellect ought not to be made absolute. My contention is that this is precisely what an uncritical engagement with the digital ends up doing.
In the ambiance of our times, it often seems as though only what is representable and communicable is real. This is because the digital absolutizes the intellect by supplanting experience with information. We can see the signs of this shift from experience to information all over the culture, from the widespread belief that no memorable event has really taken place unless it has been photographed and shared on social media, to the hitherto unimaginable phenomenon of people shambling into high-speed traffic while absorbed in a game of Pokemon Go. But one of the most compelling signs of the shift involves the science fiction trope of artificial memory. Philip K. Dick used it with characteristic prescience in “We Can Remember It for You Wholesale,” a 1966 short story set in a future when the memory of, say, a trip to Mars can be artificially implanted into the minds of people who can’t afford to travel there physically. Another memorable example occurs in The Matrix, when Neo uploads the skills of a kung fu master directly into his brain, thereby shirking the “intolerable interval” of having to learn the martial art through practice. In both cases, the assumption is that experience and information are interchangeable because experience is conceived as a mere configuration of neutral data sets. And that is intellectualism distilled to its essence. What it misses is the fact that in actually going through the motions of traveling to another planet or mastering a martial art, there is a literal infinitude of details, sensations, and singularities that the propositions “I have been to Mars” or “I know kung fu” could never encompass. Information is an entirely intellectual quantity. In perceiving lived experience as mere information intake, we effectively reduce experience to its infinitesimal computable fraction. Reality is lost in the process.
Because it surreptitiously upholds a view that places the conceptual intellect at the center of things, digital culture can be described as a concrete enactment of anthropocentric dualism. It takes for granted an absolute distinction between thought and matter. It implicitly conceives the universe as an expanse of inert data to be arranged into concepts no less inert. There is no hyperbole in describing such a place as a dead world. “Welcome to the desert of the real,” Morpheus says to Neo in The Matrix, that Gnostic paean to the computer age.7With this line, the Wachowskis are quoting the work of the digital prophet Jean Baudrillard. According to digital culture’s latent metaphysics, the Real is a wasteland, a place devoid of meaning or significance until some mind shapes it into a meaningful mirage. Whether this mind belongs to a human being or a vast artificial intelligence is immaterial. When all experience is conceived as information, the universe appears to us as it would to any computer, namely a series of fixed states without interval, motion, or becoming—a zombie cosmos.
Fortunately, this dead world isn’t the real one. James again:
What really exists is not things made but things in the making. Once made, they are dead, and an infinite number of alternative conceptual decompositions can be used in defining them. But put yourself in the making by a stroke of intuitive sympathy with the thing and, the whole range of possible decompositions coming at once into your possession, you are no longer troubled with the question which of them is the more absolutely true. Reality falls in passing into conceptual analysis; it mounts in living its own undivided life—it buds and bourgeons, changes and creates.8James, op cit., 263-64.
Beneath the conceptual overlay, reality remains what it is: not an orderly network of humanly comestible ideas, but a turbid, ever-changing, symphonic, indefinable process of becoming that is accountable to neither the predilections of reason nor the strictures of logical grammar. The conceptual order having been restored to its place as one facet of a pluralistic universe, the Real ceases to look like a desert and appears instead as a veritable forest, full of movement and teeming with strange forms of life.
The simple fact that no rational model has been able to account for the infinite richness of embodied experience is reason enough to hold that the Real is more poetic and imaginal than rationalistic and conceptual. Life isn’t an either-or but a both-and affair, the governing logic of which is analogical (as in a dream), not dialectical (as in a bank). Reality, in other words, is analog. After all, even the most sophisticated digital device is at bottom an analog machine, in that the forces that make stars explode and rivers flow and are also responsible for firing up its electronic circuits. Only the spirit in which the apparatus was made tries to convince us otherwise. Digital media supports the dominant rationalist ethos by promulgating a mode of existence that perceives nature as eminently knowable and therefore susceptible to technological control. But the truth is that the Real cannot be known, and it will not be controlled. It outstrips the concepts we manufacture to navigate its depths. Indeed, as William James himself says, nature is definable only as excess.9James, A Pluralistic Universe, 286: “Only concepts are self-identical; only ‘reason’ deals with closed equations; nature is but a name for excess; every point in her opens out and runs into the more…” (Emphasis added.) This makes it strange in the ontological sense of the word. Ours is a weird world. Accordingly, only portrayals of the world that embrace the Weird as constitutive of reality as such deserve to be called realistic.
References [ + ]
|1.||↑||Arcade Fire, “We Used to Wait”: http://www.songfacts.com/detail.php?lyrics=20359|
|2.||↑||See Jonathan Crary, 24/7: Late Capitalism and the Ends of Sleep (2013).|
|3.||↑||For Heidegger, the Greek term techne refers to the technical act of giving natural things a form adapted to our utilitarian needs. But techne implies a deeper though often concealed process of poiesis, which inheres in that poetic and creative revealing of the world that characterizes art. See “The Question Concerning Technology,” in Basic Writings (1993).|
|4.||↑||William James, A Pluralistic Universe, 232.|
|5.||↑||Alfred N. Whitehead, The Concept of Nature, 30.|
|6.||↑||James, op. cit., 247-48. (Emphasis added.)|
|7.||↑||With this line, the Wachowskis are quoting the work of the digital prophet Jean Baudrillard.|
|8.||↑||James, op cit., 263-64.|
|9.||↑||James, A Pluralistic Universe, 286: “Only concepts are self-identical; only ‘reason’ deals with closed equations; nature is but a name for excess; every point in her opens out and runs into the more…” (Emphasis added.)|