It is winter in upstate New York, on a morning so cold the ground squeaks loudly underfoot as sharp-finned ice crystals rub together. The trees look like gloved hands, fingers frozen open. Something lurches from side to side up the trunk of an old sycamore—a nut-hatch climbing in zigzags, on the prowl for hibernating insects. A crow veers overhead, then lands. As snow flurries begin, it leaps into the air, wings aslant, catching the flakes to drink. Or maybe just for fun, since crows can be mighty playful.
Another life form curves into sight down the street: a girl laughing down at her gloveless fingers, which are busily texting on some handheld device. This sight is so common that it no longer surprises me, though strolling in a large park one day I was startled by how many people were walking without looking up, or walking in a myopic daze while talking on their “cells,” as we say in shorthand, as if spoken words were paddling through the body from one salt water lagoon to another.
We don’t find it strange that, in the Human Age, slimy, hairy, oozing, thorny, smelly, seed-crackling, pollen-strewn nature is digital. It’s finger-swiped across, shared with others over, and honeycombed in our devices. For the first time in human history, we’re mainly experiencing nature through intermediary technology that, paradoxically, provides more detail while also flattening the sensory experience. Because we have riotously visual, novelty-loving brains, we’re entranced by electronic media’s caged hallucinations. Over time, can that affect the hemispheric balance of the brain and dramatically change us? Are we able to influence our evolution through the objects we dream up and rely on?
Through lack of practice, our brains have gradually lost their mental maps for how to read hoofprints, navigate by landmarks and the stars.
We may possess the same brain our prehistoric ancestors did, but we’re deploying it in different ways, rewiring it to meet 21st-century demands. The Neanderthals didn’t have the same mental real estate that modern humans enjoy, gained from a host of skills and preoccupations—wielding laser scalpels, joyriding in cars, navigating the digital seas of computers, iPhones, and iPads. Generation by generation, our brains have been evolving new networks, new ways of wiring and firing, favoring some behaviors and discarding others, as we train ourselves to meet the challenges of a world we keep amplifying, editing, deconstructing, and recreating.
Through lack of practice, our brains have gradually lost their mental maps for how to read hoofprints, choose the perfect flints for arrows, capture and transport fire, tell time by plant and animal clocks, navigate by landmarks and the stars. Our ancestors had a better gift for observing and paying attention than we do. They had to: Their lives depended on it. Today, paying attention as if your life depends on it can be a bugbear requiring conscious effort. More and more people are doing all of their reading on screens, and studies find that they’re retaining 46 percent less information than when they read printed pages. It’s not clear why. Have all the distractions shortened our attention spans? Do the light displays interfere with memory? It’s not like watching animals in ordinary life. Onscreen, what we’re really seeing isn’t the animal at all, but just 300,000 tiny phosphorescent dots flickering. A lion on TV doesn’t exist until your brain concocts an image, piecemeal, from the pattern of scintillating dots.
College students are testing about 40 percent lower in empathy than their counterparts of 20 or 30 years ago. Is that because social media has replaced face-to-face encounters? We are not the most socially connected we’ve ever been—that was when we lived in small tribes. In our cells and instincts, we still crave that sense of belonging, and fear being exiles, because for our ancestors living alone in the wild, without the group protection of the tribe, meant almost certain death. Those with a strong social instinct survived to pass their genes along to the next generation. We still follow that instinct by flocking to social media, which connects us to a vast multicultural human tribe—even though it isn’t always personal.
Many of our inventions have reinvented us, both physically and mentally. Through texting, a child’s brain map of the thumbs grows larger. Our teeth were sharper and stronger before we invented cooking; now, they’re blunt and fragile. Even cheap and easily crafted inventions can be powerful catalysts. The novelty of simple leather stirrups advanced warfare, helped to topple empires, and introduced the custom of romantic “courtly” love to the British Isles in the 11th century. Before stirrups, wielding either a bow and arrow or a javelin, a rider might easily tumble off his horse. Stirrups added lateral stability, and soldiers learned the art of charging with lances at rest, creating terror as their horses drove the lances home. Fighting in this specialized way, an aristocracy of well-armed and -armored warriors emerged, and feudalism arose as a way to finance these knights, whose code of chivalry and courtly love quickly dominated Western society. In 1066, William the Conqueror’s army was outnumbered at the Battle of Hastings, but, by using mounted shock warfare, he won England anyway, and introduced a feudal society steeped in stirrups and the romance of courtly love.
Tinkering with plows and harnesses, beyond just alleviating the difficult work of breaking ground, meant farmers could plant a third-season crop of protein-rich beans, which fortified the brain, and some historians believe that this brain boost, right at the end of the Dark Ages, ushered in the Renaissance. Improved ship hulls spread exotic goods and ideas around the continents—as well as vermin and diseases. Electricity allowed us to homestead the night as if it were an invisible country. Remember, Thomas Edison perfected the light bulb by candle or gas-lamp light.
In programs like March of the Penguins we see animals in their natural settings, but they’re dwarfed, flattened, interrupted by commercials, narrated over, greatly edited, and sometimes staged for added drama.
Our inventions don’t just change our minds; they modify our gray and white matter, rewiring the brain and priming it for a different mode of living, problem-solving, and adapting. In the process, a tapestry of new thoughts arises, and one’s worldview changes. Think how the nuclear bomb altered warfare, diplomacy, and our debates about morality. Think how television shoved wars and disasters into our living rooms, how cars and airplanes broadened everything from our leisure to our gene pool, how painting evolved when paints became portable, how the printing press remodeled the spread of ideas and the possibility of shared knowledge. Think how Eadweard Muybridge’s photographs of things in motion—horses running, humans broad-jumping—awakened our understanding of anatomy and everyday actions.
Or think how the invention of the typewriter transformed the lives of women, great numbers of whom could leave the house with dignity to become secretaries. Although they won the opportunity because their dexterous little fingers were considered better able to push the keys, working in so-called pools they risked such bold ideas as their right to vote. Even the low-tech bicycle modified the lives of women. Straddling a bike was easier if they donned bloomers—large billowy pants that revealed little more than that they had legs—which scandalized society. They had to remove their suffocating “strait-laced” corsets in order to ride. Since that seemed wicked, the idea of “loose” women became synonymous with low morals.
In ancient days, our language areas grew because we found the rumpled currency of language lifesaving, not to mention heady, seductive, and fun. Language became our plumage and claws. The more talkative among us lived to pass on their genes to chatty offspring. Language may be essential, but the invention of reading and writing was pure luxury. The uphill march children find in learning how to read reminds us that it may be one of our best tools, but it’s not an instinct. I didn’t learn to read with fluent ease until I was in college. It takes countless hours of practice to fine-tune a brain for reading. Or anything else.
Near- or farsightedness was always assumed to be hereditary. No more. In the United States, one-third of all adults are now myopic, and nearsightedness has been soaring in Europe as well. In Asia, the numbers are staggering. A recent study testing the eyesight of students in Shanghai and young men in Seoul reported that 95 percent were nearsighted. From Canberra to Ohio, one finds similar myopia, a generation of people who can’t see the forest for the trees. This malady, known as “urban eyes,” stems from spending too much time indoors, crouched over small screens. Our eyeballs adjust by changing shape, growing longer, which is bad news for those of us squinting to see far away. For normal eye growth, children need to play outside, maybe watching how a squirrel’s nest, high atop an old hickory tree, sways in the wind, then zooming down to the runnel-rib on an individual blade of grass. Is that brown curtsey at the bottom of the yard a wild turkey or a windblown chrysanthemum?
In the past, bands of humans hunted and gathered, eyes nimble, keenly attuned to a nearby scuffle or a distant dust-mist, as they struggled to survive. Natural light, peripheral images, a long field of view, lots of vitamin D, an ever-present horizon, and a caravan of visual feedback shaped their eyes. They chipped flint and arrowheads, flayed and stitched hides, and did other close work, but not for the entire day. Close work now dominates our lives, but that’s very recent, one of the Anthropocene’s hallmarks, and we may evolve into a more myopic species.
Studies also show that Google is affecting our memory in chilling ways. We more easily forget anything we know we can find online, and we tend to remember where online information is located, rather than the information itself.
Long ago, the human tribe met to share food, expertise, ideas, and feelings. The keen-eyed observations they exchanged about the weather, landscape, and animals saved lives on a daily basis. Now there are so many of us that it’s not convenient to sit around a campfire. Electronic campfires are the next best thing. We’ve reimagined space, turning the Internet into a favorite pub, a common meeting place where we can exchange knowledge or know-how or even meet a future mate. The sharing of information is fast, unfiltered, and sloppy. Our nervous systems are living in a stream of such data, influenced not just by the environment—as was the case for millennia—but abstractly, virtually. How has this changed our notion of reality? Without our brain we’re not real, but when our brain is plugged into a virtual world, then that becomes real. The body remains in physical space, while the brain travels in a virtual space that is both nowhere and everywhere at once.
One morning some birder pals and I spend an hour at Sapsucker Woods Bird Sanctuary, watching two great blue herons feed their five rowdy chicks. It’s a perfect setting for nesting herons, with an oak-snag overhanging a plush green pond, marshy shallows to hunt in, and a living larder of small fish and frogs. Only a few weeks old, the chicks are mainly fluff and appetite.
Mom and Dad run relays, and each time one returns the chicks clack wildly like wooden castanets and tussle with each other, beaks flying. Then one hogs Mom’s beak by scissoring across it and holding on until a fish slides loose. The other chicks pounce, peck like speed typists, try to steal the half-swallowed fish, and if it’s too late for that, grab Mom’s beak and claim the next fish. Sibling rivalry is rarely so explicit. We laugh and coo like a flock of doting grandparents.
At last Mom flies off to hunt, and the chicks hush for a nap, a trial wing stretch, or a flutter of the throat pouch. Real feathers have just begun to cover their down. When a landing plane roars overhead, they tilt their beaks skyward, as if they are part of a cargo cult or expecting food from pterodactyls. We could watch their antics all day.
I’m new to this circle of blue heron aficionados, some of whom have been visiting the nest daily since April and comparing notes. “I have let a lot of things go,” one says. “On purpose, though. This has been such a rare and wonderful opportunity.” “Work?” another replies. “Who has time to work?”
So true. The bird sanctuary offers a rich mosaic of live and fallen trees, mallards, songbirds, red-tailed hawks, huge pileated woodpeckers, and of course yellow-bellied sapsuckers. Canada geese have been known to stop traffic (literally)—with adults serving as crosswalk guards. It’s a green mansion, and always captivating.
However, we’re not really there. We’re all—more than 1.5 million of us thus far—watching on two live webcams affixed near the nest, and “chatting” in a swiftly scrolling Twitter-like conversation that rolls alongside the bird’s-eye view.
We’re virtually at the pond, without the mud, sweat, and mosquitoes. No need to dress, share snacks, make conversation. Some of us may be taking a coffee break, or going digitally AWOL during class or work. All we can see is the heron nest up close, and that’s a wonderful treat we’d miss if we were visiting on foot. In a couple of weeks the camera will follow the chicks as they learn to fish.
This is not an unusual way to pass time nowadays, and it’s swiftly becoming the preferred way to view nature. Just a click away, I could have chosen a tarantula-cam, meerkat-cam, blind-mole-rat-cam, or 24-hour-a-day Chinese-panda-cam from a profusion of equally appealing sites, some visited by tens of millions of people. Darting around the world to view postage-stamp-size versions of wild animals that are oblivious to the video camera is the ultimate cinema verité, and an odd shrinking and flattening of the animals, all of whom seem smaller than you. Yet I rely on virtual nature to observe animals I may never see in the wild. When I do, abracadabra, a computer mouse becomes a magic wand and there is an orphan wombat being fed by wildlife rescuers in Australia. Or from 308 photos of cattle posted on Google Earth I learn that herds tend to face either north or south, regardless of weather conditions, probably because they’re able to perceive magnetic fields, which helps them navigate, however short the distance. Virtual nature offers views and insights that might otherwise escape us. It also helps to satisfy a longing so essential to our well-being that we feel compelled to tune in, and we find it hypnotic.
What happens when that way of engaging the world becomes habitual? Nature now comes to us, not the other way round—on a small glowing screen. You can’t follow a beckoning trail, or track a noise off-camera. You don’t exercise as you meander, uncertain what delight or danger may greet you, while feeling dwarfed by forces older and larger than yourself. It’s a radically different way of being—with nature, but not in nature—and it’s bound to shape us.
The farther we distance ourselves from the spell of the present, explored by all our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature.
Films and TV documentaries like Microcosmos, Winged Migration, Planet Earth, March of the Penguins, and The Private Life of Plants inspire and fascinate millions while insinuating environmental concerns into the living room. It’s mainly in such programs that we see animals in their natural settings, but they’re dwarfed, flattened, interrupted by commercials, narrated over, greatly edited, and sometimes staged for added drama. Important sensory feedback is missing: the pungent mix of grass, dung, and blood; drone of flies and cicadas, dry rustling of wind through tall grass; welling of sweat; sandpapery sun.
On YouTube I just glimpsed several icebergs rolling in Antarctica —though without the grandeur of size, sounds, colors, waves, and panorama. Oddest of all, the icebergs looked a bit grainy. Lucky enough to visit Antarctica years ago, I was startled to find the air so clear that glare functioned almost as another color. I could see longer distances. Some icebergs are pastel, depending on how much air is trapped inside. And icebergs produce eerie whalelike songs when they rub together. True, in many places it’s a crystal desert, but in others life abounds. An eye-sweep of busy seals, whales, penguins and other birds, plus ice floes and calving glaciers, reveals so much drama in the foreground and background that it’s like entering a pop-up storybook. Watching icebergs online, or even at an Imax theater, or in sumptuous nature films, can be stirring, educational, and thought-provoking, but the experience is wildly different.
Last summer, I watched as a small screen in a department store window ran a video of surfing in California. That simple display mesmerized high-heeled, pin-striped, well-coiffed passersby who couldn’t take their eyes off the undulating ocean and curling waves that dwarfed the human riders. Just as our ancient ancestors drew animals on cave walls and carved animals from wood and bone, we decorate our homes with animal prints and motifs, give our children stuffed animals to clutch, cartoon animals to watch, animal stories to read. Our lives trumpet, stomp, and purr with animal tales, such as The Bat Poet, The Velveteen Rabbit, Aesop’s Fables, The Wind in the Willows, The Runaway Bunny, and Charlotte’s Web. I first read these wondrous books as a grown-up, when both the adult and the kid in me were completely spellbound. We call each other by “pet” names, wear animal-print clothes. We ogle plants and animals up close on screens of one sort or another. We may not worship or hunt the animals we see, but we still regard them as necessary physical and spiritual companions. It seems the more we exile ourselves from nature, the more we crave its miracle waters. Yet technological nature can’t completely satisfy that ancient yearning.
What if, through novelty and convenience, digital nature replaces biological nature? Gradually, we may grow used to shallower and shallower experiences of nature. Studies show that we’ll suffer. Richard Louv writes of widespread “nature deficit disorder” among children who mainly play indoors—an oddity quite new in the history of humankind. He documents an upswell in attention disorders, obesity, depression, and lack of creativity. A San Diego fourth-grader once told him: “I like to play indoors because that’s where all the electrical outlets are.” Adults suffer equally. It’s telling that hospital patients with a view of trees heal faster than those gazing at city buildings and parking lots. In studies conducted by Peter H. Kahn and his colleagues at the University of Washington, office workers in windowless cubicles were given flat-screen views of nature. They reaped the benefits of greater health, happiness, and efficiency than those without virtual windows. But they weren’t as happy, healthy, or creative as people given real windows with real views of nature.
As a species, we’ve somehow survived large and small ice ages, genetic bottlenecks, plagues, world wars, and all manner of natural disasters, but I sometimes wonder if we’ll survive our own ingenuity. At first glance, it seems like we may be living in sensory overload. The new technology, for all its boons, also bedevils us with speed demons, alluring distractors, menacing highjinks, cyber-bullies, thought-nabbers, calm-frayers, and a spiky wad of miscellaneous news. Some days it feels like we’re drowning in a twittering bog of information. But, at exactly the same time, we’re living in sensory poverty, learning about the world without experiencing it up close, right here, right now, in all its messy, majestic, riotous detail. Like seeing icebergs without the cold, without squinting in the Antarctic glare, without the bracing breaths of dry air, without hearing the chorus of lapping waves and shrieking gulls. We lose the salty smell of the cold sea, the burning touch of ice. If, reading this, you can taste those sensory details in your mind, is that because you’ve experienced them in some form before, as actual experience? If younger people never experience them, can they respond to words on the page in the same way?
The farther we distance ourselves from the spell of the present, explored by all our senses, the harder it will be to understand and protect nature’s precarious balance, let alone the balance of our own human nature. I worry about our virtual blinders. Hobble all the senses except the visual, and you produce curiously deprived voyeurs. At some medical schools, future doctors can attend virtual anatomy classes, in which they can dissect a body by computer—minus that whole smelly, fleshy, disturbing human element. Stanford’s Anatomage (formerly known as the Virtual Dissection Table) offers corpses that can be nimbly dissected from many viewpoints, plus ultrasound, X-ray and MRI. At New York University, medical students can don 3D glasses and explore virtual cadavers stereoscopically, as if swooping along Tokyo’s neon-cliffed streets on Google Maps. The appeal is easy to understand. As one 21-year-old female NYU student explains, “In a cadaver, if you remove an organ, you cannot add it back in as if it were never removed. Plus, this is way more fun than a textbook.” Exploring virtual cadavers offers constant change, drama, progress. It’s more interactive, more lively, akin to a realistic video game instead of a static corpse that just lies there.
When all is said and done, we only exist in relation to the world, and our senses evolved as scouts who work together to bridge that divide and provide volumes of information, warnings, and rewards. But they don’t report everything. Or even most things. We’d col- lapse from sheer exhaustion. They filter experience, so that the brain isn’t swamped by so many stimuli that it can’t focus on what may be lifesaving. Some of our expertise comes with the genetic suit, but most of it must be learned, updated, and refined, through the fine art of focusing deeply, in the present, through the senses, and combining emotional memories with sensory experience.
Once you’ve held a ball, felt its smooth contour, turning it in your hands, your brain need only see another ball to remember the feel of roundness. You can look at a Red Delicious apple and know the taste will be sweet, the sound will be crunchy, and feel the heft of it in your hand. Strip the brain of feedback from the mansion of the senses and life not only feels poorer, learning grows less reliable. Digital exploration is predominantly visual, and nature, pixilated, is mainly visual, so it offers one-fifth of the information. Subtract the other subtle physical sensations of smell, taste, touch, and sound, and you lose a wealth of problem-solving and lifesaving detail.
When I was little, children begged to go outside and play, especially in winter when snow fell from the sky like a great big toy that clotted your mittens, whisked up your nose, slid underfoot, shape-shifted in your hands, made great projectiles, and outlined everything, linking twigs and branches, roofs and sidewalks, car hoods and snow forts with white ribbons. Some still do. But most people play more indoors now, mainly alone and stagestruck, staring at our luminous screens.
I relish technology’s scope, reach, novelty, and remedies. But it’s also full of alluring brain closets, in which the brain may be well occupied but has lost touch with the body, lost the intimacy of the senses, lost a visceral sense of being one life form among many on a delicately balanced planet. A big challenge for us in the Anthropocene will be reclaiming that sense of presence. Not to forgo high-speed digital life, but balance it with slow hours of just being outside, surrounded by nature, and watching what happens next.
Because something wonderful always happens. When a sense of presence steals up the bones, one enters a mental state where needling worries soften, careers slow their cantering, and the imaginary line between us and the rest of nature dissolves. Then for whole moments one may see nothing but snow, gathering thick and wet along the limbs of an old magnolia. Or, indoors, one may watch how a vase full of tulips, whose genes have traveled eons and silk roads, arch their spumoni-colored ruffles and nod gently when the furnace gusts. On the periodic table of the heart, somewhere between wonderon and unattainium, lies presence, which one doesn’t so much take as steep in, like a romance, and without which one can live just fine, but not thrive. Those sensory bridges need to stay sharp, not just for our physical survival, but so we feel fully engaged and alive.
A digital identity in a digital landscape figures indelibly in our reminted sense of self. Electronic work and dreams fuel most people’s lives, education, and careers. Kindness, generosity, bullying, greed, and malice all blink across our devices and survive like extremophiles on invisible nets. Sometimes, still human but mentally fused with our technologies, we no longer feel compatible with the old environment, when nature seemed truly natural. To use an antique metaphor, the plug and socket no longer fit snugly. We’ve grown too large, and there’s no shrinking back. Instead, so that we don’t feel like we’re falling off the planet, we’re revising and redefining nature. That includes using the Internet as we do our other favorite tools, as a way to extend our sense of self. A rake becomes an extension of one’s arm. The Internet becomes an extension of one’s personality and brainpower, an untethered way to move commerce and other physical objects through space, a universal diary, a stew of our species’ worries, a hippocampus of our shared memories. Could it ever become conscious? It’s already the sum of our daily cogitations and desires, a powerful ghost that can not only haunt with aplomb but rabble-rouse, wheel and deal, focus obsessively, pontificate on all topics, speak in all tongues, further romance, dialogue with itself, act decisively, mumble numerically, and banter between computers until the cows come home. Then find someone to milk the cows.
It’s been suggested that we really have two selves now, the physical one and a second self that’s always present in our absence—an online self we also have to groom and maintain, a self people can respond to even when we’re not available. As a result everyone goes through two adolescences on the jagged and painfully exposed road to a sense of identity.
Surely we can inhabit both worlds with poise, dividing our time between the real and the virtual. Ideally, we won’t sacrifice one for the other. We’ll play outside and visit parks and wilds on foot, and also enjoy technological nature as a mental seasoning, turning to it for what it does best: illuminate all the hidden and mysterious facets of nature we can’t experience or fathom on our own.
Diane Ackerman is the author of The Human Age: The World Shaped By Us, The Zookeeper’s Wife, A Natural History of the Senses, and other books.
Excerpted from The Human Age: The World Shaped By Us by Diane Ackerman. Copyright © 2014 by Diane Ackerman. With permission of the publisher, W.W. Norton & Company, Inc. All rights reserved. This selection may not be reproduced, stored in a retrieval system, or transmitted in any form by any means without the prior written permission of the publisher.