Parenting can be neurotic. Part of my brain sounded the “You just scarred her for life” alarm whenever my daughter screamed out. When she was about one year old, I took her to a music class. She played with the maracas and gently beat a drum, but then she saw the teacher, a tall bearded man with a guitar. She went into a frenzy, crying and gasping with terror and desperation. I took her out of the room and she relented, but as soon as she saw the teacher again, she was inconsolable. I left the class early, let her flush the adrenaline out of her system, and played with her quietly for the rest of the day. I was certain the teacher had been the trigger, but I had no idea why. My wife took her to music class the next week, only to witness the exact same reaction. And for some months after that she became scared around any tall, swarthy, hirsute man, though none provoked her to the same degree that the music teacher had.
I was frustrated that I couldn’t understand the logic behind her reactions. I knew the stimulus that triggered it. By that age, I was inured to most of her cries and screams, which were her way of expressing various desires. But during the music class, I saw real suffering in her eyes and heard it in her howls for the first time—not just pain or a demand, but real mental anguish. I wanted nothing more than to save her from it, to debug it out of her system. And I couldn’t. All I could do was help her avoid the stimulus. Years later, my second child had an identical reaction to the same music teacher. Something was hard-coded into their genes to make sure they stayed away from tall, swarthy, bearded men with guitars.
Some stimuli are impossible to avoid. I want my children to grow up as free from the biases and preconceptions of society as possible. This is particularly true for daughters, since female social roles have historically been vastly more circumscribed and disadvantaged than male roles.
Children inherit our biases as soon as they are born. Some of these biases are matters of survival, pure and simple. Despite the moments of rebellion that are integral to growing up, most of childhood is spent absorbing one’s parents’ knowledge without question.
Received Ignorance
Adjustment to objective reality does not exist for the sake of adjustment in itself. All adaptations are regulated by needs.
—LEV VYGOTSKY
My heart sank the first time I went into a toy store as a parent and saw two distinct, gender-segregated aisles of toys. Dolls and dresses were on the left, cars and building blocks on the right. When I read my daughter a book about a princess rebelling against her parents for being told that she had to marry to be a proper princess, my daughter piped up indignantly and said, “Elsa [from Frozen] isn’t married!” That’s all well and good, but I would prefer to stick with Queen Alice of Through the Looking-Glass and Princess Ozma of the Oz books, who exist outside our cultural sphere altogether. The embrace of any dominant gender narrative, even those that seem innocuous or progressive, carries limitations with it.
I was surprised by how quickly children take up the taxonomies embedded within these narratives. By the age of three, my daughter had clear ideas of what girls and boys were like, how they differed, and how she was definitely the former. Animals, monsters, adults, dinosaurs, and pirates also all had certain characteristics that defined them. And then children add to the characteristics, as she was doing at age six:
HER: I think the rainbow should be split. Red, orange, and yellow are colors girls can wear, and blue, green, and purple are clothes boys can wear. [pointing to purple clothes] I feel sad I cannot wear these anymore.
ME: You know, I think there are no girls’ colors or boys’ colors.
HER: That doesn’t matter.
ME: But do boys even wear purple that often?
HER: Okay, but I can’t think of any better way to split it that’s simple.
Psychologist Susan Gelman makes the compelling case that humans have an innate tendency toward essentialism that is present from early childhood on. Essentialism means that a particular entity has some underlying essential set of characteristics that give it its particular nature, without which it couldn’t be that thing. Gelman found that “72% of four-year-olds and 73% of first-graders spontaneously mentioned inborn dispositions, intrinsic nature, or growth” when asked why particular animals possessed their typical properties and behaviors:
For example, preschoolers said that a rabbit has long ears because “the egg made the [rabbit’s] ears so that it had them when it hatched,” or birds fly “because that’s the way birds are made.”
Yet in reality, there are exceptions to every “essential” characteristic. Some birds don’t fly. Some rabbits don’t have long ears. Some girls don’t like dolls. Some boys run slower than girls. Some girls like math. Essentialist tendencies allow children to carve the world into “piecewise approximations of reality” (just as the DSM did) and lead them to stereotype and limit themselves. Even though, unlike computers, humans can accommodate contradictions and nuances, the intrinsic biases of our early taxonomies remain with us. Our received categories work well enough on a day-to-day basis, but in the end nearly every categorical schema is wrong, or more precisely, incomplete and approximate. But we need categories in order to reckon with the overwhelming amount of information we have to process. As Gelman puts it:
Essentialism is not strictly a childhood construction. It is a framework for organizing our knowledge of the world, a framework that persists throughout life.
I wanted my daughter to be free of categorical restrictions. I wanted her to make informed choices, whether gender-typical or not. I wanted her to have no preconceptions of what people of a particular race were like. But my idea of a purely rational choice of her self-definition was a fantasy. Children don’t start from nothing; they learn by experimenting. I had to let my daughter keep her preconceptions and hope that she would, over time, learn to see beyond them.
I avoid arguing with my daughter over her tastes. I didn’t tell her she couldn’t have a Barbie book she took a liking to. I didn’t stop her from dressing up as Elsa. (I did tell her that the My Little Pony soundtrack made me want to go Old Yeller on Twilight Sparkle and friends.*1) Instead, I tried to steer her toward more eclectic and obscure material, far from any mainstream. And I tried to flood her with as much material as I could to put her in mind of the multiplicity of narratives and taxonomies that existed. I read her Ellen Raskin and Daniel Pinkwater’s eccentric children’s books alongside Amulet and Elephant and Piggie. I showed her Animaniacs and Oh! Edo Rocket alongside My Little Pony and Moana. I took her to rock climbing and ballet. I had no particular agenda other than the rejection of others’ agendas. I wanted her to see that no one set of categories was trustworthy. I hoped that in seeing how all these taxonomies, schemata, and narratives were incompatible with one another, she would feel less bound to any single one. I couldn’t program her, but I could perplex her.
The Child as Network
[A child] begins to find that what these people about him say is the very best evidence of fact. So much so, that testimony is even a stronger mark of fact than the facts themselves, or rather than what must now be thought of as the appearances themselves….Thus, he becomes aware of ignorance.
—CHARLES SANDERS PEIRCE
It’s impossible to debug a child because it’s impossible to reset a life.
Programming is an iterative process. When I wrote software, I would code, test, and debug my code. After fixing a bug, I would recompile my code and start it again in its uncorrupted state, before the next bug emerged. The idea of initial conditions—the ability to restart as many times as you like—is integral to software development and to algorithms. An algorithmic recipe presumes the idea of a set of initial conditions and inputs. When an algorithm terminates, only the outputs remain. The algorithmic process itself comes to an end.*2 Every time an algorithm runs, it starts with virgin conditions that vary only with regard to the set of inputs. Col
loquially, we can call this the reset button.
The scientific process depends on the reset button: the ability to conduct an experiment multiple times from identical starting conditions. In the absence of precisely identical starting conditions—whether in the study of distant stars or extremely rare circumstances or many-varied human beings—the goal is that initial conditions are as close as possible in all relevant aspects.
But I cannot reset a human being. A child is not an algorithm. It is a persistent, evolving system. Software too is becoming a persistent system. Algorithms themselves may remain static, but they are increasingly acting on large, persistent systems that are now as important to computing as the algorithms themselves. The names of these systems include Google, Amazon, Facebook, and Twitter. These companies write software, but the products they create are systems or networks. While Microsoft had to carry over a fair amount of code from one version of Windows to the next to ensure backward compatibility, each version of Windows was a discrete program. Every time a user started up Windows, the memory of the computer was cleared and reassembled from scratch, based on the state that had been saved to disk. If Windows got into a strange state and stopped behaving well, I could reboot and, more often than not, the problem fixed itself. In the worst cases, I could reinstall Windows and have a completely fresh start.
That’s not possible with systems. Constituent pieces of Google’s search engine are replaced, rebooted, and subject to constant failures, but the overall system must be up all the time. There is no restarting from scratch. Google, Amazon, and Facebook are less valuable for their algorithms than for their state: the sum total of all the data the system contains and manipulates. None of these companies can clear out their systems and “start over,” algorithmically. And neither can a child. A child starts up at birth, and her internal mechanisms produce a persistent, mutable system that is the child’s body, mind, and personality. There are algorithmic, information-carrying processes that exist within our bodies, chief among them the coding, replication, modification, and transmission of DNA (and RNA), and the sheer clarity of our DNA operations stands in stark contrast to the messiness and apparent aimlessness of our daily lives. One of the great appeals of evolutionary psychology, the field that gave us the selfish gene and the analogous idea of a meme (a cultural idea that evolves), is that it hints that there may be a single, overriding goal driving all our biological and cultural doings: passing on our genes. Our lives are ephemeral one-shot processes, but our genes promulgate themselves to live on further. Physicist Juan Roederer has proposed that any purposeful activity, be it the copying of DNA, an algorithm calculating my taxes, or my telling my daughter to go to bed, relies on the system in question being able to be reset, and that the existence of such resettable systems is a defining characteristic of biological life itself.*3 In other words, it’s the ability to start from scratch (or simulate starting over from scratch) that enables organisms to plan and execute, rather than dumbly follow the laws of nature like a rock or a planet. But even if the propagation of double helixes is the original goal of our existence, we still lack any greater understanding of the functioning of so much of the biology and culture that has built up around that goal.
In other words, what do our bodies and selves and societies do when they’re not propagating our genes—or trying to get into a situation of propagating them? Some clues may lie in thinking about what happens to software programs when we don’t shut them down and restart them, but let them linger on and evolve.
An algorithm is a finite, linear set of instructions that operates on a set of inputs to generate a particular output. Algorithmic systems like Google, Facebook, Amazon, and Twitter create a persistent system (or network) that modifies its behavior over time, in response to how it is used. In essence, these systems rely on feedback: their outputs affect the environment in which these systems exist, and the systemic environment—its users and also other algorithmic systems like it—provides new inputs that change the system further. Mathematician Norbert Wiener called such feedback systems cybernetic, drawing on the Greek word kybernan (κυβερνήν, meaning “to steer” in the sense of control, piloting, and governance). Algorithms establish and maintain these systems, but they can’t predict how a system will behave at a given point in time. For that, one must know the ongoing state of the system.*4 The result is an evolving ecosystem. Programmers can code, debug, and fix algorithms, but we acquire, train, and condition a network by having algorithms operate on it. For a child, which is also a kind of evolving ecosystem, these algorithms include intrinsic biological mechanisms, the physical effects of its surrounding environment, and other living creatures—for example, parents who may wish they could reset their child’s emotional valences on hearing a four-year-old sing this song, as I did when I asked what my daughter was sad about:
So many sad things, I can’t even tell you.
They are all squished up into a ball.
Squished into a ball.
And sometimes things fall off the ball
and they go into the trash.
And I really really really love TV
and I hope I can watch it tomorrow morning.
I can’t pull sadness out of my daughter’s brain. So I let her watch TV, and hope it ameliorates the ball of sadness.
There are portions of a system that may be resettable—we can blank our Facebook profiles or return our immune systems to rough homeostasis—but the overall system has an ongoing, linear continuity. And indeed, we now speak of resetting aspects of the human mind, particularly when it comes to trauma and addiction. The rhetoric around Eye Movement Desensitization and Reprocessing therapy, an unorthodox method that has shown promise in treating trauma and phobias, speaks not only of desensitization but of returning the mind to equilibrium, processing and resolving a bug in the system. The system doesn’t stop, nor does it return to a virgin state, but we hope that the network that makes up the human mind can be repaired on the fly.
“On the fly” is also the term used for modifying and fixing a computer program as it runs, without stopping and restarting it. While components of Google’s and Facebook’s networks are constantly shut down, modified, and restarted, the entire system persists and evolves. We dream of reset buttons for the soul and self, of ridding ourselves of addictions, phobias, bad habits, and the miscellaneous accumulated burdens of our lives. Now that we understand many mental illnesses to be neurochemical rather than cognitive, medicine aims to fix the apportioning of serotonin and dopamine in order to correct imbalances. People speak of the shamanic psychedelic drug ayahuasca as a sort of bleach for the mind, washing out layers of sediment that have clogged the functioning of the brain and capable of provoking imaginative powers beyond our estimates. Psychologist Benny Shanon writes, “It may very well be that [the ayahuasca experience] is the creative ability of the mind but, if so, the mind’s ability to create surpasses anything we cognitive scientists ever think of.” One of the appeals of MDMA, both to recreational users and to some physicians, lies in its seeming ability to turn down the amygdala’s generation of negative emotion, which has shown some promise for treating social anxiety in autistic patients: “MDMA administration acutely decreases activity in the left amygdala, a brain region involved in the interpretation of negative cues, and attenuates amygdalar response and emotional reactivity to angry faces.”
These treatments are not precise algorithms, but blunt hammers applied to shake up the mind’s innards in the hopes of producing a desired effect. The mind is a network over which we seek to gain control, and yet we find that our ability to affect it is as clumsy and indirect as the ability of a pinball player to affect a pinball machine by pushing buttons and shoving the housing. Likewise, as computer networks grow in complexity and endure over greater lengths of time, our degree of direct control over them diminishes.
In sum, an algorithmic system or algorithmic network
is a persistent agent produced by an algorithm, situated in a responsive environment. Once a network is in play, evolving over time and never reset to its initial state, it gains a complex existence independent of the algorithms that produced it, just as our bodies and minds gain a complex existence independent of the DNA that spawned them. These independent systems are not coded. Rather, they are trained, and they learn.
This means that these networks are not fundamentally algorithmic. Rather, they are systems that grow and evolve over time, and they are systems that cannot be wholly reset, for to do so would be to return the system to its starting point of ignorance and inexperience. The process of creating artificial intelligence is coming to seem less a matter of coding up algorithms and more of applying algorithms to a growing system, like pouring water on a plant—or like educating a child. Systems like Google and Facebook are the first genuine digital children.
Neural network pioneer Warren McCulloch wrote in 1951 that the distinction between machines and humans was that humans’ minds reacted and adapted to their environment with the purpose of thriving in it in a multiplicity of ways:
Why is the mind in the head? Because there, and only there, are hosts of possible connections to be formed as time and circumstance demand. Each new connection serves to set the stage for others yet to come and better fitted to adapt us to the world, for through the cortex pass the greatest inverse feedbacks whose function is the purposive life of the human intellect. The joy of creating ideals, new and eternal, in and of a world, old and temporal, robots have it not.
Bitwise Page 22