by Sam Kean
After dozens of pitch meetings, she managed to secure enough funding from well-known investors like Slow Ventures (which counts Evernote, Dropbox, and Pinterest in its portfolio) and Scott and Cyan Banister (the husband-and-wife team that sits on the boards of PayPal and Postmates) to bring Moxxly to market. “These other large [pump] companies have this relationship with insurance and health providers that is baked in, and they’re not incentivized to be more human-centered or focused on the modern woman,” says Enrique Allen of Designer Fund, which invested in an early round. “We often look for market opportunity where there are big, sleepy incumbents who don’t have a mobile strategy or a way to reach a new audience in the way that Moxxly does.”
But, Allen admits, his lactation learning curve was steep. “I didn’t realize nipples could get so sore,” he says. “I definitely have more empathy now.”
Other companies have seen the market opening as well, and Silicon Valley is finally—in industry parlance—disrupting the hell out of the space. The Willow, set to come out later this year, is a wearable pump that fits entirely inside a bra. It looks so sleek and Jobsian, you’d almost expect it to play Drake when you press the power button. The Naya pump uses water-based suction instead of air, which they claim is quieter and more comfortable. Babyation, coming at the end of this year, is designed so the collection bottles don’t hang off the breasts.
For all their improvements, however, the new pumps don’t come cheap. At under $100, the Moxxly Flow attachment is one of the least expensive options. The Naya retails for a whopping $999, while the Willow, whose price is still being decided, is in the ballpark of $400 for the pump, which doesn’t include the cost of each single-use collection bag. So until insurance companies embrace pump 2.0, many women will have to stick with the free, bovine-esque milkers.
The Moxxly offices are located in an old factory in the Bayview neighborhood of San Francisco. Next to a 3-D printer and some special rulers that measure nipple and areola size is a cardboard box of “old test boobs.” It’s the domain of Jake Kurzrock, a mechanical engineer with a handlebar mustache who, in a past life, worked in fungal genes and fertilizer.
“First, we were using a stress boob for testing,” he said, pulling out a novelty stress ball colored to look like a Caucasian breast, with a baby bottle nipple crammed in the middle. Next to it sat an African American companion stress breast. “We tried these cutlets, too,” he said, slapping some mangled silicone bra inserts onto a worktable, invoking a Hannibal Lecter–ish experiment gone wrong. “Then I just went on Amazon and got a breast used by cross-dressers. You can get really nice ones, like hundreds of dollars. We got the $40 one. It worked fine.” He used it to make a mold for breasts they’d eventually affix to a dress form named Seraphina.
Several factors come into play when constructing a piece of hardware that needs to work across varying body types. There’s the fact that nipple shape and size vary by ethnicity (cocktail party factoid: Asian nipples tend to be longer). Plus, the rate and angle at which milk flows through the breasts differ for every individual (the Moxxly team found that, for modeling purposes, warm skim best mimics human). All calculations then have to account for, as one Google spreadsheet put it, “possible sagginess.” The result of all this research? “It makes boobs less fun,” Kurzrock said.
The women at the coffee shop, who’d collectively pumped enough ultralocal milk for several lattes, would likely disagree. As Sladden threw her bottles into her boob-print tote, one of the other women recalled the day she unthinkingly wore a shift dress and was forced to get virtually naked for her pumping sessions. Another remembered being on a work camping trip, topless and sliding down in her car’s front seat, away from the prying eyes of coworkers who meandered outside. Those humiliating pumping days—for this particular set of women, at least—seemed to be behind them.
JOHN LANCHESTER
The Case Against Civilization
from The New Yorker
Science and technology: we tend to think of them as siblings, perhaps even as twins, as parts of STEM (for “science, technology, engineering, and mathematics”). When it comes to the shiniest wonders of the modern world—as the supercomputers in our pockets communicate with satellites—science and technology are indeed hand in glove. For much of human history, though, technology had nothing to do with science. Many of our most significant inventions are pure tools, with no scientific method behind them. Wheels and wells, cranks and mills and gears and ships’ masts, clocks and rudders and crop rotation: all have been crucial to human and economic development, and none historically had any connection with what we think of today as science. Some of the most important things we use every day were invented long before the adoption of the scientific method. I love my laptop and my iPhone and my Echo and my GPS, but the piece of technology I would be most reluctant to give up, the one that changed my life from the first day I used it, and that I’m still reliant on every waking hour—am reliant on right now, as I sit typing—dates from the thirteenth century: my glasses. Soap prevented more deaths than penicillin. That’s technology, not science.
In Against the Grain: A Deep History of the Earliest States, James C. Scott, a professor of political science at Yale, presents a plausible contender for the most important piece of technology in the history of man. It is a technology so old that it predates Homo sapiens and instead should be credited to our ancestor Homo erectus. That technology is fire. We have used it in two crucial, defining ways. The first and the most obvious of these is cooking. As Richard Wrangham has argued in his book Catching Fire, our ability to cook allows us to extract more energy from the food we eat, and also to eat a far wider range of foods. Our closest animal relative, the chimpanzee, has a colon three times as large as ours, because its diet of raw food is so much harder to digest. The extra caloric value we get from cooked food allowed us to develop our big brains, which absorb roughly a fifth of the energy we consume, as opposed to less than a tenth for most mammals’ brains. That difference is what has made us the dominant species on the planet.
The other reason fire was central to our history is less obvious to contemporary eyes: we used it to adapt the landscape around us to our purposes. Hunter-gatherers would set fires as they moved, to clear terrain and make it ready for fast-growing, prey-attracting new plants. They would also drive animals with fire. They used this technology so much that, Scott thinks, we should date the human-dominated phase of Earth, the so-called Anthropocene, from the time our forebears mastered this new tool.
We don’t give the technology of fire enough credit, Scott suggests, because we don’t give our ancestors much credit for their ingenuity over the long period—95 percent of human history—during which most of our species were hunter-gatherers. “Why human fire as landscape architecture doesn’t register as it ought to in our historical accounts is perhaps that its effects were spread over hundreds of millennia and were accomplished by ‘precivilized’ peoples also known as ‘savages,’” Scott writes. To demonstrate the significance of fire, he points to what we’ve found in certain caves in southern Africa. The earliest, oldest strata of the caves contain whole skeletons of carnivores and many chewed-up bone fragments of the things they were eating, including us. Then comes the layer from when we discovered fire, and ownership of the caves switches: the human skeletons are whole, and the carnivores are bone fragments. Fire is the difference between eating lunch and being lunch.
Anatomically modern humans have been around for roughly 200,000 years. For most of that time, we lived as hunter-gatherers. Then, about 12,000 years ago, came what is generally agreed to be the definitive before-and-after moment in our ascent to planetary dominance: the Neolithic Revolution. This was our adoption of, to use Scott’s word, a “package” of agricultural innovations, notably the domestication of animals such as the cow and the pig, and the transition from hunting and gathering to planting and cultivating crops. The most important of these crops have been the cereals—wheat, barley, rice, and mai
ze—that remain the staples of humanity’s diet. Cereals allowed population growth and the birth of cities, and, hence, the development of states and the rise of complex societies.
The story told in Against the Grain heavily revises this widely held account. Scott’s specialty is not early human history. His work has focused on a skeptical, peasant’s-eye view of state formation; the trajectory of his interests can be traced in the titles of his books, from The Moral Economy of the Peasant to The Art of Not Being Governed. His best-known book, Seeing Like a State, has become a touchstone for political scientists, and amounts to a blistering critique of central planning and “high modernism,” the idea that officials at the center of a state know better than the people they are governing. Scott argues that a state’s interests and the interests of subjects are often not just different but opposite. Stalin’s project of farm collectivization “served well enough as a means whereby the state could determine cropping patterns, fix real rural wages, appropriate a large share of whatever grain was produced, and politically emasculate the countryside”; it also killed many millions of peasants.
Scott’s new book extends these ideas into the deep past, and draws on existing research to argue that ours is not a story of linear progress, that the timeline is much more complicated, and that the causal sequences of the standard version are wrong. He focuses his account on Mesopotamia—roughly speaking, modern-day Iraq—because it is “the heartland of the first ‘pristine’ states in the world,” the term pristine here meaning that these states bore no watermark from earlier settlements and were the first time any such social organizations had existed. They were the first states to have written records, and they became a template for other states in the Near East and in Egypt, making them doubly relevant to later history.
The big news to emerge from recent archeological research concerns the time lag between “sedentism,” or living in settled communities, and the adoption of agriculture. Previous scholarship held that the invention of agriculture made sedentism possible. The evidence shows that this isn’t true: there’s an enormous gap—4,000 years—separating the “two key domestications,” of animals and cereals, from the first agrarian economies based on them. Our ancestors evidently took a good, hard look at the possibility of agriculture before deciding to adopt this new way of life. They were able to think it over for so long because the life they lived was remarkably abundant. Like the early civilization of China in the Yellow River Valley, Mesopotamia was a wetland territory, as its name (“between the rivers”) suggests. In the Neolithic period, Mesopotamia was a delta wetland, where the sea came many miles inland from its current shore.
This was a generous landscape for humans, offering fish and the animals that preyed on them, fertile soil left behind by regular flooding, migratory birds, and migratory prey traveling near river routes. The first settled communities were established here because the land offered such a diverse web of food sources. If one year a food source failed, another would still be present. The archeology shows, then, that the “Neolithic package” of domestication and agriculture did not lead to settled communities, the ancestors of our modern towns and cities and states. Those communities had been around for thousands of years, living in the bountiful conditions of the wetlands, before humanity committed to intensive agriculture. Reliance on a single, densely planted cereal crop was much riskier, and it’s no wonder people took a few millennia to make the change.
So why did our ancestors switch from this complex web of food supplies to the concentrated production of single crops? We don’t know, although Scott speculates that climatic stress may have been involved. Two things, however, are clear. The first is that, for thousands of years, the agricultural revolution was, for most of the people living through it, a disaster. The fossil record shows that life for agriculturalists was harder than it had been for hunter-gatherers. Their bones show evidence of dietary stress: they were shorter, they were sicker, their mortality rates were higher. Living in close proximity to domesticated animals led to diseases that crossed the species barrier, wreaking havoc in the densely settled communities. Scott calls them not towns but “late-Neolithic multispecies resettlement camps.” Who would choose to live in one of those? Jared Diamond called the Neolithic Revolution “the worst mistake in human history.” The startling thing about this claim is that, among historians of the era, it isn’t very controversial.
The other conclusion we can draw from the evidence, Scott says, is that there is a crucial, direct link between the cultivation of cereal crops and the birth of the first states. It’s not that cereal grains were humankind’s only staples; it’s just that they were the only ones that encouraged the formation of states. “History records no cassava states, no sago, yam, taro, plantain, breadfruit or sweet potato states,” he writes. What was so special about grains? The answer will make sense to anyone who has ever filled out a Form 1040: grain, unlike other crops, is easy to tax. Some crops (potatoes, sweet potatoes, cassava) are buried and so can be hidden from the tax collector, and, even if discovered, they must be dug up individually and laboriously. Other crops (notably, legumes) ripen at different intervals, or yield harvests throughout a growing season rather than along a fixed trajectory of unripe to ripe—in other words, the taxman can’t come once and get his proper due. Only grains are, in Scott’s words, “visible, divisible, assessable, storable, transportable, and ‘rationable.’” Other crops have some of these advantages, but only cereal grains have them all, and so grain became “the main food starch, the unit of taxation in kind, and the basis for a hegemonic agrarian calendar.” The taxman can come, assess the fields, set a level of tax, then come back and make sure he’s got his share of the harvest.
It was the ability to tax and to extract a surplus from the produce of agriculture that, in Scott’s account, led to the birth of the state, and also to the creation of complex societies with hierarchies, division of labor, specialist jobs (soldier, priest, servant, administrator), and an elite presiding over them. Because the new states required huge amounts of manual work to irrigate the cereal crops, they also required forms of forced labor, including slavery; because the easiest way to find slaves was to capture them, the states had a new propensity for waging war. Some of the earliest images in human history, from the first Mesopotamian states, are of slaves being marched along in neck shackles. Add this to the frequent epidemics and the general ill health of early settled communities and it is not hard to see why the latest consensus is that the Neolithic Revolution was a disaster for most of the people who lived through it.
War, slavery, rule by elites—all were made easier by another new technology of control: writing. “It is virtually impossible to conceive of even the earliest states without a systematic technology of numerical record keeping,” Scott maintains. All the good things we associate with writing—its use for culture and entertainment and communication and collective memory—were some distance in the future. For half a thousand years after its invention, in Mesopotamia, writing was used exclusively for bookkeeping: “the massive effort through a system of notation to make a society, its manpower, and its production legible to its rulers and temple officials, and to extract grain and labor from it.” Early tablets consist of “lists, lists, and lists,” Scott says, and the subjects of that record-keeping are, in order of frequency, “barley (as rations and taxes), war captives, male and female slaves.” Walter Benjamin, the great German Jewish cultural critic, who committed suicide while trying to escape Nazi-controlled Europe, said that “there is no document of civilization which is not at the same time a document of barbarism.” He meant that every complicated and beautiful thing humanity ever made has, if you look at it long enough, a shadow, a history of oppression. As a matter of plain historical fact, that seems right. It was a long and traumatic journey from the invention of writing to your book club’s discussion of Jodi Picoult’s latest.
We need to rethink, accordingly, what we mean when we talk about ancient “dark ages.” Sco
tt’s question is trenchant: “‘dark’ for whom and in what respects”? The historical record shows that early cities and states were prone to sudden implosion. “Over the roughly five millennia of sporadic sedentism before states (seven millennia if we include preagriculture sedentism in Japan and the Ukraine),” he writes, “archaeologists have recorded hundreds of locations that were settled, then abandoned, perhaps resettled, and then again abandoned.” These events are usually spoken of as “collapses,” but Scott invites us to scrutinize that term, too. When states collapse, fancy buildings stop being built, the elites no longer run things, written records stop being kept, and the mass of the population goes to live somewhere else. Is that a collapse, in terms of living standards, for most people? Human beings mainly lived outside the purview of states until—by Scott’s reckoning—about the year 1600 A.D. Until that date, marking the last two-tenths of one percent of humanity’s political life, “much of the world’s population might never have met that hallmark of the state: a tax collector.”
The question of what it was like to live outside the settled culture of a state is therefore an important one for the overall assessment of human history. If that life was, as Thomas Hobbes described it, “nasty, brutish, and short,” this is a vital piece of information for drawing up the account of how we got to be who we are. In essence, human history would become a straightforward story of progress: most of us were miserable most of the time, we developed civilization, everything got better. If most of us weren’t miserable most of the time, the arrival of civilization is a more ambiguous event. In one column of the ledger, we would have the development of a complex material culture permitting the glories of modern science and medicine and the accumulated wonders of art. In the other column, we would have the less good stuff, such as plague, war, slavery, social stratification, rule by mercilessly appropriating elites, and Simon Cowell.