Archaeology from Space

Home > Other > Archaeology from Space > Page 20
Archaeology from Space Page 20

by Sarah Parcak


  In tech evolution, 30 years is not very long. Remember that before the early 1990s, almost no one had heard of the internet. A century ago, the telephone had just started to appear in very wealthy people’s homes, and now, some 2.53 billion people own smartphones.3

  Believe it or not, inventors pore over science fiction novels hoping to glean insights into the next big thing that can make them billions.4 Employing science fiction to envision the future of archaeological exploration could be equally rewarding.

  For example, the idea of mapping and excavating an entire site in an hour—or at least the parts of a site that fit together well enough to extrapolate the whole puzzle—is at present the craziest idea of all. A single archaeological team might work at a site for more than 40 years, perhaps the director’s entire career, and they would barely scratch the surface.

  Let’s do some math, because math is fun. A 500-by-500-meter mound, 8 meters in height—not considering what’s below the current ground surface—leaves us with a general site volume of 2 million cubic meters. In a single season, an archaeologist and his or her local dig crew could excavate a 10-by-10-meter unit, going down 3.5 meters over the course of two months. This gives us a volume of 350 cubic meters.

  Four units that size might be excavated at one site in a season, so 1,400 cubic meters. Over 40 years, assuming a standard dig rate, we would see 56,000 cubic meters excavated, or just under 3 percent of the site. This would produce yearly excavation reports, hundreds of articles, several dozen doctorates, and academic books aplenty.

  So, to get to 100 percent, we multiply 40 years by 33. Which gives us 1,320 years of excavation, per site, to understand them fully. Even with year-round excavation, we then need to take into account intensive lab work, analysis, and publication preparation, which easily amount to four to eight additional months for each month in the field.

  Now, multiply that times x number of sites in a region.

  Despite all the advancements of our golden age of archaeology, it’s impossible. And here’s another blast of reality: archaeologists may not have the chance to work on a site for more than a few seasons. Funding and permitting issues that we’ve discussed aside, career changes inevitably cause directors to leave for other sites. Staying site-monogamous takes devotion, mainly for reasons of time commitment. I am going steady with Lisht, my favorite site in Egypt, but I keep finding other sites to map. I don’t know what that says about me, and maybe it’s better if I never find out.

  Where We Are Now

  Understanding a site in an hour is obviously closer to fantasy, let alone sci-fi. But looking at the status quo, with the magic toy box the fictional Robbie has at his disposal in 2119, may tell us how close we are to self-driving miniature drones for mapping, surveying, and 3-D reconstruction.

  Already, remote-sensing tools are hosted on anything we can get off the ground, from satellites to helicopters and drones—also known as unmanned aerial vehicles, or UAVs. Typical drones used for archaeology measure about 50 centimeters in diameter, but the technology is getting smaller and smaller. You can now buy a palm-size drone as a toy and a novelty mini-drone the diameter of a soda can.5 Some even carry cameras.6

  Prior to 2015, the heavier the remote-sensing payload, the larger the drone, and sometimes you had to rely on an airplane or helicopter. Now a standard drone can easily lift a LIDAR system and a thermal infrared or hyperspectral camera, not unlike Robbie’s red bots that he used for his initial survey.7 All of these technologies have miniaturized drastically in the past decade: a good-quality thermal infrared camera is now the same size as a smartphone. Having fully miniaturized versions on palm-size drones a hundred years from now suddenly seems less fantastical.

  Each system could theoretically map subsurface features, site activity areas, topography, and relic river courses by 2119.

  Hyperspectral Imaging

  Something called hyperspectral imaging is an exciting new frontier for archaeological remote sensing. Instead of the standard four to eight bands of visible and near-infrared data that I’ve spent a lot of this book discussing, hyperspectral imagery can provide hundreds of bands of data, giving clues about the chemical composition of the terrain.8 It’s like going from 8 colors on your computer screen to 256: you’d be able to see far more details and subtleties in your photos.

  A handheld spectrometer,9 a machine the size of a standard high school microscope, can measure the spectral signature of any material based on its chemical makeup. Geologists use them to detect tiny differences in geological strata,10 but they are fairly new to archaeologists, and we have not yet fully exploited their capabilities. The first step would be to build up signature databases of archaeological sites and zones for comparative purposes.

  We already know that everything on the Earth’s surface has its own distinct chemical signature. As buried features on sites degrade, they release tiny pieces of building materials that slowly mix with the strata above. While this may not be visible to the naked eye, we can map these changes—enhanced by rainfall—using infrared data. That allows us to locate outlines of mud-brick buildings or settlement foundations.11 In the case of buried stone features, we can use mid-infrared data to make them more apparent.

  Hyperspectral data can also allow archaeologists to identify distinct activity areas on archaeological sites. Ceramic or metal production, for instance, requires burning at high temperatures and leaves clear chemical residues that indicate an industrial zone. The high bone content of cemeteries may change the mineral content of the soil, and produces fragments that can often be observed on top of sites, creating distinct signatures. Each of these different areas would register as clear spikes across the light spectrum and may be far more visible in some bands than others.

  Thermal infrared imaging also offers an exciting new avenue of research for archaeologists. In any city during the hottest parts of the summer, the concrete absorbs heat during the day, and at night, when the temperature is cooler, the heat radiates outward. Urban temperatures during summer nights can be 3 to 4 degrees warmer than areas that have more trees to shade them, which can make cities literally glow on satellite imagery at night.12 Buried archaeological features respond in similar ways, although the temperature differences are far more subtle.

  Archaeologists have already used thermal infrared cameras to detect underground ritual chambers known as kivas at Chaco Canyon in New Mexico.13 That makes it entirely possible that the same kind of imaging could be used to identify buried tombs in other desert environments—perhaps even in Egypt’s Valley of the Kings, where archaeologists have spent years searching for hidden burials. You just need to make sure you get the imagery from the right time of day, and from the right time of year, to capture the maximum temperature differences.

  Putting multiple sensors on the same drone is something we’ll see in the next few years, simply because efficiency cries out for it. And efficiency equals money saved, essential in ever-more-squeezed research budgets. Inevitably, more sensors will better target subsurface survey and excavation. Researchers already fly LIDAR systems and hyperspectral cameras on airplanes at the same time,14 and as more technology is miniaturized, a standard rig can include even more.

  Scanning Above and Below

  Let’s talk about Robbie’s green bots, which can scan each site into its surrounding landscape. Sites do not exist in a vacuum. We’ve seen how essential it’s become to know the availability of raw materials and the shifts of rivers or lakes, for understanding the rise and fall of communities. For that reason, we survey around ancient sites to locate relic watercourses or sources15 and places where natural resources could have been mined or quarried.

  Today’s key tools are magnetometry, resistivity, and ground-penetrating radar. And these physical tools themselves are honking great bits of kit, involving backbreaking deployment as crews walk hundreds of miles in a standard survey season. But as these technologies improve, the mapping systems, like other tech, will get smaller and the parts lighter. W
e can hope—and what footsore, stooped magnetometry specialist wouldn’t—that systems could one day be loaded onto self-driving drones just like Robbie’s.

  Today, researchers wait until the end of work to download their device’s data into computers and to process it with software similar to remote-sensing programs. Already, data from these devices can be wirelessly transmitted to computers,16 but it is not widespread practice yet. Assuming subsurface-sensing and transmission technologies continue to be developed, we can probably expect automatic wireless upload, and easily imagine populating data instantly onto 3-D models of the sites to show fully rendered architecture as far beneath the surface as 5 to 8 meters. Just 50 years ago, we had no subsurface remote-sensing machines at all. In a hundred years, the ground that now hides secrets could well be wide open for us, with no digging involved.

  The spectacular 3-D reconstructions delivered by Robbie’s blue bots are also already in their infancy. You’ve seen ultrasonic waves reconstructing the environment in 3-D, but you might not have been aware of it. Bats and dolphins do it naturally, and we’ve finally cottoned on. Driverless cars send out waves to detect objects in their path, with tentative identifications on which they then act accordingly, such as, person=stop, car=accelerate to avoid a collision.17 Scientists have mounted this technology on groups of drones,18 discovering many potential mapping applications. As sensors grow smaller and more sensitive, it’s feasible that they could be delivered underground, via a probe.

  Although our sci-fi scene is asking even more of in-situ scanning with regard to the discovery of the scrolls, major advances in the scanning and revelation of ancient art and writing are already taking shape. Scientists now use lasers to clear soot from tomb walls and reveal stunning paintings.19 Ancient manuscripts can be viewed in infrared light to find which ones are palimpsests—those with layers of words that have been sanded off and written over and are now hidden to the human eye.20

  And phase-contrast X-ray imaging can even peer into the burnt scrolls from the Italian site of Herculaneum, the less famous but even more fascinating cousin of Pompeii, destroyed by the same volcanic eruption of Mount Vesuvius in 79 AD. The tight rolls are too fragile to unwrap and read, but this technology is able to pick out words and letters hidden among the charred papyrus sheets. While the work only represents proof of concept so far, experts are confident that they will soon be able to read entire texts.21

  Machine Learning—a Cutting-Edge Frontier

  So the beginnings of extraordinarily advanced imaging technologies are ready for future development. But we are also getting closer to the connectivity of information that Robbie has at his fingertips, and everything it could contribute to targeting further investigation of a site with pinpoint precision. It’s called machine learning.

  Machine learning, or computer vision, forms an essential part of most computer science programs today. It is the driving force behind things like facial recognition programs. The computer has access to thousands of examples and compares the pixels it receives for a new feature against those examples via neural networks.

  This kind of software drives many of the apps on your phone—the ones you use to find out what song is playing in your favorite café, or what bird you just photographed.22 Machine learning represents, in some form, a type of augmented reality, in which we use computers to help us sort out signal from noise in our increasingly data-crowded lives.

  Satellite imagery represents the perfect type of data to use for machine learning. Looking at large satellite data sets, it took our team of three nearly six months to map all of Egypt’s looting pits. Imagine how much faster we could have worked if we had been able to train a machine to detect potential areas of pits by comparing with known examples, and our job had involved confirming those pits rather than searching across hundreds of thousands of square kilometers ourselves. We probably could have finished Egypt in a week.

  The possibilities for using machine learning to detect previously unknown archaeological sites in satellite data represent the most cutting-edge frontier in my field right now. If we could eliminate featureless areas automatically, we could zoom in on areas of interest that we might otherwise miss with tired eyes. Data scientists have already developed machine-learning algorithms to perform tasks such as searching satellite images of Greece to detect swimming pools that wealthy homeowners have hidden to avoid paying taxes.23

  My whole field is based on using the exact same iterative process on sites across the globe today, comparing similar features elsewhere in the results of surveys and remote sensing to identify the most promising hot spots for excavation in our own sites—we just do it the long way round. Machine learning could speed this up enormously, to better target coring and seismic survey. We can but hope.

  Other applications could give us a similar virtual hand post-excavation. At the conclusion of a season, what consumes the bulk of all archaeologists’ time is pinning down other, already-explored sites that corroborate or explain their own findings, with other occurrences of the same features and objects. It would be nice to have a machine do that for you.

  Search engines like the Google Ngram Viewer24 can already hunt through databases of millions of books to find the first instance of words or patterns of usage. A plagiarism software program using similar search protocols helped an amateur Shakespeare scholar find a book from which the Bard drew major inspiration for his plays.25

  Those same software principles could apply to finding any “like” things, from city plans, buildings, and walls to fragments of mystery artifacts. If the machine knew the material, shape, size, and technology, it could easily find parallels among objects in a database. Such faster-than-thought comparisons would also help generate complete 3-D reconstructions of the site or object, based on more fully excavated examples elsewhere.

  Next Year, I Want a Digbot for My Birthday …

  But for the actual digging … we seem to be light-years from tiny digbots that might someday be capable of the excavation and 3-D scanning described in Robbie’s story. Robots and sensors, however, have already become an everyday part of our lives, and advances in robotics are now turning what was once imagination into reality.

  A Massachusetts Institute of Technology spinoff company called Boston Dynamics, for example, has created a series of viral videos of animal-like robots opening doors, walking up stairs, and doing backflips. It’s got to be said, the videos bring the words “I’ll be back” to everyone’s minds, and the robots themselves provoked widespread fear at TED in 2017, when I saw them in action.26

  We might not need to run from our robot overlords just yet. Another viral video of Roomba the housework robot spreading dog poop liberally across an entire house floor27 suggests some gadgets have not quite delivered on their promise. But they soon will.

  If DARPA, the US Defense Advanced Research Projects Agency, can develop tiny robots that zoom through buildings, mimicking insects,28 and robots can explore looting pits in Egypt, like the heavily looted site of El Hibeh in Middle Egypt,29 I can see tiny bots in the future not only doing the actual digging, but also scanning features below the surface in a way that does not disturb the ancient remains. If that seems like a bit of a leap, 3-D high-resolution scans of objects and skeletons are now becoming commonplace in museums and on archaeological sites. And since the digbots are down there, then of course it would make sense for them to take samples for chemical testing and DNA.

  DNA analysis is another revolutionary tool in archaeology that is already developing at a rapid pace. Some of you may have gotten your DNA sampled via 23andMe, Ancestry.com, or National Geographic’s Genographic Project. These efforts have analyzed the DNA of tens of thousands of individuals, and you can even use your results to track the paths your family took out of Africa into a bigger world.30 I found out that I am 3.7 percent Neanderthal and 0.9 percent Denisovan. That means I have Neanderthals as great-to-the-nth-factor grandparents, if you think about it. Maybe that accounts for my thick eyebrows.
<
br />   On a shorter, more recent timescale, DNA from long-dead tissues can assist with the reconstruction of complex family trees, as archaeologists discovered comparing the DNA of the royal mummies in Cairo’s Egyptian Museum.31 In the future, as more DNA tests are conducted on ancient and modern people, perhaps by robots like Robbie’s yellow bots, I have no doubt that this will create lineages potentially going back hundreds of thousands of years. We are all cousins, after all. Facilitating ever-larger sample sizes on skeletons at archaeological sites will help tie those populations to location, and to regional and international data sets.

  DNA testing in archaeology has already become so advanced that specific diseases are isolated by sampling ancient people’s dental plaque.32 Recent discoveries also include the skin color of a person whose only remains are a 10,000-year-old skeleton.33 As the field of medicine leaps forward, so, too, will the potential for determining ancient people’s appearances and physical histories.

  The Future Is Here

  In our story, Robbie finishes by reading over an analysis of the site from the computer, which suggests its full history with a high degree of confidence. This might be the part you find the most difficult to accept. Archaeologists need to spend decades honing their archaeological and interpretive skills in order to arrive at middle age, when we can finally make grand, sweeping pronouncements. (I’m just kidding. We start making those in grad school.) A dig director today may write a book about a site after working there for 30 or 40 years, only to see most of her theories disproved by her students 10 years later. Which is all as it should be.

 

‹ Prev