Saturday, December 5, 2020

Cut marks, saw marks, and forensic anthropology

Human skeletal remains are not a common occurrence at forensic scenes but, when present, they can preserve a great deal of information about the events surrounding the deposition of a corpse. Forensic anthropology, the field that specializes in the analysis of human skeletal remains in medico-legal contexts, has garnered a good deal of attention from the popular media in recent years. Most of that attention comes from Fox's drama series Bones, which ran from 2005 to 2017 and followed the exploits of forensic anthropologist Temperence Brennan (a character from the novels of forensic anthropologist Kathy Reichs). Suffice it to say, courses in forensic anthropology are now among the most popular offered in anthropology departments (including at UNCG where I teach), and undergraduate concentrations and graduate programs in forensic anthropology have popped up across the country. 

Forensic anthropology encompasses a wide variety of methods and techniques, one of which is forensic taphonomy. Taphonomy, broadly defined, is the study of how organic remains (like skeletons) transition from living to static entities. In a forensic context, we ask the question "what happens to a person between death and discovery"? A class of taphonomic data, known as bone surface modifications, or BSMs, is particularly useful for answering this question. Marks on bone surfaces, like carnivore tooth marks, saw marks, insect damage, and the like, can provide critical evidence concerning an individual's death and their body's journey after death. Some insects, for example, are only active on the surface, so if we find damage from that particular insect on human bones, we can be sure that the bones were exposed on the surface, at least for some period of time. A few years ago, in a high-profile murder case in North Carolina, where I live, marks on the bones of a human skeleton were used to help identify the saw used to dismember a body. 

Last fall, my friend and colleague, Travis Pickering, who has spent the better part of 25 years studying taphonomy, was solicited to write a review on the use of BSMs in forensic anthropology for the journal WIREs Forensic Science. He graciously invited me to collaborate on the undertaking, and he and I spent the last year researching how BSMs were integrated into the forensic sciences and how they've been used in crime scene reconstruction. The result of those efforts is an article titled "Cruel traces: bone surface modifications and their relevance to forensic science." (I wish I could take credit for the clever title, but that was all Travis.) As much work as they are, I enjoy writing review articles that cover such a wide swath of a particular field of inquiry because they force me to explore literature that I otherwise might not even have come into contact with. 

Several key themes emerge in our review, namely that: (1) the study of BSMs developed first in the paleontology and archaeology of the mid- to late-1800s and was later adopted by forensic investigators; (2) ultimately, a BSM's utility for forensics (or any other field) is only as good as our ability to link that BSM with a particular process, and that link can only be established through systematic observations of a process actually producing a specific BSM; (3) the features of the BSM itself are not always enough to provide a positive identification, and contextual information must always be used (e.g., serrated knives and shark teeth create very similar striations on bones surfaces, but if the bones were recovered far inland, a shark origin is much less likely); (4) the admissibility of expert scientific testimony in federal courts (the so-called "Daubert" standards) privilege the validity of a scientific methodology over the expertise and experience of any one scientific expert.

As an example, consider a linear striation on the surface of a bone recovered from a crime scene. Now, we might logically interpret that mark to have been produced by a knife. That's all well and good, but lots of things can create a linear mark on a bone: a small, sharp sand grain on the ground, the tooth of a scavenging dog, and so on. How might we know that the knife is, in fact, the source of the mark? Well, for one, we need to conduct an experiment where a knife is observed to create a particular mark on a bone's surface (this can be done with donated human skeletal material or, more commonly, the bones of other mammals like deer or pigs). That way, we know for sure what a knife mark looks like. There are lots of different types of marks, though, and even the same knife can create slightly different marks depending on how it is wielded, so one experiment is typically not enough. However, what if a knife was found alongside the human skeletal material? This sort of contextual information, when considered along with the morphology of the mark itself, can provide additional clues and make our identification that much more probable. When this evidence is presented in court, its admissibility, at least according to federal standards, will have less to do with the experience of the expert witness than with the rigor of the methodology used to identify the BSM. 

Importantly, no methodology can definitively, with 100% confidence, identify the source of a particular BSM. Why not? Well, forensic investigators do not themselves witness a crime: the relevant details must be reconstructed. This is where we get the phrase "crime scene reconstruction" and its common association with archaeology, and for good reason. Forensic investigators are in many ways like archaeologists: whereas archaeologists use artifacts and architecture to reconstruct ancient cultures, forensic investigators use evidence to reconstruct a crime. This means, though, that we must work with probabilities, rather than absolute certainties. When we are able to match up a BSM from a crime scene to an experimentally produced BSM, it becomes highly probable, thought not absolutely certain, that the object that created the crime BSM is the same as the object that created the experimental BSM.

A major hurdle for forensic investigators is that some BSMs can be difficult to tell apart from each other. The marks produced by different types of saws, for instance, share many features. So, too, do the marks produced by different types of knives. With so much overlap, how similar do the little bumps, grooves, and edges of a crime BSM have to be to an experimental BSM to be considered "the same"? What features should we even be looking at, and how do we define them? One analyst's "deep groove" may be another's "V-shaped striation." This sort of inconsistency makes it difficult for BSM analysis to attain Daubert-level methodological rigor. Travis and I conclude with the suggestion that computerized image analysis of BSMs coupled with statistical classification might get us closer to reaching that level of rigor. Either way, it will be interesting to see how the field progresses from here.

Saturday, March 7, 2020

Peccaries and sabertooths

I spent my spring break down in Florida--not at the beach, but at the Florida Museum of Natural History in Gainesville. The draw? Sabertooths. My colleagues and I have long been interested in the behavior of these large cats, largely because they shared the landscape with the early hominins that we study at places like Olduvai Gorge. With the exception of the La Brea Tarpits in California, there is no better place in the world than Florida to uncover evidence for sabertooths. The state's karstic landscape contains an almost unrivaled collection of well-preserved Pleistocene fossil assemblages, many of which include the skeletal remains of sabertooths. One of these cats, Xenosmilus hodsonae, lived in Florida between about 2.5 and 1.5 million years ago. A nearly complete skeleton, and the species' holotype, was found a few miles west of Gainesville at a site called Haile 21A. We visited the museum's Florida Fossil exhibit to see a cast of the skeleton.

Mounted skeletons in the main hall of the Florida Fossils exhibit. The
Xenosmilus skeleton is just right of center.

Found alongside this important fossil, inside an ancient sinkhole, was a large collection of extinct peccaries. We are interested in this assemblage because it may teach us a great deal about sabertooth feeding behavior. Our working hypothesis is that the peccaries were victims of predation and that Haile 21A itself represents a Xenosmilus den. The fossil assemblage was excavated in the early 1980s and is now housed and curated by the Vertebrate Paleontology section of the Department of Natural History in Dickinson Hall. The collection is expertly managed by Richard Hulbert, Jr., who is one of the leading authorities on the vertebrate paleontology of Florida and a genuinely nice person.

Collections room of the Department of Vertebrate Paleontology in Dickinson
Hall.

My colleagues Manuel Domínguez-Rodrigo, Lucía Cobo Sanchez, Enrique Baquedano, and I were given access not only to the fossil collection, but a fully functional lab, a photography rig, and comparative skeletal material.

Our assigned lab space in the Vertebrate Paleontology
Section. 

We spent about eight hours a day for the past week looking through ~1,600 peccary specimens. Now, its time to analyze the data, so stay tuned...

Friday, December 27, 2019

Bodies, race, and the history of anthropology

Another overdue post from our visit to Spain this past spring...

The Museo Nacional de Antropología in Madrid has a fascinating history. The first museum in Spain to be dedicated to the study of anthropology, it was founded in 1875 as the Museum of Anatomy by Pedro González de Velasco under the patronage of King Alfonso XII. Velasco's anatomical specimens and the ethnographic artifacts collected by the Spanish government from across its then-vast empire formed the museum's original collection. Off to the left of the main entrance lies a small, nondescript room that today displays some of these items as a record of, and a tribute to, the museum's past. For me, the most interesting, and, in many ways, poignant pieces in the room was the skeleton of Augustín Luego Capilla.

Skeleton of Augustín Luego Capilla
What makes the skeleton so remarkable, and the reason why Velasco acquired it for the museum, is its size: when Capilla died at the tragically young age of 26, he stood 2.35 meters (~8 feet, 3 inches) tall. Known as the "Gigante Extremeño," or "Giant of Extramadura," Capilla earned a living as part of a circus act and, when traveling through or near Madrid in 1875, sought medical attention from Velasco. By that time he was very ill, and on December 31st he died. His mother, Josefa, apparently in gratitude for Velasco's medical care, donated her son's body to the museum for anatomical study.

Capilla's extreme stature is likely explained by pituitary gigantism, a condition caused by the excessive secretion of growth hormone and insulin-like growth factor 1. The disproportionately large mandible, abnormal bone growth around the knee joints, and enlarged hands are all symptoms of acromegaly, which is commonly associated with pituitary pathologies. Capilla is one of only a handful of acromegalic skeletons on display in western museums. Velasco also created a full body cast of Capilla's corpse before it was skeletonized (this is also on display in the museum).

Pelvic girdle of Augustín Luego Capilla showing pathological proximal
femora (note anterio-posteriorally compressed femoral heads) and acetabula.
While the skeletal material is very interesting from a paleopathological perspective, some point out, correctly I believe, how tragic Capilla's life and afterlife was and is. Being displayed today as a museum oddity, Capilla's body, one can argue, has no more dignity now than Capilla himself had as a spectacle for gawking circus-goers when he was alive. While Capilla's body was donated, this was not the case for many other acromegalic skeletons, many of which were essentially stolen to be included in medical collections.

The room also houses some objects that encapsulate the rather sordid history of biological anthropology, especially its obsession with racial typology.

Die Proportionslehre der Menschlichen Gestalt,
Carl Gustav Carus, 1854. 

References:

Giménez-Roldán, S (2019). The Giant of Extramadura: acromegalic gigantism in the 19th century. Neurosciences and History 6: 38-52.

Tuesday, December 3, 2019

Forensic anthropology and political violence in Guatemala

Between 1960 and 1996, Guatemala experienced a brutal civil war that cost the lives of over 200,000 people. While the roots of the conflict stretch back nearly 500 years to the exploitative policies of the Spanish Empire, it was in 1960, in response to a CIA-backed coup that overthrew the country's democratically elected president, that a rebellion broke out to overthrow a repressive military regime. Among the most tragic features of this conflict was the systematic murder of ethnic Mayans, who were seen as rebel sympathizers.

While we were in Spain in the spring of 2019, my wife and I visited the Museo Nacional de Antropología in Madrid. The museum had an exhibition featuring a series of haunting photographs by Jonathan Moller, who documented the work of forensic anthropologists involved in the recovery, excavation, and identification of skeletal remains from the disappeared. This is taken from the exhibit:
Since 2000 a Guatemalan citizens' movement has sought justice and challenged impunity. The exhumations and subsequent investigations by forensics have played a key role in that effort, as they offered survivors the opportunity to expose the truth, providing concrete evidence of the atrocities committed in the war. For many years, the government of Guatemala buried the truth about these killings and massacres, just as they buried hope for a better life in Guatemala. The perpetrators of the crimes have responded to the exhumations by unleashing a new wave of political assassinations and death threats to prevent the truth from coming to light.  
The exhumations bring back the pain and horror, but at the same time they impart healing and closure to the surviving families. The survivors are finally reunited with their loved ones, and they are able to mourn and give them a proper burial in the village cemetery, to be at peace with them and with themselves, since many feel guilt for having survived. The exhumations, mourning and re-burials help the survivors recovery their dignity. 
These photographs were taken between 2000 and 2001, when Jonathan Moller was part of the forensic anthropological team of the Office of Peace and Reconciliation, which worked in the municipality of Santa María Nebaj. They represent a starting point to begin to tell the story of the repression and unspeakable violence suffered by the Mayan peoples of Guatemala, and they can only begin to convey the emotion and intensity of what was experienced at that time. Jonathan witnessed the families' pain and grief as they relived these horrible atrocities, and he also shared their joy and celebration at recovering the remains of their loved ones.
The recovery of historical memory allows us to know the past to understand the present and thus have references to build a just and peaceful society that values above all the life and dignity of people.
 The images are extremely powerful--here is just one that I snapped with the camera on my phone:

A member of the forensic team carefully lifts the remains of two men who
were killed in the violence in the 1980s. Jonathan Moller, Nebaj, 2001.

Just another example of how anthropology matters. You can read more about these efforts at the Fundación de Antropología Forense de Guatemala

Tuesday, November 26, 2019

Empires across space and time

On Monday evening, October 28th, the North Carolina Humanities Corridor sponsored, and the University of North Carolina at Greensboro hosted, an interdisciplinary discussion on "Empires Across Space and Time." The discussion brought together archaeologists, historians, and literary critics to explore different perspectives on empires and how they construct imperial power and identity.

In her introductory remarks, Jill Bender of UNCG's Department of History emphasized the relevance of empire studies by pointing out that literally millions and millions of people have lived under these political entities over time, and that has left deep-seeded economic, social, and even psychological impressions that reverberate even today. The discussion was split into two sessions of three speakers, each of whom had five minutes to provide their particular perspective on empire.

The first session examined ancient empires. Robyn Le Blanc, an archaeologist and numismatics expert at UNCG, spoke about her work on the Roman Empire. Rome, Jupiter's "empire without end," was, according to Le Blanc, first and foremost an extractive political system that sought to squeeze goods and taxes from its provinces. A sense of belonging to the Roman state was of only secondary importance, especially for a population made up largely of non-citizens. However, even when Roman identity was constructed and/or fostered, it meant very different things to different people. She also emphasized that because our written records were produced for and by the political elite (read: middle-aged men), we must often rely on archaeology and material culture to reveal the lives of non-elites (read: 99.9% of the population). Enter "civic coins," which are coins minted and used in the provinces. While most of these coins portrayed images of their local founding myths, stylistic analysis also reveals how communities outside the Roman heartland of Italy saw themselves within the imperial web. Some coins, for example, used Latin inscriptions to show off their "Roman-ness," while others pictured Roman mythological founding rituals to rework their city's past through a Roman lens. When the empire was at its height, communities were especially active participants in Roman culture, a participation that was reflected through the use of Mediterranean (and, later, Christian) mythology.

The next discussant, my colleague Donna Nash, addressed her research with the Wari Empire, which rose ~600 CE and was South America's first imperial state. She is particularly interested in how expansive polities like the Wari integrated conquered people. One way to identify whether, or to what degree, local peoples adopted aspects of Wari ideology and identity is through the analysis of material culture like art. The appearance of Wari "style" in provincial art may indicate some level of influence by and, thus, integration into, Wari culture. Citing modern Americans' use of Chinese manufactured goods as an example, Nash nevertheless cautioned that simply using the motifs or goods of another culture does not necessarily signal cultural hegemony. Another possible source of data for cultural influence is architecture--there is some evidence that the provinces adopted imperial building layouts.

Malcolm Motley, a student in UNCG's Department of English who also studies the Roman Empire, began his discussion of Roman colonies by reminding us that the source of the word is the Latin colonus, which means "farmer." This highlights the major role of colonies in the empire: providing resources for the state. The goal, then, of the imperial officials was not necessarily to create "little Romes" out of the colonies through forced cultural assimilation but, rather, to extract resources from them. It is tempting, then, to view this relationship as one-sided. Motley pointed out, though, that many colonies actually benefited from their colonial status through access to lucrative traits routes. He argued that many colonies may, in fact, be best understood as communication hubs rather than collections of subjected peoples--a rather different conceptualization of imperial organization.

The audience's questions after this first session raised some interesting issues, too. One thing that was pointed out is that only the elites typically participate fully in imperial culture and view their identity in those terms. Villagers, on the other hand, tend not to do so. This makes sense because elites have the most to gain from assimilating. Someone asked about successful empires, and Nash pointed out that long-lived empires tend to show cultural flexibility (be yourself, we don't care) but maintain economic inflexibility (pay your taxes!).

The second session covered more recent empires, in this case the British Empire. Christopher Hodgkins of UNCG's Department of English began the discussion by exploring Elizabethan-era British literature. The revival and recovery of the Roman/Arthurian Empire (without the trappings of Catholicism or Popery, of course--this was, after all, just after the Reformation) was a consistent theme in the writing of the period.

Perhaps the most interesting story of the evening came from the research of Jill Bender, who is in UNCG's Department of History and focuses on the 19th century British Empire. What, Bender asked, held the British imperial system together, and what happened when a group refused to participate? Her current project involves the government assisted migration of poor Irish women from Irish workhouses to the empire's distant colonies where women were badly needed. In 1857, the British government recruited 50 women from workhouses in Cork--many of whom were forced into the workhouses because of the Potato Famine the previous decade--to migrate to South Africa. Days before they were scheduled to depart, however, officials learned that the women did not, in fact, wish to go. Suspecting some sort of coercion, the women were asked again one-by-one by an all male panel. They all still refused to depart, with 47 out of 50 specifically citing religious concerns about the lack of a Catholic priest in the colony. This is the interesting part: the British acquiesced and were forced to find other recruits. This revealed, Bender argued, that the path of the British Empire could be shaped in very real ways by the decisions of individual actors rather than grand imperial strategy. This highlighted a general theme of the second session, namely that history can proceed both from the "top-down" (that is, from the perspective of elites that shaped imperial policy) and through the lived experience of non-elites, especially the oppressed lower classes and the colonized.

Really enjoyable event, and I learned quite a bit.

Saturday, October 12, 2019

Changing the brain through instruction

I attended my second Transforming Online Pedagogy and Practices Symposium (TOPPS) this past May. The event is sponsored and hosted by UNCG's University Teaching and Learning Commons (UTLC). Like the 2018 symposium, this event was well worth it.

This year's theme was "Science and the Art of Changing the Brain through Instruction and Instructional Design," and the UTLC brought in Dr. Kristen Betts as the event's keynote speaker. Dr. Betts is a Clinical Professor in the School of Education at Drexel University and specializes, among other things, in how we can harness brain science to improve teaching. Her presentations, which revolved around the theme "Leveraging Psychology to Create Compelling Learning Experiences," was spread across two morning sessions on May 13th and 14th.

The first session was entitled "Neuropedagogy: Changing the Brain through Instruction and Instructional Design." Dr. Betts first asked us to go through a list of statements about learning and the brain and mark whether we thought the statement was correct or incorrect. Here they are:
  1. Listening to classical music increases reasoning ability. 
  2. Individuals learn better when they receive information in their preferred learning styles (e.g., auditory, visual, kinesthetic) 
  3. Some of us are "left-brained" and some are "right-brained" due to hemispheric dominance and this helps explains differences in how we learn 
  4. We only use 10% of our brain 
  5. Normal development of the human brain involves the birth and death of brain cells 
  6. Learning is due to the addition of new cells to the brain 
  7. There are critical periods in human development after which certain skills can no longer be learned 
  8. Learning occurs through changes to the connections between brain cells
  9. The left and right hemispheres of the brain work together
  10. Production of new connections in the brain can continue into old age
Now, this screamed "top 10 myths about the brain and learning," so my guard was up immediately. Nevertheless, I was sure that 1-4 were incorrect since I had discovered previously that these, including approaches based on "learning styles" and "left hemisphere/right hemisphere," are among the most commonly debunked myths, and the 10% fallacy is also well-known. What is interesting, though, is that when an instructor invokes a "learning style" approach and, in doing so, validates a student's preconceived notions about their own learning style, they create a fixed mindset in the learner--"well, I'm a visual learner, so that's the only way I can learn." The other list she had us look at included statements that highlighted how emotions can affect cognitive processes and the fact that human memory does not operate like a recording device.

With learning myths suitably punctured and learning truths duly identified, Dr. Betts then moved on to a discussion of feedback. I think we all know how important meaningful feedback is--the peer-review process in academia is an excellent example--but I never really thought deeply about why it is so important. It turns out that feedback, when it is done right, helps us to identify and reduce the gap between our current performance and some desired goal, be it an A on a research paper or a scientifically legitimate research study. Dr. Betts asked the audience to define feedback and its relationship to assessment. Our group described feedback as a response that allow students to see the difference between what they do and what they are trying to do (or, at least, what we as instructors are trying to get them to do). We also argued that feedback helps students identify biases, incorrect statements, and their overall strengths and weaknesses. It also quickly became clear that, for us, feedback and assessment are very distinct processes. While for former tracks the development of a skill, the latter evaluates whether a skill has been attained or not. We wrapped up the discussion by identifying the challenges of giving feedback. The most commonly cited one is time: do we have enough of it to provide substantive feedback? Others included the question of whether or not learners take the feedback into account (if not, why waste the time giving it?) and whether or not instructors are able to craft an assignment that can produce meaningful feedback in the first place. (One suggestion to ensure that students heed instructor feedback is to require it as part of an assignment's rubric.) Dr. Betts then outlined best practices for instructor feedback, which should be:
  • Understandable. Use language that the student can understand.
  • Selective. Choose two or three issues that the student can do something about without feeling overwhelmed.
  • Specific. Point out the areas in the student's work to which the feedback applies.
  • Timely. It should be provided so that the student has enough time to improve for the next assignment. Students need time to reflect, address misunderstandings, and seek support from the instructor and/or academic services. A lack of feedback, or even delayed feedback, often leads to (di)stress, which in turn creates anxiety and poorer performance. 
  • Contextualized. Comments should be framed in reference to the assignment's learning outcomes and/or rubric. 
  • Non-judgmental. The focus should be on learning goals rather than just performance and comments should be descriptive rather than evaluative. Even the terms we use can impact the way that students perceive, and thus react, to our feedback: give "feedback" rather than "criticism"; provide "constructive" rather than "critical" feedback; areas of "weakness" can become areas "to develop"; and "rewrites" turn into "revisions." 
  • Balanced. Point out both the strengths and weaknesses.
  • Forward-looking. Provide suggestions for how future work can be improved.
  • Transferable. The focus should be on the process and skills, not just the knowledge content.
  • Personal. Refer to what is already known about the student based on previous assignments.
The problem for many students is that they have not received thoughtful feedback in an educational context. As instructors, we'd of course love to provide this for everyone, and we should strive to do so within the constraints set by our class sizes and learning objectives. This workshop was designed specifically for online instruction, and one practice that Dr. Betts engages in, and one that I never thought about, was recording her feedback and sending the video to students. It turns out that this can be accomplished in two to three minutes and might take less time than red-marking a paper.

Dr. Betts went on to share some basic principles of learning science during the second day:
  • Human brains are as unique as human faces.
  • Everyone's brain is also uniquely prepared, based on their personal experiences, to learn different tasks. This is significantly influenced by one's developmental environment. It has been shown, for instance, that chronic toxic stress can affect the ability of the prefrontal cortex to process information. Of particular importance to us as instructors is students' prior educational experience: while some students have had extensive feedback and have been held accountable, others have had little or no feedback and have not been held accountable; while some students have been held to high standards, others have not been held to high standards. This sets an individual's "default mode network," which projects that experience into the future (e.g., "I've never been held to high standards, so I won't ever be held to high standards.")
  • The brain changes in light of new experience, so neuroplasticity occurs throughout one's lifetime. Every time a new fact or skill is learned, the brain creates new neural pathways and, possibly, about 700 new neurons a day. Practice actually thickens the myelin sheaths surrounding the brain's axons, which in turn permits more efficient signal (that is, information) transfer. 
  • Learning cannot occur without some form of attention and memory.
  • The brain seeks novelty and patterns.
  • Repeated practice and rehearsal of learned material across multiple modalities helps to consolidate information in long-term memory.

The overall take-home message is that because the brain physically changes every time learning happens, we, as instructors, are quite literally brain changers. A powerful thought indeed.

Tuesday, October 8, 2019

Shopping for rocks in the Olduvai Basin

The invention and proliferation of stone tool technology was one of the most significant events in human evolution--the ability to use stones as tools and, eventually, the wherewithal to modify them into sharp-edged knives and other implements enabled our early ancestors to access foods that would have been difficult or impossible to obtain and consume with their relatively small, unspecialized teeth. If you spend some time working with stones, it eventually becomes apparent that not all of them are created equal: some break easily, others are tough to fracture; some produce razor-sharp edges, others generate dull ones; some are close by and/or easy to get a hold of, others are far away and/or difficult to access; some are durable and last a long time, others are brittle and must be discarded after a single use.

Now, we know that a modern human can learn to recognize these attributes and, what is more, they can (not to say that they necessarily do) plan their days with them in mind ("well, let's see...there are two ways to get to the pond for fresh water, Path #1 and Path #2, but only Path #1 has an outcrop of durable rocks on the way, so I'll kill two birds with one stone and take Path #1"). The question, then, is this: to what degree did our early human ancestors appreciate the sometimes subtle differences among rocks, and what can this tell us about their cognitive capacities?

Before we can even answer this very interesting question, however, we need to figure out a way to (1) rank rocks in terms of their usefulness, and (2) determine where on the landscape early humans were getting their rocks in the first place. There is a long history of research on these topics in Paleolithic archaeology, and my colleagues and I added some data to the debate in a recently published paper in Quaternary International. My interest in the topic goes back to the late 2000s, when David Braun wrote a couple of really interesting papers on the stone tools from Kanjera South, a two-million-year-old site in Kenya. Most studies on rock "usefulness" are based on rather subjective and imprecise categories. These categories, and the studies that utilize them, have provided key insights, including the fact that rock selection by early humans was not random. Braun, however, explored the possibility that the material sciences might provide some useful tools to help archaeologists objectively describe the characteristics of rocks.

As I mentioned above, there are a host of features that one might consider when selecting a rock. We chose to concentrate on fracture predictability, largely because the creation of many types of stone tools involves breaking a rock into smaller (and hopefully useful) pieces. If a rock breaks differently every time you hit it, there is no way to predict what you're going to end up with. Sure, it might be useful, but, then again, it might not. With a rock that fractures predictably, though, you can be reasonably sure that the time and energy you've expended will pay off with the production of a useful tool. Flint knappers have known for a long time that homogenous rocks break more predictably than do heterogeneous rocks because they are stronger (they can resist strain) and more elastic (they can resist deformation) when impacted by an outside force. Thankfully, a rock's strength and elasticity are highly correlated with its hardness, something that can be quickly assessed with a rebound hammer. These nifty handheld devices, which were originally designed for use on concrete, fire a spring-loaded plunger onto the surface of a stone. The plunger then bounces back, or "rebounds," after impact. The distance of that rebound reflects the hardness of the stone. Braun and others have used this technique to estimate fracture predictability for the rocks available to early humans at several Pleistocene archaeological sites in Kenya.   

In 2014, we set out to produce comparable data in our neck of the woods, the Olduvai Basin of northern Tanzania. What is today a deep gorge surrounded by open grasslands was, about two million years ago, a stream-fed soda lake surrounded by lush vegetation. Largely unchanged, however, are the volcanic highlands that border the basin to the south and east and the numerous hills--remnants of Archean-aged metamorphosed bedrock--that rise above the plains. Importantly, both the volcanos and the hills are made up of rocks from which stone tools can (and, in the past, could) be made.

A view of Olduvai Gorge in the foreground and, in the background, Naibor Soit, a granulite outcrop from which quartz could be procured (photo: Amy Schnell).

With the help of students from the UNCG Olduvai Gorge Paleoanthropology Field School and Earlham College's Summer Collaborative Research Program, I and my good friends and colleagues Cynthia Fadem and Ryan Byerly have been traipsing around the Olduvai Basin hammering as many rocks as we can get a hold of. Since 2014, we've accumulated a database of 110 specimens, and some interesting patterns have emerged. It turns out that the volcanic rocks that occur as rounded cobbles within the seasonal streams that drain the volcanic highlands have high rebound values and, thus, high fracture predictability, while the metamorphic rocks from the hills show either intermediate or low rebound values. Now, if early humans were selecting their rocks based on fracture predictability, we might expect that most of the artifacts from the archaeological sites would be made from volcanic sources. It turns out, however, that among Olduvai's artifact assemblages, volcanic rocks tend to be very rare, while metamorphic rocks, especially those made largely of quartz, are very common, which implies that fracture predictability was not a major concern. But why not? We might interpret this pattern to mean that early humans in the Olduvai Basin were not clever enough to recognize the value of predictably fractured volcanic rocks. We're skeptical of this hypothesis, though, because experimental work indicates that there are good reasons not to select volcanic rocks, since they:
  • usually occur as rounded cobbles, which are tough to flake because they don't have very many of the acute angles that make flake removal possible;
  • require more raw muscle power to flake; and
  • may not be as durable as other rock types.
What is more, the quartz-rich rocks in the Olduvai Basin:
  • are readily available from conspicuous landscape features that are very close to most of the archaeological sites; and 
  • are very friable, which, although reducing their fracture predictability, makes them relatively easy to smash into lots of small chunks, among which are typically a handful of useable tools.  
Finally, it's not like volcanic rocks were not utilized at all. In fact, early humans appear in some cases to have selected them over metamorphic rocks when they wanted to create handaxes rather than simple flakes. This makes sense given that the more complex production sequence of a handaxe probably requires a more predictably fractured rock. Unfortunately, you can't subject fragile artifacts to the impact of a rebound hammer. However, if you can tell where the artifact originally came from, we can correlate the hardness of our geological specimens with their archaeological counterparts without subjecting the latter to any damage. Well, in addition to hammering rocks, we also subjected them to X-ray fluorescence, which can help identify their chemical composition. The volcanic rocks are easily distinguished from the metamorphic rocks just by looking at them, but the metamorphic rocks themselves, even those from different hills, can look very similar to each other. Luckily, the chemical signatures of each metamorphic hill are distinct enough for statistical algorithms to correctly match chunks of rock to the correct hill with 75-80% accuracy.

In the future, we should be able to match Olduvai's metamorphic artifacts to the hills from which they were being collected, which in turn will give us an idea of how far early humans travelled when shopping for their rocks.

References:

Egeland, CP, Fadem, CM, Byerly, RM, Henderson, C, Fitzgerald, C, Mabulla, AZP, Baquedano, E, Gidna, A (2019). Geochemical and physical characterization of lithic raw materials in the Olduvai Basin, Tanzania. Quaternary International. doi.org/10.1016/j.quaint.2019.09.036