Tuesday, November 15, 2016

Network analysis in archaeology, Part 1

The Archaeology Program here at UNCG applied (largely through the suggestion and hard work of our program chair, Dr. Maura Heyn) to host an Ashby Dialogue for the 2016-2017 academic year. The purpose of these endowed dialogues is to bring together students, faculty, and the community to investigate an interesting topic in an informal setting (I was involved in another one of these in 2014). The archaeology faculty thought it might be interesting to explore network analysis in archaeology, largely because none of us had any idea what the heck it really was. Essentially, it gives us an excuse to get together and informally discuss something that is currently hot and that might inform our own research. The official theme: "Exploring Connections in the Past: Archaeology and Network Analysis."

For our first session, held on Friday, Nov. 4, was meant to introduce everyone to network analysis. Dr. Heyn, who moderated the session, chose an article by Tom Brughmans (2010). We had a great turnout, including many students and our colleague Art Murphy, who utilizes network analysis in his work on the responses of contemporary human populations to disasters. We spent a lot of time discussing what network analysis actually is and defining basic parameters like m-slices (apparently, this is a smaller network within a larger network). A couple of key points that arose:
  • Network analysis, contrary to what I had originally thought, is not really concerned with space (in the geographic sense). Social networks, shared artifact types, and other data types are more common in the approach.
  • It's important to remember that once you throw your data into a network analysis, the resulting pattern is not meaningful in and of itself: what it all means depends on a host of other contextual data.
  • Several different types of networks can result from the same data; it all depends on the parameters originally put into the construction of the model. It is up to the researcher to decide which one makes the most sense.
  • While it may seem a bit unscientific, it can be valuable to just throw your data into a model and see what emerges. It is certainly possible that otherwise obscure, fuzzy, or unknown relationships may emerge from a so-called "exploratory analysis." 
Looking forward to the next session!
References:

Brughmans, T (2010). Connecting the dots: towards archaeological network analysis. Oxford Journal of Archaeology 29, 277-303.

Monday, October 17, 2016

Plants in space

The RISE Network at UNCG periodically hosts STEM-based presentations, and I was able to attend one recently (9.27.16) on "Space...the final frontier...for plants" by the new Dean of the College of Arts and Sciences, John Kiss. Kiss spoke about his work with NASA on plant biology. The highlights:
  • He started off by describing the importance of space as a laboratory for plant biology. Long distance (and long-term) space travel will likely require some sort of bioregenerative ecosystem, so it is critical to understand how plants grow in space. What's also interesting is that plants, by providing a tangible connection to Earth, appear to have a positive psychological effect on humans in isolated contexts (like space). 
  • While the space travel angle is all well and good, Kiss's academic interests lie in the effect of zero- and micro-gravity on plant physiology. More specifically, he and his colleagues are looking at a phenomenon called "red light induced phototropism." Phototropism is simply growth in response to light; we see this all the time when plants grow "towards" a source of light. As I understand it, most modern plants, especially flowering and seeding plants, are sensitive to the type of light that they respond to (light in the blue area of the spectrum seems to be preferred). Ancient lineages, on the other hand, appear not to care much about the wavelength. The effect may be very slight, however, and can be confounded by gravity. Kiss and his colleagues are using Arabidopsis in a zero gravity environment to determine if red light induced phototropism is present (it appears as if it is).
Fun stuff...

UPDATE (6.1.17): I just heard that Dean Kiss's experimental samples will be launched into space today. The project, dubbed Seedling Growth-3, aims to understand how plant growth is affected by gravity. The launch will be streamed live here.

Tuesday, September 20, 2016

Zika: public health, fear, and ethics

UNCG's Lloyd International Honors College sponsors a "Food for Thought" series each semester where experts from around campus speak informally to a small audience about provocative and/or timely issues. This past Wednesday (September 14), Dr. Rob Cannon and Dr. Janne Cannon, both experts in microbiology, stopped by to talk about "Zika: A virus at the intersection of public health, fear, and ethics." I admit that my knowledge of Zika was limited, basically, to the following: (1) it's transmitted by mosquitoes; (2) it can cause birth defects; and (3) that its pretty bad. What I didn't think much about was, as the Cannon's theme indicates, how this related to broader issues. Some highlights:
  • The Zika virus, which is carried by mosquitoes of the genus Aedes, is not new. In fact, it was first isolated in the mid-20th century. We have not paid much attention to it largely because very few human cases were reported until 2007. The range of these mosquitoes is expanding northwards as the warmer climates that support them also shift northwards. There are now confirmed cases in south Florida, which makes it an American issue now.
  • The scariest thing about Zika is that it can lead to devastating defects in the developing fetuses of infected women, including microcephaly. Even more unsettling, though, is that there simply aren't enough data to determine the probability that the fetus of an infected woman will actually develop serious developmental defects.  
  • In February, President Obama requested 1.9 billion dollars to fight Zika. While one might expect this to be a no-brainer for Congress, a number of hangups have emerged about how the money would be distributed. Republicans have requested that the funding be tied to budget cuts elsewhere and, in addition, do not want clinics associated with Planned Parenthood to get funding because of the the abortion services they provide. The latter sticking point doesn't make much sense because, although the legislation is complex, federal funding can't be used for abortion in most cases. Regardless, Democrats have vowed to block any Zika funding that includes such restrictions. Seems pretty short-sighted on all fronts... 
  • Part of that funding package would include monies to use genetically modified male mosquitoes to breed with, and shorten the lifespan for any offspring of, Zika carrying female Aedes individuals. Although the strategy has been successful elsewhere, and despite FDA approval, some residents of southern Florida, where Zika cases have now been reported, have been resistant to the the program.
  • There is also a socio-economic dimension to this issue. While the best way to avoid the virus is to stay inside, this is easier said than done for poorer people living in humid tropical environments without air conditioning. 

Saturday, August 13, 2016

Lilly Teaching Conference

I recently received an Online Learning Course (Re)Design grant from UNCG's University Teaching and Learning Commons (UTLC) to upgrade our department's Statistics for Anthropology course for online delivery in the Summer of 2017 (UPDATE 5.10.18. I finally got it online in the Spring of 2018). To help me with this transition, I attended on Online Learning Incubator workshop at UNCG this past June. Expertly facilitated by Brian Udermann, who is Director of Online Learning at the University of Wisconsin La Crosse, I learned a ton about how to put together, deliver, and assess an online course.

UTLC's Coordinator of the Learning Innovations Office, Laura Pipe, then contacted me about joining a group of UNCG faculty that were attending the Lilly Conference on Designing Effective Teaching, held on August 1-3 in Asheville, North Carolina. Boy, am I glad that I accepted the invitation−there were some fantastic presentations, everyone was extremely pleasant, and the meals were outstanding. One of my main goals was to continue learning about online course delivery, but I also took part in a variety of other workshops. As usual, the highlights from my notes:
  • The first session that I attended was Taking the Flip: Plans, Tools, and Assessment Strategies for Creating a Flipped Classroom, by Jayme Swanke of Southern Illinois University, Edwardsville. I had heard of a flipped classroom before, but I really didn't know much about it. As Swanke explained it, the "Flipped Triad" consists of: (1) content delivery (readings, online lectures, etc.) outside of class; (2) assessments such as quizzes, either outside or at the beginning of class, to ensure student readiness and/or identify deficiencies in student understanding; (3) creative activities in the classroom that apply the course content. Here is a figure from her presentation:
The Flipped Classroom model. From newlandstandl.files.wordpress.  

This model, because most of the "homework" is conducted in the classroom with the instructor, is supposed to encourage peer-to-peer collaboration, independent learning, individualized attention, and overall engagement. Swanke talked about her experience using this model in her social work courses and found that it works best when organized around themes or units rather than individual topics. She also stressed the importance of clearly defining the objectives for the units and for each class session. Her recommendations for presenting content online were: GoAnimate (for creating animations; not free), CamStudio (for recording lectures; free), and Windows MovieMaker. To ensure and/or assess student readiness, she suggests Socrative for in-class quizzes, Canvas quizzes of the online lectures, and discussion board posts. Finally, for in-class activities Swanke uses discussions, problem-based learning, collaborative learning (VoiceThread, for example, allows students to post audio files to facilitate collaboration), and case studies.
  • Sally Blomstrom (Embry-Riddle Aeronautical University) and her colleagues summarized a project whereby student digital literacy is developed and assessed in a communications course through "audio tours." In this case, students choose a fish from a museum collection and record themselves discussing the species's movement from a biomechanical perspective (this is an engineering school, after all). One of the goals is to teach students how to engage an audience with vocal variety and enthusiasm. Blomstrom utilizes Sing&See, which is software designed to analyze singing voices, to actually visualize students' voices. I've had trouble tracking this skill objectively in my own speaking courses, so I was excited to learn about this new tool. Blomstrom had the students take before and after self assessments of their digital skill set and found that many students did notice a significant increase over the course of the project. The most important aspect of assessing this project is, as the presentation's title Using Structured Reflection to Improve Digital Literacy suggests, the formal reflection that students write at the end. This exercise forces students to identify what worked, what didn't, and what they truly learned from the experience. One interesting finding: many students not only became better users of digital tools but recognized the relevance of digital literacy.  
  • The first plenary presentation was given by Terry Doyle (Emeritus at Ferris State University) on Understanding How Students Learn: The First Step to Improving College Teaching Practices. Doyle is an advocate of Learner Centered Teaching, and his website is a fantastic (though overwhelming) resource for this approach. He began by asking us, as teachers, to ask ourselves the following three questions about our classes: (1) What do we want students to retain and apply from our classes A YEAR AFTERWARDS? The answer, he says, should guide decisions about content delivery: if you don't expect students to retain and apply a fact or concept for the long term, then it's probably best to spend more time on other things; (2) What can students do on their own, and what can they not do on their own? Information is EVERYWHERE nowadays, and teachers, as experts in their fields, need to concentrate on things that students need our help to learn rather than on things that they can look up and learn themselves; (3) What teaching actions optimize students' opportunities to learn and master course content? There is only so much time, and we want to ensure that this limited resource is being used well. We as teachers are obligated, he contends, to pay attention to and follow where the research on learning takes us, even if it makes us uncomfortable. Ultimately, we can't make informed decisions without knowing how people learn, which is what the bulk of his presentation was about. One thing that resonated with me is that teachers cannot control a whole slew of variables that impact a student's readiness to learn: genes, family life, sleep, stress levels, diet, hydration, and so on. What we can control is our own readiness to teach, the quality of our learning activities, the quality and timeliness of feedback, and accessibility to students. So, what does the research tell us about learning? Well, first of all, it is clear that, to quote Doyle, "It is the one who does the work who does the learning." No work, no learning. It is also becoming increasingly evident that movement (walking, running, or other forms of exercise) encourages thinking and learning. Can't solve a problem? Get up and go for a walk. Attention is also key. When you attend to a task, it physically alters the brain and prepares it to learn. This is why active engagement is so important−it helps maintain attention. Attention is also affected by the type of task (new and unfamiliar vs. old and automatic) and, perhaps most importantly, the relevance or meaningfulness of the task. A learner, in other words, needs a clear rationale for learning something. I think we all know this, and in our classes we try to tell students why its important to learn what we're teaching them. Doyle provides some excellent rationales that apply to any course in any discipline. First, learning how to learn is critical; the rapidity of technological advances and ever-changing requirements in expertise, even within industries, will require people to literally remake themselves, perhaps several times, throughout their working lives. So, students must become life-long learners. Second, most classes teach writing, reading, problem-solving, collaboration, and the like, all of which are foundational skills that help students gain and keep employment. Finally, learned skills help meet survival needs (paying rent, buying food, etc.). There was a lot to digest from the presentation, but it really forced me to think about my teaching actions.
  • Steven Benko and Julie Schrock (Meredith College) facilitated a session entitled Redefining Participation: How Well Did You Do? How Much Did I Help? This workshop forced teachers to ask a very important, but often poorly defined, question: What is "good" participation? Like most teachers, I have an amorphous and, usually, presumed idea of what I consider to be good participation. While I consider participation to be important in my courses, to be honest I've never really sat down and thought deeply about the actual "deliverables" of participation, which is a prerequisite to any reasonable rubric for evaluating it. It is not unusual for instructors to informally gauge participation and use that impression to determine if students on the edge are bumped up or down on the grade scale. Most people in the session agreed that participation includes attendance, class preparation (by doing readings or exercises before class), and contributions to in-class discussions, and a balance should be struck between quantity, dependability, and quality. Benko and Schrock designed very detailed rubrics to measure these items. Contribution, for instance, was graded as Novice (did not raise questions about he readings, did not state a position during class discussion), Developing (rarely, i.e., no more than once, contributed), Proficient (occasionally, i.e., at least twice, contributed), Accomplished (regularly contributed), or Mastery (answered multiple follow-up questions, explained a position and provided reasoned justifications). This rubric was implemented through self-assessments. I admit I was a bit skeptical when I first heard this−allowing students to assess themselves? They found, however, that students were surprisingly honest about their participation, or lack thereof, and that the rubric, which was made available to the students, influenced how they prepared for and participated in classes. Importantly, the rubric included and valued behaviors beyond just oral participation, which allowed students who, for whatever reason, are reticent to speak up, to be rewarded for participation. Now, their class sizes at Meredith are no more than about 30 students, which makes it easier for teachers to track participation and thus call students out if needed, so this strategy may not be very successful in larger class settings. They have also used a token system whereby students are given a token each time they participate, with different colors representing different levels of contribution (comment, question, well-developed idea, etc.). Again, probably best for smaller classes. Nonetheless, this highlights the importance of setting clear expectations for participation.
  • While online courses permit a good deal of flexibility for learners, there are also a wide range of misconceptions among students about the format: they are easier, they take less time, etc. Saginaw Valley State is now implementing a "Digital Badge" system to encourage students to take an online tutorial to prepare them for the online learning environment. Students that successfully complete the tutorial receive a badge that appears on their Learning Management System profile so that instructors know that they have some background. UNCG has an optional set of modules called Ready to Learn that serves our community in this capacity. (I should point out, too, that online instructors−myself included−assume a level of technological savvy among millennials and post-millennials that often doesn't exist.)
  • Nearly everyone acknowledges the importance of team-based learning (i.e., group work), largely because it models the setting in which students will find themselves in the workforce. Students typically dislike these types of assignments, however, largely because they lose some control over their performance and, thus, their grade. The fear, of course, is to be placed in a group with a slacker (or, to use the apparently technical term, "social loafer"). Completely understandable. A round table discussion I attended on Tuesday morning attempted to address this issue through the use of task management applications that track the contribution and workflow of group members. Here is a list of the discussed applications, some of which are free: Asana, Droptask, Trello, Teamweek, KanbanFlow, Meistertask, Freedcamp, Wrike, Teamwork, allthings, Zoho, Realtimeboard,and Stormboard. Several people mentioned that students are uncomfortable outing each other as social loafers, and when they do it is usually when the assignment is ready to be turned in. This puts the instructor in a difficult position when trying to modify grades and assess contribution. Because contributions can be tracked with these applications, it makes it easier for the instructor to identify social loafers early on. Michael Howell from Appalachian State University, who facilitated the roundtable, suggested that instructors take a class session (or shoot a video) to familiarize students with the use of the applications.
  • In How to Facilitate Engaging Discussions Using Research-Based Techniques, Kevin Kelly from the Association of College and University Educators provided a number of useful tips for discussions. He first stressed that discussions, just like any other assignment, should have specific objectives, what he called a "mission statement." We should also try to allow students to write down (individually or in groups) their response to a prompt before answering orally, which lets them stew and thus increases the chances of a meaningful response. He also suggested the "Hatful of Quotes" technique. I had never heard of it, but it goes like this: before class, select five or six passages from the text or an article and transfer them to small slips of paper, ensuring that each quote appears at least twice. Once you get to class, have individuals or groups draw a quote and give them several minutes to consider their response. Then, have them share with the class. I might try this one. Ever heard of a "Google Jockey"? Me neither. An example from Marsha Ratzel and Shelley Wright's post on the Voices from the Learning Revolution blog:
I facilitate the discussion by asking questions, while my students Google, looking for the information we need. As they come across links and videos that explain what we're learning about, my students send me links that I add to our Wiki. This process allows us to talk about the information, including how to research and find reputable information.
This example actually made me think of Terry Doyle's point about what students can and can't do on their own. Often times, looking up information is not something they need our help with−it's determining what is reliable (and why) that they need help with.
  • The coolest session I attended was run by Michael Meyer, who directs the Center for Teaching and Learning at Michigan Technological University. Meyer, who also teaches physics at MTU, covered Seven Strategies for Seven Principles, the "seven principles" referring to the Seven Principles for Good Practice in Undergraduate Education. While we actually covered more than seven strategies, they were all very interesting. There is a movement among educators towards the integration of technology in the classroom. This is all well and good (I am one of them), but Meyer's first strategy was the use of good 'ole fashioned white boards. In his physics courses, he divides the students (typically 70-80 of them) into groups of three and asks questions that can be answered collaboratively and displayed to the rest of the class. This method works great for graphs, diagrams, and mental maps in addition to traditional true/false, short answer, and multiple choice questions. Nearpod is a cool application that allows you to pose a variety of questions, including "draw-it" questions using a digital whiteboard, to students on mobile devices. If you want to pay a bit, the upgrade also allows the instructor to add "virtual field trips" and to control the screen of other digital devices so that, for example, everyone can be on the same webpage (and not on Facebook). Piazza is a nice (and free) online Q & A platform that allows, among other things, instructor endorsed answers, customizable polls, and full integration with an LMS like Canvas. PhET has hundreds of interactive simulations for STEM concepts, and TagCrowd creates word clouds. A word cloud, I learned, is the number of times a word appears in a section of text and a list with the font size of the word representing its frequency occurrence. This is especially useful when you want to track student responses to a question: put all the responses into a single text, paste it into TagCrowd, and look for patterns in their answers. This can help you identify whether folks are on the right track.
  • Lisa Martino, also of Embry-Riddle Aeronautical University, spoke about online course culture. While the number of online learners is growing extremely rapidly, students continue to struggle with social isolation, coursework confusion, and the lack of teacher presence. Martino highlighted some ways in which teachers can create a culture within their online courses that can help alleviate these challenges, including introduction videos from the instructor, ice breakers for the students, regular office hours (yes, even online), and weekly video announcements. 
  • The second plenary presentation was by Claire Major (University of Alabama), who has written extensively on research-based assessments of various teaching strategies. She summarized results from the most recent meta-analyses and found that: (1) Students actually like lectures, as long as they are done well; (2) Knowledge retention in lectures is increased with guided note-taking, frequent in-class quizzing, active learning break-outs, and instructor sign posts (e.g., "I need 500% of your attention, because this concept is important); (3) In terms of information transfer, lecture-only classes are no better than other class models for short-term retention, but fall behind in skill development (critical thinking, for example) and long-term retention; (4) Courses that combine lecture with active learning activities result in higher exam scores that courses using a lecture-only format; (5) Failure rates are higher in courses using a lecture-only format than they are in courses that combine lecture with active learning activities; (6) Students in courses that use discussion outperform the critical thinking (synthesis and evaluation of information) of students in courses that are solely based on lecture; (7) Students who participate in courses with collaborative learning have greater gains in team skills, self confidence, and higher order skills like problem solving than students in lecture-only courses; (8) There is no difference in exam performance between students who participate in games than students in lecture-only courses. She also argued that collaborative learning can be improved by ensuring that the activities have structure, mechanisms for individual and group accountability, and appropriately sized groups (five seems to be the magic number). There is no doubt that games are good motivators for students, but they can also be improved by making them collaborative and including instructor feedback.
  • Hands-down the most thought-provoking session was Fostering a Decolonized Education in an Inclusive Liberal Arts Education, hosted by Tiece Ruffin, Agya Boakye-BoatenTrey Adcock, and Jeramias Zunguze, all of the University of North Carolina at Asheville. It was pointed out that much of the university (and especially K-12) curriculum in the U.S. is biased towards the Western (and, thus, White) tradition of knowledge and knowledge production. They also argued that it is time to "decolonize" education by considering alternative perspectives not only as "window dressing" for universities purporting to be inclusive, but as truly immersive experiences that force educators to step outside their cultural comfort zone. Tiece Ruffin related a story from one of her teacher education courses in which a student attended the Asheville Goombay Festival, which celebrates the African diaspora and Asheville's African American Community. When asked to reflect upon her experience, the student, who was white and raised in rural North Carolina, wrote that she felt extremely uncomfortable and even expressed fear that she would be shot. This highlights the need among teachers for more exposure to diverse cultures (take an anthropology course!!). I would recommend the following articles, both of which speak to this issue in the context of white teachers and black students: What I Learned Teaching Black Students, and White Teachers I Wish I Never Had.             

        

Wednesday, May 4, 2016

The origins of the Great War

On April 24th I attended a discussion at my local Durham County Library entitled "The Great War: From Single Political Event to Global War." The event was sponsored by Sister Cities of Durham and the Durham Library Foundation Humanities Society. The presenter was Richard Hill, who taught history in the Durham County Public School System for 32 years, is well traveled, and certainly knows his history.

The presentation was well-attended but I was by far the youngest person there (mean age I would say ~55). Any-who...

As Hill pointed out, most discussions of WWI begin with the assassination of Archduke Franz Ferdinand, heir to the Hapsburg Austro-Hungarian Empire, and his wife Sophie, Duchess of Hohenberg, in Sarajevo on June 28, 1914 by a 19-year-old student and Yugoslav nationalist, Gavrilo (Gabriel) Princip. The assassination is a fascinating story in and of itself, and involved a good deal of dumb luck, happenstance, and coincidence (the lead driver for the Imperial cavalcade, for example, made a wrong turn down a side street, which is where Princip, gun in hand, just happened to be). Hill then showed some movie clips of soldiers marching past cheering crowds from a site called CriticalPast, which has tons of historic stock footage, some of which dates to WWI. Here is some footage of the funeral for the Archduke:



Hill continued by showing a slide with a bunch of small, difficult-to-read text. This was intentional, though, as his point was to show that assassinations of political figures was actually an extremely common occurrence between the 1890s and the first decades of the 20th century (the US was not exempt from this pattern−William McKinley, anyone?). So, the assassination itself was not to blame. A better question is why was this assassination, at this time and this place, is the only one to have sparked a major conflict. The answer, Hill argued, was a combination of:
  • Rising nationalism. It has to be remembered that some of the belligerents were only recently unified into nations that we would recognize today. Hill showed a map of Europe at the end of the 17th century to highlight the fact that Germany and Italy in particular were hodgepodges of associated but independent entities that were not unified until the 1870s. (In fact, the German Empire was officially declared in the Hall of Mirrors at Versailles in 1871 following the Prussian defeat of France.) With the rise of these states, an increased sense of nationalism and, thus, national honor, emerged. One manifestation of this was the growth of the German navy, which, it was hoped, would compete with Britain for mastery of the seas (suffice it to say, Britain felt quite threatened by this).
Europe on the eve of the Great War. Courtesy of europeaninstitute.org
  • Alliances. A complex web of alliances formed throughout the late 19th and early 20th century within (and outside) Europe. The major agreements were: (1) the Triple Alliance, first formed in 1882 between Austria-Hungary and the recently unified Italy and Germany, and, (2) largely in reaction to the Triple Alliance, the Triple Entente (or "Understanding"), which in 1884 involved France and Russia and, after 1907, Great Britain. It is important to note that the Entente should be recognized for what it was, an understanding, rather than as a binding treaty. It was not a given, for example, even up until the outbreak of hostilities, that Britain would come to the aid of France, Russia, and Belgium. Nevertheless, these (and other) agreements ultimately forced the powers to react. 
  • Military strategy. Other than Britain, compulsory, and universal, military service was a part of life for citizens of many European countries, which meant that most of the major powers could field potentially massive armies at any time (in the millions). This created significant logistical problems, as these armies had to be called up, fed, equipped and, perhaps most importantly, transported to the right place at the right time. The timing and location of hostilities were in fact absolutely critical−the discussions among military leaders clearly reveal the obsession with getting the battle-ready troops to strategic points before their enemies. This was especially true for Germany, who expected to have to fight on two fronts. One could scarcely think of a situation more conducive to rash decisions. In fact, Germany planned to commit massive forces to the Western Front to quickly bring France to her knees before turning its attention to Russia in the east. France, expecting the Germans to attack from the north through Belgium (as they eventually did), hoped to initially defend against the German onslaught before thrusting across the frontier towards the Rhine. Russia was committed to an invasion of East Prussia (to prevent a massing of German arms to the west) and an attack to their southwest to defend Slavic populations from Austro-Hungarian aggression. Britain, for her part, hoped to depend heavily, and almost exclusively, on her powerful navy.   
The Germans' planned envelopment of the French army (red) and France's thrust
towards the Rhine (blue). Courtesy of westpoint.edu
  • Colonial networks. Hill stated that colonialism did play a subsidiary role in the war. Asian states, for example, were increasingly resisting the expansionist policies of European nations, a process that led to the humiliating defeat by Japan of Russia in the Russo-Japanese War of 1904-1905. Russia also sought an exit to the Mediterranean and was thus always looking to lop off pieces of the disintegrating Ottoman Empire, and Britain wanted control of the Suez Canal to ensure quick passage to her Indian possessions, and these powers were intent on protecting these interests.
  • General factors. This last category was kind of a catch-all for other important causes. Hill first noted the incredible advances in communication technology−telephones, wireless devices, motion pictures−that permitted an unprecedented ability to disseminate information to massive audiences and, thus, to influence (and, in many cases, deceive) the public. Rapid advances in military technology also led to an arms race among the great powers (recall Germany's construction of a state-of-the-art navy to rival Britain's). 
So, what caused the cataclysm that became the Great War? Hill brought things full circle to argue that:

1. Europe lacked a system with the ability and authority to mediate international disputes. Woodrow Wilson's League of Nations proposal as part of his 1919 fourteen-point piece plan was mean to fill this void (it was created in 1920, although the United States never joined). Closely tied to this factor was the inability (or refusal) of existing states to add new political units.

2. The destructive power of modern mechanized warfare was tragically under-appreciated. Of course, it's not like there weren't warning signs: the horrifying capabilities of machine guns, for example, were evident both in Britain's colonial wars and in the Russo-Japanese War.

3. A willingness on the part of the great powers to go to war in order to maintain national unity.

During the Q & A session, one audience member referred to a documentary that he had seen claiming that a (if not the) main cause of the war was actually the Berlin-to-Bagdad railroad, which was being constructed by the Germans to ensure a steady supply of oil from the Middle East for her growing navy. There does seem to be some merit to this claim, and it does fit nicely in Hill's "Colonial networks" theme. UPDATE 10.14.16: I am currently listening to Chris Clark's The Sleepwalkers: How Europe Went to War in 1914, which is a really, really comprehensive look at the roots of the war. It the book, he relates how the Germans, under pressure from the British, gave up plans to run the line all the way to the Persian Gulf, which largely muted British opposition.

Really interesting event. Kudos to the sponsors and to Mr. Hill...

Friday, April 22, 2016

Paleoanthropology Society meetings 2016

I recently returned from Atlanta, which hosted the 2016 meeting of the Paleoanthropology Society. As usual, there were some really interesting talks, one of which (I hope) was a presentation I gave on behalf of my colleagues with some data we collected on stone raw materials in the Olduvai Basin. Here are my favorite talks as recorded in my notes:
  • The meeting's first presentation was by Catalina Villamil, a graduate student at NYU, who examined variability in the orientation of the foramen magnum across several groups of primates and marsupials. Basically, her analyses found that among primates a more anteriorly oriented foramen magnum tends to be associated with orthograde postures (that is, postures that involve a vertically oriented trunk) rather than bipedal locomotion. This is important since, as she and others point out, paleoanthropologists often look for an anteriorly oriented foramen magnum to identify bipedalism and, thus, a fossil's identification as a hominin. It appears, then, that this is not necessarily a very good indicator. This calls into question inferences for bipedal locomotion for fossils like Sahelanthropus, whose purported hominin status rests partly on foramen magnum location.
  • Michael Pante and his colleagues presented data from high-resolution 3D scans of experimentally produced stone tool cutmarks, carnivore tooth marks, and crocodile tooth marks. This is really important, as there has been a good deal of disagreement over the identification of surface modifications on fossil bones, largely because of a lack of consistency and precision among analysts in the definition of features used to differentiate marks created by various agents. Intriguingly, they suggest that some ancient marks that are thought to have been created by stone knives show similarities with crocodile marks. Theoretically, one could scan an unknown mark, compare it to the 3D structure of marks of known origin, and determine the "fit" of the two in order to make a probabilistic assessment. The only drawback, as Pante stated, is the time it takes to adequately scan the marks (24 hours in some cases). So, this method cannot yet be applied to whole assemblages. Regardless, I am in total agreement that this is the direction bone surface modification studies should be going and, at the very least, we should subject questionable marks (or extremely important ones−Dikika anyone?) to this sort of analysis. In a similar vein, J.A. Harris et al. reported on a Bayesian approach to identifying surface marks, which also produces a probability that a mark belongs in a specific category.
  • Several talks touched on stone tool production and/or use among modern humans and chimpanzees and their implications for understanding Paleolithic technologies. Susanna Carvalho summarized her team's work in Guinea on wild chimpanzee stone tool use. She reported, really interestingly, that young chimps start to learn nut-cracking by observing close kin (this makes sense as these are the individuals with whom they grow up), while older chimps slowly shift their observations to more skilled individuals. She also provided what I thought was a cool quote: [Chimpanzee] "tools are not modified prior to use, but by use." David Braun piggybacked nicely on this talk when he reported some fun experiments conducted with these same chimps. Some background: Dave has previously demonstrated that hominins at the ca. two-million-year-old site of Kanjera South (Kenya) were pretty "choosey" when it came to the types of rocks they used for stone tools. Specifically, it appears as if these hominins intentionally selected rocks for their durability (i.e., their ability to hold a sharp edge over time). Dietz Stout has documented a similar level of selectivity (in this case, for stone with little or no impurities) at the 2.6-million-year-old site of Gona (Ethiopia). While this implied some level of cognitive sophistication, Dave related to us that he was often asked how these hominins could have possibly known about such physical properties of stone; where they, in fact, the first engineers? To explore this further, he designed an elegant experiment in which he transported the same rock types being used by the Kanjera hominins a few thousand kilometers west for use by the chimps living at Carvalho et al.'s open-air laboratory in Guinea. They laid the stones on the ground and recorded the frequency with which they were chosen by the chimps for nut-cracking. They found that the chimps selected harder rocks for hammers (not too surprising) and a soft, chalk-like rock for anvils. This latter observation was mystifying at first−after all, wouldn't you want a hard rock on the bottom, too?−until it was realized that successive nut-cracking events with these more malleable stones eventually resulted in the production of a deep impression that prevented the nut from slipping out from under the hammer upon impact. Pretty cool. They also witnessed how juveniles tended to "recycle" tool sets from older, more experienced individuals (that is, they passively took control of hammers and anvils after they were abandoned by those who knew what they were doing), which essentially provided a mechanism for how knowledge of raw material characteristics could be transmitted within a group. The two major take-home messages here were: (1) the properties that we, as modern humans, think are relevant when selecting a rock for use as a potential tool are different than those identified by chimps (and presumably, ancient hominins), and (2) an awareness of raw material characteristics (hardness, durability, what have you) does not necessarily require extensive knowledge of stone engineering and/or geology, and this information can be transferred in an entirely passive manner. This runs counter to interpretations that consider selectivity and multi-stage reduction sequences among Oldowan hominins as evidence for sophisticated cognition and/or complex modes of learning (however, see here for an example of recent work indicating that active teaching was probably not necessary to transmit knapping skills). The last lithic paper I'll mention came from Nada Khreisheh, who reported that an individual's skill with handaxe production correlated best with psychometric tests that involve planning.
  • David Patterson, a graduate student at George Washington University and the person largely responsible for manning the laptop to ensure that everyone's PowerPoints loaded properly, produced some fascinating data on stable isotopes across the ancient landscapes of Koobi Fora (Kenya) between 2 and 1.4 million years ago. Tons of data here, but what I found to be most interesting was the observation that one of the only taxa in the Turkana Basin with a pattern of C4 enrichment was fossils of the genus Homo, which suggests that something special was going on with this bipedal ape over this time span: more C4 plants, or the consumption of C4-eating animals? This theme also appeared in a paper from Kay Behrensmeyer et al. who, in their analysis of the paleoecology and paleogeography of a slice within Koobi Fora's Okote Member (ca. 1.51-1.53 million years ago), found that hominins were so-called "transient" species−that is, they appear and then disappear within the sedimentary sequence. This could indicate that they utilized a greater diversity of habitats than other animals.
  • The last talk before lunch on the first day was given by Yohannes Haile-Selassie, who reported on a new species of australopithecine, Australopithecus deyiremeda and its implications for diversity within the hominin family tree during the Middle Pliocene. Full disclosure here: even though this species was announced over a year ago, I had never heard of it. This just goes to show how difficult it is to keep up with the literature with a heavy teaching load and (most importantly) a toddler. In fact, I have so little time to go through the newest primary literature that I depend on conferences like this and Texas A&M's fantastic Anthropology in the News feed for the latest in paleoanthropology. Anyway, Haile-Selassie suggests that the hominin phylogeny may be as diverse prior to three million years ago as it appears to have been after three million years ago.   
  • Behrensmeyer's presentation also criticized the use of the term "mosaic" in paleoecological reconstructions, mainly because it is so imprecise. Many of the reconstructions, including those from Amy Rector and colleagues for Cooper's D and Kay Reed et al. for the lower Awash valley, tend to converge on some sort of "mixed" habitat (usually between forest and bushland, or bushland and grassland). In these cases, though, the imprecision reflects the type of data (in this case, mammal fossil assemblages).
  • Darryl de Ruiter presented on the new Homo naledi assemblage on behalf of the Rising Star team. As in their recent publications, de Ruiter maintains that the most likely explanation is that the bodies were deposited intentionally by members of the group using the cave system as a grave site of some sort. John Shea suggested that the assemblage could represent young, adventurous males that became stuck within the depths of the cave. An interesting hypothesis, although de Ruiter pointed out that it appears as if the assemblage includes individuals of both sexes and all ages. No news on the age of the assemblage...this is going to be very intriguing for some time.
  • Because of my work in Armenia, my ears perked up during Simonyan et al.'s presentation (given by Miriam Belmaker) on test excavations in southern Armenia. They reported the recovery of a large assemblage of obsidian artifacts that show techno-typological affinities to the late Lower and Middle Paleolithic. Obsidan hydration dates (which, as Belmaker acknowledged, can be pretty temperamental) indicate at least two separate occupations, one at ~120,000 and the other >200,000 years ago. I look forward to hearing more about this as research progresses.   
  • My team, including my friends and colleagues Cynthia FademRyan Byerly, and Audax Mabulla and UNCG undergraduate researcher Curran Fitzgerald, presented on our recent work with lithic raw materials in the Olduvai Basin. Building on the work of folks like Dave Braun, we reported on data we collected with a Proceq Schmidt Rock Hammer, which, by measuring rock hardness, provides an objective and quantitative proxy for raw material "quality." These sorts of data are critical if we want to start understanding why hominins chose the rocks they did for tool manufacture. We also collected a ton of additional samples to determine if the handful of macroscopically similar, but spatially discrete, quartzite outcrops in the basin could be distinguished based on their elemental composition. Encouragingly, this does appear to be the case, which means that determining the source of the quartzite artifacts from the Olduvai archaeological sites is a real possibility.    
  • The most thought-provoking paper was given by Stanley Ambrose and his colleague Jibril Hibro. They argued that Neandertal-modern human admixture should not yet be viewed as a given. The weakest link in these studies is that African genetic diversity was woefully underrepresented, which means that genetic variants identified as deriving from Neandertals (or other archaic populations) could, in fact, have already been present (but unsampled) among African populations. So, the question that has not yet been adequately addressed is, as their paper's title suggests, Is Neandertal-human genetic admixture in Eurasians actually African ancestry? This paper certainly reminded me to be cautious in our interpretation of the genetic data.
  • The last paper I'll comment on was the very last of the conference, given by Randolf Donahue and colleagues. They introduced Fossilfinder.org, which allows anyone to examine high resolution photos taken by drones around Lake Turkana for fossils and other items of paleoanthropological interest. This forms part of the "Citizen Science" movement, which attempts to harness the public to essentially crowd source the collection of data. I have looked at some of the photos myself and, even at relatively high resolution, it is not easy to pick out materials. Nevertheless, I am anxious to see how this project progresses...
Lots of other great papers, just not enough energy to write about them all. First trip to Atlanta, too...seems like it has a lot going for it.

Wednesday, March 9, 2016

Giving a talk at the Virginia Museum of Natural History

I am travelling to Martinsville, VA, tomorrow to deliver a talk at the Virginia Museum of Natural History. I was contacted a few months ago by the museum's executive director, Dr. Joe Keiper, to participate in their 2nd Thursday Science Talks program. This year's theme is "From Cosmology to Conservation: Your World and Your Place in It." In the spirit of that theme, I will be discussing what I think Neandertals can teach us about the concept of "Race" among modern humans. Here are the talk's particulars:
A Neandertal's Perspective on the Existence of "Races" Among Modern Humans
For many years, anthropologists have grappled with the central paradox of "Race." On the one hand, the nature of human variation seriously undermines the biological reality of racial categories. On the other hand, it is evident that one's race, as a marker of status, identity, or heritage, is real and, thus, really matters. In this presentation, we will step back nearly 35,000 years, when the last Neandertals roamed Ice Age Eurasia, to explore what these extinct humans can teach us about race and, ultimately, what it means to be human in today's world.
Looking forward to it!