Thursday, March 31, 2016

Slavery carried bilharzia parasites from West Africa to the Caribbean, genomics confirms

Scientists used the full DNA sequences ofSchistosoma mansoni parasites from Africa and the French Caribbean to discover the fluke's origins, map its historic transmission and identify the secrets of its success. Their findings show how the global slave trade transported the disease from Senegal and Cameroon to Guadeloupe. Further genomic comparison with a closely related schistosome species that infects rodents reveals how the parasite has adapted to infecting human beings.

Schistosoma mansoni is a blood fluke (flatworm) that infects more than 250 million people worldwide and causes more than 11,000 deaths each year. Six years ago the Sanger Institute published the parasite's first full DNA sequence (genome); this latest study used that 'genetic map' to construct and compare the genomes of S. mansoni parasites gathered from across Africa and the New World, the majority of which were held at the Schistosomiasis Collection in the Natural History Museum, London.

By analysing the differences between the human-infecting S. mansoni and its close relative, the rodent-infecting S. rodhaini, the scientists calculated that the two species evolved from a common ancestor approximately 107,000 to 148,000 years ago in East Africa. This finding suggests that the species is much 'younger' than previously thought.

"The timing of the separation of the two species coincidences with the first archaeological evidence of fishing in Africa," explains Thomas Crellen, first author of the study from Imperial College London, the Sanger Institute and the Royal Veterinary College London. "The parasite develops in freshwater and infects people by burrowing through their skin. The introduction of fishing would have meant that people spent more time in the water, greatly increasing their chances of being infected."

Analysing the differences between genomes from different locations also revealed the darker side of human history.

"Comparing the S. mansoni genomes suggests that flukes in West Africa split from their Caribbean counterparts at some point between 1117AD and 1742AD, which overlaps with the time of the 16th-19th Century Atlantic Slave Trade," says Professor Joanne Webster from Imperial College London and the Royal Veterinary College. "During this period more than 22,000 African people were transported from West Africa to Guadeloupe by French slave ships, and the fluke was carried with them."

Comparing the genomes of S. mansoni with S. rodhaini also revealed the genetic variations that have been positively selected over time in the human-infecting fluke and have been "fixed" into its DNA. It is likely that these variations are the evolutionary adaptations that have occurred to enable the fluke to successfully tunnel into, and thrive within, human beings.

"When we looked for the differences between human-infectingS. mansoni DNA and its rodent infecting cousin S. rodhaini, we found two important variations. We found that changes to two genes in S. mansoni's DNA -- VAL21 and an elastase gene -appear to be important in allowing the fluke to enter and live in humans," says Dr James Cotton, senior author of the study from the Sanger Institute. "VAL genes produce proteins that cause allergic responses, so it is possible that the variation in VAL21helps the fluke to hide from our immune systems. The elastase gene helps the parasite to burrow in to the body, by breaking down elastin -- a major component of human skin."

It is hoped that exploring the genetic makeup of the fluke it will be possible to discover more about the processes the parasite relies on to infect humans and offer new opportunities to develop preventive and therapeutic interventions.

Science Daily. 2016. “Slavery carried bilharzia parasites from West Africa to the Caribbean, genomics confirms”. Science Daily. Posted: February 16, 2016. Available online:

Wednesday, March 30, 2016

From washing machines to computers: how the ancients invented the modern world

True innovation is hard to find, as few things come out of nothing. Take the now ubiquitous selfie, for example. The format may have changed but the concept of making self-portraits is hundreds, if not thousands of years old. The same is true of many inventions that we typically think of as modern, some of which actually have precedents dating back over 1000 years.

A Roman washing machine

“Fulling” was a major occupation in the Roman world that involved cleaning cloth by trampling it in tubs containing an alkaline solution, such as water and urine or the mineral known as fuller’s earth. But in ancient Antioch, in what is now Turkey, evidence suggests the process may have been mechanised, meaning the Romans may have effectively created the world’s first washing machine as far back as the 1st century AD.

Traditionally thought of as a medieval invention, the mechanical fulling mill would likely have consisted of a waterwheel that lifted a trip-hammer, which would then drop to press the cloth. A fullers' canal mentioned in an inscription in Antioch would have supplied an estimated 300,000m3of water at almost a metre per second, far in excess of what was needed for regular foot-powered fulleries. The power this could generate means it could have supported fulling on an industrial scale with maybe 42 pairs of mechanical hammers.

An ancient Greek computer

In 1900, divers off the coast of the Greek island of Antikythera discovered something that changed our view of ancient science. The Antikythera mechanism is a bronze system of 30 gears that models the cycles of the sun and moon. It is effectively the first-known analogue computer, dating back to the 1st century BC. Set in a wooden box, the internal gears would have turned dials on the outside that showed the position of the sun and moon, as well as the rising and setting of specific stars and possibly the positions of Mars and Venus, too. Another dial could be moved to take into account leap years.

Although we now know that the Babylonians discovered how to use geometry to track the course of Jupiter in around 1800 BC, the Antikythera mechanism is the earliest known device that automatically calculates astronomical phenomena. We know of no other similiar devices for several hundred more years until the 8th century AD, when mathematician Muhammed al-Fazari is said to have built the first Islamic astrolabe. And nothing as mechanically sophisticated would appear again until the European astronomical clocks of the 14th century.

The Great Roman Bake-Off

Bread was big business in the Roman world. It was given out by the state as part of a dole known as the annona. This meant that it was possible for people to make substantial amounts of money as bakers. One such person was Marcus Vergilius Eurysaces, a freedman (ex-slave) from Rome, who was so proud of his successful baking business that he commemorated it on his tomb. Today it is one of the most striking monuments from ancient Rome.

The top of the monument is decorated with a series of scenes that show a range of baking activities including the mixing and kneading of dough, the forming of loaves and the baked loaves being stacked in baskets. The most curious part, however, is the cylinders that make up the bulk of the monument. These features have baffled scholars for quite some time. One convincing theory argues that it is likely that these cylinders are related to baking and may well represent an early dough-mixing machine. The idea is that a rotating metal arm would have been attached to each cylinder in order to mix the dough.

The first state space project

Ninth-century Baghdad in what is now Iraq saw the rise of a growing scientific community, particularly in astronomy, centred around a library known as the “House of Wisdom”. The problem for these new scholars was that their books were written many centuries earlier and came from a wide range of different cultures – including Persian, Indian and Greek – that did not always agree. The Caliph al Ma’mun decided the only solution was to build an astronomical observatory so the city’s scholars could determine the truth.

Observatories weren’t new but a state-sponsored scientific institution was. It’s hard to be sure exactly which instruments were used in the al-Shammasiyya observatory, but they probably included a sundial, astrolabes and a quadrant set on the wall to measure the precise position of objects in the sky. The quadrant may have been the first of its kind to be used in astronomical observations. The scientists used these instruments to reassess Ptolemy’s Mathematical Treatise from the 2nd century AD, and to make numerous astronomical observations, including the latitudes and longitudes of 24 fixed stars.

Kamash, Zena. 2016. “From washing machines to computers: how the ancients invented the modern world”. The Conversation. Posted: February 15, 2016. Available online:

Tuesday, March 29, 2016

Language juggling rewires bilingual brain in a good way

Bilinguals use and learn language in ways that change their minds and brains, which has consequences -- many positive, according to Judith F. Kroll, a Penn State cognitive scientist.

"Recent studies reveal the remarkable ways in which bilingualism changes the brain networks that enable skilled cognition, support fluent language performance and facilitate new learning," said Kroll, Distinguished Professor, psychology, linguistics and women's studies.

Researchers have shown that the brain structures and networks of bilinguals are different from those of monolinguals. Among other things, the changes help bilinguals to speak in the intended language -- not to mistakenly speak in the "wrong" language.

And just as humans are not all the same, bilinguals are not all the same and the changes in the mind and brain differ depending on how the individual learned the language, what the two languages are and the context the languages are used in.

"What we know from recent research is that at every level of language processing -- from words to grammar to speech -- we see the presence of cross-language interaction and competition," said Kroll, Distinguished Professor of psychology, linguistics and women's studies. "Sometimes we see these cross-language interactions in behavior, but sometimes we only see them in brain data."

Kroll presented recent findings about how bilinguals learn and use language in ways that change their minds and brains today (Feb. 13) at the annual meeting of the American Association for the Advancement of Science.

Both languages are active at all times in bilinguals, meaning the individuals cannot easily turn off either language and the languages are in competition with one another. In turn this causes bilinguals to juggle the two languages, reshaping the network in the brain that supports each.

"The consequences of bilingualism are not limited to language but reflect a reorganization of brain networks that hold implications for the ways in which bilinguals negotiate cognitive competition more generally," said Kroll.

Kroll was instrumental in establishing the first U.S. chapter of Bilingualism Matters at Penn State, within the University's Center for Language Science. Bilingualism Matters is an international organization that aims to bring practically applicable findings from current bilingual research to the public.

Science Daily. 2016. “Language juggling rewires bilingual brain in a good way”. Science Daily. Posted: Available online:

Monday, March 28, 2016

How learning languages translates into health benefits for society

The advantages of speaking a second language - for health and mental ability - are to come under the spotlight at an event at the AAAS annual meeting in Washington, DC.

Experts in bilingualism will examine how learning a second language at any age not only imparts knowledge and cultural understanding, but also improves thinking skills and mental agility. It can delay brain ageing and offset the initial symptoms of dementia.

During the symposium, researchers will examine how findings from bilingualism research are currently applied, and how they could best benefit society through education, policymaking and business. Experts will examine current research themes related to bilingualism from infancy to old age, and explore their implications for society.

Professor Antonella Sorace of the University of Edinburgh, who established and directs the Bilingualism Matters Centre, will focus on research on minority languages, such as Gaelic and Sardinian. She will discuss whether the benefits associated with minority languages are consistent with those of learning more prestigious languages.

Professor Sorace will be joined by researchers from San Diego State University, Pennsylvania State University, Concordia University, Nizam's Institute of Medical Sciences, the Chinese University of Hong Kong and the University of Connecticut.

The symposium, entitled 'Bilingualism Matters' is directly inspired by the Bilingualism Matters Centre at the University of Edinburgh, which is at the forefront of public engagement in this field and has a large international network. The event will take place from 1.30-4.30pm on Saturday 13 February in the Marshall Ballroom South, Marriot Wardman Park, Washington DC.

Professor Sorace, of the University of Edinburgh's School of Philosophy, Psychology and Language Sciences, said: "We are excited to reflect on Edinburgh's experiences in bilingualism as an international example of cutting-edge scientific research and public engagement, and to share the current state of research in this area and its relevance for the general public."

EurekAlert. 2016. “How learning languages translates into health benefits for society”. EurekAlert. Posted: February 13, 2016. Available online:

Sunday, March 27, 2016

'Lost' Roads of Ancient Rome Discovered with 3D Laser Scanners

Laser scans of Britain's terrain may reveal weathered Roman roads that have been hidden for centuries across the countryside of northern England.

Over the past 18 years, the U.K.'s Environment Agency has used a technology called lidar to collect data for more than 72 percent of England's surface. This remote sensing technique bounces laser light beams off the ground to make 3D terrain maps that can peer below vegetation and reveal the contours of every ditch and boulder below.

The U.K.'s lidar maps were used primarily for environmental purposes, such as for planning flood defenses or tracking eroding coastlines. But last summer, the agency dumped all 11TB of its data sets onto theSurvey Open Data website. 

The maps grabbed the attention of archaeologists and history buffs —among them, David Ratledge, a 70-year-old retired road engineer who has spent nearly five decades searching for ancient Roman roads, The Times of London reported.

After the Romans invaded Britain in the first century A.D., they built an impressive network of roads to secure their occupation. You can walk in the footsteps of Roman soldiers on a few surviving sections of these ancient highways today, but many routes have been stripped of their stones or they have been obscured by development and farmland.

These "lost" roads left some gaps in the history of Roman Britain. One mystery for Ratledge was, how did the Romans get from Ribchester to Lancaster? With access to the new maps, Ratledge thinks he has solved the puzzle. He traced an 11-mile (17 kilometers) road from Ribchester to the main north-south road at Catterall that then led to Lancaster.

"The road takes a very logical and economical route to join the main north-south road at Catterall and hence on to Lancaster," Ratledge wrote on the website of the Roman Roads Research Association. "Years of looking for a road via Priest Hill, White Chapel, Beacon Fell, Oakenclough and Street proved to be time spent in the wrong place!"

Ratledge said a prominent stretch of a Roman rampart is even visible in Google Street View.

"How nobody —me included —spotted it is a mystery," he wrote.

Archaeologists Hugh Toller and Bryn Gethin have also used the lidar data to find four other roads, including a missing part of a Roman road called the Maiden Way, the U.K. Environment Agency said in a statement.

First developed in the 1960s, lidar has a variety of uses. In one of its best-known early applications, it helped NASA's astronauts study the surface of the moon during the Apollo missions. Today, it's been used to survey land for oil and gas companies, or to assess the damage of a disaster like the 2010 Haiti earthquake or Hurricane Sandy. It's even been used in an artistic capacity, to make haunting portraits of people in Ethiopia.

The technique has also become a useful tool for archaeologists who want to look for buried structures without breaking ground. In recent years, archaeologists have used lidar to discover the foundations of alost city in the Honduran rainforest, mapped the sprawling ancient city of Angkor in Cambodia and revealed lost historic sites across New England.

In England, archaeologists aren't the only ones interested in the Environment Agency's terrain maps. The agency said utility companies might use the data to plan the construction of new infrastructure, and winemakers might even find the lidar maps useful when scouting potential plots for vineyards. "Minecraft" players have also requested the data sets to help them build virtual worlds.

Gannon, Megan. 2016. “'Lost' Roads of Ancient Rome Discovered with 3D Laser Scanners”. Live Science. Posted: February 11, 2016. Available online:

Saturday, March 26, 2016

How society deals with human suffering

Millions of people experience social suffering in their everyday lives. But how should we venture to understand these brute facts of modern existence? How do they impact upon our cultural beliefs, political outlooks and moral behaviours?

In a new book, entitled A Passion for Society: How We Think About Human Suffering, Dr Iain Wilkinson, of the University of Kent, and co-author Professor Arthur Kleinman, of Harvard University, examine the moral experience and public portrayal of human suffering and how these have changed through modern times.

The authors go on to investigate how the knowledge people acquire of the suffering of others holds the potential to inspire caring acts of compassion.

Taking an historical perspective, A Passion for Society further considers the development of social science, with a particular focus on how this has been shaped in response to problems of social suffering. The authors argue that social science's original concern with social suffering and its amelioration gave way to a professionalisation that espoused dispassionate enquiry above the pursuit of humanitarian social reform.

Dr Wilkinson and Professor Kleinman then chart the more recent recuperation of this lost tradition and explore some of the ways in which social inquiries coupled with caring actions for others are currently revitalising and remaking the discipline of social science.

The authors conclude by arguing for what they describe as an engaged social science that connects critical thought with social action and operates with a commitment to establish and sustain humane forms of society.

Iain Wilkinson is Reader in Sociology within Kent's School of Social Policy, Sociology and Social Research. Arthur Kleinman is Professor of Medical Anthropology within Harvard Medical School's Department of Social Medicine. A Passion for Society: How We Think About Human Suffering, was published in January 2016 by the University of California Press. See:

EurekAlert. 2016. “How society deals with human suffering”. EurekAlert. Posted: February 11, 2016. Available online:

Friday, March 25, 2016

Mysterious Graves Discovered at Ancient European Cemetery

One of the oldest cemeteries in Europe has recently been discovered, with graves dating back almost 8,500 years. Two of the most intriguing finds are the skeleton of a six-month-old child and a mysterious upright burial of a man in his early 20s.

The German cemetery, called Gross Fredenwalde after a nearby village, belongs to a time known as the Mesolithic, when Europe was populated by hunter-gatherers. At a press conference Thursday morning in Berlin, excavators announced that nine skeletons have been uncovered on the hilltop burial site so far, five of them children younger than 6 years old. And the researchers found ample evidence that more graves remain unexcavated. “It’s rare for the Mesolithic to find multiple graves in one place,” says forensic anthropologist Bettina Jungklaus, who excavated one of the bodies. “They were mobile people, ranging over the landscape.” Excavations in 2013 and 2014 uncovered evidence of the prehistoric graveyard, found 50 miles north of Berlin on a hill 300 feet above the plains below. The hilltop’s hard, rocky soil would have been a tough place to dig graves. With no water sources nearby, it would have been a bad place for a settlement, too. In a paper published in the journal Quartär, Thomas Terberger, the archaeologist who led the recent dig, says the burials are evidence of careful planning. “It’s not an accumulation of burials by accident, but a place where they decided to put their dead,” says Terberger, of the Lower Saxony Department of Historic Preservation. “It’s the first evidence of a true cemetery in northern Europe or Scandinavia.”

That, colleagues say, makes the spot special. “It’s a big surprise,” says Erik Brinch Petersen, an archaeologist at the University of Copenhagen. “Hunter-gatherer people typically buried their dead right next to their houses. Here in northern Europe, a site like this is unique.” The infant skeleton is rare, too. Researchers say it’s the earliest infant skeleton ever found in Germany, and one of the oldest in Europe. Excavators removed the fragile remains from the cemetery in a single, 660-pound (300 kilogram) block of earth, making it possible to carefully expose the 8,400-year-old skeleton in the controlled setting of a lab. “It’s really rare to find an intact burial like this, because an infant’s bones are so small and fragile,” says Jungklaus. Laid to rest not long after it turned six months old, the baby is almost perfectly preserved, its arms folded across its tiny chest. The bones and nearby soil are stained red from ochre pigment used to decorate the body for burial. The excellent preservation offers researchers a wealth of information. Chemical signatures in the bones, for example, could show whether the infant was breast-fed; DNA could establish links to other skeletons in the cemetery and determine the infant’s gender. Learning more about its short life and how it died could tell archaeologists more about what conditions were like for Europe’s early inhabitants. “We can look at possible illnesses, and perhaps determine the cause of death,” Jungklaus says. “Children are always the weakest link–they’re the first victims when the environment or living situation changes.” While the infant burial is remarkable, the body of a young man found nearby has excavators puzzled–and excited. Buried more than 1,000 years after the infant, the man was entombed standing up, together with bone tools and flint knives. The man’s skeleton suggests he lived a pretty easy life. It doesn’t show signs that he did a lot of physically taxing labor. “He looks like a flint knapper or experienced craftsman, rather than the strongest boy of the group,” Terberger says. Stranger still, the vertical grave was filled in just as far as the man’s knees at first. His upper body was allowed to partially decay and fall apart before the grave was filled in. At some point, a fire was built on top of the tomb.

One possible explanation comes from hundreds of miles to the northeast. Standing burials similar to the one at Gross Fredenwalde have been found in a cemetery called Olenij Ostrov in modern-day Russia, from about the same time. Researchers have long assumed culture flowed into ancient Europe from the south, but these odd burials suggest that there was active migration or communication across northern Europe as well. “This man is an indication of such eastern influences,” Terberger says; DNA results from his bones might be able to tease out the connections. From early analyses of his DNA and the grave goods he was buried with, it’s clear the young man buried standing up was a hunter-gatherer, like the infant he shared the cemetery with. But he died about 7,000 years ago, meaning the hilltop cemetery was in use for more than a millennium. His death occurred about the same time the first farmers arrived in this part of Europe, part of a process that changed the face of the continent. The overlap might help researchers understand what happened when hunter-gatherers first encountered immigrants bringing new technologies and lifestyles from far to the south. “Late hunter-gatherers and early farmers lived side-by-side,” Terberger says. But the evidence from the graveyard suggests that relations were chilly. Archaeologists have found farmer settlements from the same time period just 7 miles (10 kilometers) away from the hunter-gatherer cemetery–but no signs that the people buried there had any meaningful contact with their neighbors. “They must have looked in each other’s eyes, but not exchanged anything–neither goods nor genes,” says Petersen.

Curry, Andrew. 2016. “Mysterious Graves Discovered at Ancient European Cemetery”. National Geographic News. Posted: February 11, 2016. Available online:

Thursday, March 24, 2016

Neanderthal DNA has subtle but significant impact on human traits

Since 2010 scientists have known that people of Eurasian origin have inherited anywhere from 1 to 4 percent of their DNA from Neanderthals.

The discovery spawned a number of hypotheses about the effects these genetic variants may have on the physical characteristics or behavior of modern humans, ranging from skin color to heightened allergies to fat metabolism... generating dozens of colorful headlines including "What your Neanderthal DNA is doing for you" and "Neanderthals are to blame for our allergies" and "Did Europeans Get Fat From Neanderthals?"

Now, the first study that directly compares Neanderthal DNA in the genomes of a significant population of adults of European ancestry with their clinical records confirms that this archaic genetic legacy has a subtle but significant impact on modern human biology.

"Our main finding is that Neanderthal DNA does influence clinical traits in modern humans: We discovered associations between Neanderthal DNA and a wide range of traits, including immunological, dermatological, neurological, psychiatric and reproductive diseases," said John Capra, senior author of the paper "The phenotypic legacy of admixture between modern humans and Neanderthals" published in the Feb. 12 issue of the journal Science. The evolutionary geneticist is an assistant professor of biological sciences at Vanderbilt University.

Some of the associations that Capra and his colleagues found confirm previous hypotheses. One example is the proposal that Neanderthal DNA affects cells called keratinocytes that help protect the skin from environmental damage such as ultraviolet radiation and pathogens. The new analysis found Neanderthal DNA variants influence skin biology in modern humans, in particular the risk of developing sun-induced skin lesions called keratosis, which are caused by abnormal keratinocytes.

In addition, there were a number of surprises. For example, they found that a specific bit of Neanderthal DNA significantly increases risk for nicotine addiction. They also found a number of variants that influence the risk for depression: some positively and some negatively. In fact, a surprisingly number of snippets of Neanderthal DNA were associated with psychiatric and neurological effects, the study found.

"The brain is incredibly complex, so it's reasonable to expect that introducing changes from a different evolutionary path might have negative consequences," said Vanderbilt doctoral student Corinne Simonti, the paper's first author.

According to the researchers, the pattern of associations that they discovered suggest that today's population retains Neanderthal DNA that may have provided modern humans with adaptive advantages 40,000 years ago as they migrated into new non-African environments with different pathogens and levels of sun exposure. However, many of these traits may no longer be advantageous in modern environments.

One example is a Neanderthal variant that increases blood coagulation. It could have helped our ancestors cope with new pathogens encountered in new environments by sealing wounds more quickly and preventing pathogens from entering the body. In modern environments this variant has become detrimental, because hypercoagulation increases risk for stroke, pulmonary embolism and pregnancy complications.

In order to discover these associations, the researchers used a database containing 28,000 patients whose biological samples have been linked to anonymized versions of their electronic health records. The data came from eMERGE - the Electronic Medical Records and Genomics Network funded by the National Human Genome Research Institute - which links digitized records from Vanderbilt University Medical Center's BioVU databank and eight other hospitals around the country.

This data allowed the researchers to determine if each individual had ever been treated for a specific set of medical conditions, such as heart disease, arthritis or depression. Next they analyzed the genomes of each individual to identify the unique set of Neanderthal DNA that each person carried. By comparing the two sets of data, they could test whether each bit of Neanderthal DNA individually and in aggregate influences risk for the traits derived from the medical records.

"Vanderbilt's BioVU and the network of similar databanks from hospitals across the country were built to enable discoveries about the genetic basis of disease," said Capra. "We realized that we could use them to answer important questions about human evolution."

According to the evolutionary geneticist, this work establishes a new way to investigate questions about the effects of events in recent human evolution.

The current study was limited to associating Neanderthal DNA variants with physical traits (phenotypes) included in hospital billing codes, but there is a lot of other information contained in the medical records, such as lab tests, doctors' notes, and medical images, that Capra is working on analyzing in a similar fashion.
Reference: 2016. “Neanderthal DNA has subtle but significant impact on human traits”. Posted: February 11, 2016. Available online:

Wednesday, March 23, 2016

First Migrants to Imperial Rome ID'd by Their Teeth

Three adult men and a young adolescent of unknown gender buried in cemeteries outside Rome were likely migrants to the city, their teeth reveal.

The four immigrants all lived during the first to third centuries A.D. They are the first individuals ever to be identified as migrants to the city during the Roman Imperial era, which began around the turn of the millennium and ended in the fourth century.

This was a time when Rome was a thriving, complex metropolis, said study researcher Kristina Killgrove, a biological anthropologist at the University of West Florida.

"Up to a million people were living there," Killgrove told Live Science. "This population ebbed and flowed. You had people who were migrating in, and you had people who were dying and [people who were] migrating out."

Hidden history

Previous researchers have estimated that 40 percent of the people who lived in Rome during this period were slaves (some born locally and some imported), and about 5 percent were voluntary migrants to the city. But there was no census in Rome and no records of the comings and goings of individuals, Killgrove said.

She searched for evidence of these early travelers in two cemeteries right outside Rome's walls — Casal Bertone to the east, and Castellaccio Europarco to the south. To uncover people's origins, Killgrove and colleague Janet Montgomery of Durham University in the United Kingdom analyzed the isotopes in their molars. They focused on the first molar, which starts forming at birth and finishes forming at age 4. The enamel of this molar holds a record of what people ate and drank in their first years. 

"Teeth are kind of like little time capsules in your mouth," Killgrove said.

Isotopes are versions of the same element with different numbers of neutrons in their nuclei. The researchers analyzed strontium isotopes in molars from 105 skeletons from the two cemeteries, and further analyzed oxygen and carbon isotopes in a subset of 55 of those individuals. Strontium enters food and water by the weathering of rocks and indicates the geology of an area where a person spent his or her first years. Oxygen reflects the source of a person's drinking water, including meteorological factors like humidity and rainfall. Carbon can provide information about a person's diet, particularly whether  plants rich in the isotope C4 (maize and millet, for example) or C3 (rice and wheat, among others) were eaten.

Trading places

A combination of these isotopes revealed that two adult men who were between 35 and 50 when they died, one adult man older than 50, and a teenager between the ages of 11 and 15 almost certainly came to Rome from somewhere else. A couple of the men had high levels of certain strontium isotopes, indicative of starting life in a place where the rocks are old. Much of Italy is made of young, volcanic rocks, Killgrove said. The closest old rocks are in the Alps, or on some of the islands of the Tyrrhenian Sea. The oxygen isotope analysis also hinted that these two men could have come from an Alpine climate, though it's impossible to be sure, Killgrove and Montgomery reported.

The adolescent had low strontium isotope levels, suggestive of a home environment of young limestones or basalt. High oxygen isotopes ratios pointed to a hot climate. Those clues suggest a possible North African origin for this young person.

Four other individuals (two 7- to 12-year-olds, a male between the ages of 11 and 15 and a female between the ages of 16 and 20) also had isotope signatures that suggested they may not have been native Romans, but the datawas a bit ambiguous, Killgrove said. Figuring out if people migrated to Rome is particularly difficult because people in the city ate imported food and drank water drawn from large areas through aqueducts, meaning their isotope ratios have a broader range than people living in a more self-contained city.

It's impossible to tell why the migrants found in the Roman cemeteries moved, Killgrove said. They may have been slaves, or they may have come to Rome for voluntary reasons. The burials appear to be those of lower-class people, Killgrove said, but that doesn't mean they weren't free. Notably, the immigrants' diets do seem to have changed when they moved to Rome. As children, they ate diets higher in C4 foods, probably millet, Killgrove said.

"When they came to Rome, that becomes more in line with the Roman diet, which is more wheat-based than millet-based," she said. (Killgrove has previously found class differences in the amount of millet and wheat eaten by Romans.)

Killgrove is now working at another cemetery site outside Rome and plans more isotope analysis, along with DNA studies. An understanding of migration can deepen an understanding of Rome's development, as well as Imperial Roman slavery, acculturation to Roman culture and even disease transmission.

"It all goes back to migration," Killgrove said.

Pappas, Stephanie. 2016. “First Migrants to Imperial Rome ID'd by Their Teeth”. Live Science. Posted: February 10, 2016. Available online:

Tuesday, March 22, 2016

Prelinguistic infants can categorize colors

A joint group of researchers from Chuo University, Japan Women's University and Tohoku University has revealed that infants aged between 5 and 7 months hold the representation of color categories in their brain, even before the acquisition of language.

This study is published in the online journal of Proceedings of the National Academy of Sciences.

A long-held theory called Sapir-Wharf hypothesis claims that languages define our perceptions. This theory is widely accepted in various fields of study including psychology, linguistics and anthropology. Color perception is also considered to be subject to this theory, since colors are called by their names in daily communications.

Through numerous studies on the color lexicons of languages in the world, categorical color perception is considered to be strongly affected by language. On the other hand, the similarity of color categories across linguistic and cultural differences is also reported as strong evidence of the universality of color categories. Therefore, whether or not language affects color categories has been a central issue related to how we perceive colors.

This new study reveals that the category of colors can be independent of language, at least in the early stage of development in an infant's visual system.

Infants 5-7 months old were tested to see if brain activity is different for colors in different categories. The brain activity was measured by a near infrared specrtoscopy technique, which realizes comfortable measurement of brain activity in infants.

The study found that the brain activity increased significantly when the colors of blue and green were alternated, while there was no significant reaction to the alternation of different shades of green. The difference was observed in the occipito-temporal area in both left and right hemispheres.

A similar difference was found in adult participants with no significant lateralization. Since language related cortical areas reside in the left hemisphere in most right-handed adults, the observed brain activity had no direct relation to language processing. In addition, brain activity caused by categorical color differences was not found in the occipital region, which is known to play a significant role in the early stage of visual processing.

These results show that color information is processed through multiple cortical stages in infants, in a way similar to adults. They suggest that the brain activity in reaction to different color categories are represented differently in infants, even before the acquisition of language. They also imply that color categories can develop independent of the acquisition of relevant language.

Science Daily. 2016. “Prelinguistic infants can categorize colors”. Science Daily. Posted: February 10, 2016. Available online:

Monday, March 21, 2016

Ability to navigate between cultures is good for Mexican-American youth

Biculturalism is positively associated with prosocial behaviors such as helping others and self-esteem

Approximately 40 million foreign-born persons, representing about 13 percent of the population live in the United States. Many Latino immigrants find it best to maintain their cultures and identities while acclimating to mainstream American culture, thereby becoming bicultural. New research from the University of Missouri points to biculturalism as an indicator of positive self-evaluation and prosocial tendencies, such as empathy towards others, for Mexican-American youth.

"Regardless of the nationality of a parent, one thing remains constant--parents want their children to have prosocial tendencies," said Gustavo Carlo, Millsap Professor of Diversity in MU's College of Human Environmental Sciences. "Parents want their kids to have self-esteem, to care for others and be confident: traits that lead to relatively high levels of well-being. This is particularly true for Latino immigrants working to make a better life for their children in the U.S."

Carlo says that for Latino youth in the U.S., biculturalism allows them to stay connected with both their culture of origin and the culture of their communities. Through his research, Carlo found that those with higher biculturalism scores had greater prosocial tendencies and positive self-evaluation. Moreover, he found that prosocial actions, such as caring for others and helping those in need, promote a better self-concept making it easier to maintain connection with one's culture of origin.

To study the impacts that biculturalism has on Latino youth, Carlo focused on the predicted positive associations biculturalism would have on positive self-evaluations and whether prosocial tendencies increase as biculturalism increases. Carlo surveyed 574 U.S. Mexican adolescents living in the greater Phoenix area. The survey consisted of questions related to ethnicity, language spoken at home, willingness to help others and self-esteem.

"We found that adolescents who can adopt both their culture of heritage and mainstream culture and those who can navigate between the two worlds are more likely to be confident, have higher self-esteem and help others," Carlo said. "However, not all adolescents have the luxury to navigate both worlds. For example, one may want to fit in with their peers but, for a variety of reasons, is unable to do so. Then the next best alternative is to remain connected with one's culture of origin to improve overall well-being."

To help Latino youth navigate between cultures, parents and teachers can play roles, Carlo said. He suggests that parents be open to their children's entering and adopting mainstream culture as well as teachers' supporting programs that promote inclusion and diversity.

The Journal of Latina/o Psychology will publish the study, "The Associations of Biculturalism to Prosocial Tendencies and Positive Self-Evaluations," this spring. The research was funded by the National Institute of Mental Health (MH068920). The content is solely the responsibility of the authors and does not necessarily represent the official views of either funding agencies. Camille Basilio and George Knight from Arizona State University co-authored the study. Carlo's book, "Prosocial Development: A Multidimensional Approach," was recently published by Oxford University Press and received an award from the American Educational Research Association.

EurekAlert. 2016. “Ability to navigate between cultures is good for Mexican-American youth”. EurekAlert. Posted: February 10, 2016. Available online:

Sunday, March 20, 2016

9,200 year old site in Sweden indicates an early date for Nordic human settlement

The discovery of the world’s oldest storage of fermented fish in southern Sweden could rewrite the Nordic prehistory with findings indicating a far more complex society than previously thought. The unique discovery by osteologist Adam Boethius from Lund University was made when excavating a 9,200 year-old settlement at what was once a lake in Blekinge, Sweden.

“Our findings of large-scale fish fermentation, a traditional way of preserving fish, indicate that not only was this area settled at that time, it was also able to support a large community”, says Adam Boethius, whose findings are now being published in the Journal of Archaeological Science.

A different perspective

The discovery is also an indication that Nordic societies were far more developed 9,200 years ago than what was previously believed. The findings are important as it is usually argued that people in the north lived relatively mobile lives, while people in the Levant – a large area in the Middle East – became settled and began to farm and raise cattle much earlier.

“These findings indicate a different time line, with Nordic foragers settling much earlier and starting to take advantage of the lakes and sea to harvest and process fish. From a global perspective, the development in the Nordic region could correspond to that of the Middle East at the time.

The discovery is unique as a find like this has never been made before. That is partly because fish bones are so fragile and disappear more easily than, for example, bones of land animals. In this case, the conditions were quite favourable, which helped preserve the remains”, says Boethius.

The fermentation process is also quite complex in itself. Because people did not have access to salt or the ability to make ceramic containers, they acidified the fish using, for example, pine bark and seal fat, and then wrapped the entire content in seal and wild boar skins and buried it in a pit covered with muddy soil. This type of fermentation requires a cold climate.

Past Horizons. 2016. “9,200 year old site in Sweden indicates an early date for Nordic human settlement”. Past Horizons. Posted: February 8, 2016. Available online:

Saturday, March 19, 2016

Humans have always been migrants

A short animated film commissioned by two University of Kent historians challenges the concept that migration at current levels is a new phenomenon.

With migration now a major topic of debate across Europe, Professor Ray Laurence and Dr Julie Anderson, working with the University of Reading's Dr Hella Eckardt, created a script and commissioned the film to provide the public, schools and policy makers with a better understanding of its history.

The 75 second animation draws from research on the Roman Empire and the First World War.

This includes recent developments in the chemical analysis of the teeth of skeletons from the Roman Empire period in Britain which has revealed that migrants from North Africa were living in York. The analysis also suggests up to 30% of the population in Britain came from abroad during the Roman period.

Research into the hidden histories of war graves in Britain has also identified migrants who had took part in WWI. The conversion of Brighton Pavilion into a hospital for Indian troops provides further evidence of the important role they played.

With the topic of migration now included within the GCSE History curriculum of exam board, Oxford Cambridge and RSA (OCR) the film is expected to provide children with a far more complex view of Britain's population in the past.

Fleming, Sandy. 2016. “Humans have always been migrants”. Posted: February 9, 2016. Available online:

Friday, March 18, 2016

Late Antique Little Ice Age 1,500 years ago

Dendroclimatologist Ulf Büntgen and his fellow researchers were able for the first time to precisely reconstruct the summer temperatures in central Asia for the past 2,000 years. This was made possible by new tree-ring measurements from the Altai mountains in Russia. The results complement the climatological history of the European Alps, stretching back 2,500 years, that Büntgen and collaborators published in 2011 in the journal Nature Geoscience. “The course temperatures took in the Altai mountains corresponds remarkably well to what we found for the Alps,” says Büntgen. The combined findings allow for the first time to infer summer temperatures for large parts of Eurasia over the past two millennia.

Cold phase in 6th century

Tree-ring widths in old trees reflect the summer climate in any given year in the past. Looking at these, the researchers were particularly struck by a cold phase in the 6th century. It exhibited even lower temperatures, longer duration and larger expanse than the temperature drops in the Little Ice Age (13th to 19th centuries CE). “This was the most dramatic cooling in the Northern Hemisphere in the past 2,000 years,” explains Büntgen.

Climate and culture

In light of this, the researchers refer to the period from 536 to around 660 CE for the first time as the “Late Antique Little Ice Age” (LALIA). This was triggered by three major volcanic eruptions in 536, 540 and 547 CE, whose climatic impact was prolonged further by the retardant effect of the oceans and a minimum in solar activity.

According to the team of naturalists, historians and linguists, this period bore witness to a whole series of social upheavals. After famine, the Justinian plague established itself between 541 and 543 CE, killing millions of people in the centuries that followed and possibly contributing to the decline of the Eastern Roman Empire.


Proto-Slavic-speaking people migrated, supposedly from the Carpathian region, into the eastern areas of modern-day Europe that had been abandoned by the Romans, thereby forming the Slavic language area. According to the researchers, this period of cool temperatures may also have fostered the expansion of the Arab Empire in the Middle East. The Arabian Peninsula received more rain, growing more vegetation, which may have sustained larger herds of camels used by the Arab armies for their campaigns.

In cooler areas, various peoples also migrated east towards China, maybe driven away by a lack of pastureland in central Asia. As a result, hostilities broke out in the steppe regions of northern China between nomadic groups and the local ruling powers. Subsequently, an alliance between these steppe populations and the Eastern Romans conquered the Sasanian Empire in Persia, leading to its collapse.

Strategies for modern-day climate change

While the researchers stress, however, that potential links between this period of cool temperatures and socio-political changes always need to be treated with great caution, they write that “the LALIA fits in well with the main transformative events that occurred in Eurasia during that time“.

Ulf Büntgen points out that their study serves as an example of how sudden climatological shifts can change existing political systems: “We can learn something from the speed and scale of the transformations that took place at that time,” he says. Knowledge about the effects of past climatic fluctuations could maybe contribute to developing strategies how to deal with modern climate change.”

Past Horizons. 2016. “Late Antique Little Ice Age 1,500 years ago”. Past Horizons. Posted: February 8, 2016. Available online:

Thursday, March 17, 2016

Innate teaching skills 'part of human nature'

Some 40 years ago, Washington State University anthropologist Barry Hewlett noticed that when the Aka pygmies stopped to rest between hunts, parents would give their infants small axes, digging sticks and knives.

To parents living in the developed world, this could be seen as irresponsible. But in all the intervening years, Hewlett has never seen an infant cut him- or herself. He has, however, seen the exercise as part of the Aka way of teaching, an activity that most researchers - from anthropologists to psychologists to biologists - consider rare or non-existent in such small-scale cultures.

He has completed a small but novel study of the Aka, concluding that, "teaching is part of the human genome."

"It's part of our human nature," said Hewlett, a professor of anthropology at WSU Vancouver. "Obviously, teaching as it exists in formal education is way different than the way it exists in small-scale groups that I work with. The thing is, there does seem to be something going on there."

The Aka are among the last of the world's hunter-gatherers, but their way of life accounts for 99 percent of human history. That they teach, and how they teach, offers new insight into who we are as humans and how we might best learn.

Clearly, the Aka are not helicopter parents who would shudder at the thought of giving sharp objects to any children, let alone 1-year-olds. Rather, the Aka place a high value on individual autonomy, in addition to sharing and egalitarianism, so they're unlikely to intervene with one another's behavior.

"One does not coerce or tell others what to do, including children," Hewlett and co-author Casey Roulette write in Royal Society Open Science, an open-access journal by the world's oldest scientific publisher, The Royal Society of London.

After he saw the Aka teaching infants how to use various tools, he was told by social-cultural anthropologists that the activity was "just play." To their credit, said Hewlett, social-cultural anthropologists have recognized that teaching can be done outside a formal setting.

"The downside to that is they hadn't looked at teaching more broadly as part of human nature," he said.

But cognitive psychologists and evolutionary biologists suggested teaching is universal. Hewlett was particularly intrigued by the thinking of cognitive psychologists like Gyorgy Gergely of Central European University.

Gergely described an innate form of teaching called "natural pedagogy" in which a teacher directly demonstrates skills by, say, pointing, gazing or talking to a child. The learners in turn use the cues to imitate and learn about novel objects.

"It's important to remember that, cognitively, teaching occurs both in the teacher as well as in the child," said Hewlett. "The child needs to know that these particular cues mean something and the teacher knows how to use these particular cues to draw attention to knowledge that may not be clear to the learner. It's a co-evolution in the sense that it's happening both with the child and the so-called teacher."

Hewlett videotaped five male and five female 12- to 14-month-old infants for one hour each, usually in a naturalistic setting in or near their camp. He would have liked to videotape more but civil war in the Central African Republic made that impossible.

Later, Hewlett, Roulette and a person unfamiliar with the hypotheses coded the taped behavior of children and adults to identify moments when an adult modified his or her behavior to enhance learning, researchers' minimalist definition of teaching.

The researchers documented 169 discrete teaching events, like a caregiver demonstrating how to use a knife. Almost half lasted less than three seconds, with teachers giving positive and negative feedback, demonstrating activities, pointing, giving verbal instruction and "opportunity scaffolding"- providing an object like a digging stick and the chance to use it.

Hewlett said he was surprised to see how frequently the Aka teach their infants. More than 40 percent of the time, infants imitated skills to which they were exposed. On average, for less than four minutes average of teaching, they practiced skills for more than nine minutes.

The teaching interventions were brief and subtle, and Hewlett came to appreciate the value of letting the child learn as much as possible on his or her own.

"We know learning can be very rapid when it is self-motivated," he said. "When you take away the autonomy of the child, that impacts the self-motivation of the child."

The technique gives the child more choices and serves as an alternative to helicopter parents who hover over an infant and say, "go do this, go do that, you need to do this, you need to do that."

"This way steps backward in the other direction," he said, as in, "I need to provide advice here or there but I don't have all the right answers for my child."

EurekAlert. 2016. “Innate teaching skills 'part of human nature'”. EurekAlert. Posted: February 8, 2016. Available online:

Wednesday, March 16, 2016

Agricultural policies in Africa could be harming the poorest

Agricultural policies aimed at alleviating poverty in Africa could be making things worse, according to research by the University of East Anglia (UEA).

Published this month in the journal  World Development, the study finds that so-called 'green revolution' policies in Rwanda - claimed by the government, international donors and organisations such as the International Monetary Fund to be successful for the economy and in alleviating poverty - may be having very negative impacts on the poorest.

One of the major strategies to reduce poverty in sub-Saharan Africa is through policies to increase and modernise agricultural production. Up to 90 per cent of people in some African countries are smallholder farmers reliant on agriculture, for whom agricultural innovation, such as using new seed varieties and cultivation techniques, holds potential benefit but also great risk.

In the 1960s and 70s policies supporting new seeds for marketable crops, sold at guaranteed prices, helped many farmers and transformed economies in Asian countries. These became known as "green revolutions". The new wave of green revolution policies in sub-Saharan Africa is supported by multinational companies and western donors, and is impacting the lives of tens, even hundreds of millions of smallholder farmers, according to the study's lead author Dr Neil Dawson.

The study reveals that only a relatively wealthy minority have been able to keep to enforced modernisation because the poorest farmers cannot afford the risk of taking out credit for the approved inputs, such as seeds and fertilizers. Their fears of harvesting nothing from new crops and the potential for the government to seize and reallocate their land means many choose to sell up instead.

The findings tie in with recent debates about strategies to feed the world in the face of growing populations, for example the influence of wealthy donors such as the Gates Foundation, initiative's such as the New Alliance for Food Security and Nutrition, and multinational companies such as Monsanto in pushing agricultural modernisation in Africa. There have also been debates about small versus large farms being best to combat hunger in Africa, while struggles to maintain local control over land and food production, for example among the Oromo people in Ethiopia, have been highlighted.

Dr Dawson, a senior research associate in UEA's School of International Development, said: "Similar results are emerging from other experiments in Africa. Agricultural development certainly has the potential to help these people, but instead these policies appear to be exacerbating landlessness and inequality for poorer rural inhabitants.

"Many of these policies have been hailed as transformative development successes, yet that success is often claimed on the basis of weak evidence through inadequate impact assessments. And conditions facing African countries today are very different from those past successes in Asia some 40 years ago.

"Such policies may increase aggregate production of exportable crops, yet for many of the poorest smallholders they strip them of their main productive resource, land. This study details how these imposed changes disrupt subsistence practices, exacerbate poverty, impair local systems of trade and knowledge, and threaten land ownership. It is startling that the impacts of policies with such far-reaching impacts for such poor people are, in general, so inadequately assessed."

The research looked in-depth at Rwanda's agricultural policies and the changes impacting the wellbeing of rural inhabitants in eight villages in the country's mountainous west. Here chronic poverty is common and people depend on the food they are able to grow on their small plots.

Farmers traditionally cultivated up to 60 different types of crops, planting and harvesting in overlapping cycles to prevent shortages and hunger. However, due to high population density in Rwanda's hills, agricultural policies have been imposed which force farmers to modernise with new seed varieties and chemical fertilisers, to specialise in single crops and part with "archaic" agricultural practices.

Dr Dawson and his UEA co-authors Dr Adrian Martin and Prof Thomas Sikor recommend that not only should green revolution policies be subject to much broader and more rigorous impact assessments, but that mitigation for poverty-exacerbating impacts should be specifically incorporated into such policies. In Rwanda, that means encouraging land access for the poorest and supporting traditional practices during a gradual and voluntary modernisation.

EurekAlert. 2016. “Agricultural policies in Africa could be harming the poorest”. EurekAlert. Posted: February 7, 2016. Available online:

Tuesday, March 15, 2016

From genes to latrines—Vikings and their worms provide clues to emphysema

In a paper published today in Scientific Reports a group of researchers led by LSTM have found that the key to an inherited deficiency, predisposing people to emphysema and other lung conditions, could lie in their Viking roots.

Archaeological excavations of Viking latrine pits in Denmark have revealed that these populations suffered massive worm infestations The way that their genes developed to protect their vital organs from disease caused by worms has become the inherited trait which can now lead to lung disease in smokers.

Chronic obstructive pulmonary disease (COPD) and emphysema affect over 300 million people, or nearly 5% of the global population. The only inherited risk factor is alpha-1-antitrypsin (A1AT) deficiency, and this risk is compounded if individuals smoke tobacco.

A1AT protects the lungs and liver from enzymes called proteases that are produced by cells of the immune system, but also by parasitic worms. In the absence of A1AT these proteases can break down lung tissue leading to COPD and emphysema. Deficiency of A1AT is genetically determined and is due to deviants of A1AT that are surprisingly common, particularly in Scandinavia, where they evolved in Viking populations more than two thousand years ago. Why these disease-causing deviants of A1AT are so common in human populations today has long been a mystery.

LSTM's Professor Richard Pleass is senior author on the paper. He said: "Vikings would have eaten contaminated food and parasites would have migrated to various organs, including lungs and liver, where the proteases they released would cause disease."

In this latest paper the authors show that these deviant forms of A1AT bind an antibody called immunoglobulin E (IgE) that evolved to protect people from worms. The binding of A1AT to IgE prevents the antibody molecule from being broken down by such proteases.

"Thus these deviant forms of A1AT would have protected Viking populations, who neither smoked tobacco nor lived long lives, from worms." Continued Professor Pleass, "it is only in the last century that modern medicine has allowed human populations to be treated for disease causing worms. Consequently these deviant forms of A1AT, that once protected people from parasites, are now at liberty to cause emphysema and COPD."
Reference: 2016. “From genes to latrines—Vikings and their worms provide clues to emphysema”. Posted: February 4, 2016. Available online:

Monday, March 14, 2016

Practice makes perfect: Switching between languages pays off

Bilingual toddlers who obtain more practice in language switching are better at certain types of problem solving

It's estimated that half of the world's population speaks two or more languages. But are there hidden benefits to being bilingual? Research from Concordia reveals a new perk visible in the problem-solving skills of toddlers.

The results of a study recently published by the Journal of Experimental Child Psychology show that bilingual children are better than monolinguals at a certain type of mental control, and that those children with more practice switching between languages have even greater skills.

Bilingual speakers can thank the sometimes-arduous practice of switching from one language to another for this skill. "This switching becomes more frequent as children grow older and as their vocabulary size increases," says Diane Poulin-Dubois, a professor in Concordia's Department of Psychology and the study's senior author.

"Therefore, the superior performance on these conflict tasks appears to be due to bilinguals' strengthened cognitive flexibility and selective attention abilities as they have increased experience in switching across languages in expressive vocabulary."

Poulin-Dubois and Cristina Crivello, a graduate student with Concordia's Centre for Research in Human Development (CRDH), led a group of researchers* in a longitudinal investigation, which compared bilingual toddlers to their monolingual peers, tracking the tots as they gained greater vocabularies in each of their two languages.

For the study, the researchers assessed the vocabularies of 39 bilingual children and 43 monolinguals when they were aged 24 months, and then again at 31 months. During the second assessment, the researchers also had the young participants perform a battery of tasks to test their cognitive flexibility and memory skills.

"For the most part, there was no difference between the bilingual and monolingual toddlers," says Poulin-Dubois, who is also a member of CRDH. "But that changed dramatically when it came to the conflict inhibition test, and the differences were especially apparent in the bilingual toddlers whose vocabulary had increased most."

In this case, conflict inhibition refers to the mental process of overriding a well learned rule that you would normally pay attention to.

To assess toddlers' abilities in this domain, Crivello, who undertook the research as part of her master's thesis and is the first author of the study, administered two tests:

1. Reverse categorization -- participants were told to put a set of little blocks into a little bucket and big blocks into a big bucket. Then the instructions were switched -- big blocks in the little bucket and little blocks in the big bucket.

2. Shape conflict -- participants were shown pictures of different sized fruit and asked to name them. Then a new series of images was shown, with a small fruit embedded inside a large one. Toddlers were asked to point to the little fruit.

It wasn't surprising to the researchers that the bilingual children performed significantly better on the conflict inhibition tasks than did their monolingual counterparts.

"Language switching underlies the bilingual advantage on conflict tasks," says Crivello. "In conflict inhibition, the child has to ignore certain information -- the size of a block relative to a bucket, or the fact that one fruit is inside another. That mirrors the experience of having to switch between languages, using a second language even though the word from a first language might be more easily accessible."

The unique feature of the study was the finding that the more language switching toddlers engaged in, the more it benefitted them. Within the bilingual group of toddlers, those who had amassed a greater number of "doublets" -- pairs of words in each language, such as dog/chien -- performed even better on the conflict inhibition tasks. "By the end of the third year of life, the average bilingual child uses two words for most concepts in his or her vocabulary, so young bilingual children gradually acquire more experience in switching between languages," says Poulin-Dubois.

Science Daily. 2016. “Practice makes perfect: Switching between languages pays off”. Science Daily. Posted: February 3, 2016. Available online:

Sunday, March 13, 2016

Difficult grammar affects music experience

Reading and listening to music at the same time affects how you hear the music. Language scientists and neuroscientists from Radboud University and the Max Planck Institute for Psycholinguistics published this finding in an article in Royal Society Open Science on February 3.

"The neural pathways for language and music share a crossroads," says lead author Richard Kunert. "This has been shown in previous research, but these studies focused on the effect of simultaneous reading and listening on language processing. Until now, the effect of this multitasking on the neural processing of music has been predicted only in theory."

Kunert therefore asked his subjects to read several easy and difficult phrases while they listened to a short piece of music, which Kunert composed himself. Afterwards, he asked the subjects to judge the closure, i.e. the feeling of completeness, of a chord sequence: did it stop before the end, or had they heard the entire sequence from beginning to end?

This is an example of the taks 'how 'complete' is this chord sequence?'. First play fragment 1 and then fragment 2. At the end of fragment 1, you have the feeling that the music is not 'complete' yet, it feels a bit weird. Fragment 2 ends in a better way. Fragment 3 is where it gets interesting: when you listen to this chord while reading an easy sentence (below), it seems more 'complete' than when you are reading a difficult sentence (below).


The | surgeon | consoled | the | man | and | the | woman | because | the | surgery | had | not | been | successful.


The | surgeon | consoled | the | man | and | the | woman | put | her | hand | on | his | forehead.

The experiment showed that the subjects judged the music to be less complete with grammatically difficult sentences than with simple sentences. The brain area that is the crossroads of music and language therefore has to do with grammar. "Previously, researchers thought that when you read and listen at the same time, you do not have enough attention to do both tasks well. With music and language, it is not about general attention, but about activity in the area of the brain that is shared by music and language," explains Kunert.

Language and music appear to be fundamentally more alike than you might think. A word in a sentence derives its meaning from the context. The same applies to a tone in a chord sequence or a piece of music. Language and music share the same brain region to create order in both processes: arranging words in a sentence and arranging tones in a chord sequence. Reading and listening at the same time overload the capacity of this brain region, known as Broca's area, which is located somewhere under your left temple.

Previously, researchers demonstrated that children with musical training were better at language than children who did not learn to play an instrument. The results of Kunert and colleagues demonstrate that the direction of this positive effect probably does not matter. Musical training enhances language skills, and language training probably enhances the neural processing of music in the same way. But engaging in language and music at the same time remains difficult for everyone -- whether you are a professional guitar player or have no musical talent at all.

Science Daily. 2016. “Difficult grammar affects music experience”. Science Daily. Posted: February 3, 2016. Available online:

Saturday, March 12, 2016

Powerful Women Buried at Stonehenge

The remains of 14 women believed to be of high status and importance have been found at Stonehenge, the iconic prehistoric monument in Wiltshire, England.

The discovery, along with other finds, supports the theory that Stonehenge functioned, at least for part of its long history, as a cremation cemetery for leaders and other noteworthy individuals, according to a report published in the latest issue of British Archaeology.

During the recent excavation, more women than men were found buried at Stonehenge, a fact that could change its present image.

"In almost every depiction of Stonehenge by artists and TV re-enactors we see lots of men, a man in charge, and few or no women," archaeologist Mike Pitts, who is the editor of British Archaeology and the author of the book "Hengeworld," told Discovery News.

"The archaeology now shows that as far as the burials go, women were as prominent there as men. This contrasts with the earlier burial mounds, where men seem to be more prominent."

Pitts added, "By definition -- cemeteries are rare, Stonehenge exceptional -- anyone buried at Stonehenge is likely to have been special in some way: high status families, possessors of special skills or knowledge, ritual or political leaders."

The recent excavation focused on what is known as Aubrey Hole 7, one of 56 chalk pits dug just outside of the stone circle and dating to the earliest phases of Stonehenge in the late fourth and early third millennium B.C.

Christie Willis of the University College London Institute of Archaeology worked on the project and confirmed that the remains of at least 14 females and nine males -- all young adults or older -- were found at the site. A barrage of high tech analysis techniques, such as CT scanning, was needed to study the remains, given that the individuals had been cremated.

Radiocarbon dating and other analysis of all known burials at Stonehenge reveal that they took place in several episodes from about 3100 B.C. to at least 2140 B.C. Long bone pins, thought to be hair pins, as well as a mace head made out of gneiss -- a striped stone associated with transformation -- have also been excavated at Stonehenge.

As for why no children’s remains were found during this latest excavation, both Willis and Pitts believe that such corpses must have been treated differently. Pitts suspects that infants and children were also cremated, but that their ashes were scattered in the nearby river Avon.

"There is a common association between late Neolithic religious centers and the sources or upper reaches of significant rivers," he explained.

Stonehenge’s location is also important because prior U.K. burial sites, which were often large mounds containing stone and timber chambers, tended to be erected on hilltops or other high ground, far away from where people lived.

While Stonehenge was also set apart from housing, it and other later cremation cemeteries tended to be on lower ground near rivers that locals must have frequented.

Pitts said this placement is "perhaps in line with a move from a focus on male lineage and hierarchy to both genders and family or class. This reflects a parallel shift from markers of territory and land (via the barrows) to commemorations of communities."

As for the culture(s) represented by Stonehenge, Willis said the monument was built about 1,000 years after agriculture arrived from the Middle East. The people had wheat, barley, cattle, pigs, sheep and goats, but no horses yet. They did not yet use wheels, but had well-crafted stone tools. Metalworking spread to Britain at around 2400 B.C., which was well after the early stages of Stonehenge construction.

Stonehenge, now a World Heritage Site, radiates timeless beauty and achievement, but it seems women's status proved to be more ephemeral.

Willis said that the role of women in society "probably declined again towards the 3rd millennium B.C…both archaeological and historical evidence has shown that women’s status has gone up and down quite noticeably at different times in the past."

Viegas, Jennifer. 2016. “Powerful Women Buried at Stonehenge”. Discovery News. Posted: February 3, 2016. Available online:

Friday, March 11, 2016

The power of the LGBTQ language experience

Celebrating its 23rd year at American University in Washington, D.C., Lavender Languages is North America's longest-running academic conference on language use in lesbian, gay, bisexual, transgender and queer life. The conference focuses on linguistic practices, and this year also attends to the growing tensions between (homo)sexuality and citizenship in U.S. settings and worldwide.

"The conference draws senior scholars and undergraduates, community members and political activists because of their shared interests in LGBTQ language. As this year's program demonstrates, those interests now span the globe," said William Leap, the conference founder and professor in AU's Department of Anthropology.

Scholars attending this year's conference hail from countries such as Scotland, Ireland, Colombia, Ecuador, Costa Rica, France, and South Africa. A panel on language, sexuality and national belonging will explore ways in which the relationships between national belonging and sexual subjectivity are mediated through linguistic practice. In many contemporary societies, one must identity and present as heterosexual in order to be recognized as a full member of the nation, which in many cases results in conflicted relationships between LGBT citizens and the state.

Some of the highlights of the conference include:

  • David Peterson will host a back-to-basics workshop on Critical Discourse Analysis. Peterson's popular course provides useful skills for language analysis in any setting.
  • Salvador Vidal-Ortiz and Leti Gomez will introduce, discuss and take questions from the audience about their new book Queer Brown Voices (University of Texas Press, 2015). This stunning collection of papers about LGBTQ Latino/a activism has gained much praise from critics. Gomez and Vidal-Ortiz will talk about the political and linguistic dimensions of this project.
  • Research Panel: The panel features projects located in the United States, Ireland, and Argentina. Session presenters will discuss how they use language as an entry point for exploring social voice and recovering social history and take questions about research design, procedure and practice.
  • Two sessions will explore how gender and sexuality influence language learning and language teaching in everyday life and in the classroom. These Saturday morning sessions will be of particular interest to teachers.
The conference runs this year from Feb. 12-14. While it is an academic conference, all sessions take an informal approach, and members of the public are always welcome to attend. Information on the conference can be found by visiting the Lavender Languages website at

EurekAlert. 2016. “The power of the LGBTQ language experience”. EurekAlert. Posted: February 2, 2016. Available online:

Thursday, March 10, 2016

Africa’s poison arrow beetles are key in traditional hunting method

Humans started hunting with bows and arrows tens of thousands of years ago. Then, at some point, we realized that the arrows were even more effective at bringing down large game if tainted with poison. There were plenty of plant extracts that served as sources for deadly chemicals, such as curare, used by native people in the Orinoco Basin of South America. And some Amazonian hunters discovered that brightly colored poison dart (or poison arrow)frogs could also be the source of useful toxins.

But budding mystery novelists shouldn’t overlook other poison sources, such as the Bushman poison arrow beetle (Diamphidia nigroornata) of southern Africa. For the last couple of centuries, anthropologists have been recording how the San people of the region use the beetle and local plants to create poisons for their arrows. Now Caroline Chaboo of the University of Kansas in Lawrence and colleagues gone through those past records and visited with some of the last San hunters who hunt in traditional methods to document a fading practice. Their new study appears February 1 in ZooKeys.

The San people are traditionally hunter-gatherers, and there are about 113,000 living in Angola, Botswana, Namibia, South Africa, Zambia and Zimbabwe. Not all San groups use poison-tipped arrows, the researchers found, but those that do limit their use for when hunting large game, such as eland, elephant, wildebeest or lion, hunting smaller animals with traps and snares. And it is the Ju|'hoan San of northeast Namibia, one of the largest San groups, that get their arrow poison from beetles.

The Ju|'hoansi are also the only San group that is still allowed to subsistence hunt using traditional methods, Chaboo and her colleagues write. And some still hunt with poisoned arrows and pass down that practice to the younger generation. In many other San groups, the knowledge has been lost because people have been removed from traditional lands or hunting has been made illegal.

The poison doesn’t come from grown beetles but from beetle larvae. The larvae grow in the dirt surrounding Commiphora plants (a genus that includes frankincense and myrhh) that the adult beetles feed on. Hunters keep track of where the plants grow so they know where to find beetles in the two months when the insects are in their larval stage. A hunter digs up beetle cocoons and returns home. There, he breaks open the cocoon and removes the larva, discarding any pupae or adults. He rubs the skin of the larva to break it open, then squirts the tissue into a mortar made of an old giraffe or kudu knuckle bone. When he has the tissue from 10 larvae, he mixes it with saliva and a roasted bean of a Bobgunnia madagascariensis tree. The hunter then applies the mixture with a twig to the sinew that attaches an arrow to its shaft and allows it to air dry.

The actual poison in the mixture is diamphotoxin, and it causes calcium ions to rush into cells. Red blood cells inside a poisoned animal rupture, and the animal will experience convulsions, paralysis and then death. This makes killing big game much easier, since a hunter just has to nick the animal instead of make a killing blow.

And — here’s where mystery writers should take note — the poisoned arrows are not only used in hunting wild game; they are also “the most common weapon in family quarrels, suicides, homicides and warfare,” the research team writes. “The victim can die within one day if the wounded limb is not amputated.”

As more and more people leave (or are forced to leave) traditional lifestyles, Chaboo and her colleagues note, we are losing the knowledge acquired over thousands of years and may miss out on opportunities to understand important steps in the evolution of humankind. But, perhaps more importantly, we may also be missing out on clues to knowledge that could also be beneficial in the modern world, as medicines, pesticides and other useful chemicals have been derived from natural poisons and toxins. Perhaps a poison that once laced an arrow could lead to another one of these discoveries.

Zielinski, Sarah2016. “Africa’s poison arrow beetles are key in traditional hunting method”. Science News. Posted: February 10, 2016. Available online:

Wednesday, March 9, 2016

Humans evolved by sharing technology and culture

Our early ancestors, Homo sapiens, managed to evolve and journey across the earth by exchanging and improving their technology

Blombos Cave in South Africa has given us vast knowledge about our early ancestors. In 2015, four open access articles, with research finds from Blombos as a starting point, have been published in the journal PLOS ONE.

"We are looking mainly at the part of South Africa where Blombos Cave is situated. We sought to find out how groups moved across the landscape and how they interacted," says Christopher S. Henshilwood, Professor at the University of Bergen (UiB) and University of the Witwatersrand and one of the authors of the articles.

The technology of our ancestors

Since its discovery in the early 1990s, Blombos Cave, about 300 kilometres east of Cape Town, South Africa, has yielded important new information on the behavioural evolution of the human species. The cave site was first excavated in 1991 and field work has been conducted there on a regular basis since 1997 - and is on-going. Blombos contains Middle Stone Age deposits currently dated at between 100,000 and 70,000 years, and a Later Stone Age sequence dated at between 2,000 and 300 years.

The researchers from UiB and Witswatersrand have now been looking closer at technology used by different groups in this and other regions in South Africa, such as spear points made of stone, as well as decorated ostrich eggshells, to determine whether there was an overlap and contact across groups of Middle Stone Age humans. How did they make contact with each other? How would contact across groups affect one group? How did the exchange of symbolic material culture affect the group or groups?

Adapting and evolving

"The pattern we are seeing is that when demographics change, people interact more. For example, we have found similar patterns engraved on ostrich eggshells in different sites. This shows that people were probably sharing symbolic material culture, at certain times but not at others" says Dr Karen van Niekerk, a UiB researcher and co-author.

This sharing of symbolic material culture and technology also tells us more about Homo sapiens' journey from Africa, to Arabia and Europe. Contact between cultures has been vital to the survival and development of our common ancestors Homo sapiens. The more contact the groups had, the stronger their technology and culture became.

"Contact across groups, and population dynamics, makes it possible to adopt and adapt new technologies and culture and is what describes Homo sapiens. What we are seeing is the same pattern that shaped the people in Europe who created cave art many years later," Henshilwood says.

EurekAlert. 2016. “Humans evolved by sharing technology and culture”. EurekAlert. . Posted: February 2, 2016. Available online:

Tuesday, March 8, 2016

Semantically speaking: Does meaning structure unite languages?

Humans' common cognitive abilities, language dependance may provide an underlying semantic order to the world's languages

We create words to label people, places, actions, thoughts, and more so we can express ourselves meaningfully to others. Do humans' shared cognitive abilities and dependence on languages naturally provide a universal means of organizing certain concepts? Or do environment and culture influence each language uniquely? Using a new methodology that measures how closely words' meanings are related within and between languages, an international team of researchers has revealed that for many universal concepts, the world's languages feature a common structure of semantic relatedness.

"Before this work, little was known about how to measure [a culture's sense of] the semantic nearness between concepts," says co-author and Santa Fe Institute Professor Tanmoy Bhattacharya. "For example, are the concepts of sun and moon close to each other, as they are both bright blobs in the sky? How about sand and sea, as they occur close by? Which of these pairs is the closer? How do we know?"

Translation, the mapping of relative word meanings across languages, would provide clues. But examining the problem with scientific rigor called for an empirical means to denote the degree of semantic relatedness between concepts.

To get reliable answers, Bhattacharya needed to fully quantify a comparative method that is commonly used to infer linguistic history qualitatively. (He and collaborators had previously developed this quantitative method to study changes in sounds of words as languages evolve.)

"Translation uncovers a disagreement between two languages on how concepts are grouped under a single word," says co-author and Santa Fe Institute and Oxford researcher Hyejin Youn. "Spanish, for example, groups 'fire' and 'passion' under 'incendio,' whereas Swahili groups 'fire' with 'anger' (but not 'passion')."

To quantify the problem, the researchers chose a few basic concepts that we see in nature (sun, moon, mountain, fire, and so on). Each concept was translated from English into 81 diverse languages, then back into English. Based on these translations, a weighted network was created. The structure of the network was used to compare languages' ways of partitioning concepts.

The team found that the translated concepts consistently formed three theme clusters in a network, densely connected within themselves and weakly to one another: water, solid natural materials, and earth and sky.

"For the first time, we now have a method to quantify how universal these relations are," says Bhattacharya. "What is universal -- and what is not -- about how we group clusters of meanings teaches us a lot about psycholinguistics, the conceptual structures that underlie language use."

The researchers hope to expand this study's domain, adding more concepts, then investigating how the universal structure they reveal underlies meaning shift. Their research was published today in PNAS.

Science Daily. 2016. “Semantically speaking: Does meaning structure unite languages?”. Science Daily. Posted: February 1, 2016. Available online:

Monday, March 7, 2016

Study suggests different written languages are equally efficient at conveying meaning

A study led by the University of Southampton has found there is no difference in the time it takes people from different countries to read and process different languages.

The research, published in the journal Cognition, finds the same amount of time is needed for a person, from for example China, to read and understand a text in Mandarin, as it takes a person from Britain to read and understand a text in English -- assuming both are reading their native language.

Professor of Experimental Psychology at Southampton, Simon Liversedge, says: "It has long been argued by some linguists that all languages have common or universal underlying principles, but it has been hard to find robust experimental evidence to support this claim. Our study goes at least part way to addressing this -- by showing there is universality in the way we process language during the act of reading. It suggests no one form of written language is more efficient in conveying meaning than another."

The study, carried out by the University of Southampton (UK), Tianjin Normal University (China) and the University of Turku (Finland), compared the way three groups of people in the UK, China and Finland read their own languages.

The 25 participants in each group -- one group for each country -- were given eight short texts to read which had been carefully translated into the three different languages. A rigorous translation process was used to make the texts as closely comparable across languages as possible. English, Finnish and Mandarin were chosen because of the stark differences they display in their written form -- with great variation in visual presentation of words, for example alphabetic vs. logographic(1), spaced vs. unspaced, agglutinative(2) vs. non-agglutinative.

The researchers used sophisticated eye-tracking equipment to assess the cognitive processes of the participants in each group as they read. The equipment was set up identically in each country to measure eye movement patterns of the individual readers -- recording how long they spent looking at each word, sentence or paragraph.

The results of the study showed significant and substantial differences between the three language groups in relation to the nature of eye movements of the readers and how long participants spent reading each individual word or phrase. For example, the Finnish participants spent longer concentrating on some words compared to the English readers. However, most importantly and despite these differences, the time it took for the readers of each language to read each complete sentence or paragraph was the same.

Professor Liversedge says: "This finding suggests that despite very substantial differences in the written form of different languages, at a basic propositional level, it takes humans the same amount of time to process the same information regardless of the language it is written in.

"We have shown it doesn't matter whether a native Chinese reader is processing Chinese, or a Finnish native reader is reading Finnish, or an English native reader is processing English, in terms of comprehending the basic propositional content of the language, one language is as good as another."

The study authors believe more research would be needed to fully understand if true universality of language exists, but that their study represents a good first step towards demonstrating that there is universality in the process of reading.

Science Daily. 2016. “Study suggests different written languages are equally efficient at conveying meaning”. Science Daily. Posted: February 1, 2016. Available online:

Sunday, March 6, 2016

Increase in volcanic eruptions at the end of the ice age caused by melting ice caps and erosion

The combination of erosion and melting ice caps led to a massive increase in volcanic activity at the end of the last ice age, according to new research. As the climate warmed, the ice caps melted, decreasing the pressure on the Earth's mantle, leading to an increase in both magma production and volcanic eruptions. The researchers, led by the University of Cambridge, have found that erosion also played a major role in the process, and may have contributed to an increase in atmospheric carbon dioxide levels.

"It's been established that melting ice caps and volcanic activity are linked - but what we've found is that erosion also plays a key role in the cycle," said Dr Pietro Sternai of Cambridge's Department of Earth Sciences, the paper's lead author, who is also a member of Caltech's Division of Geological and Planetary Science. "Previous attempts to model the huge increase in atmospheric CO2 at the end of the last ice age failed to account for the role of erosion, meaning that CO2 levels may have been seriously underestimated."

Using numerical simulations, which modelled various different features such as ice caps and glacial erosion rates, Sternai and his colleagues from the University of Geneva and ETH Zurich found that erosion is just as important as melting ice in driving the increase in magma production and subsequent volcanic activity. The results are published in the journalGeophysical Research Letters.

Although the researchers caution not to draw too strong a link between anthropogenic (human-caused) climate change and increased volcanic activity as the timescales are very different, since we now live in a period where the ice caps are being melted by climate change, they say that the same mechanism will likely work at shorter timescales as well.

Over the past million years, the Earth has gone back and forth between ice ages, or glacial periods, and interglacial periods, with each period lasting for roughly 100,000 years. During the interglacial periods, such as the one we live in today, volcanic activity is much higher, as the lack of pressure provided by the ice caps means that volcanoes are freer to erupt. But in the transition from an ice age to an interglacial period, the rates of erosion also increase, especially in mountain ranges where volcanoes tend to cluster.

Glaciers are considered to be the most erosive force on Earth, and as they melt, the ground beneath is eroded by as much as ten centimetres per year, further decreasing the pressure on the volcano and increasing the likelihood of an eruption. A decrease in pressure enhances the production of magma at depth, since rocks held at lower pressure tend to melt at lower temperatures.

When volcanoes erupt, they release more carbon dioxide into the atmosphere, creating a cycle that speeds up the warming process. Previous models that attempted to explain the increase in atmospheric CO2 during the end of the last ice age accounted for the role of deglaciation in increasing volcanic activity, but did not account for erosion, meaning that CO2 levels may have been significantly underestimated.

A typical ice age lasting 100,000 years can be characterised into periods of advancing and retreating ice - the ice grows for 80,000 years, but it only takes 20,000 years for that ice to melt.

"There are several factors that contribute to climate warming and cooling trends, and many of them are related to the Earth's orbital parameters," said Sternai. "But we know that much faster warming that cooling can't be caused solely by changes in the Earth's orbit - it must be, at least to some extent, related to something within the Earth system itself. Erosion, by contributing to unload the Earth's surface and enhance volcanic CO2 emissions, may be the missing factor required to explain such persistent climate asymmetry."

EurekAlert. 2016. “Increase in volcanic eruptions at the end of the ice age caused by melting ice caps and erosion”. EurekAlert. Posted: February 1, 2016. Available online:

Saturday, March 5, 2016

How Muslim Women Are Challenging the Status Quo

When one of Katherine Zoepf’s classmates was killed in the Twin Towers on 9/11, her own certainties about life vanished in the rubble. What had compelled 19 young Arabic men to plot murder on the other side of the world? What was it like to live in a strict Muslim society? Her new book, Excellent Daughters: The Secret Lives of Women Who Are Transforming the Arab World, is the story of her journey to Saudi Arabia, Egypt, and other Middle Eastern countries, where she gained rare access to the lives and dreams of young women.

Talking from her home in New York’s East Village, she explains why, since the Arab Spring, female genital mutilation is on the rise in Egypt again; how growing up with a mother who was a Jehovah’s Witness enabled her to better comprehend Muslim fundamentalism; and why more understanding between the Islamic world and the West is essential.

You’re right. September 11th does feel in many ways like the start of this whole journey. I hope this doesn’t sound corny, but for me and a lot of Americans my age, September 11th was this moment where the world as we thought we understood it, the adult world which we were so excited to be entering, suddenly seemed to vanish before us.

I was 23. I had spent a year and a bit after I graduated from college working at an English language daily in Hanoi, the Vietnam News. I’d come back to New York and spent a couple of months looking for jobs. Everybody was living with roommates and meeting new people, going out for drinks in the evenings, chatting about all this exciting stuff they're doing. Suddenly all that was gone. One of my Princeton classmates was killed in the Twin Towers. And a compulsion to figure out what had happened to her was what drew me to the Middle East.

Saudi Arabia is a complicated country. Your views and freedoms as a woman depend very much on what city you come from and which community within that city. The girls I describe were among the most conservative. They were almost competitively pious with one another. [Laughs] They would attack each other if one of them seemed to be suggesting something that, by Saudi standards, was a bit daring. They particularly attacked a girl they perceived as being too sympathetic to what I represented as a Westerner. But as I got to know them a bit better, I did feel they were challenging the status quo to the extent that it was possible. Most of them studied law, and some of them had had to argue with their families pretty forcefully in order to be allowed to do so. [Discoverhow Saudi women are redefining the boundaries of public life.]

A lot of Saudis would argue that the reason they get young people married very early is because it is important to have an appropriate outlet for those feelings. Courtship, in the way we understand it, doesn't exist. But I met as many happily married Saudi women as I know happily married Western women. So I came to feel agnostic about arranged marriage.

Officially, Islam allows women to choose whom they marry. In Saudi Arabia, the woman gives her assent at what is called the shawfa, which literally means “viewing.” The prospective groom, his father, older brothers, and uncles come to the girl's father's home to propose. It's the first time that a properly brought-up Saudi woman will ever have been seen by a man outside her immediate family. And many of the young women I interviewed told me that it is very difficult for them to say no. At the same time, I kept hearing from women who had become engaged these incredibly romantic narratives built around the couple of minutes that they had seen their future husband for the first time.

When you're reporting on the Arab world, you're mostly talking about crisis journalism: stories of war or political upheavals. Those stories are very important and they're going to continue to dominate our news coverage and our understanding of the region. But I also came to feel that they can have a distorting effect. Everyday life, even in a country in crisis, is often pretty normal. During even the most dramatic moments on Tahrir Square, Egyptians a few blocks away were scarcely aware of what was going on.

So I wanted to show how individual decisions matter: how millions of young women making a slightly braver choice or stretching themselves a bit more or waiting to get married until after finishing their master’s degree—these small gestures have an aggregate effect that can be far-reaching. I thought for a long time that I had this additional sympathy for the position of many of these girls because I was wrestling with many of the same contradictions. But I think it also affected the questions I asked, especially my willingness to keep asking questions when the subject of belief came up.

I think reporters often get very uncomfortable when someone in the Middle East says, “You can't possibly understand what it’s like to wear the hijab.” Many westerners hear this and think, “Oh gosh, I'm not a Muslim, I can't talk about this.” So I think my background gave me the ability to question and try to understand religion, which is crucial. Sometimes I explicitly used my own experiences. If someone said, “It's impossible for you Westerners to understand us, when you're all partying and sleeping with your boyfriends at 13; what do you know of our lives and challenges?” I really pushed back on that.  

The literal meaning is chaos. But I've only ever heard it used to refer to sexual temptation. I've spoken to sociologists who argue that this is part of nomadic culture: that, when you are living in a very harsh environment, you have to police your bloodline very carefully.

I'm not a sociologist. But I have rarely seen a book about gender in Islam that doesn't refer to this idea. I think Muhammad's later teachings, or later revelations, which came at the time when he had many wives himself, are much harder on women. I'm not a Koranic scholar and I've not done the primary source work that allows me to speak about this with a great deal of confidence, but a number of writers have described how, as a much older man with a number of younger wives and concubines he was struggling to control, that this was when these revelations restricting women started to come.

Today, there's a lot of dispute about the degree to which you can reform Islam. Some Muslims, especially in the West, are now saying, let's look at these teachings in their context, and not take them quite so literally. They would not say they encourage it. But they effectively protect men who kill their daughters, if they use honor as an excuse. Zahra was a young teenage girl from northeastern Syria who was kidnapped and raped. When her family discovered this, they set about trying to figure out how they could kill her to “wash away the shame” to their family honor.

Because she was raped, Zahra was interviewed by the police, who realized that her family was going to kill her because she was no longer a virgin. So they put her in a prison for juvenile delinquents for her protection. This is the only place they could put her, as independent shelters for women do not exist. But her family managed to get her sprung from that prison/shelter, by convincing the administrators they were going to marry her to a cousin. A month after the marriage, she was murdered by her brother, a brother she had previously been very close to and loved enormously. He went unpunished.

This just horrified me. So I went and talked to the administrators at the shelter and the girls at the shelter. I was basically told that no girl ever expects it: that even girls whose families had openly threatened to kill them or made attempts on their lives can't believe that the fathers and brothers they love so dearly could turn on them like this—even though they grow up in a society where young women whose honor has been “tarnished” constantly disappear.

Unfortunately yes, though I want to note that we're talking just about Egypt. The context in each country is very different. But in Egypt, laws with which the Mubarak government was strongly associated—a kind of nationalized feminism—were blamed on Hosni Mubarak’s wife, Suzanne Mubarak, who is half-British. As a result, she's not considered fully Egyptian. Many Egyptians say, “She doesn't understand Egyptian society and she influenced him [Mubarak] to make laws that we Egyptians are not comfortable with, like better custody rights for women.”

These laws benefitting women are referred to as “Suzanne's Law,” which is a term of derision. So, despite the fact that female and male protesters were in Tahrir Square together, there is a feeling that we're finally going to get rid of these “foreign” policies. Unfortunately, one of these is FGM [Female Genital Mutilation], which was banned under Mubarak. Since the Arab spring, there has seen a resurgence, though, which shows the paradoxical effects of progress.

I think it depends on whose standards of free and equal you're talking about. It also depends which society you're talking about. I think for this to be universally true there is going to have to be a bit more willingness on the part of Islamic scholars and Western governments to engage on issues of religious law. We have such a strong tradition of separating church and state, which I think is a good thing. But I do think we have to find ways to seriously engage with some of the more conservative religious scholars.

But I also tried very carefully in this book to write something that might help young men and women in the region ask more questions and feel more sympathy with where we're coming from, as Westerners.

Worrall, Simon. 2016. “How Muslim Women Are Challenging the Status Quo”. National Geographic News. Posted: January 31, 2016. Available online: