It is probably the most epic journey ever undertaken just to prove a point.
Thor Heyerdahl clung to Kon-Tiki, his balsa wood raft, for 4,300 miles to show that Polynesia could have been colonised from South America rather than Asia as commonly thought.
But despite achieving his goal – sustaining his 101 day voyage with sharks caught with his bare hands – the Norwegian failed to sway the scientific community.
Now – 64 years later- new research has finally proved the adventurer was at least partly right after all.
A team of scientists have tested the genetic make up of descendants of the original islanders and found it includes DNA that could have only come from native Americans.
That means that some time before the remote islands – including Easter Island – were colonised by Europeans the locals had interbred with people from South America.
The Polynesian islands are some of the most remote in the world – lying thousands of miles west of South America and thousands of miles east of Asia.
The established theory has always been that Polynesia was colonised via Asia around 5,500 years ago.
This has been backed up by archaeology, linguistics and some genetic studies.
But in 1947, Heyerdahl controversially claimed that Easter Island's famous statues were similar to those at Lake Titicaca in Bolivia, and sailed a raft from Peru to French Polynesia to prove it could have been colonised from America.
Now Professor Erik Thorsby of the University of Oslo in Norway has found clear evidence to support elements of Heyerdahl's hypothesis.
In 1971 and 2008 he collected blood samples from Easter Islanders whose ancestors had not interbred with Europeans and other visitors to the island.
Prof Thorsby looked at the genes, which vary greatly from person to person.
Most of the islanders' genes were Polynesian, but a few of them also carried genes only previously found in indigenous American populations.
Prof Thorsby found that in some cases the Polynesian and American genes were shuffled together, the result of a process known "recombination".
This means the American genes would need to be around for a certain amount of time for it to happen.
Prof Thorsby can't put a precise date on it, but says it is likely that Americans reached Easter Island before it was "discovered" by Europeans in 1722.
Prof Thorsby believes there may have been a Kon-Tiki-style voyage from South America to Polynesia.
Alternatively, Polynesians may have travelled east to South America, and then returned.
However, Prof Thorsby said that his new evidence does not confirm Heyerdahl's theory that the islanders were originally all from South America.
The first settlers to Polynesia came from Asia, and they made the biggest contribution to the population, he said.
"Heyerdahl was wrong but not completely," he said.
The work was presented at a Royal Society talk in London and reported in the New Scientist.
______________
References:
Alleyne, Richard. 2011. "Kon-Tiki explorer was partly right – Polynesians had South American roots". Telegraph Science News. Posted: June 17, 2011. Available online: http://www.telegraph.co.uk/science/science-news/8582150/Kon-Tiki-explorer-was-partly-right-Polynesians-had-South-American-roots.html
Notes by a socio-cultural anthropologist in areas and topics that appeal to her.
Thursday, June 30, 2011
Wednesday, June 29, 2011
Early experience found critical for language development
We know that poor social and physical environments can harm young children's cognitive and behavioral development, and that development often improves in better environments. Now a new study of children living in institutions has found that intervening early can help young children develop language, with those placed in better care by 15 months showing language skills similar to children raised by their biological parents.
The study, in the journal Child Development, was conducted by researchers at the University of Minnesota, Ohio University, The Ohio State University, the University of Virginia, Harvard Medical School and Children's Hospital Boston, the University of Maryland, and Tulane University.
Researchers studied more than 100 children who were part of the Bucharest Early Intervention Project, a longitudinal study of institutional and foster care in Romania. Historically, institutions there have provided very limited opportunities for language and social interaction among children. In this study, about half of the children were placed in foster homes at about 22 months, while the other half continued living in institutions. About 60 typically developing children who lived with their biological families in the same communities served as a comparison group.
"Because institutional care was the norm for these children, it was possible to create a natural experiment, comparing those in institutional care with those placed in foster care," according to lead author Jennifer Windsor, professor of speech-language-hearing sciences at the University of Minnesota.
The study found that children who were placed in foster care before they turned 2 had substantially greater language skills at age 3-1/2 than children who stayed in institutional care, with those placed by 15 months showing language skills similar to the comparison group. In contrast, children placed in foster care after they turned 2 had the same severe language delays as those who stayed in institutional care.
"This shows that not only is the change to high-quality foster care beneficial for these children, but the timing of the change appears to be important," according to Windsor.
The findings highlight the importance of intervening early to help young children develop language. They also provide insights for parents who adopt internationally. "Many infants and toddlers who are adopted from other countries and come to the United States develop language quickly," Windsor notes. "However, older children who have been living in poor care environments may be at high risk for language delays."
________________
References:
EurekAlert. 2011. "Early experience found critical for language development". EurekAlert. Posted: June 17, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/sfri-eef060811.php
The study, in the journal Child Development, was conducted by researchers at the University of Minnesota, Ohio University, The Ohio State University, the University of Virginia, Harvard Medical School and Children's Hospital Boston, the University of Maryland, and Tulane University.
Researchers studied more than 100 children who were part of the Bucharest Early Intervention Project, a longitudinal study of institutional and foster care in Romania. Historically, institutions there have provided very limited opportunities for language and social interaction among children. In this study, about half of the children were placed in foster homes at about 22 months, while the other half continued living in institutions. About 60 typically developing children who lived with their biological families in the same communities served as a comparison group.
"Because institutional care was the norm for these children, it was possible to create a natural experiment, comparing those in institutional care with those placed in foster care," according to lead author Jennifer Windsor, professor of speech-language-hearing sciences at the University of Minnesota.
The study found that children who were placed in foster care before they turned 2 had substantially greater language skills at age 3-1/2 than children who stayed in institutional care, with those placed by 15 months showing language skills similar to the comparison group. In contrast, children placed in foster care after they turned 2 had the same severe language delays as those who stayed in institutional care.
"This shows that not only is the change to high-quality foster care beneficial for these children, but the timing of the change appears to be important," according to Windsor.
The findings highlight the importance of intervening early to help young children develop language. They also provide insights for parents who adopt internationally. "Many infants and toddlers who are adopted from other countries and come to the United States develop language quickly," Windsor notes. "However, older children who have been living in poor care environments may be at high risk for language delays."
________________
References:
EurekAlert. 2011. "Early experience found critical for language development". EurekAlert. Posted: June 17, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/sfri-eef060811.php
Tuesday, June 28, 2011
Prehistoric finds on remote St Kilda's Boreray isle
The remains of a permanent settlement which could date back to the Iron Age has been uncovered on a remote Scottish island, according to archaeologists.
It was previously thought Boreray in the St Kilda archipelago was only visited by islanders to hunt seabirds and gather wool from sheep.
Archaeologists have now recorded an extensive agricultural field system and terraces for cultivating crops.
They have also found an intact stone building buried under soil and turf.
St Kilda's group of small islands are the remotest part of the British Isles, lying 41 miles (66km) west of the Western Isles.
Hirta, the main island of St Kilda, was occupied until 1930 when the last islanders left after they asked to be evacuated because their way of life was no longer sustainable.
The National Trust for Scotland (NTS) said simple tools found on Hirta suggested Bronze Age travellers may have visited St Kilda 4,000 to 5,000 years ago from the Western Isles before people settled at an unknown date.
Boreray's sheer sea cliffs and sea stacs are home to thousands of seabirds and what land is available is grazed by hardy feral sheep. The island's soil is fertile because of the actions of burrowing seabirds.
Mullach an Eilein, the highest point on the isle rises to just above 1,260ft (384m), making Boreray the smallest Scottish island to have a summit higher than 1,000ft (304m).
The discoveries by the Royal Commission on the Ancient and Historical Monuments of Scotland (RCAHMS) and NTS has suggested that Boreray, as well as Hirta, had settlers.
The survey team said the stone building found was among three ancient settlement mounds. It could contain Iron Age artefacts.
RCAHMS surveyor Ian Parker said the finds could change experts' understanding of the archipelago's history.
He said: "Until now, we thought Boreray was just visited for seasonal hunting and gathering by the people of Hirta.
"But this new discovery shows that a farming community actually lived on the island, perhaps as long ago as the prehistoric period.
"These agricultural remains and settlement mounds give us a tantalising glimpse into the lives of those who lived for a time on Boreray.
"Farming what is probably one of the most remote - and inhospitable - islands in the North Atlantic would have been a hard and gruelling existence."
St Kilda expert Jill Harden, who is contracted to NTS, said it was refreshing to know that there was still so much to learn about the islands.
The finds were made during a five-year project to produce the most complete mapping record of St Kilda's built heritage.
The survey, which began in 2007 and is to be completed later this year, uses satellite and digital technology to map traces of human occupation on the islands from early prehistory through to its modern military radar installations.
_______________
References:
2011. "Prehistoric finds on remote St Kilda's Boreray isle". BBC News. Posted: June 16, 2011. Available online: http://www.bbc.co.uk/news/uk-scotland-highlands-islands-13753643
It was previously thought Boreray in the St Kilda archipelago was only visited by islanders to hunt seabirds and gather wool from sheep.
Archaeologists have now recorded an extensive agricultural field system and terraces for cultivating crops.
They have also found an intact stone building buried under soil and turf.
St Kilda's group of small islands are the remotest part of the British Isles, lying 41 miles (66km) west of the Western Isles.
Hirta, the main island of St Kilda, was occupied until 1930 when the last islanders left after they asked to be evacuated because their way of life was no longer sustainable.
The National Trust for Scotland (NTS) said simple tools found on Hirta suggested Bronze Age travellers may have visited St Kilda 4,000 to 5,000 years ago from the Western Isles before people settled at an unknown date.
Boreray's sheer sea cliffs and sea stacs are home to thousands of seabirds and what land is available is grazed by hardy feral sheep. The island's soil is fertile because of the actions of burrowing seabirds.
Mullach an Eilein, the highest point on the isle rises to just above 1,260ft (384m), making Boreray the smallest Scottish island to have a summit higher than 1,000ft (304m).
The discoveries by the Royal Commission on the Ancient and Historical Monuments of Scotland (RCAHMS) and NTS has suggested that Boreray, as well as Hirta, had settlers.
The survey team said the stone building found was among three ancient settlement mounds. It could contain Iron Age artefacts.
RCAHMS surveyor Ian Parker said the finds could change experts' understanding of the archipelago's history.
He said: "Until now, we thought Boreray was just visited for seasonal hunting and gathering by the people of Hirta.
"But this new discovery shows that a farming community actually lived on the island, perhaps as long ago as the prehistoric period.
"These agricultural remains and settlement mounds give us a tantalising glimpse into the lives of those who lived for a time on Boreray.
"Farming what is probably one of the most remote - and inhospitable - islands in the North Atlantic would have been a hard and gruelling existence."
St Kilda expert Jill Harden, who is contracted to NTS, said it was refreshing to know that there was still so much to learn about the islands.
The finds were made during a five-year project to produce the most complete mapping record of St Kilda's built heritage.
The survey, which began in 2007 and is to be completed later this year, uses satellite and digital technology to map traces of human occupation on the islands from early prehistory through to its modern military radar installations.
_______________
References:
2011. "Prehistoric finds on remote St Kilda's Boreray isle". BBC News. Posted: June 16, 2011. Available online: http://www.bbc.co.uk/news/uk-scotland-highlands-islands-13753643
Monday, June 27, 2011
Origins of the Japanese
A team of researchers have been delving into the origins of the Japanese people, with some interesting findings. The research was centred on a study of Japanese dialects with the aim of finding the roots of the language.
The language family is known as Japonic and this includes Japanese and a similar language called Ryukyuan, which is spoken in the chain of islands to the south of Japan.
Comparing the cultures
Whilst genetically, the modern Japanese are descended from two main migrant streams, the Jōmon culture and the Yayoi culture, the linguistic roots have now been determined as originating from the Yayoi.
Archaeologists have found evidence for two waves of migrants, a hunter-gatherer people who created the Jōmon culture and rice farmers who left remains known as the Yayoi culture
Archaeologists have found evidence for two waves of migrants, a hunter-gatherer people who created the Jōmon culture and rice farmers who left remains known as the Yayoi culture.
The hunter-gatherers arrived in Japan before the end of the last ice age around 20,000 years ago, via land bridges that joined Japan to Asia’s mainland. They remained isolated until about 2,400 years ago when wet rice agriculture developed in southern China and was adapted to Korea’s colder climate.
Several languages seem to have been spoken on the Korean Peninsula at this time, but that of the Yayoi people is unknown. The work of two researchers at the University of Tokyo, Sean Lee and Toshikazu Hasegawa, now suggests that the origin of Japonic coincides with the arrival of the Yayoi.
The finding, if confirmed, indicates that the Yayoi people took Japonic to Japan, though still leaves unresolved the question of where in Asia the Yayoi culture or Japonic language originated before arriving in the Korean Peninsula.
The linguistic link was provided by a method known as the ‘Bayesian phylogeny’. This uses a computer to map several language trees employing a limited vocabulary of approximate 200 words which are known to evolve slowly.
By feeding all the data from the dialect studies into this computer model, a date of 2,182 years ago was predicted for the origin of Japonic, and this fits with the arrival of the Yayoi.
Whilst John B Whitman, of the National Institute for Japanese Language and Linguistics in Tokyo refers to the results as “solid and reasonable”, other linguists are far more sceptical.
A question of identity
“There has been a gap in thinking,” said Hisao Baba, curator of anthropology at the National Science Museum in Tokyo. “Archaeology has made a lot of progress, but politics has made it difficult for the general public to take a critical look at their own past.”
The question of origin cuts to the core of Japan’s identity as they have long celebrated themselves as ethnically unique.
As such, archaeology in Japan until the 1950s had to conform to accepted belief and all archaeological deposits in Japan, no matter how old, were left by ancestors of the modern Japanese. Japanese archaeologists said Japan’s gene pool had remained isolated since the end of the last ice age, over 20,000 years ago.
Confronted with evidence that a sudden change had swept Japan in about 400 BCE — replacing the millennia-old Jōmon hunter-gatherer culture with a society that could grow rice and forge both iron weapons and tools — archaeologists attributed it to nothing more than technological borrowing from the mainland rather than influx of a people. Even although recent analysis of skull shapes has shown the rice farmers who appeared 2,400 years ago were quite different from the hunters whom they replaced, it is still difficult for the Japanese to take this on board.
Tatetsuki, Okayama, Japan.
Direct comparisons between Jōmon and Yayoi skeletons show that the two peoples are noticeably distinguishable. The Jōmon tended to be shorter, with relatively longer forearms and lower legs, more wide-set eyes, shorter and wider faces, and much more pronounced facial topography. They also have strikingly raised brow ridges, noses, and nose bridges. Yayoi people, on the other hand, averaged an inch or two taller, with close-set eyes, high and narrow faces, and flat brow ridges and noses. By the Kofun period (250 to 538 AD) almost all skeletons excavated in Japan, except those of the Ainu and prehistoric Okinawans, resemble those of modern day Japanese.
Many Japanese people want to believe that their distinctive language and culture required uniquely complex developmental processes. To acknowledge a relationship of the Japanese language to any other language seems to constitute a surrender of cultural identity.
This recent study of linguistic evidence may be further proof of a more complex history and genetic studies have suggested interbreeding between the Yayoi and Jōmon people, with the Jōmon contribution to modern Japanese being as much as 40 percent. However it was the Yayoi language that prevailed, along with their agricultural technology.
_________________
References:
Past Horizons. 2011. "Origins of the Japanese". Past Horizons. Posted: June 15, 2011. Available online: http://www.pasthorizons.com/index.php/archives/06/2011/origins-of-the-japanese
The language family is known as Japonic and this includes Japanese and a similar language called Ryukyuan, which is spoken in the chain of islands to the south of Japan.
Comparing the cultures
Whilst genetically, the modern Japanese are descended from two main migrant streams, the Jōmon culture and the Yayoi culture, the linguistic roots have now been determined as originating from the Yayoi.
Archaeologists have found evidence for two waves of migrants, a hunter-gatherer people who created the Jōmon culture and rice farmers who left remains known as the Yayoi culture
Archaeologists have found evidence for two waves of migrants, a hunter-gatherer people who created the Jōmon culture and rice farmers who left remains known as the Yayoi culture.
The hunter-gatherers arrived in Japan before the end of the last ice age around 20,000 years ago, via land bridges that joined Japan to Asia’s mainland. They remained isolated until about 2,400 years ago when wet rice agriculture developed in southern China and was adapted to Korea’s colder climate.
Several languages seem to have been spoken on the Korean Peninsula at this time, but that of the Yayoi people is unknown. The work of two researchers at the University of Tokyo, Sean Lee and Toshikazu Hasegawa, now suggests that the origin of Japonic coincides with the arrival of the Yayoi.
The finding, if confirmed, indicates that the Yayoi people took Japonic to Japan, though still leaves unresolved the question of where in Asia the Yayoi culture or Japonic language originated before arriving in the Korean Peninsula.
The linguistic link was provided by a method known as the ‘Bayesian phylogeny’. This uses a computer to map several language trees employing a limited vocabulary of approximate 200 words which are known to evolve slowly.
By feeding all the data from the dialect studies into this computer model, a date of 2,182 years ago was predicted for the origin of Japonic, and this fits with the arrival of the Yayoi.
Whilst John B Whitman, of the National Institute for Japanese Language and Linguistics in Tokyo refers to the results as “solid and reasonable”, other linguists are far more sceptical.
A question of identity
“There has been a gap in thinking,” said Hisao Baba, curator of anthropology at the National Science Museum in Tokyo. “Archaeology has made a lot of progress, but politics has made it difficult for the general public to take a critical look at their own past.”
The question of origin cuts to the core of Japan’s identity as they have long celebrated themselves as ethnically unique.
As such, archaeology in Japan until the 1950s had to conform to accepted belief and all archaeological deposits in Japan, no matter how old, were left by ancestors of the modern Japanese. Japanese archaeologists said Japan’s gene pool had remained isolated since the end of the last ice age, over 20,000 years ago.
Confronted with evidence that a sudden change had swept Japan in about 400 BCE — replacing the millennia-old Jōmon hunter-gatherer culture with a society that could grow rice and forge both iron weapons and tools — archaeologists attributed it to nothing more than technological borrowing from the mainland rather than influx of a people. Even although recent analysis of skull shapes has shown the rice farmers who appeared 2,400 years ago were quite different from the hunters whom they replaced, it is still difficult for the Japanese to take this on board.
Tatetsuki, Okayama, Japan.
Direct comparisons between Jōmon and Yayoi skeletons show that the two peoples are noticeably distinguishable. The Jōmon tended to be shorter, with relatively longer forearms and lower legs, more wide-set eyes, shorter and wider faces, and much more pronounced facial topography. They also have strikingly raised brow ridges, noses, and nose bridges. Yayoi people, on the other hand, averaged an inch or two taller, with close-set eyes, high and narrow faces, and flat brow ridges and noses. By the Kofun period (250 to 538 AD) almost all skeletons excavated in Japan, except those of the Ainu and prehistoric Okinawans, resemble those of modern day Japanese.
Many Japanese people want to believe that their distinctive language and culture required uniquely complex developmental processes. To acknowledge a relationship of the Japanese language to any other language seems to constitute a surrender of cultural identity.
This recent study of linguistic evidence may be further proof of a more complex history and genetic studies have suggested interbreeding between the Yayoi and Jōmon people, with the Jōmon contribution to modern Japanese being as much as 40 percent. However it was the Yayoi language that prevailed, along with their agricultural technology.
_________________
References:
Past Horizons. 2011. "Origins of the Japanese". Past Horizons. Posted: June 15, 2011. Available online: http://www.pasthorizons.com/index.php/archives/06/2011/origins-of-the-japanese
Sunday, June 26, 2011
Shrunken Head DNA Proves Horrific Folklore True
A remarkably well-preserved shrunken head has just been authenticated by DNA analysis, which provides strong evidence that anecdotal accounts of violent head-hunting in South America were true.
The study, published in the latest issue of Archaeological and Anthropological Sciences, marks the first successful effort to unveil the genetic make-up of a shrunken head.
"The shrunken heads were made from enemies' heads cut on the battlefield," co-author Gila Kahila Bar-Gal told Discovery News. "Then, during spiritual ceremonies, enemies' heads were carefully reduced through boiling and heating, in the attempt to lock the enemy's spirit and protect the killers from spiritual revenge."
Kahila Bar-Gal is a senior lecturer in the Hebrew University of Jerusalem's Koret School of Veterinary Medicine. She is also a faculty member within the university's department of Agriculture, Food and Environment.
For the study, she and her colleagues used DNA testing and other techniques to examine the authenticity and possible cultural provenance of a shrunken head displayed at the Eretz Israel Museum in Tel Aviv. The head remains in an incredible state of preservation, with the deceased man's hair, facial features and other physical characteristics intact.
Many shrunken heads are forgeries, with some 80 percent suspected to be fakes. The late 19th through the 20th centuries saw a rise in manufacture of such fakes for profit.
The shrunken head at the Israeli museum, however, turns out to be legit.
"The shrunken head we studied was made from a real human skin," Kahila Bar-Gal said. "The people who made it knew exactly how to peel the skin from the skull, including the hair," she added, mentioning that it was also salted and boiled.
The researchers determined that the skin belonged to a man who lived and died in South America "probably in the Afro-Ecuadorian population." The genes reveal the victim's ancestors were from West Africa, but his DNA profile matches that of modern populations from Ecuador with African admixture.
According to the scientists, he was probably a member of a group that fought the Jivaro-Shuar tribes of Ecuador. These tribes also lived in Peru during the post-Columbian period, and were thought to make ritual shrunken heads out of their enemies.
Although Kahila Bar-Gal said the DNA could not pinpoint the exact age of the shrunken head, the scientists estimate the individual was killed between 1600-1898 A.D. The early date marks the entry of Africans into the region, while the latter date was when the last major nomadic populations of hunters and gatherers in Ecuador were thought to have existed.
Accounts of what happened to shrunken heads after the post-battle spiritual ceremonies vary. There are accounts that the Jivaro-Shuar warriors kept the shrunken heads as "keepsakes or personal adornments," even wearing them at certain times. Leonard Clark, who traveled to the region in 1948, however, said that he saw a shrunken head, called a "tsantsa," used in a ceremony and then stuffed in an old earthenware pot that was placed in the thatched ceiling of the house.
"Robbed of its soul, the savagely beautiful trophy no longer had any spiritual value," Clark wrote in a 1953 account.
Chuck Greenblatt, a professor in the Department of Microbiology and Molecular Genetics at Hebrew University's Hadassah Medical School, told Discovery News that "the ancient DNA techniques employed by the authors are appropriate and I have no doubt as to the authenticity of their results."
Kahila Bar-Gal hopes other museums will consider having certain objects genetically tested, as the method can reveal authenticity and uncover important historical information that may not otherwise be available.
__________________
References:
Viegas, Jennifer. 2011. "Shrunken Head DNA Proves Horrific Folklore True". Discovery News. Posted: June 14, 2011. Available online: http://news.discovery.com/history/head-hunting-dna-analysis-110614.html
The study, published in the latest issue of Archaeological and Anthropological Sciences, marks the first successful effort to unveil the genetic make-up of a shrunken head.
"The shrunken heads were made from enemies' heads cut on the battlefield," co-author Gila Kahila Bar-Gal told Discovery News. "Then, during spiritual ceremonies, enemies' heads were carefully reduced through boiling and heating, in the attempt to lock the enemy's spirit and protect the killers from spiritual revenge."
Kahila Bar-Gal is a senior lecturer in the Hebrew University of Jerusalem's Koret School of Veterinary Medicine. She is also a faculty member within the university's department of Agriculture, Food and Environment.
For the study, she and her colleagues used DNA testing and other techniques to examine the authenticity and possible cultural provenance of a shrunken head displayed at the Eretz Israel Museum in Tel Aviv. The head remains in an incredible state of preservation, with the deceased man's hair, facial features and other physical characteristics intact.
Many shrunken heads are forgeries, with some 80 percent suspected to be fakes. The late 19th through the 20th centuries saw a rise in manufacture of such fakes for profit.
The shrunken head at the Israeli museum, however, turns out to be legit.
"The shrunken head we studied was made from a real human skin," Kahila Bar-Gal said. "The people who made it knew exactly how to peel the skin from the skull, including the hair," she added, mentioning that it was also salted and boiled.
The researchers determined that the skin belonged to a man who lived and died in South America "probably in the Afro-Ecuadorian population." The genes reveal the victim's ancestors were from West Africa, but his DNA profile matches that of modern populations from Ecuador with African admixture.
According to the scientists, he was probably a member of a group that fought the Jivaro-Shuar tribes of Ecuador. These tribes also lived in Peru during the post-Columbian period, and were thought to make ritual shrunken heads out of their enemies.
Although Kahila Bar-Gal said the DNA could not pinpoint the exact age of the shrunken head, the scientists estimate the individual was killed between 1600-1898 A.D. The early date marks the entry of Africans into the region, while the latter date was when the last major nomadic populations of hunters and gatherers in Ecuador were thought to have existed.
Accounts of what happened to shrunken heads after the post-battle spiritual ceremonies vary. There are accounts that the Jivaro-Shuar warriors kept the shrunken heads as "keepsakes or personal adornments," even wearing them at certain times. Leonard Clark, who traveled to the region in 1948, however, said that he saw a shrunken head, called a "tsantsa," used in a ceremony and then stuffed in an old earthenware pot that was placed in the thatched ceiling of the house.
"Robbed of its soul, the savagely beautiful trophy no longer had any spiritual value," Clark wrote in a 1953 account.
Chuck Greenblatt, a professor in the Department of Microbiology and Molecular Genetics at Hebrew University's Hadassah Medical School, told Discovery News that "the ancient DNA techniques employed by the authors are appropriate and I have no doubt as to the authenticity of their results."
Kahila Bar-Gal hopes other museums will consider having certain objects genetically tested, as the method can reveal authenticity and uncover important historical information that may not otherwise be available.
__________________
References:
Viegas, Jennifer. 2011. "Shrunken Head DNA Proves Horrific Folklore True". Discovery News. Posted: June 14, 2011. Available online: http://news.discovery.com/history/head-hunting-dna-analysis-110614.html
Saturday, June 25, 2011
How serious is son preference in China?
Why are female foetuses aborted in China? Does an increase in the number of abortions of female foetuses reflect an increase in son preference? Sociologist Lisa Eklund from Lund University in Sweden has studied why families in China have a preference for sons.
At the time of the census in 2005, almost 121 boys were born for every 100 girls. Last year's census showed that sex ratio at birth (SRB) had improved somewhat. But it is still too early to celebrate, in Eklund's view: the narrowing of the gap does not necessarily mean that girls are valued more highly.
Because of the high SRB, there has been a tendency to picture China as a country where son preference is strong and possibly increasing since the 1980s. However, Eklund argues in her PhD thesis that using SRB as a proxy indicator for son preference is problematic. She has therefore developed a model to estimate what she calls "son compulsion", where data on SRB and total fertility rate are used to estimate the proportion of couples who wants to give birth to at least one son and who take action to achieve that goal. When looking at variation in son compulsion over time and between regions, Eklund finds that new patterns emerge that do not surface when using SRB as a proxy indicator. Contrary to popular belief, son compulsion remained steady in rural China (at around 10 per cent) while it increased in urban China in the 1990s (from 2.8 per cent to 4.5 per cent).
"This doubling concurred in time with cuts in the state welfare system in the cities, which meant that adult sons were given a more important role in providing for the social and financial security of the elderly", she says. Her findings call into question the assumption that son preference is essentially a rural issue. They also have implications for comparative perspectives and her findings suggest that son compulsion may be higher in other countries even though they expose lower SRB.
When it emerged that far more boys than girls were being born in China, the Chinese government launched the Care for Girls Campaign to improve the value of the girl child and to prevent sex-selective abortion. Nonetheless, the imbalance between the sexes continued to increase. Eklund's findings suggest that the campaign may actually have done more harm than good. Families receive extra support if they have girls and in rural areas exceptions are made from the one-child policy if the first child is a girl.
"By compensating parents of girls in various ways, the government reinforces the idea that girls are not as valuable as boys", says Eklund.
Eklund further challenges the notion that families in rural areas want sons because sons are expected to take over the farming.
"That is a weak argument", says Eklund. "Young people, both men and women, are moving away from rural areas. Of those who stay, women provide just as much help as men. In fact, it is the elderly who end up taking greater responsibility for the agriculture."
However, there are also other reasons why sons are seen as more important for families. Traditionally, a girl moves in with her husband's family when she gets married and she thus cannot look after her own parents when they grow old. Boys also play an important role in ancestor worship, and they ensure that the family name lives on.
Eklund further finds that there is a stubbornness in both popular and official discourses to view son preference as a matter of parents and grandparents without looking at structural factors that help underpin the institution of son preference.
________________________
References:
EurekAlert. 2011. "How serious is son preference in China?". EurekAlert. Posted: June 14, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/lu-hsi061411.php
At the time of the census in 2005, almost 121 boys were born for every 100 girls. Last year's census showed that sex ratio at birth (SRB) had improved somewhat. But it is still too early to celebrate, in Eklund's view: the narrowing of the gap does not necessarily mean that girls are valued more highly.
Because of the high SRB, there has been a tendency to picture China as a country where son preference is strong and possibly increasing since the 1980s. However, Eklund argues in her PhD thesis that using SRB as a proxy indicator for son preference is problematic. She has therefore developed a model to estimate what she calls "son compulsion", where data on SRB and total fertility rate are used to estimate the proportion of couples who wants to give birth to at least one son and who take action to achieve that goal. When looking at variation in son compulsion over time and between regions, Eklund finds that new patterns emerge that do not surface when using SRB as a proxy indicator. Contrary to popular belief, son compulsion remained steady in rural China (at around 10 per cent) while it increased in urban China in the 1990s (from 2.8 per cent to 4.5 per cent).
"This doubling concurred in time with cuts in the state welfare system in the cities, which meant that adult sons were given a more important role in providing for the social and financial security of the elderly", she says. Her findings call into question the assumption that son preference is essentially a rural issue. They also have implications for comparative perspectives and her findings suggest that son compulsion may be higher in other countries even though they expose lower SRB.
When it emerged that far more boys than girls were being born in China, the Chinese government launched the Care for Girls Campaign to improve the value of the girl child and to prevent sex-selective abortion. Nonetheless, the imbalance between the sexes continued to increase. Eklund's findings suggest that the campaign may actually have done more harm than good. Families receive extra support if they have girls and in rural areas exceptions are made from the one-child policy if the first child is a girl.
"By compensating parents of girls in various ways, the government reinforces the idea that girls are not as valuable as boys", says Eklund.
Eklund further challenges the notion that families in rural areas want sons because sons are expected to take over the farming.
"That is a weak argument", says Eklund. "Young people, both men and women, are moving away from rural areas. Of those who stay, women provide just as much help as men. In fact, it is the elderly who end up taking greater responsibility for the agriculture."
However, there are also other reasons why sons are seen as more important for families. Traditionally, a girl moves in with her husband's family when she gets married and she thus cannot look after her own parents when they grow old. Boys also play an important role in ancestor worship, and they ensure that the family name lives on.
Eklund further finds that there is a stubbornness in both popular and official discourses to view son preference as a matter of parents and grandparents without looking at structural factors that help underpin the institution of son preference.
________________________
References:
EurekAlert. 2011. "How serious is son preference in China?". EurekAlert. Posted: June 14, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/lu-hsi061411.php
Friday, June 24, 2011
Learning to count not as easy as 1, 2, 3
Working with larger numbers matters
Preschool children seem to grasp the true concept of counting only if they are taught to understand the number value of groups of objects greater than three, research at the University of Chicago shows.
"We think that seeing that there are three objects doesn't have to involve counting. It's only when children go beyond three that counting is necessary to determine how many objects there are," said Elizabeth Gunderson, a UChicago graduate student in psychology.
Gunderson and Susan Levine, the Stella M. Rowley Professor in Psychology, Comparative Human Development and the Committee on Education at the University, study how children develop an understanding of the connection between number words and their actual numerical value. That connection is known as the cardinal principle, which states that the size of a set of objects is determined by the last number reached when counting the set.
Learning to recite number words in order is not the same as understanding the cardinal principle, they point out. Research has shown that children who enter kindergarten with a good understanding of the cardinal principle do better in mathematics.
Gunderson is lead author of a paper, "Some Types of Parent Number Talk Count More than Others: Relations between Parents' Input and Children's Cardinal-Number Knowledge," published in the current issue of the journal Developmental Science. Levine, a leading national expert on the early acquisition of mathematics, is co-author.
Levine's work has shown that exposure to language related to numbers improves mathematics comprehension; the latest paper goes a step further. It shows that children who are exposed to number words from four through 10, in addition to the number words from one through three, acquire an understanding of the cardinal principle before children who have little exposure to these higher number words.
To perform the study, team members made five home visits and videotaped interactions between 44 youngsters and their parents. The sessions lasted for 90 minutes and were made at four–month intervals, when the youngsters were between the ages of 14 to 30 months. They coded each instance in which parents talked about numbers with their children.
When the children were nearly 4 years old, they were assessed on their understanding of the cardinal principle. The results were then compared to the records of their conversations about numbers with their parents.
Children whose parents talked about sets of four to 10 objects that the child could see were more likely to understand the cardinal principle, the research showed. Using smaller numbers in conversations and referring to objects the children couldn't see (such as "I'll be there in two minutes.") was not predictive of children's understanding of the cardinal principle. "The results have important policy implications, showing that specific aspects of parents' engagement in numerically relevant behaviors in the home seem to have an impact on children's early mathematical development," the authors point out.
Parents frequently do not realize the impact they can have on their children's understanding of mathematics and believe that a child's school is primarily responsible for the development of mathematical skills, research shows. Parents also frequently overestimate their children's understanding of mathematics.
Further studies could lead to suggestions of how parents and early childhood educators can best boost early mathematics learning, the authors point out.
__________________
References:
EurekAlert. 2011. "Learning to count not as easy as 1, 2, 3". EurekAlert. Posted: June 14, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/uoc-ltc061411.php
Preschool children seem to grasp the true concept of counting only if they are taught to understand the number value of groups of objects greater than three, research at the University of Chicago shows.
"We think that seeing that there are three objects doesn't have to involve counting. It's only when children go beyond three that counting is necessary to determine how many objects there are," said Elizabeth Gunderson, a UChicago graduate student in psychology.
Gunderson and Susan Levine, the Stella M. Rowley Professor in Psychology, Comparative Human Development and the Committee on Education at the University, study how children develop an understanding of the connection between number words and their actual numerical value. That connection is known as the cardinal principle, which states that the size of a set of objects is determined by the last number reached when counting the set.
Learning to recite number words in order is not the same as understanding the cardinal principle, they point out. Research has shown that children who enter kindergarten with a good understanding of the cardinal principle do better in mathematics.
Gunderson is lead author of a paper, "Some Types of Parent Number Talk Count More than Others: Relations between Parents' Input and Children's Cardinal-Number Knowledge," published in the current issue of the journal Developmental Science. Levine, a leading national expert on the early acquisition of mathematics, is co-author.
Levine's work has shown that exposure to language related to numbers improves mathematics comprehension; the latest paper goes a step further. It shows that children who are exposed to number words from four through 10, in addition to the number words from one through three, acquire an understanding of the cardinal principle before children who have little exposure to these higher number words.
To perform the study, team members made five home visits and videotaped interactions between 44 youngsters and their parents. The sessions lasted for 90 minutes and were made at four–month intervals, when the youngsters were between the ages of 14 to 30 months. They coded each instance in which parents talked about numbers with their children.
When the children were nearly 4 years old, they were assessed on their understanding of the cardinal principle. The results were then compared to the records of their conversations about numbers with their parents.
Children whose parents talked about sets of four to 10 objects that the child could see were more likely to understand the cardinal principle, the research showed. Using smaller numbers in conversations and referring to objects the children couldn't see (such as "I'll be there in two minutes.") was not predictive of children's understanding of the cardinal principle. "The results have important policy implications, showing that specific aspects of parents' engagement in numerically relevant behaviors in the home seem to have an impact on children's early mathematical development," the authors point out.
Parents frequently do not realize the impact they can have on their children's understanding of mathematics and believe that a child's school is primarily responsible for the development of mathematical skills, research shows. Parents also frequently overestimate their children's understanding of mathematics.
Further studies could lead to suggestions of how parents and early childhood educators can best boost early mathematics learning, the authors point out.
__________________
References:
EurekAlert. 2011. "Learning to count not as easy as 1, 2, 3". EurekAlert. Posted: June 14, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-06/uoc-ltc061411.php
Thursday, June 23, 2011
Language Learning
I am very passionate about languages. They are the doorways to culture. When you learn a language you learn about the people.
It was Charles Berlitz's grandfather, Maximilian Berlitz that created the way we learn language today. That is they use the direct method of language learning which is based in the target language and does not refer to the learner's language at all. They also advocated the audio-lingual approach which used listening and speaking to learn the target language.
What has developed now is the pattern of language learning that includes a block of text that introduces varying stages of grammatical constructs and vocabulary. A list of vocabulary words, grammar explanation and linked exercises. This method is almost universal to date. There are variants of the theme, but for adult learners, it is important that they use the method that they best learn by.
Let's look at the Cree language. It is also known as Cree-Montagnais or Cree-Montagnais-Naskapi. Cree is related to the Algonquian languages. It is spoken by approximately 117,000 people across Canada, from the Northwest Territories to Labrador, making it by far the most spoken aboriginal language in Canada.
While Western language learners struggle with the intricacies of female, male or neutral words in different languages, Cree uses animate and inanimate to order its words.
"Verbs contain most of the information. It contains obligatory reference to grammatical roles and number of its arguments (subject, direct and indirect object), and optionally also several valency-changing affixes (causative, applicative, detransitivizer, passive), gender-changing suffixes (from animate to inanimate, and the reverse) plus adverbial modifiers, tense, mood, aspect, Aktionsart, discourse markers, and further also incorporated nouns, classifiers,and diminutive suffixes. Even the stems are complex, most verbs consisting of at least two formative elements suggestive of a form of Aktionsart. Consequently, one Cree verb can sometimes be equivalent to a whole sentence in English."
I am most interested in the animate and inanimate. Consider:
Source: Our Languages ~ Plains Cree.
You can learn more by clicking this link for an informative pdf.
It was Charles Berlitz's grandfather, Maximilian Berlitz that created the way we learn language today. That is they use the direct method of language learning which is based in the target language and does not refer to the learner's language at all. They also advocated the audio-lingual approach which used listening and speaking to learn the target language.
What has developed now is the pattern of language learning that includes a block of text that introduces varying stages of grammatical constructs and vocabulary. A list of vocabulary words, grammar explanation and linked exercises. This method is almost universal to date. There are variants of the theme, but for adult learners, it is important that they use the method that they best learn by.
Let's look at the Cree language. It is also known as Cree-Montagnais or Cree-Montagnais-Naskapi. Cree is related to the Algonquian languages. It is spoken by approximately 117,000 people across Canada, from the Northwest Territories to Labrador, making it by far the most spoken aboriginal language in Canada.
While Western language learners struggle with the intricacies of female, male or neutral words in different languages, Cree uses animate and inanimate to order its words.
"Verbs contain most of the information. It contains obligatory reference to grammatical roles and number of its arguments (subject, direct and indirect object), and optionally also several valency-changing affixes (causative, applicative, detransitivizer, passive), gender-changing suffixes (from animate to inanimate, and the reverse) plus adverbial modifiers, tense, mood, aspect, Aktionsart, discourse markers, and further also incorporated nouns, classifiers,and diminutive suffixes. Even the stems are complex, most verbs consisting of at least two formative elements suggestive of a form of Aktionsart. Consequently, one Cree verb can sometimes be equivalent to a whole sentence in English."
I am most interested in the animate and inanimate. Consider:
A word or group of words used as the name of a class of people, places, or things, or of a particular person, place, or thing.
Nouns in Cree are categorised into two categories, animate (na) and inanimate nouns (ni). Animate nouns include people, animals, most plants/trees and other items.
Animate nouns (na) are nouns that fall under the gender of animacy.
Man = napéw
Woman = iskwéw
A saulteaux person = nakawihiniw
Duck = sísíp
Dog = atim
Moose = moswa
Blueberry bush = sípíhkominátik
Poplar tree = mitos
Labrador tea bush = maskíkopakwáhtik
Stone/rock = asiniy
Pipe = ospwakan
Sock = askikan
There are certain items that will be in the animate category while other items of the same, fall in the inanimate category. Consider the following examples:
Some berries that are inanimate while other berries are animate:
Saskatoon berry = misáskwatómin (ni)
Goose berry = sápómin (ni)
Apple = caspimin/wásaskwécós (ni)
Strawberry = otihimin (ni)
Raspberry [s] = ayoskan [ak] (na)
Grape [s] = sóminis [ak] (na)
Source: Our Languages ~ Plains Cree.
You can learn more by clicking this link for an informative pdf.
Thoughts within thoughts make us human
Cogito ergo sum - I think, therefore I am - was coined by René Descartes in 1637. He was struggling to find a solid philosophical basis for how we know about reality and truth.
This is also turns out to be of the most famous examples of recursion, the process of embedding ideas within ideas that humans seem to do so effortlessly. So effortlessly and so skilfully, in fact, that it's beginning to look like the one true dividing line between animals and humans that may hold up to close scrutiny.
That's the hope of Michael Corballis, professor emeritus of psychology at the University of Auckland, New Zealand. His new book, The Recursive Mind: The origins of human language, thought, and civilization, is a fascinating and well-grounded exposition of the nature and power of recursion.
In its ultra-reasonable way, this is quite a revolutionary book because it attacks key notions about language and thought. Most notably, it disputes the idea, argued especially by linguist Noam Chomsky, that thought is fundamentally linguistic - in other words, you need language before you can have thoughts.
Chomsky's influential theory of universal grammar has been modified considerably since its origins in the 1960s, but it is still supported by many linguists. Its key idea is that the human mind has evolved an innate capacity for language and that all languages share some universal forms, constrained by the way we think. Corballis reckons instead that the thought processes that made language possible were non-linguistic, but had recursive properties to which language adapted: "Where Chomsky views thought through the lens of language, I prefer to view language though the lens of thought." From this, says Corballis, follows a better understanding of how humans actually think - and a very different perspective on language and its evolution.
So how did recursion help ancient humans pull themselves up by their cognitive bootstraps? It allowed us to engage in mental time travel, says Corballis, the recursive operation whereby we recall past episodes into present consciousness and imagine future ones, and sometimes even insert fictions into reality.
We are on our own with this degree of recursion. Chimps, bonobos and orangutans just don't tell stories, paint pictures, write music or make films - there are no great ape equivalents of Hamlet or Inception. Similarly, theory of mind is uniquely highly developed in humans: I may know not only what you are thinking, says Corballis, but also that you know what I am thinking. Most - but not all - language depends on this capability.
If he's right, Corballis's theories also help make sense of apparent anomalies such as linguist and anthropologist Daniel's Everett's work on the Pirahã, an Amazonian people who hit the headlines because of debates over whether their language has any words for colours, and, crucially, numbers. Corballis now thinks that the Pirahã language may not be that unusual, and cites the example of other languages from oral cultures, such as the Iatmul language of New Guinea, which is also said to lack recursion.
The emerging point is that recursion developed in the mind and need not be expressed in a language. But, as Corballis is at pains to point out, although recursion was critical to the evolution of the human mind, it is not one of those "modules" much beloved of evolutionary psychologists, many of which are said to have evolved in the Pleistocene. Nor did it depend on some genetic mutation or the emergence of some new neuron or brain structure. Instead, he suggests it came of progressive increases in short-term memory and capacity for hierarchical organisation - all dependent in turn on incremental increases in brain size.
But as Corballis admits, this brain size increase was especially rapid in the Pleistocene. These incremental changes can lead to sudden more substantial jumps - think water boiling or balloons popping. In mathematics these shifts are called catastrophes. So, notes Corballis, wryly, "we may perhaps conclude that the emergence of the human mind was catastrophic".
Let's hope that's not too prescient.
______________
References:
Else, Liz. 2011. "Thoughts within thoughts make us human". New Scientist. Posted: June 3, 2011. Available online: http://www.newscientist.com/blogs/culturelab/2011/06/thoughts-within-thoughts-make-us-human.html
This is also turns out to be of the most famous examples of recursion, the process of embedding ideas within ideas that humans seem to do so effortlessly. So effortlessly and so skilfully, in fact, that it's beginning to look like the one true dividing line between animals and humans that may hold up to close scrutiny.
That's the hope of Michael Corballis, professor emeritus of psychology at the University of Auckland, New Zealand. His new book, The Recursive Mind: The origins of human language, thought, and civilization, is a fascinating and well-grounded exposition of the nature and power of recursion.
In its ultra-reasonable way, this is quite a revolutionary book because it attacks key notions about language and thought. Most notably, it disputes the idea, argued especially by linguist Noam Chomsky, that thought is fundamentally linguistic - in other words, you need language before you can have thoughts.
Chomsky's influential theory of universal grammar has been modified considerably since its origins in the 1960s, but it is still supported by many linguists. Its key idea is that the human mind has evolved an innate capacity for language and that all languages share some universal forms, constrained by the way we think. Corballis reckons instead that the thought processes that made language possible were non-linguistic, but had recursive properties to which language adapted: "Where Chomsky views thought through the lens of language, I prefer to view language though the lens of thought." From this, says Corballis, follows a better understanding of how humans actually think - and a very different perspective on language and its evolution.
So how did recursion help ancient humans pull themselves up by their cognitive bootstraps? It allowed us to engage in mental time travel, says Corballis, the recursive operation whereby we recall past episodes into present consciousness and imagine future ones, and sometimes even insert fictions into reality.
We are on our own with this degree of recursion. Chimps, bonobos and orangutans just don't tell stories, paint pictures, write music or make films - there are no great ape equivalents of Hamlet or Inception. Similarly, theory of mind is uniquely highly developed in humans: I may know not only what you are thinking, says Corballis, but also that you know what I am thinking. Most - but not all - language depends on this capability.
If he's right, Corballis's theories also help make sense of apparent anomalies such as linguist and anthropologist Daniel's Everett's work on the Pirahã, an Amazonian people who hit the headlines because of debates over whether their language has any words for colours, and, crucially, numbers. Corballis now thinks that the Pirahã language may not be that unusual, and cites the example of other languages from oral cultures, such as the Iatmul language of New Guinea, which is also said to lack recursion.
The emerging point is that recursion developed in the mind and need not be expressed in a language. But, as Corballis is at pains to point out, although recursion was critical to the evolution of the human mind, it is not one of those "modules" much beloved of evolutionary psychologists, many of which are said to have evolved in the Pleistocene. Nor did it depend on some genetic mutation or the emergence of some new neuron or brain structure. Instead, he suggests it came of progressive increases in short-term memory and capacity for hierarchical organisation - all dependent in turn on incremental increases in brain size.
But as Corballis admits, this brain size increase was especially rapid in the Pleistocene. These incremental changes can lead to sudden more substantial jumps - think water boiling or balloons popping. In mathematics these shifts are called catastrophes. So, notes Corballis, wryly, "we may perhaps conclude that the emergence of the human mind was catastrophic".
Let's hope that's not too prescient.
______________
References:
Else, Liz. 2011. "Thoughts within thoughts make us human". New Scientist. Posted: June 3, 2011. Available online: http://www.newscientist.com/blogs/culturelab/2011/06/thoughts-within-thoughts-make-us-human.html
Wednesday, June 22, 2011
Kids own up to ownership
Young children are possessed by possessions. Preschoolers argue about what belongs to whom with annoying regularity, a habit that might suggest limited appreciation of what it means to own something.
But it’s actually just the opposite, psychologist Ori Friedman of the University of Waterloo in Canada reported on May 28 at the Association for Psychological Science annual meeting. At ages 4 and 5, youngsters value a person’s ownership rights — say, to a crayon — far more strongly than adults do, Friedman and psychology graduate student Karen Neary found.
Rather than being learned from parents, a concept of property rights may automatically grow out of 2- to 3-year-olds’ ideas about bodily rights, such as assuming that another person can’t touch or control one’s body for no reason, Friedman proposed.
“Parents and adults may teach kids when it’s appropriate to disregard personal ownership,” he said. One such instance would involve a mother’s advice on when to lend a toy to another child who wants to borrow that item.
Friedman’s team presented a simple quandary to 40 preschoolers, ages 4 and 5, and to 44 adults. Participants saw an image of a cartoon boy holding a crayon who appeared above the word “user” and a cartoon girl who appeared above the word “owner.” After hearing from an experimenter that the girl wanted her crayon back, volunteers were asked to rule on which cartoon child should get the prized object.
About 75 percent of 4- and 5-year-olds decided in favor of the owner, versus about 20 percent of adults.
A second experiment consisted of more than 100 kids, ages 3 to 7, and 30 adults. In this case, participants saw the same cartoon boy and girl but were told that the crayon belonged to the school that the two imaginary children attended.
Nearly everyone, regardless of age, said that the user should keep the crayon for as long as needed in this situation. In other words, kids distinguished between people using an owned or a nonowned object.
In a final experiment that presented two cartoon adults, one using a cell phone that the other owned, most 4-year-olds but only a minority of adults declared that the device should be returned to its owner even before the borrower had a chance to use it. Children showed some flexibility in allowing borrowers to keep the phone — say, if it was needed for an emergency — but adults adjusted their opinions more readily to such circumstances.
It’s hard to know how children reasoned about experimental ownership scenarios, remarked psychologist Dan Ariely of Duke University. Perhaps preschoolers thought that, relative to the boy using a crayon, the girl who owned that crayon liked it more or got more pleasure from using it, Ariely suggested.
That possibility hasn’t been studied. What’s clear is that learning apparently plays little role in early thinking about possessions, Friedman asserted.
“A concept of ownership rights may be a product of the way we naturally think early in life,” he said.
_______________
References:
Bower, Bruce. 2011. "Kids own up to ownership ". Science News. Posted: June, 2011. Available online: http://blog.sciencenews.org/view/generic/id/74983/title/Kids_own_up_to_ownership
But it’s actually just the opposite, psychologist Ori Friedman of the University of Waterloo in Canada reported on May 28 at the Association for Psychological Science annual meeting. At ages 4 and 5, youngsters value a person’s ownership rights — say, to a crayon — far more strongly than adults do, Friedman and psychology graduate student Karen Neary found.
Rather than being learned from parents, a concept of property rights may automatically grow out of 2- to 3-year-olds’ ideas about bodily rights, such as assuming that another person can’t touch or control one’s body for no reason, Friedman proposed.
“Parents and adults may teach kids when it’s appropriate to disregard personal ownership,” he said. One such instance would involve a mother’s advice on when to lend a toy to another child who wants to borrow that item.
Friedman’s team presented a simple quandary to 40 preschoolers, ages 4 and 5, and to 44 adults. Participants saw an image of a cartoon boy holding a crayon who appeared above the word “user” and a cartoon girl who appeared above the word “owner.” After hearing from an experimenter that the girl wanted her crayon back, volunteers were asked to rule on which cartoon child should get the prized object.
About 75 percent of 4- and 5-year-olds decided in favor of the owner, versus about 20 percent of adults.
A second experiment consisted of more than 100 kids, ages 3 to 7, and 30 adults. In this case, participants saw the same cartoon boy and girl but were told that the crayon belonged to the school that the two imaginary children attended.
Nearly everyone, regardless of age, said that the user should keep the crayon for as long as needed in this situation. In other words, kids distinguished between people using an owned or a nonowned object.
In a final experiment that presented two cartoon adults, one using a cell phone that the other owned, most 4-year-olds but only a minority of adults declared that the device should be returned to its owner even before the borrower had a chance to use it. Children showed some flexibility in allowing borrowers to keep the phone — say, if it was needed for an emergency — but adults adjusted their opinions more readily to such circumstances.
It’s hard to know how children reasoned about experimental ownership scenarios, remarked psychologist Dan Ariely of Duke University. Perhaps preschoolers thought that, relative to the boy using a crayon, the girl who owned that crayon liked it more or got more pleasure from using it, Ariely suggested.
That possibility hasn’t been studied. What’s clear is that learning apparently plays little role in early thinking about possessions, Friedman asserted.
“A concept of ownership rights may be a product of the way we naturally think early in life,” he said.
_______________
References:
Bower, Bruce. 2011. "Kids own up to ownership ". Science News. Posted: June, 2011. Available online: http://blog.sciencenews.org/view/generic/id/74983/title/Kids_own_up_to_ownership
Tuesday, June 21, 2011
In Search of English
Historians at Northumbria University are embarking on a ground breaking project to explore why “Englishness” has been overlooked in America, while other ethnic groups are celebrated and well-known.
The team, led by Professor Don MacRaild, Dr Tanja Bueltmann and Dr David Gleeson, argue that the existence of English cultural communities in North America has been largely ignored by traditional historians who see the English as assimilating into Anglo-American culture without any need to overtly express a separate English ethnicity.
Their initial research has found that from the late eighteenth century and throughout the nineteenth century, North American towns and cities boasted organisations such as the Sons of St George, where traditional English food and folk culture were maintained. The evidence suggests that the English were distinctly aware of being an ethnic group within the emerging settlements at the time, exhibiting and maintaining their ethnicity in similar ways to the Irish, Scottish and German colonists. Yet this does not appear to be recognised in history.
The three-year project entitled ‘Locating the Hidden Diaspora: The English in North America in Transatlantic Perspective, 1760-1950’, has received £286,000 from the Arts and Humanities Research Council (AHRC). It aims to take a fresh look at English ethnicity using thousands of untapped sources, including manuscripts and newspaper articles from this period. The team believes that their research will have wider reaching implications in shedding light on current debates in UK identity politics and Englishness.
Professor MacRaild said: “It struck us as highly surprising that, though the English in North America formed an array of ethnic clubs and societies, such as the St George’s Society, no one has shown much interest in these associations, their activities and English cultural legacies.
“The English were one of the largest European groups of immigrants in the US yet, while they settled alongside the other migrants who powerfully exerted ethnic awareness, the English are not ascribed the attributes of ethnicity associated with other immigrants.
“The Irish, Scots, Germans, and many other European ethnic groups have been subjected to dozens if not hundreds of studies, but not so the English. The standard historian’s answer has been that the English assimilated more easily to Anglo-American culture so removing the need for ethnic expression. However, far from being an invisible group within a world of noticeably ethnicised European immigrants, the English consciously ethnicised themselves in an active way.”
Evident expressions of Englishness are found in English immigrants to America celebrating St George’s Day, toasting Queen Victoria, marking Shakespeare’s birthday, and Morris dancing. Benevolence was also of great importance, with many English associations being involved in providing charity – from meal tickets to ‘Christmas cheer’ – towards English immigrants experiencing hardships.
The team believe that Englishness has been overlooked by historians because, as the founding colonists, the English were the benchmark against which all other ethnic groups measured themselves.
Ironically, England’s relatively recent decline in global influence and the cultural changes produced by mass immigration and regional devolution has sparked increasing attempts to rediscover and define Englishness – seen in calls to celebrate St George’s Day as a national holiday and the rise in the English Defence League (EDL).
“At present,” Professor MacRaild argues, “Englishness in England is bedevilled with fears about right-wing extremists, football hooligans, and the uses and abuses of the now prevalent St George’s flag. We hope a project which will demonstrate the vibrancy of Englishness beyond England’s shores will contribute to debates about how Englishness fits into today’s multi-ethnic and increasingly federal political culture.”
Dr Tanja Bueltmann, an expert in the history of ethnic associations in the Scottish and English diasporas, added: “The growing movement for an independent Scotland has raised the issue of “Britishness” and “Englishness” in the wider society and influenced national debate about identity.
“Englishness as an ethnicity is now being rediscovered as a result of a crisis of confidence, partly influenced by the increasing fluidity of national borders and migration. Englishness is again being defined in opposition to other competing groups.”
Dr David Gleeson, historian of nineteenth-century America, said: “The project also has implications for the other side of the Atlantic. Recognising the English as a distinct diaspora gives us a clearer picture of the development of an American identity in that it complicates the idea of a coherent ‘Anglo’ cultural mainstream and indicates the fluid and adaptable nature of what it meant and means to be an American or Canadian.”
The research project will produce books, articles, an exhibition, and a series of public lectures to expatriate community groups throughout North America. The team will also work with local folk groups, including the Hexham Morris Men, and Folkworks at the Sage, Gateshead, to disseminate their findings to the wider public. International partners also working on the project are based in Guelph and Kansas Universities and from the College of Charleston.
Dr Gleeson added: “Perhaps English-Americans and Canadians will make a ‘Homecoming’, similar to the one organised by the Scottish government in 2009 for those of Scottish background, to re-establish connections with the land of their ancestors.”
_____________
References:
Past Horizons. 2011. "In Search of English". Past Horizons. Posted: May 27, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/in-search-of-the-english
The team, led by Professor Don MacRaild, Dr Tanja Bueltmann and Dr David Gleeson, argue that the existence of English cultural communities in North America has been largely ignored by traditional historians who see the English as assimilating into Anglo-American culture without any need to overtly express a separate English ethnicity.
Their initial research has found that from the late eighteenth century and throughout the nineteenth century, North American towns and cities boasted organisations such as the Sons of St George, where traditional English food and folk culture were maintained. The evidence suggests that the English were distinctly aware of being an ethnic group within the emerging settlements at the time, exhibiting and maintaining their ethnicity in similar ways to the Irish, Scottish and German colonists. Yet this does not appear to be recognised in history.
The three-year project entitled ‘Locating the Hidden Diaspora: The English in North America in Transatlantic Perspective, 1760-1950’, has received £286,000 from the Arts and Humanities Research Council (AHRC). It aims to take a fresh look at English ethnicity using thousands of untapped sources, including manuscripts and newspaper articles from this period. The team believes that their research will have wider reaching implications in shedding light on current debates in UK identity politics and Englishness.
Professor MacRaild said: “It struck us as highly surprising that, though the English in North America formed an array of ethnic clubs and societies, such as the St George’s Society, no one has shown much interest in these associations, their activities and English cultural legacies.
“The English were one of the largest European groups of immigrants in the US yet, while they settled alongside the other migrants who powerfully exerted ethnic awareness, the English are not ascribed the attributes of ethnicity associated with other immigrants.
“The Irish, Scots, Germans, and many other European ethnic groups have been subjected to dozens if not hundreds of studies, but not so the English. The standard historian’s answer has been that the English assimilated more easily to Anglo-American culture so removing the need for ethnic expression. However, far from being an invisible group within a world of noticeably ethnicised European immigrants, the English consciously ethnicised themselves in an active way.”
Evident expressions of Englishness are found in English immigrants to America celebrating St George’s Day, toasting Queen Victoria, marking Shakespeare’s birthday, and Morris dancing. Benevolence was also of great importance, with many English associations being involved in providing charity – from meal tickets to ‘Christmas cheer’ – towards English immigrants experiencing hardships.
The team believe that Englishness has been overlooked by historians because, as the founding colonists, the English were the benchmark against which all other ethnic groups measured themselves.
Ironically, England’s relatively recent decline in global influence and the cultural changes produced by mass immigration and regional devolution has sparked increasing attempts to rediscover and define Englishness – seen in calls to celebrate St George’s Day as a national holiday and the rise in the English Defence League (EDL).
“At present,” Professor MacRaild argues, “Englishness in England is bedevilled with fears about right-wing extremists, football hooligans, and the uses and abuses of the now prevalent St George’s flag. We hope a project which will demonstrate the vibrancy of Englishness beyond England’s shores will contribute to debates about how Englishness fits into today’s multi-ethnic and increasingly federal political culture.”
Dr Tanja Bueltmann, an expert in the history of ethnic associations in the Scottish and English diasporas, added: “The growing movement for an independent Scotland has raised the issue of “Britishness” and “Englishness” in the wider society and influenced national debate about identity.
“Englishness as an ethnicity is now being rediscovered as a result of a crisis of confidence, partly influenced by the increasing fluidity of national borders and migration. Englishness is again being defined in opposition to other competing groups.”
Dr David Gleeson, historian of nineteenth-century America, said: “The project also has implications for the other side of the Atlantic. Recognising the English as a distinct diaspora gives us a clearer picture of the development of an American identity in that it complicates the idea of a coherent ‘Anglo’ cultural mainstream and indicates the fluid and adaptable nature of what it meant and means to be an American or Canadian.”
The research project will produce books, articles, an exhibition, and a series of public lectures to expatriate community groups throughout North America. The team will also work with local folk groups, including the Hexham Morris Men, and Folkworks at the Sage, Gateshead, to disseminate their findings to the wider public. International partners also working on the project are based in Guelph and Kansas Universities and from the College of Charleston.
Dr Gleeson added: “Perhaps English-Americans and Canadians will make a ‘Homecoming’, similar to the one organised by the Scottish government in 2009 for those of Scottish background, to re-establish connections with the land of their ancestors.”
_____________
References:
Past Horizons. 2011. "In Search of English". Past Horizons. Posted: May 27, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/in-search-of-the-english
Monday, June 20, 2011
Campaign to save Australia's rock art
A national campaign to protect Australia’s rock art led by Griffith University’s Professor Paul Tacon was recently launched in Sydney. Professor Tacon, who is Australia’s first Chair in Rock Art, warned in less than 50 years, half of Australia’s valuable rock art sites could vanish.
He said without a national heritage register of rock art, Australia’s estimated 100,000 sites were in danger of being destroyed by vandalism or industrial and urban development.
“We need a fully-resourced research institute and national register-archive to bring together diverse forms of information about rock art sites and to plan for the future management and conservation of sites.”
“Australian rock art is priceless and older than the majority of priceless artworks in galleries around the world. It offers a unique visual archive of Australia’s history going back at least 15,000 years and it tells us about our recent and ancient pasts.”
Rock art consists of paintings, drawings, engravings, stencils and figures made of native beeswax in rock shelters and caves, on boulders and platforms.
He said South Africa led the world in national rock art archives with a major repository at the Rock Art Research Institute in Johannesburg.
“Many other nations have smaller scale national registers and archives but Australia has never had one. We are way behind other countries like China, South Africa, France, India and Spain in managing rock art heritage.”
The register and digital archive will be a joint initiative between Griffith University’s Place, Evolution and Rock Art Heritage Unit and the Australian National University’s Rock Art Research Centre.
It will have strong links with Indigenous communities, museums, and other universities. Collaborative fieldwork between Aboriginal Australians, archaeologists and other scientists will take place across Australia.
For more information visit the Protect Australia’s Spirit website
________________
References:
2011. "Campaign to save Australia's rock art". Past Horizons. Posted: June 1, 2011. Available online: http://www.pasthorizons.com/index.php/archives/06/2011/campaign-to-save-australias-rock-art
He said without a national heritage register of rock art, Australia’s estimated 100,000 sites were in danger of being destroyed by vandalism or industrial and urban development.
“We need a fully-resourced research institute and national register-archive to bring together diverse forms of information about rock art sites and to plan for the future management and conservation of sites.”
“Australian rock art is priceless and older than the majority of priceless artworks in galleries around the world. It offers a unique visual archive of Australia’s history going back at least 15,000 years and it tells us about our recent and ancient pasts.”
Rock art consists of paintings, drawings, engravings, stencils and figures made of native beeswax in rock shelters and caves, on boulders and platforms.
He said South Africa led the world in national rock art archives with a major repository at the Rock Art Research Institute in Johannesburg.
“Many other nations have smaller scale national registers and archives but Australia has never had one. We are way behind other countries like China, South Africa, France, India and Spain in managing rock art heritage.”
The register and digital archive will be a joint initiative between Griffith University’s Place, Evolution and Rock Art Heritage Unit and the Australian National University’s Rock Art Research Centre.
It will have strong links with Indigenous communities, museums, and other universities. Collaborative fieldwork between Aboriginal Australians, archaeologists and other scientists will take place across Australia.
For more information visit the Protect Australia’s Spirit website
________________
References:
2011. "Campaign to save Australia's rock art". Past Horizons. Posted: June 1, 2011. Available online: http://www.pasthorizons.com/index.php/archives/06/2011/campaign-to-save-australias-rock-art
Sunday, June 19, 2011
Hybrid Mammoth DNA Found
Mitochondrial DNA analysis of a Columbian mammoth (Mammuthus columbi) found in Utah suggests that its mitochondrial DNA was nearly identical to that of the woolly mammoth (Mammuthus primigenius).
[Ranges of Columbian and Wolly mammoths (Genome Biology).]
"We think this individual may have been a woolly-Columbian hybrid," said Jacob Enk of the McMaster Ancient DNA Centre, the group that led the research, which was recently published in Genome Biology.
"Living African elephant species interbreed where their ranges adjoin, with males of the bigger species out-competing the smaller for mates," he explained in a press release. The mitochondrial genomes in the smaller females then show up in populations of the larger species. "Since woolly and Columbian ranges periodically overlapped in time and space, it's likely that they engaged in similar behaviour and left a similar genetic signal," Enk said.
Modern examples of this can be seen where two varieties of elephant in Africa encounter each other. The larger savanna elephant (Loxodonta africana africana) and the smaller forest elephant (Loxodonta africana cyclotis) are capable of interbreeding. Genetic evidence has fueled a debate that these two modern elephants are indeed separate species.
The hybridization of mammoths may explain other fossils that look like intermediates between the two species. These fossils were sometimes assigned to the species Mammathus jeffersonii, but further research may show them to be hybrids of the woolly and Columbian mammoths.
The woolly mammoth was a smaller furrier beast, that lived in the north closer to the glaciers of the Ice Ages, from Alaska through Canada, and east to the Great Lakes and New England. The larger Columbian mammoth lived further south. It inhabited the western and southern portion of the U.S. as far south as Florida, and nearly to Chiapas in Mexico.
The mammoths should not be confused with the American mastodon (Mammut americanum), another ancient elephant from North and Central America.
_______________
References:
Wall, Tim. 2011. "Hybrid Mammoth DNA Found". Discovery News. Posted: June 1, 2011. Available online: http://news.discovery.com/animals/hybrid-mammoth-mix-breeds-possible-110601.html
[Ranges of Columbian and Wolly mammoths (Genome Biology).]
"We think this individual may have been a woolly-Columbian hybrid," said Jacob Enk of the McMaster Ancient DNA Centre, the group that led the research, which was recently published in Genome Biology.
"Living African elephant species interbreed where their ranges adjoin, with males of the bigger species out-competing the smaller for mates," he explained in a press release. The mitochondrial genomes in the smaller females then show up in populations of the larger species. "Since woolly and Columbian ranges periodically overlapped in time and space, it's likely that they engaged in similar behaviour and left a similar genetic signal," Enk said.
Modern examples of this can be seen where two varieties of elephant in Africa encounter each other. The larger savanna elephant (Loxodonta africana africana) and the smaller forest elephant (Loxodonta africana cyclotis) are capable of interbreeding. Genetic evidence has fueled a debate that these two modern elephants are indeed separate species.
The hybridization of mammoths may explain other fossils that look like intermediates between the two species. These fossils were sometimes assigned to the species Mammathus jeffersonii, but further research may show them to be hybrids of the woolly and Columbian mammoths.
The woolly mammoth was a smaller furrier beast, that lived in the north closer to the glaciers of the Ice Ages, from Alaska through Canada, and east to the Great Lakes and New England. The larger Columbian mammoth lived further south. It inhabited the western and southern portion of the U.S. as far south as Florida, and nearly to Chiapas in Mexico.
The mammoths should not be confused with the American mastodon (Mammut americanum), another ancient elephant from North and Central America.
_______________
References:
Wall, Tim. 2011. "Hybrid Mammoth DNA Found". Discovery News. Posted: June 1, 2011. Available online: http://news.discovery.com/animals/hybrid-mammoth-mix-breeds-possible-110601.html
Saturday, June 18, 2011
Human Brain Limits Twitter Friends to 150
The number of people we can truly be friends with is constant, regardless of social networking services like Twitter, according to a new study of the network.
Back in early 90s, the British anthropologist Robin Dunbar began studying the social groups of various kinds of primates. Before long, he noticed something odd.
Primates tend to maintain social contact with a limited number of individuals within their group. But here's the thing: primates with bigger brains tended to have a bigger circle of friends. Dunbar reasoned that this was because the number of individuals a primate could track was limited by brain volume.
Then he did something interesting. He plotted brain size against number of contacts and extrapolated to see how many friends a human ought to be able to handle. The number turned out to be about 150.
Since then, various studies have actually measured the number of people an individual can maintain regular contact with. These all show that Dunbar was just about spot on (although there is a fair spread in the results).
What's more, this number appears to have been constant throughout human history--from the size of neolithic villages to military units to 20th century contact books.
But in the last decade or so, social networking technology has had a profound influence on the way people connect. Twitter, for example, vastly increases the ease with which we can communicate with and follow others. It's not uncommon for tweeters to follow and be followed by thousands of others.
So it's easy to imagine that social networking technology finally allows humans to surpass the Dunbar number.
Not so say Bruno Goncalves and buddies at Indiana University. They studied the network of links created by 3 million Twitter users over 4 years. These tweeters sent each a whopping 380 million tweets.
But how to define friendship on Twitter. Goncalves and co say it's not enough simply to follow or be followed by somebody for there to be a strong link.
Instead, there has to be a conversation, an exchange of tweets. And these conversation have to be regular to be a sign of a significant social bond, so occasional contacts don't count.
Goncalves and pals used these rules to reconstruct the social network of all 3 million tweeters and studied how these networks evolve.
It turns out that when people start tweeting, their number of friends increases until they become overwhelmed. Beyond that saturation point, the conversations with less important contacts start to become less frequent and the tweeters begin to concentrate on the people they have the strongest links with.
So what is the saturation point? Or, in other words, how many people can tweeters maintain contact with before they get overwhelmed? The answer is between 100 and 200, just as Dunbar predicts.
"This finding suggests that even though modern social networks help us to log all the people with whom we meet and interact, they are unable to overcome the biological and physical constraints that limit stable social relations," say Goncalves and co.
The bottom line is this: social networking allows us to vastly increase the number of individual we can connect with. But it does nothing to change our capability to socialise. However hard we try, we cannot maintain close links with more than about 150 buddies.
And if Dunbar is correct, that's the way it'll stay until somebody finds a way to increase human brain size.
The abstract of: Validation of Dunbar's number in Twitter conversations.
______________
References:
KFC. 2011. "Human Brain Limits Twitter Friends to 150". Technology Review. Posted: May 30, 2011. Available online: http://www.technologyreview.com/blog/arxiv/26824/
Back in early 90s, the British anthropologist Robin Dunbar began studying the social groups of various kinds of primates. Before long, he noticed something odd.
Primates tend to maintain social contact with a limited number of individuals within their group. But here's the thing: primates with bigger brains tended to have a bigger circle of friends. Dunbar reasoned that this was because the number of individuals a primate could track was limited by brain volume.
Then he did something interesting. He plotted brain size against number of contacts and extrapolated to see how many friends a human ought to be able to handle. The number turned out to be about 150.
Since then, various studies have actually measured the number of people an individual can maintain regular contact with. These all show that Dunbar was just about spot on (although there is a fair spread in the results).
What's more, this number appears to have been constant throughout human history--from the size of neolithic villages to military units to 20th century contact books.
But in the last decade or so, social networking technology has had a profound influence on the way people connect. Twitter, for example, vastly increases the ease with which we can communicate with and follow others. It's not uncommon for tweeters to follow and be followed by thousands of others.
So it's easy to imagine that social networking technology finally allows humans to surpass the Dunbar number.
Not so say Bruno Goncalves and buddies at Indiana University. They studied the network of links created by 3 million Twitter users over 4 years. These tweeters sent each a whopping 380 million tweets.
But how to define friendship on Twitter. Goncalves and co say it's not enough simply to follow or be followed by somebody for there to be a strong link.
Instead, there has to be a conversation, an exchange of tweets. And these conversation have to be regular to be a sign of a significant social bond, so occasional contacts don't count.
Goncalves and pals used these rules to reconstruct the social network of all 3 million tweeters and studied how these networks evolve.
It turns out that when people start tweeting, their number of friends increases until they become overwhelmed. Beyond that saturation point, the conversations with less important contacts start to become less frequent and the tweeters begin to concentrate on the people they have the strongest links with.
So what is the saturation point? Or, in other words, how many people can tweeters maintain contact with before they get overwhelmed? The answer is between 100 and 200, just as Dunbar predicts.
"This finding suggests that even though modern social networks help us to log all the people with whom we meet and interact, they are unable to overcome the biological and physical constraints that limit stable social relations," say Goncalves and co.
The bottom line is this: social networking allows us to vastly increase the number of individual we can connect with. But it does nothing to change our capability to socialise. However hard we try, we cannot maintain close links with more than about 150 buddies.
And if Dunbar is correct, that's the way it'll stay until somebody finds a way to increase human brain size.
The abstract of: Validation of Dunbar's number in Twitter conversations.
______________
References:
KFC. 2011. "Human Brain Limits Twitter Friends to 150". Technology Review. Posted: May 30, 2011. Available online: http://www.technologyreview.com/blog/arxiv/26824/
Friday, June 17, 2011
Roman Ship Carried Live Fish Tank
The ancient Romans might have traded live fish across the Mediterranean Sea by endowing their ships with an ingenious hydraulic system, a new investigation into a second century A.D. wreck suggests.
Consisting of a pumping system designed to suck the sea water into a fish tank, the apparatus has been reconstructed by a team of Italian researchers who analyzed a unique feature of the wreck: a lead pipe inserted in the hull near the keel.
Recovered in pieces from the Adriatic sea in 1999, the ship was carrying a cargo of processed fish when it sank six miles off the coast of Grado in northeastern Italy.
The small trade vessel, which was 55 feet long and 19 feet wide, was packed with some 600 vases called amphoras.They were filled with sardines, salted mackerel, and garum, a fish sauce much loved by the Romans.
Now the archaeologists suspect that some 200 kilograms (440 pounds) of live fish, placed in a tank on the deck in the aft area, might have also been carried by the ship during its sailing life.
"The apparatus shows how a simple small cargo vessel could have been turned into one able to carry live fish. This potentiality, if confirmed by future studies, shows that trading live fish was actually possible in the Roman world," Carlo Beltrame, a marine archaeologist at the Ca' Foscari University of Venice in Italy, told Discovery News.
Indeed, a number of historical accounts have suggested that the Romans might have transported live fish by sea. For example, the scientist and historian Pliny the Elder (23 – 79 A.D.), wrote that live parrotfish were shipped from the Black Sea to the Neapolitan coast in order to introduce the species into the Tyrrhenian Sea.
Measuring 51 inches in length and featuring a diameter of at least 2.7 inches, the unique lead pipe was located in a sort of "small bilge-well" and would have been connected to a hand operated piston pump (which had not been found within the wreck).
Ending with a hole right in the hull, the pipe intrigued the researchers.
"No seaman would have drilled a hole in the keel, creating a potential way for water to enter the hull, unless there was a very powerful reason to do so," Beltrame and colleagues reported in the International Journal of Nautical Archaeology.
According to the researchers, the reason wasn't the need for removing bilge water from the bottom of the boat through the pipe.
Indeed, bucket chain pumps were able to discharge bilge water from the side in a much safer way, possibly recovering between 110 and 225 liters (30 to 60 gallons) of water per minute.
"It seems unlikely that sailors aboard the small Grado ship abandoned the usual chain-pumping apparatus in favor of the more complex bilge pump," Beltrame said.
Rather than serving a bilge pump to send water out of the ship, the pipe could have supported a sucking pump to bring water onto the vessel, the researchers argued.
But what could have been the purpose of such an unusual hydraulic system?
According to Beltrame and colleagues, the ship was too small to justify the presence of the pump to wash the decks or extinguish fires (similar piston-driven suction systems were employed on warships such as Horatio Nelson's HMS Victory).
"Given the ship's involvement in the fish trade, the most logical hypothesis is that the piston pump worked to supply a fish tank with oxygenated water," said Beltrame.
The researchers calculated that the small trade vessel could have carried a tank containing around 4 cubic meters (141 cubic feet) of water.
This water mass would have created no problems for stability while housing some 200 kilograms (440 pounds) of live fish, such as sea bass or sea bream.
Connected to the lead pipe, the hand operated piston pump would have easily allowed the necessary exchange of the water mass.
According to the researchers, the water would have needed to be replaced once every half an hour in order to provide a constant oxygen supply.
"With a flow of 252 liters (66 gallons) per minute, the piston pump would have filled the tank in 16 minutes," Beltrame said.
According to Rita Auriemma, a marine archaeologist at the University of Salento, it is plausible that the hydraulic system in Grado ship served for live fish trade.
"The context in which the ship operated makes this the most logical explanation," Auriemma told Discovery News.
"The near Istria coast was known for numerous vivaria, large enclosures to breed fish. It is possible that the Grado ship transported live fish from these vivaria to large markets in the high Adriatic," Auriemma said.
Indeed, it would have taken about 10 hours to cross the nearly 30 miles of sea that divided the Istria vivaria to the river port of Aquileia, one of the richest Roman towns during the imperial period.
"Such a trip could have been sustained by the live fish only through an apparatus of continuous water exchange similar to that of the Grado ship," Beltrame said.
___________
References:
Lorenzi, Rossella. 2011. "Roman Ship Carried Live Fish Tank". Discovery News. Posted: June 3, 2011. Available online: http://news.discovery.com/archaeology/roman-shipwreck-fish-tank-110603.html
Consisting of a pumping system designed to suck the sea water into a fish tank, the apparatus has been reconstructed by a team of Italian researchers who analyzed a unique feature of the wreck: a lead pipe inserted in the hull near the keel.
Recovered in pieces from the Adriatic sea in 1999, the ship was carrying a cargo of processed fish when it sank six miles off the coast of Grado in northeastern Italy.
The small trade vessel, which was 55 feet long and 19 feet wide, was packed with some 600 vases called amphoras.They were filled with sardines, salted mackerel, and garum, a fish sauce much loved by the Romans.
Now the archaeologists suspect that some 200 kilograms (440 pounds) of live fish, placed in a tank on the deck in the aft area, might have also been carried by the ship during its sailing life.
"The apparatus shows how a simple small cargo vessel could have been turned into one able to carry live fish. This potentiality, if confirmed by future studies, shows that trading live fish was actually possible in the Roman world," Carlo Beltrame, a marine archaeologist at the Ca' Foscari University of Venice in Italy, told Discovery News.
Indeed, a number of historical accounts have suggested that the Romans might have transported live fish by sea. For example, the scientist and historian Pliny the Elder (23 – 79 A.D.), wrote that live parrotfish were shipped from the Black Sea to the Neapolitan coast in order to introduce the species into the Tyrrhenian Sea.
Measuring 51 inches in length and featuring a diameter of at least 2.7 inches, the unique lead pipe was located in a sort of "small bilge-well" and would have been connected to a hand operated piston pump (which had not been found within the wreck).
Ending with a hole right in the hull, the pipe intrigued the researchers.
"No seaman would have drilled a hole in the keel, creating a potential way for water to enter the hull, unless there was a very powerful reason to do so," Beltrame and colleagues reported in the International Journal of Nautical Archaeology.
According to the researchers, the reason wasn't the need for removing bilge water from the bottom of the boat through the pipe.
Indeed, bucket chain pumps were able to discharge bilge water from the side in a much safer way, possibly recovering between 110 and 225 liters (30 to 60 gallons) of water per minute.
"It seems unlikely that sailors aboard the small Grado ship abandoned the usual chain-pumping apparatus in favor of the more complex bilge pump," Beltrame said.
Rather than serving a bilge pump to send water out of the ship, the pipe could have supported a sucking pump to bring water onto the vessel, the researchers argued.
But what could have been the purpose of such an unusual hydraulic system?
According to Beltrame and colleagues, the ship was too small to justify the presence of the pump to wash the decks or extinguish fires (similar piston-driven suction systems were employed on warships such as Horatio Nelson's HMS Victory).
"Given the ship's involvement in the fish trade, the most logical hypothesis is that the piston pump worked to supply a fish tank with oxygenated water," said Beltrame.
The researchers calculated that the small trade vessel could have carried a tank containing around 4 cubic meters (141 cubic feet) of water.
This water mass would have created no problems for stability while housing some 200 kilograms (440 pounds) of live fish, such as sea bass or sea bream.
Connected to the lead pipe, the hand operated piston pump would have easily allowed the necessary exchange of the water mass.
According to the researchers, the water would have needed to be replaced once every half an hour in order to provide a constant oxygen supply.
"With a flow of 252 liters (66 gallons) per minute, the piston pump would have filled the tank in 16 minutes," Beltrame said.
According to Rita Auriemma, a marine archaeologist at the University of Salento, it is plausible that the hydraulic system in Grado ship served for live fish trade.
"The context in which the ship operated makes this the most logical explanation," Auriemma told Discovery News.
"The near Istria coast was known for numerous vivaria, large enclosures to breed fish. It is possible that the Grado ship transported live fish from these vivaria to large markets in the high Adriatic," Auriemma said.
Indeed, it would have taken about 10 hours to cross the nearly 30 miles of sea that divided the Istria vivaria to the river port of Aquileia, one of the richest Roman towns during the imperial period.
"Such a trip could have been sustained by the live fish only through an apparatus of continuous water exchange similar to that of the Grado ship," Beltrame said.
___________
References:
Lorenzi, Rossella. 2011. "Roman Ship Carried Live Fish Tank". Discovery News. Posted: June 3, 2011. Available online: http://news.discovery.com/archaeology/roman-shipwreck-fish-tank-110603.html
Thursday, June 16, 2011
Intel anthropologist: Fieldwork with the silicon tribe
Anthropologist Genevieve Bell gives the chip maker insight into how people experience new technologies
How did you come to work as an anthropologist with a tech firm?
My mother was an anthropologist. I grew up in Australia on field sites and Aboriginal settlements, running around with no shoes and killing my supper. My father and grandfather were engineers, so being surrounded by people who constantly take things apart is very much part of my life.
What does your work involve?
At Intel, my job is to bring humanity back into the technology equation. I talk about what people care about, what motivates them, and then think about how understanding their everyday practices might generate new forms of technology.
At Stanford you studied Native American culture. Are there parallels with your work today?
Initially, Intel was like a field site. It was so profoundly foreign. That let me ask naive questions and it created a headspace I could work in. They also knew anthropology was interesting and believed you could apply it to anything. It was liberating. I was able do research that I could not have done had I stayed in a university.
How does your work fit in at Intel?
It can drive silicon in the making. We studied why people love TV and learned it's because you press one button for a story - it is straightforward, flexible and lets you get away from the everyday.
The engineers said they didn't care. I said, "You do." A traditional microprocessor needs a fan to cool it down - which has implications for designing chip-based consumer electronics. This noise is going to kick in and ruin the mood. If television is about telling a story, nothing should get in the way of that story.
Do you often knock heads with engineers?
Engineering tends to start with what is technologically possible. Part of my job is about how you talk about experiences as a starting point instead. Taking a shower, for example: you don't need to know how plumbing works, but what people love about showering. This approach creates very different solutions.
Who are you designing for?
To design for real people, you've got to think of messy apartments where everything is plugged into the same electrical outlet. As an engineer you tend to imagine you're designing into a blank space. It's a different problem to think about how to create technology so compelling that a person is willing to give something up, to unplug it to plug your thing in.
Now we have touchscreens will user interfaces change again?
At the moment, we're thinking about how control by voice and gesture will materialise. The engineers I used to work with would tell me how they were going to make all televisions with voice recognition. I thought, as soon as you have more than one person in a room there is no way you want voice recognition. It would mean the television has to solve the problem of who's in charge. Remote controls make it easy.
Your new book is about the rising role of computers in everyday life. Why did you pick this topic?
Paul Dourish and I have been writing papers together for nearly a decade. He comes from a computer science background, and we've shifted each other's perspectives considerably. We thought the conversation about where technology development has come from and where it's going should have a broader audience: the field has moved on so much.
Your book deals in depth with privacy. How big an issue is it now?
Privacy was a big issue a decade ago. Today, people are more worried about reputation. We tested people with future scenarios, such as if your smart television could update your Facebook page about what you're watching. No one liked it. People said things like, "My girlfriend put the show on and left the room" or "I've only ever watched it once". We talk about the content we watch as part of who we are. One of the biggest anxieties we have about these technologies is that they reveal what we're really up to - what dreadful dorks we are.
Genevieve Bell is director of Interaction and Experience Research at Intel. She holds a PhD in cultural anthropology from Stanford University, California. Her first book, Divining a Digital Future: Mess and Mythology in Ubiquitous Computing is co-written with Paul Dourish and published by the MIT Press
____________
References:
Webb, Jeremy. 2011. "Intel anthropologist: Fieldwork with the silicon tribe". New Scientist. Posted: May 31, 2011. Available online: http://www.newscientist.com/blogs/culturelab/2011/05/intel-anthropologist-fieldwork-with-the-silicon-tribe.html
How did you come to work as an anthropologist with a tech firm?
My mother was an anthropologist. I grew up in Australia on field sites and Aboriginal settlements, running around with no shoes and killing my supper. My father and grandfather were engineers, so being surrounded by people who constantly take things apart is very much part of my life.
What does your work involve?
At Intel, my job is to bring humanity back into the technology equation. I talk about what people care about, what motivates them, and then think about how understanding their everyday practices might generate new forms of technology.
At Stanford you studied Native American culture. Are there parallels with your work today?
Initially, Intel was like a field site. It was so profoundly foreign. That let me ask naive questions and it created a headspace I could work in. They also knew anthropology was interesting and believed you could apply it to anything. It was liberating. I was able do research that I could not have done had I stayed in a university.
How does your work fit in at Intel?
It can drive silicon in the making. We studied why people love TV and learned it's because you press one button for a story - it is straightforward, flexible and lets you get away from the everyday.
The engineers said they didn't care. I said, "You do." A traditional microprocessor needs a fan to cool it down - which has implications for designing chip-based consumer electronics. This noise is going to kick in and ruin the mood. If television is about telling a story, nothing should get in the way of that story.
Do you often knock heads with engineers?
Engineering tends to start with what is technologically possible. Part of my job is about how you talk about experiences as a starting point instead. Taking a shower, for example: you don't need to know how plumbing works, but what people love about showering. This approach creates very different solutions.
Who are you designing for?
To design for real people, you've got to think of messy apartments where everything is plugged into the same electrical outlet. As an engineer you tend to imagine you're designing into a blank space. It's a different problem to think about how to create technology so compelling that a person is willing to give something up, to unplug it to plug your thing in.
Now we have touchscreens will user interfaces change again?
At the moment, we're thinking about how control by voice and gesture will materialise. The engineers I used to work with would tell me how they were going to make all televisions with voice recognition. I thought, as soon as you have more than one person in a room there is no way you want voice recognition. It would mean the television has to solve the problem of who's in charge. Remote controls make it easy.
Your new book is about the rising role of computers in everyday life. Why did you pick this topic?
Paul Dourish and I have been writing papers together for nearly a decade. He comes from a computer science background, and we've shifted each other's perspectives considerably. We thought the conversation about where technology development has come from and where it's going should have a broader audience: the field has moved on so much.
Your book deals in depth with privacy. How big an issue is it now?
Privacy was a big issue a decade ago. Today, people are more worried about reputation. We tested people with future scenarios, such as if your smart television could update your Facebook page about what you're watching. No one liked it. People said things like, "My girlfriend put the show on and left the room" or "I've only ever watched it once". We talk about the content we watch as part of who we are. One of the biggest anxieties we have about these technologies is that they reveal what we're really up to - what dreadful dorks we are.
Genevieve Bell is director of Interaction and Experience Research at Intel. She holds a PhD in cultural anthropology from Stanford University, California. Her first book, Divining a Digital Future: Mess and Mythology in Ubiquitous Computing is co-written with Paul Dourish and published by the MIT Press
____________
References:
Webb, Jeremy. 2011. "Intel anthropologist: Fieldwork with the silicon tribe". New Scientist. Posted: May 31, 2011. Available online: http://www.newscientist.com/blogs/culturelab/2011/05/intel-anthropologist-fieldwork-with-the-silicon-tribe.html
Wednesday, June 15, 2011
Bilingualism no big deal for brain, Kansas researcher finds
How do people who speak more than one language keep from mixing them up? How do they find the right word in the right language when being fluent in just one language means knowing about 30,000 words?
That's what science has wondered about for decades, offering complicated theories on how the brain processes more than one language and sometimes theorizing that bilingualism degrades cognitive performance.
But University of Kansas psycholinguist Mike Vitevitch thinks that complicated explanations of how the brain processes two or more languages overlook a straightforward and simple explanation.
"The inherent characteristics of the words — how they sound — provide enough information to distinguish which language a word belongs to," he said. "You don't need to do anything else."
And in an analysis of English and Spanish, published in the April 7 online edition of Bilingualism: Language and Cognition, Vitevitch found few words that sounded similar in the two languages.
Most theories of how bilingual speakers find a word in memory assume that each word is "labeled" with information about which language it belongs to, Vitevitch said.
But he disagrees. "Given how different the words in one language sound to the words in the other language, it seems like a lot of extra and unnecessary mental work to add a label to each word to identify it as being from one language or the other. "
Here's an analogy. Imagine you have a bunch of apples and oranges in your fridge. The apples represent one language you know, the oranges represent another language you know and the fridge is that part of memory known as the lexicon, which contains your knowledge about language. To find an apple you just look for the round red thing in the fridge and to find an orange you just look for the round orange thing in the fridge. Once in a while you might grab an unripe, greenish orange mistaking it for a granny smith apple. Such instances of language "mixing" do happen on occasion, but they are pretty rare and are easily corrected, said Vitevitch.
"This process of looking for a specific piece of fruit is pretty efficient as it is —labeling each apple as an apple and each orange as an orange with a magic marker seems redundant and unnecessary."
Given how words in one language tend to sound different from words in another language, parents who speak different languages should not worry that their children will be confused or somehow harmed by learning two languages, said Vitevitch.
"Most people in most countries in the world speak more than one language," said Vitevitch. "If the U.S. wants to successfully compete in a global economy we need people who can communicate with potential investors and consumers in more than one language."
_______________
References:
EurekAlert. 2011. "Bilingualism no big deal for brain, Kansas researcher finds". EurekAlert. Posted: May 31, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-05/uok-bnb053111.php
That's what science has wondered about for decades, offering complicated theories on how the brain processes more than one language and sometimes theorizing that bilingualism degrades cognitive performance.
But University of Kansas psycholinguist Mike Vitevitch thinks that complicated explanations of how the brain processes two or more languages overlook a straightforward and simple explanation.
"The inherent characteristics of the words — how they sound — provide enough information to distinguish which language a word belongs to," he said. "You don't need to do anything else."
And in an analysis of English and Spanish, published in the April 7 online edition of Bilingualism: Language and Cognition, Vitevitch found few words that sounded similar in the two languages.
Most theories of how bilingual speakers find a word in memory assume that each word is "labeled" with information about which language it belongs to, Vitevitch said.
But he disagrees. "Given how different the words in one language sound to the words in the other language, it seems like a lot of extra and unnecessary mental work to add a label to each word to identify it as being from one language or the other. "
Here's an analogy. Imagine you have a bunch of apples and oranges in your fridge. The apples represent one language you know, the oranges represent another language you know and the fridge is that part of memory known as the lexicon, which contains your knowledge about language. To find an apple you just look for the round red thing in the fridge and to find an orange you just look for the round orange thing in the fridge. Once in a while you might grab an unripe, greenish orange mistaking it for a granny smith apple. Such instances of language "mixing" do happen on occasion, but they are pretty rare and are easily corrected, said Vitevitch.
"This process of looking for a specific piece of fruit is pretty efficient as it is —labeling each apple as an apple and each orange as an orange with a magic marker seems redundant and unnecessary."
Given how words in one language tend to sound different from words in another language, parents who speak different languages should not worry that their children will be confused or somehow harmed by learning two languages, said Vitevitch.
"Most people in most countries in the world speak more than one language," said Vitevitch. "If the U.S. wants to successfully compete in a global economy we need people who can communicate with potential investors and consumers in more than one language."
_______________
References:
EurekAlert. 2011. "Bilingualism no big deal for brain, Kansas researcher finds". EurekAlert. Posted: May 31, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-05/uok-bnb053111.php
Tuesday, June 14, 2011
Marlborough mound mystery solved – after 4,400 years
For generations, it has been scrambled up with pride by students at Marlborough College. But the mysterious, pudding-shaped mound in the grounds of the Wiltshire public school now looks set to gain far wider acclaim as scientists have revealed it is a prehistoric monument of international importance.
After thorough excavations, the Marlborough mound is now thought to be around 4,400 years old, making it roughly contemporary with the nearby, and far more renowned, Silbury Hill.
The new evidence was described by one archeologist, an expert on ancient ritual sites in the area, as "an astonishing discovery". Both neolithic structures are likely to have been constructed over many generations.
The Marlborough mound had been thought to date back to Norman times. It was believed to be the base of a castle built 50 years after the Norman invasion and later landscaped as a 17th-century garden feature. But it has now been dated to around 2400BC from four samples of charcoal taken from the core of the 19 metre-high hill.
The samples prove it was built at a time when British tribes were combining labour on ritual monuments in the chalk downlands of Wiltshire, including Stonehenge and the huge ditches and stone circle of Avebury.
History students at the college will now have the chance to study an extraordinary example just a stone's throw from their classroom windows. Malborough's Master Nicholas Sampson said: "We are thrilled at this discovery, which confirms the long and dramatic history of this beautiful site and offers opportunity for tremendous educational enrichment."
The Marlborough mound has been called "Silbury's little sister", after the more famous artificial hill on the outskirts of Avebury, which is the largest manmade prehistoric hill in Europe.
Marlborough, at two-thirds the height of Silbury, now becomes the second largest prehistoric mound in Britain; it may yet be confirmed as the second largest in Europe.
Jim Leary, the English Heritage archeologist who led a recent excavation of Silbury, said: "This is an astonishing discovery. The Marlborough mound has been one of the biggest mysteries in the Wessex landscape. For centuries, people have wondered whether it is Silbury's little sister, and now we have an answer. This is a very exciting time for British prehistory."
The dating was carried out as part of major conservation work amid concerns that tree roots could be destabilising the structure.
________________
References:
Kennedy, Maev. 2011. "Marlborough mound mystery solved – after 4,400 years". Guardian. Posted: May 31, 2011. Available online: http://www.guardian.co.uk/science/2011/may/31/malborough-mound-wiltshire-silbury-neolithic
After thorough excavations, the Marlborough mound is now thought to be around 4,400 years old, making it roughly contemporary with the nearby, and far more renowned, Silbury Hill.
The new evidence was described by one archeologist, an expert on ancient ritual sites in the area, as "an astonishing discovery". Both neolithic structures are likely to have been constructed over many generations.
The Marlborough mound had been thought to date back to Norman times. It was believed to be the base of a castle built 50 years after the Norman invasion and later landscaped as a 17th-century garden feature. But it has now been dated to around 2400BC from four samples of charcoal taken from the core of the 19 metre-high hill.
The samples prove it was built at a time when British tribes were combining labour on ritual monuments in the chalk downlands of Wiltshire, including Stonehenge and the huge ditches and stone circle of Avebury.
History students at the college will now have the chance to study an extraordinary example just a stone's throw from their classroom windows. Malborough's Master Nicholas Sampson said: "We are thrilled at this discovery, which confirms the long and dramatic history of this beautiful site and offers opportunity for tremendous educational enrichment."
The Marlborough mound has been called "Silbury's little sister", after the more famous artificial hill on the outskirts of Avebury, which is the largest manmade prehistoric hill in Europe.
Marlborough, at two-thirds the height of Silbury, now becomes the second largest prehistoric mound in Britain; it may yet be confirmed as the second largest in Europe.
Jim Leary, the English Heritage archeologist who led a recent excavation of Silbury, said: "This is an astonishing discovery. The Marlborough mound has been one of the biggest mysteries in the Wessex landscape. For centuries, people have wondered whether it is Silbury's little sister, and now we have an answer. This is a very exciting time for British prehistory."
The dating was carried out as part of major conservation work amid concerns that tree roots could be destabilising the structure.
________________
References:
Kennedy, Maev. 2011. "Marlborough mound mystery solved – after 4,400 years". Guardian. Posted: May 31, 2011. Available online: http://www.guardian.co.uk/science/2011/may/31/malborough-mound-wiltshire-silbury-neolithic
Monday, June 13, 2011
Ancient War Revealed in Discovery of Incan Fortresses
Incan fortresses built some 500 years ago have been discovered along an extinct volcano in northern Ecuador, revealing evidence of a war fought by the Inca just before the Spanish conquistadors arrived in the Andes.
"We're seeing evidence for a pre-Columbian frontier, or borderline, that we think existed between Inca fortresses and Ecuadorian people's fortresses," project director Samuel Connell, of Foothill College in California, told LiveScience.
The team has identified what they think are 20 fortresses built by the Inca and two forts that were built by a people from Ecuador known as the Cayambe. The volcano is called Pambamarca.
The team's research was presented in March at the 76th annual meeting of the Society for American Archaeology (SAA), in Sacramento, Calif.
"We know that there are many, many fortresses throughout northern Ecuador that haven't been identified one way or the other," said Chad Gifford, of Columbia University, who is also a project director.
Spanish folklore?
The discoveries suggest that there is a ring of truth to stories that Spanish chroniclers told when they penetrated into South America during the 16th and 17th centuries.
According to these stories, Incan ruler Huayna Capac sought to conquer the Cayambe. Using a "very powerful army," he was hoping for a quick victory but ended up getting entangled in a 17-year struggle.
"Finding that their forces were not sufficient to face the Inca on an open battlefield, the Cayambes withdrew and made strongholds in a very large fortress that they had," wrote Spanish missionary Bernabe Cobo in the 17th century in his book "History of the Inca Empire" (University of Texas Press, 1983). A translation, by Roland Hamilton, was published in 1983 by the University of Texas Press. "The Inca ordered his men to lay siege to it and bombard it continuously; but the men inside resisted so bravely that they forced the Inca to raise the siege because he had lost so many men."
Finally, after many battles, the Inca succeeded in driving the Cayambe out of their strongholds and onto the shores of a lake.
Cobo wrote that "the Inca ordered his men to cut the enemies' throats without pity as they caught them and to throw the bodies into the lake; as a result the water of the lake became so darkened with blood that it was given the name that it has today of Yahuarcocha, which means lake of blood."
Signs of War
The newly discovered Inca fortresses are built out of stone, contain platforms called ushnus, and are located on ridges about 10,000 feet (3,000 meters) above the ground.
The soldiers who lived in them were clearly prepared for battle.
"The site of Quitoloma has well over 100 structures for people living inside," said Connell. "Those structures are filled with Inca weaponry. We find quite a few sling stones stored in these houses as if they were lying in wait for the enemy to attack, or were about to storm down the hill."
The two Cayambe forts, by comparison, are made out of a tough volcanic material called cangahua. They are sizable fortresses with people likely having lived both inside and outside their walls. "There are fewer of them but plenty big," Gifford said.
One of the forts had evidence for a battle with two types of ammunition (sling stones and bola stones) found outside its walls. Both fortifications housed pottery designed using Ecuadorian rather than Incan styles.
More excavation needs to be done to unravel the full story of these fortresses, but so far the team has found no evidence of post-conflict slaughter at the Cayambe sites. "We see the apparent continued settlement in the area, which runs counter to this idea of [a] lake of blood," Connell said.
Cayambe pottery continued to be used in the region, suggesting that their culture carried on, at least on some level. "It could be that some peoples decided after many years of resistance and warfare to simply lay down their arms or become allies with the Inca," Connell said.
There certainly would have been a need for them to become friends.
In the decades after the war, large numbers of Spanish would penetrate into Ecuador and Peru. Smallpox ravaged the local population and the Inca would find themselves fighting an enemy equipped with gunpowder. Against these pressures they fell back, with their last stronghold at Vilcabamba falling in 1572.
The conquest was nothing short of a disaster for people living in Ecuador. When the Spanish took over they built estates called haciendas. The descendents of the Cayambe would be forced to labor for the Spanish, doing work like processing wool. Connell said that they worked in "very severe conditions," sometimes in windowless rooms. A difficult time for a people who, just decades earlier, had fought a war for their freedom.
_____________
References:
Jarus, Owen. 2011. "Ancient War Revealed in Discovery of Incan Fortresses". Live Science. Posted: May 31, 2011. Available online: http://www.livescience.com/14370-incan-fortresses-ecuador-ancient-battles.html
"We're seeing evidence for a pre-Columbian frontier, or borderline, that we think existed between Inca fortresses and Ecuadorian people's fortresses," project director Samuel Connell, of Foothill College in California, told LiveScience.
The team has identified what they think are 20 fortresses built by the Inca and two forts that were built by a people from Ecuador known as the Cayambe. The volcano is called Pambamarca.
The team's research was presented in March at the 76th annual meeting of the Society for American Archaeology (SAA), in Sacramento, Calif.
"We know that there are many, many fortresses throughout northern Ecuador that haven't been identified one way or the other," said Chad Gifford, of Columbia University, who is also a project director.
Spanish folklore?
The discoveries suggest that there is a ring of truth to stories that Spanish chroniclers told when they penetrated into South America during the 16th and 17th centuries.
According to these stories, Incan ruler Huayna Capac sought to conquer the Cayambe. Using a "very powerful army," he was hoping for a quick victory but ended up getting entangled in a 17-year struggle.
"Finding that their forces were not sufficient to face the Inca on an open battlefield, the Cayambes withdrew and made strongholds in a very large fortress that they had," wrote Spanish missionary Bernabe Cobo in the 17th century in his book "History of the Inca Empire" (University of Texas Press, 1983). A translation, by Roland Hamilton, was published in 1983 by the University of Texas Press. "The Inca ordered his men to lay siege to it and bombard it continuously; but the men inside resisted so bravely that they forced the Inca to raise the siege because he had lost so many men."
Finally, after many battles, the Inca succeeded in driving the Cayambe out of their strongholds and onto the shores of a lake.
Cobo wrote that "the Inca ordered his men to cut the enemies' throats without pity as they caught them and to throw the bodies into the lake; as a result the water of the lake became so darkened with blood that it was given the name that it has today of Yahuarcocha, which means lake of blood."
Signs of War
The newly discovered Inca fortresses are built out of stone, contain platforms called ushnus, and are located on ridges about 10,000 feet (3,000 meters) above the ground.
The soldiers who lived in them were clearly prepared for battle.
"The site of Quitoloma has well over 100 structures for people living inside," said Connell. "Those structures are filled with Inca weaponry. We find quite a few sling stones stored in these houses as if they were lying in wait for the enemy to attack, or were about to storm down the hill."
The two Cayambe forts, by comparison, are made out of a tough volcanic material called cangahua. They are sizable fortresses with people likely having lived both inside and outside their walls. "There are fewer of them but plenty big," Gifford said.
One of the forts had evidence for a battle with two types of ammunition (sling stones and bola stones) found outside its walls. Both fortifications housed pottery designed using Ecuadorian rather than Incan styles.
More excavation needs to be done to unravel the full story of these fortresses, but so far the team has found no evidence of post-conflict slaughter at the Cayambe sites. "We see the apparent continued settlement in the area, which runs counter to this idea of [a] lake of blood," Connell said.
Cayambe pottery continued to be used in the region, suggesting that their culture carried on, at least on some level. "It could be that some peoples decided after many years of resistance and warfare to simply lay down their arms or become allies with the Inca," Connell said.
There certainly would have been a need for them to become friends.
In the decades after the war, large numbers of Spanish would penetrate into Ecuador and Peru. Smallpox ravaged the local population and the Inca would find themselves fighting an enemy equipped with gunpowder. Against these pressures they fell back, with their last stronghold at Vilcabamba falling in 1572.
The conquest was nothing short of a disaster for people living in Ecuador. When the Spanish took over they built estates called haciendas. The descendents of the Cayambe would be forced to labor for the Spanish, doing work like processing wool. Connell said that they worked in "very severe conditions," sometimes in windowless rooms. A difficult time for a people who, just decades earlier, had fought a war for their freedom.
_____________
References:
Jarus, Owen. 2011. "Ancient War Revealed in Discovery of Incan Fortresses". Live Science. Posted: May 31, 2011. Available online: http://www.livescience.com/14370-incan-fortresses-ecuador-ancient-battles.html
Sunday, June 12, 2011
New Collections come to Enrich the Memory of the World
The Director-General of UNESCO, Irina Bokova has recently endorsed recommendations by the International Advisory Committee of the Memory of the World Committee to inscribe 45 new documents and documentary collections from all over the world on the Memory of the World Register, which now numbers a total of 238 items.
“By helping safeguard and share such a varied documentary heritage, UNESCO’s Memory of the World Programme reinforces the basis for scholarship and enjoyment of the creative wealth and diversity of human cultures and societies,” said the Director-General.
UNESCO launched the Memory of the World Programme in 1992 to guard against collective amnesia
The inscriptions were recommended by the International Advisory Committee of the Memory of the World Programme that met in Manchester (UK) from 22 to 25 of May 2011.
The 11th century Enina Apostolos is the most ancient extant Slavonic copy of the Acts and Epistles.
UNESCO launched the Memory of the World Programme in 1992 to guard against collective amnesia through the preservation of the valuable archive holdings and library collections all over the world and ensuring their wide dissemination.
The UNESCO website explains the decision to create such a list:
“Listing of items such as these on the Memory of the World Register is intended to generate interest and help with the conservation of documentary heritage which helps us to understand our society in all its complexities. However war, social upheaval, looting, illegal trading, destruction, inadequate conservation and lack of funding have all had a disastrous effect on the conservation of our documentary heritage. A growing awareness of this, together with UNESCO’s belief that the world’s documentary heritage belongs to all and should be preserved and protected, led to the establishment of its Memory of the World programme in 1992.”
The programme works to identify and facilitate the preservation of valuable archive holdings and library collections worldwide, and assists with their dissemination. Inscription of a collection in the Memory of the World register, created in 1995, is part of the process.
The Memory of the World Register covers all types of material and support, including stone, celluloid, parchment, audio recordings and more.
The earliest document of a printed text featuring multicoloured printed decoration, and the first example of a book produced entirely by means of mechanical methods.
Mainz Psalter - The earliest document of a printed text featuring multicoloured printed decoration, and the first example of a book produced entirely by means of mechanical methods.
New items inscribed:
Austria: Mainz Psalter at the Austrian National Library; Arnold Schönberg Estate
Barbados, Jamaica, Panama, Saint Lucia, the United Kingdom and the United States of America: Silver Men: West Indian Labourers at the Panama Canal
Bolivia:Documentary Fonds of Royal Audiencia Court of La Plata (RALP)
Brazil: Fonds of the Network of information and counter information of the military regime in Brazil
Bulgaria: Enina Apostolos, Old Bulgarian Cyrillic manuscript (fragment) of the 11th century
China:Ben Cao Gang Mu (Compendium of Materia Medica); Huang Di Nei Jing (Yellow Emperor’s Inner Canon)
Czech Republic Collection of 526 prints of university theses from 1637-1754
Denmark: MS.GKS 4 2°,vol.I-III,Biblia Latina.Commonly called “the Hamburg Bible”, or “the Bible of Bertoldus”
Fiji, Guyana, Suriname, Trinidad and Tobago:Documentary Heritage of the Indian Indentured Labourers
France:Bibliothèque de Beatus Rhenanus
Germany: Construction and Fall of the Berlin Wall and the Two-Plus-Four-Treaty of 1990; Patent DRP 37435 “Vehicle with gas engine operation” submitted by Carl Benz, Mannheim (29 January 1886)
India:’Laghukalachakratantrarajatlka’ (Vimalprabha);Tarikh-E-Khandan-E-Timuriyah
Indonesia and the Netherlands: La Galigo
Iran:A Collection of Nezami’s Khamseh Al-Tafhim li Awa’il Sana’at al-Tanjim (The Book of Instruction in the Elements of the Art of Astrology)
Italy: Lucca’s Historical Diocesan Archives (ASDLU): Early Middle Ages documents
Japan: Sakubei Yamamoto Collection
Korea, Republic of: Human Rights Documentary Heritage 1980 Archives for the May 18th Democratic Uprising against Military Regime, in Gwangju, Republic of Korea Ilseongnok: the Records of Daily Reflections
Mexico: Sixteenth to eighteenth century pictographs from the record group “Maps, drawings and illustrations
Mongolia: Lu.”Altan Tobchi” – Golden History written in 1651; Mongolian Tanjur
: Kitab al-ibar,wa diwan al-mobtadae wa al-khabar
Netherlands: Desmet Collection
Netherlands, Brazil, Ghana, Guyana, Netherlands Antilles, Suriname, United Kingdom, United States of America: Dutch West India Company (Westindische Compagnie) Archives
Netherlands, Curacao and Suriname: Archive Middelburgsche Commercie Compagnie (MCC)
Norway:Thor Heyerdahl Archives
Philippines: Presidential Papers of Manuel L. Quezon
Poland: Archive of Warsaw Reconstruction Office
Russian Federation: Ostromir Gospel (1056-1057); Leo Tolstoy’s Personal Library and Manuscripts, Photo and Film Collection
Saint Kitts and Nevis: Registry of Slaves of Bermuda 1821 -1834 (an addendum to Slaves of the British Caribbean 1817-1834, inscribed in 2009)
Sweden: Stockholm City Planning Committee Archives; Codex Argenteus – the ‘Silver Bible
Switzerland: Les Collections Jean-Jacques Rousseau de Genève et de Neuchâtel
Thailand: The Epigraphic Archives of Wat Pho
Trinidad and Tobago: The Constantine Collection
Tunisia: Privateering and the international relations of the Regency of Tunis in the 18th and 19th centuries
United Kingdom: Historic Ethnographic Recordings (1898 – 1951) at the British Library
Viet Nam: Stone Stele Records of Royal Examinations of the Le and Mac Dynasties (1442-1779)
Eleven countries enter the Memory of the World Register for the first time with the new inscriptions: Bulgaria, Fiji, Guyana, Ireland, Japan, Mongolia, Morocco, Panama, Suriname, Switzerland, Tunisia.
Visit the Memory of the World site.
_________________
References:
2011. "New Collections come to Enrich the Memory of the World". Past Horizons. Posted: May 30, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/new-collections-come-to-enrich-the-memory-of-the-world
“By helping safeguard and share such a varied documentary heritage, UNESCO’s Memory of the World Programme reinforces the basis for scholarship and enjoyment of the creative wealth and diversity of human cultures and societies,” said the Director-General.
UNESCO launched the Memory of the World Programme in 1992 to guard against collective amnesia
The inscriptions were recommended by the International Advisory Committee of the Memory of the World Programme that met in Manchester (UK) from 22 to 25 of May 2011.
The 11th century Enina Apostolos is the most ancient extant Slavonic copy of the Acts and Epistles.
UNESCO launched the Memory of the World Programme in 1992 to guard against collective amnesia through the preservation of the valuable archive holdings and library collections all over the world and ensuring their wide dissemination.
The UNESCO website explains the decision to create such a list:
“Listing of items such as these on the Memory of the World Register is intended to generate interest and help with the conservation of documentary heritage which helps us to understand our society in all its complexities. However war, social upheaval, looting, illegal trading, destruction, inadequate conservation and lack of funding have all had a disastrous effect on the conservation of our documentary heritage. A growing awareness of this, together with UNESCO’s belief that the world’s documentary heritage belongs to all and should be preserved and protected, led to the establishment of its Memory of the World programme in 1992.”
The programme works to identify and facilitate the preservation of valuable archive holdings and library collections worldwide, and assists with their dissemination. Inscription of a collection in the Memory of the World register, created in 1995, is part of the process.
The Memory of the World Register covers all types of material and support, including stone, celluloid, parchment, audio recordings and more.
The earliest document of a printed text featuring multicoloured printed decoration, and the first example of a book produced entirely by means of mechanical methods.
Mainz Psalter - The earliest document of a printed text featuring multicoloured printed decoration, and the first example of a book produced entirely by means of mechanical methods.
New items inscribed:
Eleven countries enter the Memory of the World Register for the first time with the new inscriptions: Bulgaria, Fiji, Guyana, Ireland, Japan, Mongolia, Morocco, Panama, Suriname, Switzerland, Tunisia.
Visit the Memory of the World site.
_________________
References:
2011. "New Collections come to Enrich the Memory of the World". Past Horizons. Posted: May 30, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/new-collections-come-to-enrich-the-memory-of-the-world
Saturday, June 11, 2011
Climate played big role in Vikings' disappearance from Greenland
The end of the Norse settlements on Greenland likely will remain shrouded in mystery. While there is scant written evidence of the colony's demise in the 14th and early 15th centuries, archaeological remains can fill some of the blanks, but not all.
What climate scientists have been able to ascertain is that an extended cold snap, called the Little Ice Age, gripped Greenland beginning in the 1400s. This has been cited as a major cause of the Norse's disappearance. Now researchers led by Brown University show the climate turned colder in an earlier span of several decades, setting in motion the end of the Greenland Norse. Their findings appear in Proceedings of the National Academy of Sciences.
The Brown scientists' finding comes from the first reconstruction of 5,600 years of climate history from two lakes in Kangerlussuaq, near the Norse "Western Settlement." Unlike ice cores taken from the Greenland ice sheet hundreds of miles inland, the new lake core measurements reflect air temperatures where the Vikings lived, as well as those experienced by the Saqqaq and the Dorset, Stone Age cultures that preceded them.
"This is the first quantitative temperature record from the area they were living in," said William D'Andrea, the paper's first author, who earned his doctorate in geological sciences at Brown and is now a postdoctoral researcher at the University of Massachusetts–Amherst. "So we can say there is a definite cooling trend in the region right before the Norse disappear."
"The record shows how quickly temperature changed in the region and by how much," said co-author Yongsong Huang, professor of geological sciences at Brown, principal investigator of the NSF-funded project, and D'Andrea's Ph.D. adviser. "It is interesting to consider how rapid climate change may have impacted past societies, particularly in light of the rapid changes taking place today."
D'Andrea points out that climate is not the only factor in the demise of the Norse Western Settlement. The Vikings' sedentary lifestyle, reliance on agriculture and livestock for food, dependence on trade with Scandinavia and combative relations with the neighboring Inuit, are believed to be contributing factors.
Still, it appears that climate played a significant role. The Vikings arrived in Greenland in the 980s, establishing a string of small communities along Greenland's west coast. (Another grouping of communities, called the "Eastern Settlement" also was located on the west coast but farther south on the island.) The arrival coincided with a time of relatively mild weather, similar to that in Greenland today. However, beginning around 1100, the climate began an 80-year period in which temperatures dropped 4 degrees Celsius (7 degrees Fahrenheit), the Brown scientists concluded from the lake readings. While that may not be considered precipitous, especially in the summer, the change could have ushered in a number of hazards, including shorter crop-growing seasons, less available food for livestock and more sea ice that may have blocked trade.
"You have an interval when the summers are long and balmy and you build up the size of your farm, and then suddenly year after year, you go into this cooling trend, and the summers are getting shorter and colder and you can't make as much hay. You can imagine how that particular lifestyle may not be able to make it," D'Andrea said.
Archaeological and written records show the Western Settlement persisted until sometime around the mid-1300s. The Eastern Settlement is believed to have vanished in the first two decades of the 1400s.
The researchers also examined how climate affected the Saqqaq and Dorset peoples. The Saqqaq arrived in Greenland around 2500 B.C. While there were warm and cold swings in temperature for centuries after their arrival, the climate took a turn for the bitter beginning roughly 850 B.C., the scientists found. "There is a major climate shift at this time," D'Andrea said. "It seems that it's not as much the speed of the cooling as the amplitude of the cooling. It gets much colder."
The Saqqaq exit coincides with the arrival of the Dorset people, who were more accustomed to hunting from the sea ice that would have accumulated with the colder climate at the time. Yet by around 50 B.C., the Dorset culture was waning in western Greenland, despite its affinity for cold weather. "It is possible that it got so cold they left, but there has to be more to it than that," D'Andrea said.
_____________
EurekAlert. 2011. "Climate played big role in Vikings' disappearance from Greenland". EurekAlert. Posted: May 30, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-05/bu-cpb052611.php
What climate scientists have been able to ascertain is that an extended cold snap, called the Little Ice Age, gripped Greenland beginning in the 1400s. This has been cited as a major cause of the Norse's disappearance. Now researchers led by Brown University show the climate turned colder in an earlier span of several decades, setting in motion the end of the Greenland Norse. Their findings appear in Proceedings of the National Academy of Sciences.
The Brown scientists' finding comes from the first reconstruction of 5,600 years of climate history from two lakes in Kangerlussuaq, near the Norse "Western Settlement." Unlike ice cores taken from the Greenland ice sheet hundreds of miles inland, the new lake core measurements reflect air temperatures where the Vikings lived, as well as those experienced by the Saqqaq and the Dorset, Stone Age cultures that preceded them.
"This is the first quantitative temperature record from the area they were living in," said William D'Andrea, the paper's first author, who earned his doctorate in geological sciences at Brown and is now a postdoctoral researcher at the University of Massachusetts–Amherst. "So we can say there is a definite cooling trend in the region right before the Norse disappear."
"The record shows how quickly temperature changed in the region and by how much," said co-author Yongsong Huang, professor of geological sciences at Brown, principal investigator of the NSF-funded project, and D'Andrea's Ph.D. adviser. "It is interesting to consider how rapid climate change may have impacted past societies, particularly in light of the rapid changes taking place today."
D'Andrea points out that climate is not the only factor in the demise of the Norse Western Settlement. The Vikings' sedentary lifestyle, reliance on agriculture and livestock for food, dependence on trade with Scandinavia and combative relations with the neighboring Inuit, are believed to be contributing factors.
Still, it appears that climate played a significant role. The Vikings arrived in Greenland in the 980s, establishing a string of small communities along Greenland's west coast. (Another grouping of communities, called the "Eastern Settlement" also was located on the west coast but farther south on the island.) The arrival coincided with a time of relatively mild weather, similar to that in Greenland today. However, beginning around 1100, the climate began an 80-year period in which temperatures dropped 4 degrees Celsius (7 degrees Fahrenheit), the Brown scientists concluded from the lake readings. While that may not be considered precipitous, especially in the summer, the change could have ushered in a number of hazards, including shorter crop-growing seasons, less available food for livestock and more sea ice that may have blocked trade.
"You have an interval when the summers are long and balmy and you build up the size of your farm, and then suddenly year after year, you go into this cooling trend, and the summers are getting shorter and colder and you can't make as much hay. You can imagine how that particular lifestyle may not be able to make it," D'Andrea said.
Archaeological and written records show the Western Settlement persisted until sometime around the mid-1300s. The Eastern Settlement is believed to have vanished in the first two decades of the 1400s.
The researchers also examined how climate affected the Saqqaq and Dorset peoples. The Saqqaq arrived in Greenland around 2500 B.C. While there were warm and cold swings in temperature for centuries after their arrival, the climate took a turn for the bitter beginning roughly 850 B.C., the scientists found. "There is a major climate shift at this time," D'Andrea said. "It seems that it's not as much the speed of the cooling as the amplitude of the cooling. It gets much colder."
The Saqqaq exit coincides with the arrival of the Dorset people, who were more accustomed to hunting from the sea ice that would have accumulated with the colder climate at the time. Yet by around 50 B.C., the Dorset culture was waning in western Greenland, despite its affinity for cold weather. "It is possible that it got so cold they left, but there has to be more to it than that," D'Andrea said.
_____________
EurekAlert. 2011. "Climate played big role in Vikings' disappearance from Greenland". EurekAlert. Posted: May 30, 2011. Available online: http://www.eurekalert.org/pub_releases/2011-05/bu-cpb052611.php
Friday, June 10, 2011
The Bilingual Advantage
A cognitive neuroscientist, Ellen Bialystok has spent almost 40 years learning about how bilingualism sharpens the mind. Her good news: Among other benefits, the regular use of two languages appears to delay the onset of Alzheimer’s disease symptoms. Dr. Bialystok, 62, a distinguished research professor of psychology at York University in Toronto, was awarded a $100,000 Killam Prize last year for her contributions to social science. We spoke for two hours in a Washington hotel room in February and again, more recently, by telephone. An edited version of the two conversations follows.
Q. How did you begin studying bilingualism?
A. You know, I didn’t start trying to find out whether bilingualism was bad or good. I did my doctorate in psychology: on how children acquire language. When I finished graduate school, in 1976, there was a job shortage in Canada for Ph.D.’s. The only position I found was with a research project studying second language acquisition in school children. It wasn’t my area. But it was close enough.
As a psychologist, I brought neuroscience questions to the study, like “How does the acquisition of a second language change thought?” It was these types of questions that naturally led to the bilingualism research. The way research works is, it takes you down a road. You then follow that road.
Q. So what exactly did you find on this unexpected road?
A. As we did our research, you could see there was a big difference in the way monolingual and bilingual children processed language. We found that if you gave 5- and 6-year-olds language problems to solve, monolingual and bilingual children knew, pretty much, the same amount of language.
But on one question, there was a difference. We asked all the children if a certain illogical sentence was grammatically correct: “Apples grow on noses.” The monolingual children couldn’t answer. They’d say, “That’s silly” and they’d stall. But the bilingual children would say, in their own words, “It’s silly, but it’s grammatically correct.” The bilinguals, we found, manifested a cognitive system with the ability to attend to important information and ignore the less important.
Q. How does this work — do you understand it?
A. Yes. There’s a system in your brain, the executive control system. It’s a general manager. Its job is to keep you focused on what is relevant, while ignoring distractions. It’s what makes it possible for you to hold two different things in your mind at one time and switch between them.
If you have two languages and you use them regularly, the way the brain’s networks work is that every time you speak, both languages pop up and the executive control system has to sort through everything and attend to what’s relevant in the moment. Therefore the bilinguals use that system more, and it’s that regular use that makes that system more efficient.
Q. One of your most startling recent findings is that bilingualism helps forestall the symptoms of Alzheimer’s disease. How did you come to learn this?
A. We did two kinds of studies. In the first, published in 2004, we found that normally aging bilinguals had better cognitive functioning than normally aging monolinguals. Bilingual older adults performed better than monolingual older adults on executive control tasks. That was very impressive because it didn’t have to be that way. It could have turned out that everybody just lost function equally as they got older.
That evidence made us look at people who didn’t have normal cognitive function. In our next studies , we looked at the medical records of 400 Alzheimer’s patients. On average, the bilinguals showed Alzheimer’s symptoms five or six years later than those who spoke only one language. This didn’t mean that the bilinguals didn’t have Alzheimer’s. It meant that as the disease took root in their brains, they were able to continue functioning at a higher level. They could cope with the disease for longer.
Q. So high school French is useful for something other than ordering a special meal in a restaurant?
A. Sorry, no. You have to use both languages all the time. You won’t get the bilingual benefit from occasional use.
Q. One would think bilingualism might help with multitasking — does it?
A. Yes, multitasking is one of the things the executive control system handles. We wondered, “Are bilinguals better at multitasking?” So we put monolinguals and bilinguals into a driving simulator. Through headphones, we gave them extra tasks to do — as if they were driving and talking on cellphones. We then measured how much worse their driving got. Now, everybody’s driving got worse. But the bilinguals, their driving didn’t drop as much. Because adding on another task while trying to concentrate on a driving problem, that’s what bilingualism gives you — though I wouldn’t advise doing this.
Q. Has the development of new neuroimaging technologies changed your work?
A. Tremendously. It used to be that we could only see what parts of the brain lit up when our subjects performed different tasks. Now, with the new technologies, we can see how all the brain structures work in accord with each other.
In terms of monolinguals and bilinguals, the big thing that we have found is that the connections are different. So we have monolinguals solving a problem, and they use X systems, but when bilinguals solve the same problem, they use others. One of the things we’ve seen is that on certain kinds of even nonverbal tests, bilingual people are faster. Why? Well, when we look in their brains through neuroimaging, it appears like they’re using a different kind of a network that might include language centers to solve a completely nonverbal problem. Their whole brain appears to rewire because of bilingualism.
Q. Bilingualism used to be considered a negative thing — at least in the United States. Is it still?
A. Until about the 1960s, the conventional wisdom was that bilingualism was a disadvantage. Some of this was xenophobia. Thanks to science, we now know that the opposite is true.
Q. Many immigrants choose not to teach their children their native language. Is this a good thing?
A. I’m asked about this all the time. People e-mail me and say, “I’m getting married to someone from another culture, what should we do with the children?” I always say, “You’re sitting on a potential gift.”
There are two major reasons people should pass their heritage language onto children. First, it connects children to their ancestors. The second is my research: Bilingualism is good for you. It makes brains stronger. It is brain exercise.
Q. Are you bilingual?
A. Well, I have fully bilingual grandchildren because my daughter married a Frenchman. When my daughter announced her engagement to her French boyfriend, we were a little surprised. It’s always astonishing when your child announces she’s getting married. She said, “But Mom, it’ll be fine, our children will be bilingual!”
______________
References:
Dreifus, Claudia. 2011. "The Bilingual Advantage". New York Times. Posted: May 30, 2011. Available online: http://www.nytimes.com/2011/05/31/science/31conversation.html?_r=1
Q. How did you begin studying bilingualism?
A. You know, I didn’t start trying to find out whether bilingualism was bad or good. I did my doctorate in psychology: on how children acquire language. When I finished graduate school, in 1976, there was a job shortage in Canada for Ph.D.’s. The only position I found was with a research project studying second language acquisition in school children. It wasn’t my area. But it was close enough.
As a psychologist, I brought neuroscience questions to the study, like “How does the acquisition of a second language change thought?” It was these types of questions that naturally led to the bilingualism research. The way research works is, it takes you down a road. You then follow that road.
Q. So what exactly did you find on this unexpected road?
A. As we did our research, you could see there was a big difference in the way monolingual and bilingual children processed language. We found that if you gave 5- and 6-year-olds language problems to solve, monolingual and bilingual children knew, pretty much, the same amount of language.
But on one question, there was a difference. We asked all the children if a certain illogical sentence was grammatically correct: “Apples grow on noses.” The monolingual children couldn’t answer. They’d say, “That’s silly” and they’d stall. But the bilingual children would say, in their own words, “It’s silly, but it’s grammatically correct.” The bilinguals, we found, manifested a cognitive system with the ability to attend to important information and ignore the less important.
Q. How does this work — do you understand it?
A. Yes. There’s a system in your brain, the executive control system. It’s a general manager. Its job is to keep you focused on what is relevant, while ignoring distractions. It’s what makes it possible for you to hold two different things in your mind at one time and switch between them.
If you have two languages and you use them regularly, the way the brain’s networks work is that every time you speak, both languages pop up and the executive control system has to sort through everything and attend to what’s relevant in the moment. Therefore the bilinguals use that system more, and it’s that regular use that makes that system more efficient.
Q. One of your most startling recent findings is that bilingualism helps forestall the symptoms of Alzheimer’s disease. How did you come to learn this?
A. We did two kinds of studies. In the first, published in 2004, we found that normally aging bilinguals had better cognitive functioning than normally aging monolinguals. Bilingual older adults performed better than monolingual older adults on executive control tasks. That was very impressive because it didn’t have to be that way. It could have turned out that everybody just lost function equally as they got older.
That evidence made us look at people who didn’t have normal cognitive function. In our next studies , we looked at the medical records of 400 Alzheimer’s patients. On average, the bilinguals showed Alzheimer’s symptoms five or six years later than those who spoke only one language. This didn’t mean that the bilinguals didn’t have Alzheimer’s. It meant that as the disease took root in their brains, they were able to continue functioning at a higher level. They could cope with the disease for longer.
Q. So high school French is useful for something other than ordering a special meal in a restaurant?
A. Sorry, no. You have to use both languages all the time. You won’t get the bilingual benefit from occasional use.
Q. One would think bilingualism might help with multitasking — does it?
A. Yes, multitasking is one of the things the executive control system handles. We wondered, “Are bilinguals better at multitasking?” So we put monolinguals and bilinguals into a driving simulator. Through headphones, we gave them extra tasks to do — as if they were driving and talking on cellphones. We then measured how much worse their driving got. Now, everybody’s driving got worse. But the bilinguals, their driving didn’t drop as much. Because adding on another task while trying to concentrate on a driving problem, that’s what bilingualism gives you — though I wouldn’t advise doing this.
Q. Has the development of new neuroimaging technologies changed your work?
A. Tremendously. It used to be that we could only see what parts of the brain lit up when our subjects performed different tasks. Now, with the new technologies, we can see how all the brain structures work in accord with each other.
In terms of monolinguals and bilinguals, the big thing that we have found is that the connections are different. So we have monolinguals solving a problem, and they use X systems, but when bilinguals solve the same problem, they use others. One of the things we’ve seen is that on certain kinds of even nonverbal tests, bilingual people are faster. Why? Well, when we look in their brains through neuroimaging, it appears like they’re using a different kind of a network that might include language centers to solve a completely nonverbal problem. Their whole brain appears to rewire because of bilingualism.
Q. Bilingualism used to be considered a negative thing — at least in the United States. Is it still?
A. Until about the 1960s, the conventional wisdom was that bilingualism was a disadvantage. Some of this was xenophobia. Thanks to science, we now know that the opposite is true.
Q. Many immigrants choose not to teach their children their native language. Is this a good thing?
A. I’m asked about this all the time. People e-mail me and say, “I’m getting married to someone from another culture, what should we do with the children?” I always say, “You’re sitting on a potential gift.”
There are two major reasons people should pass their heritage language onto children. First, it connects children to their ancestors. The second is my research: Bilingualism is good for you. It makes brains stronger. It is brain exercise.
Q. Are you bilingual?
A. Well, I have fully bilingual grandchildren because my daughter married a Frenchman. When my daughter announced her engagement to her French boyfriend, we were a little surprised. It’s always astonishing when your child announces she’s getting married. She said, “But Mom, it’ll be fine, our children will be bilingual!”
______________
References:
Dreifus, Claudia. 2011. "The Bilingual Advantage". New York Times. Posted: May 30, 2011. Available online: http://www.nytimes.com/2011/05/31/science/31conversation.html?_r=1
Thursday, June 9, 2011
Archaeological Study of Ancient Swahili Town
An archaeological dig at Songo Mnara, a World Heritage site on the southern coast of Tanzania, will enable researchers to explore aspects of ancient urban planning in coastal East Africa.
Large scale funding from the Arts and Humanities Research Council (AHRC) and the National Science Foundation (NSF) will enable an exceptionally well-preserved example of an ancient Swahili stonetown on the coast of East Africa to be excavated this summer. The project is being undertaken by an international team of archaeologists from the Universities of York and Bristol in the UK and Rice in the USA.
Songo Mnara is recognised as the most impressive of all Swahili townscapes, including more than 40 coral-built houses and room-blocks, five mosques and multiple cemeteries.
Occupation of the site was brief, from the fourteenth to sixteenth centuries AD, coinciding with the golden age of Swahili stonetowns along the eastern African coast. Such towns were home to an indigenous and cosmopolitan form of urbanism that linked Africa with the Indian Ocean world system from AD 700 to 1500.
The research team, led by Dr Stephanie Wynne-Jones of the University of York and Dr Jeffrey Fleisher of Rice University, will also involve Bristol University’s archaeologists Dr Kate Robson-Brown and Professor Mark Horton.
Archaeological techniques will investigate economic, social and ritual activities at Songo Mnara and during three seasons of fieldwork, the team will study household activities through excavations within and around buildings, and public and communal practices in the open areas and monuments of the site. They will also accurately plot the site’s plan and its position in the island’s landscape.
Professor Mark Horton said, “It has always been my ambition to work at Songo Mnara, one of the very best preserved of the Swahili stonetowns. With this kind of support – the most ever given to examine these remarkable sites – we can hope to understand medieval world systems.”
Dr Stephanie Wynne-Jones added: “The project will allow us to collect lots of valuable information to build up a clear picture of how Swahili towns were both planned and unplanned. This will offer insight into how town plans emerged – through the efforts of powerful people, but also through the more basic and daily acts of those living in and moving through the town.
“Our work will also shed light on the organizational principles in ancient town plans more generally, and contribute to wider research on aspects of town layouts that were formed by movement, activity and use, rather than formal planning from above.”
The project will contribute to the conservation of Songo Mnara, which remains an ‘endangered’ World Heritage Site, by encouraging community involvement and providing educational opportunities demonstrating the unique role that archaeology can play in the site’s preservation and documentation.
________________
References:
Past Horizons. "Archaeological Study of Ancient Swahili Town". Past Horizons. Posted: May 29, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/archaeological-study-of-ancient-swahili-town
Large scale funding from the Arts and Humanities Research Council (AHRC) and the National Science Foundation (NSF) will enable an exceptionally well-preserved example of an ancient Swahili stonetown on the coast of East Africa to be excavated this summer. The project is being undertaken by an international team of archaeologists from the Universities of York and Bristol in the UK and Rice in the USA.
Songo Mnara is recognised as the most impressive of all Swahili townscapes, including more than 40 coral-built houses and room-blocks, five mosques and multiple cemeteries.
Occupation of the site was brief, from the fourteenth to sixteenth centuries AD, coinciding with the golden age of Swahili stonetowns along the eastern African coast. Such towns were home to an indigenous and cosmopolitan form of urbanism that linked Africa with the Indian Ocean world system from AD 700 to 1500.
The research team, led by Dr Stephanie Wynne-Jones of the University of York and Dr Jeffrey Fleisher of Rice University, will also involve Bristol University’s archaeologists Dr Kate Robson-Brown and Professor Mark Horton.
Archaeological techniques will investigate economic, social and ritual activities at Songo Mnara and during three seasons of fieldwork, the team will study household activities through excavations within and around buildings, and public and communal practices in the open areas and monuments of the site. They will also accurately plot the site’s plan and its position in the island’s landscape.
Professor Mark Horton said, “It has always been my ambition to work at Songo Mnara, one of the very best preserved of the Swahili stonetowns. With this kind of support – the most ever given to examine these remarkable sites – we can hope to understand medieval world systems.”
Dr Stephanie Wynne-Jones added: “The project will allow us to collect lots of valuable information to build up a clear picture of how Swahili towns were both planned and unplanned. This will offer insight into how town plans emerged – through the efforts of powerful people, but also through the more basic and daily acts of those living in and moving through the town.
“Our work will also shed light on the organizational principles in ancient town plans more generally, and contribute to wider research on aspects of town layouts that were formed by movement, activity and use, rather than formal planning from above.”
The project will contribute to the conservation of Songo Mnara, which remains an ‘endangered’ World Heritage Site, by encouraging community involvement and providing educational opportunities demonstrating the unique role that archaeology can play in the site’s preservation and documentation.
________________
References:
Past Horizons. "Archaeological Study of Ancient Swahili Town". Past Horizons. Posted: May 29, 2011. Available online: http://www.pasthorizons.com/index.php/archives/05/2011/archaeological-study-of-ancient-swahili-town
Wednesday, June 8, 2011
Ancient world dictionary finished — after 90 years
It was a monumental project with modest beginnings: a small group of scholars and some index cards. The plan was to explore a long-dead language that would reveal an ancient world of chariots and concubines, royal decrees and diaries — and omens that came from the heavens and sheep livers.
The year: 1921. The place: The University of Chicago. The project: Assembling an Assyrian dictionary based on words recorded on clay or stone tablets unearthed from ruins in Iraq, Iran, Syria and Turkey, written in a language that hadn't been uttered for more than 2,000 years. The scholars knew the project would take a long time. No one quite expected how very long.
Decades passed. The team grew. Scholars arrived from Vienna, Paris, Copenhagen, Jerusalem, Berlin, Helsinki, Baghdad and London, joining others from the U.S. and Canada. One generation gave way to the next, one century faded into the next. Some signed on early in their careers; they were still toiling away at retirement. The work was slow, sometimes frustrating and decidedly low-tech: Typewriters. Mimeograph machines. And index cards. Eventually, nearly 2 million of them.
And now, 90 years later, a finale. The Chicago Assyrian Dictionary is now officially complete — 21 volumes of Akkadian, a Semitic language (with several dialects, including Assyrian) that endured for 2,500 years. The project is more encyclopedia than glossary, offering a window into the ancient society of Mesopotamia, now modern-day Iraq, through every conceivable form of writing: love letters, recipes, tax records, medical prescriptions, astronomical observations, religious texts, contracts, epics, poems and more.
Why is there a need for a dictionary of a language last written around A.D. 100 that only a small number of scholars worldwide know of? Gil Stein, director of the university's Oriental Institute (the dictionary's home), has a ready answer:
"The Assyrian Dictionary gives us the key into the world's first urban civilization," he says. "Virtually everything that we take for granted ... has its origins in Mesopotamia, whether it's the origins of cities, of state societies, the invention of the wheel, the way we measure time, and most important the invention of writing.
"If we ever want to understand our roots," Stein adds, "we have to understand this first great civilization."
The translated cuneiform texts — originally written with wedged-shaped characters — reveal a culture where people expressed joy, anxiety and disappointment about the same events they do today: a child's birth, bad harvests, money troubles, boastful leaders.
"A lot of what you see is absolutely recognizable — people expressing fear and anger, expressing love, asking for love," says Matthew Stolper, a University of Chicago professor who worked on the project on and off over three decades. "There are inscriptions from kings that tell you how great they are, and inscriptions from others who tell you those guys weren't so great. ... There's also lot of ancient versions of `your check is in the mail.' And there's a common phrase in old Babylonian letters that literally means `don't worry about a thing.'"
There were omens, too — ways of divining the future by reading smoke patterns, the stars, the moon and sheep livers.
"Like all people at all times, they wanted to try to find some way of controlling their world," says Martha Roth, the dictionary's editor-in-charge and dean of humanities. "It's very difficult to draw the line between actually believing and being superstitious."
Robert Biggs, professor emeritus at the university, devoted nearly a half century to the dictionary, sometimes uncovering tablets on digs in the Iraq desert, sometimes poring over texts in museums in London and Baghdad. His specialty is Babylonian medicine. For almost an entire year, he studied thousands of references to sheep livers.
For example: If a sheep's gallbladder — part of the liver — was long and pointed, it meant the defeat of the enemy king. If there was a certain kind of crease on the liver, it could mean the king was going on a journey. A lunar eclipse could mean danger for a king.
But the tablets reached far beyond royalty. Biggs says they included everything from a disputed paternity case to agricultural loans to famine, where desperate people sold their children for cash. "Life was very fragile ... it was much more risky that it is now," he says.
Making sense of it all was painstaking work. Some of the wedge-shaped characters changed over the thousands of years, and the tablets excavated from ancient temples, palaces and cities were frequently crumbling. Often there was no punctuation, so it was hard to know where one word ended and the other began.
"You'd sit in a room with a good light and turn the tablet in various directions to see as much as possible," Biggs explains. "Quite often the tablets were broken so you might see part of a sign. And different people looking at the same thing would see something different because of the way you'd hold it."
"Sometimes it got to be very tedious," he adds. "Other times there was a sense of exhilaration if you could solve some problem or figure out what a rare word means."
Regardless, the work continued.
"You always saw the light at the end of the tunnel," Biggs says. "But the end of the tunnel kept getting further and further away."
An early 10-year completion deadline was soon deemed unrealistic. "Scholars always underestimated how difficult it would be," Roth says. "People always expected the project would end in their lifetime. What can I tell you? That's not always the way it goes."
There was much to research, much to record. By 1935, scholars already had 1 million index cards. It would take more than 30 years before the first of the 21 volumes was published. Most cover a single letter. The entire collection spans about 10,000 pages and 28,000 words. The definitions are more fitting for an encyclopedia; they provide cultural and historical context, similar to those in the Oxford English Dictionary.
"It's not such a word means king," Roth says. "It's a matter of understanding the thousands and thousands of references to the word king in every document in every period."
Roth notes that after arriving at the university in 1979, she asked to work on the word witness or witnessing. That took four to five years. On the other hand, there might be just a dozen references to a jar holding grain and that research could be complete in an afternoon.
Now that the dictionary is finished, Roth says there's a feeling of tremendous accomplishment and "a little bit of a sense of loss.... This has occupied my waking and sleeping moments for 32 years. You dream this stuff."
The end also brings a realization as more tablets are unearthed, more discoveries will be made.
"It's like driving a Porsche off the lot and looking in the Blue Book (listing a car's worth) and seeing how much value it's lost," Stolper says. "The moment it's done, it's out of date."
Biggs says the scholars are satisfied with the final version, but there is that lingering temptation.
"It might be nice to start over," he says, "but no one has the courage to do it anymore."
Online: Chicago Assyrian Dictionary
_______________
References:
Cohen, Sharon. 2011. "Ancient world dictionary finished — after 90 years". Yahoo News. Posted: June 4, 2011. Available online: http://news.yahoo.com/s/ap/20110604/ap_on_re_us/us_postcard_the90_year_dictionary_project
The year: 1921. The place: The University of Chicago. The project: Assembling an Assyrian dictionary based on words recorded on clay or stone tablets unearthed from ruins in Iraq, Iran, Syria and Turkey, written in a language that hadn't been uttered for more than 2,000 years. The scholars knew the project would take a long time. No one quite expected how very long.
Decades passed. The team grew. Scholars arrived from Vienna, Paris, Copenhagen, Jerusalem, Berlin, Helsinki, Baghdad and London, joining others from the U.S. and Canada. One generation gave way to the next, one century faded into the next. Some signed on early in their careers; they were still toiling away at retirement. The work was slow, sometimes frustrating and decidedly low-tech: Typewriters. Mimeograph machines. And index cards. Eventually, nearly 2 million of them.
And now, 90 years later, a finale. The Chicago Assyrian Dictionary is now officially complete — 21 volumes of Akkadian, a Semitic language (with several dialects, including Assyrian) that endured for 2,500 years. The project is more encyclopedia than glossary, offering a window into the ancient society of Mesopotamia, now modern-day Iraq, through every conceivable form of writing: love letters, recipes, tax records, medical prescriptions, astronomical observations, religious texts, contracts, epics, poems and more.
Why is there a need for a dictionary of a language last written around A.D. 100 that only a small number of scholars worldwide know of? Gil Stein, director of the university's Oriental Institute (the dictionary's home), has a ready answer:
"The Assyrian Dictionary gives us the key into the world's first urban civilization," he says. "Virtually everything that we take for granted ... has its origins in Mesopotamia, whether it's the origins of cities, of state societies, the invention of the wheel, the way we measure time, and most important the invention of writing.
"If we ever want to understand our roots," Stein adds, "we have to understand this first great civilization."
The translated cuneiform texts — originally written with wedged-shaped characters — reveal a culture where people expressed joy, anxiety and disappointment about the same events they do today: a child's birth, bad harvests, money troubles, boastful leaders.
"A lot of what you see is absolutely recognizable — people expressing fear and anger, expressing love, asking for love," says Matthew Stolper, a University of Chicago professor who worked on the project on and off over three decades. "There are inscriptions from kings that tell you how great they are, and inscriptions from others who tell you those guys weren't so great. ... There's also lot of ancient versions of `your check is in the mail.' And there's a common phrase in old Babylonian letters that literally means `don't worry about a thing.'"
There were omens, too — ways of divining the future by reading smoke patterns, the stars, the moon and sheep livers.
"Like all people at all times, they wanted to try to find some way of controlling their world," says Martha Roth, the dictionary's editor-in-charge and dean of humanities. "It's very difficult to draw the line between actually believing and being superstitious."
Robert Biggs, professor emeritus at the university, devoted nearly a half century to the dictionary, sometimes uncovering tablets on digs in the Iraq desert, sometimes poring over texts in museums in London and Baghdad. His specialty is Babylonian medicine. For almost an entire year, he studied thousands of references to sheep livers.
For example: If a sheep's gallbladder — part of the liver — was long and pointed, it meant the defeat of the enemy king. If there was a certain kind of crease on the liver, it could mean the king was going on a journey. A lunar eclipse could mean danger for a king.
But the tablets reached far beyond royalty. Biggs says they included everything from a disputed paternity case to agricultural loans to famine, where desperate people sold their children for cash. "Life was very fragile ... it was much more risky that it is now," he says.
Making sense of it all was painstaking work. Some of the wedge-shaped characters changed over the thousands of years, and the tablets excavated from ancient temples, palaces and cities were frequently crumbling. Often there was no punctuation, so it was hard to know where one word ended and the other began.
"You'd sit in a room with a good light and turn the tablet in various directions to see as much as possible," Biggs explains. "Quite often the tablets were broken so you might see part of a sign. And different people looking at the same thing would see something different because of the way you'd hold it."
"Sometimes it got to be very tedious," he adds. "Other times there was a sense of exhilaration if you could solve some problem or figure out what a rare word means."
Regardless, the work continued.
"You always saw the light at the end of the tunnel," Biggs says. "But the end of the tunnel kept getting further and further away."
An early 10-year completion deadline was soon deemed unrealistic. "Scholars always underestimated how difficult it would be," Roth says. "People always expected the project would end in their lifetime. What can I tell you? That's not always the way it goes."
There was much to research, much to record. By 1935, scholars already had 1 million index cards. It would take more than 30 years before the first of the 21 volumes was published. Most cover a single letter. The entire collection spans about 10,000 pages and 28,000 words. The definitions are more fitting for an encyclopedia; they provide cultural and historical context, similar to those in the Oxford English Dictionary.
"It's not such a word means king," Roth says. "It's a matter of understanding the thousands and thousands of references to the word king in every document in every period."
Roth notes that after arriving at the university in 1979, she asked to work on the word witness or witnessing. That took four to five years. On the other hand, there might be just a dozen references to a jar holding grain and that research could be complete in an afternoon.
Now that the dictionary is finished, Roth says there's a feeling of tremendous accomplishment and "a little bit of a sense of loss.... This has occupied my waking and sleeping moments for 32 years. You dream this stuff."
The end also brings a realization as more tablets are unearthed, more discoveries will be made.
"It's like driving a Porsche off the lot and looking in the Blue Book (listing a car's worth) and seeing how much value it's lost," Stolper says. "The moment it's done, it's out of date."
Biggs says the scholars are satisfied with the final version, but there is that lingering temptation.
"It might be nice to start over," he says, "but no one has the courage to do it anymore."
Online: Chicago Assyrian Dictionary
_______________
References:
Cohen, Sharon. 2011. "Ancient world dictionary finished — after 90 years". Yahoo News. Posted: June 4, 2011. Available online: http://news.yahoo.com/s/ap/20110604/ap_on_re_us/us_postcard_the90_year_dictionary_project
Subscribe to:
Posts (Atom)