The Birds and the ‘Bots
Some of the topics presented in the news coverage of several papers published in PLoS ONE last week included birds, music and artificial intelligence.
Coen Elemans and colleagues at the University of Pennsylvania and the University of Utah studied the European starling and the zebra finch and found that these songbirds control their songs with the fastest-contracting muscle type yet described. These superfast muscles are previously known only from the sound-producing organs of rattlesnakes, several fish and the ringdove but the new study suggests they may be more common than once thought. The study was covered in the New York Times (Learning From a Muddy Muscle Master), National Geographic (Fastest Known Muscles Found in Songbirds' Throats), The Telegraph (Songbirds have superfast muscles) and The Independent (Songbirds develop super muscles for dawn chorus), among other places.
Birds were also the subject of a study entitled, Birds Reveal their Personality when Singing, by Garamszegi and colleagues. The researchers used bird song as a model to investigate whether behavioural traits involved in sexual advertisement can serve as good indicators of personality in wild animals. They found that the females preferred males who sang close to the ground, which may involve a higher predation risk, because it offers less concealment and puts males in a conspicuous position from the predators’ eye. Only prime quality individuals can cope with such costs of exposed singing, while cheaters will be eliminated by predators. The study was picked up by CBS News (Bold male bird gets the girl: study) and blogged by GrrlScientist (Singing the Praises of Mr Personality).
“Most musical, most melancholy bird,” said Samuel Taylor Coleridge of the nightingale but whether birdsong can affect us in the same way as a beautiful sonata played by a human musician is another matter. Stefan Koelsch at the University of Sussex, meanwhile, investigated whether people respond in the same ways to computerised music – particularly to unexpected chord changes – as they do to music played by humans. The researchers recorded the electrical brain responses and skin conductance responses of the participants and found that while the original, human music elicited brain activity in the listeners and caused them to sweat, the modified music generated little response. The authors suggest that the brain is therefore more likely to look for musical meaning when the music was played by a pianist. Perhaps the computerised music in the study wasn’t quite as poignant as HAL’s rendition of Daisy in 2001: A Space Odyssey.
The best headline of the week must surely be The Chronicle of Higher Education for its article on the study: Don't Cry For Me, R2D2. Other coverage included stories in The Telegraph (Sweaty music find could help develop new treatments), The Guardian (Music that brings a tear to the eye), Wired (Study: Computer Musicians Ain't Got No Soul) and PsychCentral (Computer Music Not As Calming).
Finally, a study by Sören Krach and colleagues investigated how the increase of human-likeness of interaction partners modulates the participants’ brain activity. In this study, participants were playing an easy computer game (the prisoners’ dilemma game) against four different game partners: a regular computer notebook, a functionally designed Lego-robot, the anthropomorphic robot BARTHOC Jr. and a human. The fMRI study found that the more human-like the opponent, the more engaged the cortical regions associated with mental state attribution of the participants and the more the participants enjoyed the interaction. The study was blogged by io9 (Proof that the Brain Cannot Distinguish Between Human and Humanoid) and in the Chronicle of Higher Education (Our Brains Attribute Human Qualities to Humanoid Machines).
53 other papers were published in PLoS ONE last week (including an article by Laurie Graham and colleagues, which was covered by The Economist) and can all be read, rated and discussed on the journal website.