Scientists call it the "cocktail party problem." To understand the person talking to you in a noisy room, you've got to filter out all of the conversations, clinking glass, and other noises in the background. Fortunately, our brains are up to the challenge, and now—thanks to a little help from a humanoid robot—researchers have found new clues to how we do it.
Then the researchers brought in the robot. Team leader Hirohito Kondo explains that he and his colleagues wanted to know if head movements could reset the cocktail party effect—that is, once we've filtered out the background noise, does turning our heads bring back the cacophony?
Al Bregman, a hearing scientist at McGill University in Montreal, Canada, was impressed by the research but is reluctant to believe that such a flaw in the brain exists. Instead, he suggests, there could be a problem with the sound stimulus used in these sorts of studies.
"The system is so exquisite in its capabilities, able to detect sub-millisecond asynchronies between the signals at the two ears," he says, "that it is hard for me to believe that the Kondo et al. results reflect a crude flaw in the system."