Dogs and the language barrier
(appeared on 12th Jan 2022)

(link to main website)

Print version - Dog and language

Does my dog understand language? Or does she only respond to sounds and gestures, asks S.Ananthanarayanan.

Many dog lovers believe that dogs understand ‘everything we say’. The more grounded think dog ‘obedience’ is conditioned response to sounds, not comprehension. Laura V. Cuaya, Raúl Hernández-Pérez, Marianna Boros, Andrea Deme and Attila Andics, from Department of Ethology, Eötvös Loránd University, Budapest, and the Lingual Articulation Research Group, a research programme in Budapest, describe in the journal, NeuroImage, their work that shows dogs display the ability to distinguish speech sounds from non-speech, and to make out the difference between languages.

The development of speech and language, which marks humans apart from other living things, has been the subject of study since long. The general view is that a child learns its mother tongue by constant exposure, trial and error, and begins to grasp what different sounds convey and can convey, from the names of things and actions to the rules of grammar that bind words together.

And then, there are theories of how the rules of grammar arise. On one hand, the idea that all languages have the same, basic grammar. And then, the observation that bird-song has a pattern, which is taught and learnt, that dolphins communicate using structured sounds, and so on. And at yet another level, the mechanics of language, programming a computer to read, speak, or translate. And this last challenge works by building computer structures to work like how neurons, or nerve cells, in the animal brain are believed to work.

Even if we accept that a child learns by repetition and trial and error, this does not address the question of what happens in the cells of the child’s brain while it learns. Development of electronic circuits that can calculate or make choices suggested how brain cells, which are activated by electrical signals, could be structured. And in turn, studies of how humans and other living things perceive and learn has led to computer programmes that mimic the brain.

The understanding is that when the infant brain receives stimuli, from the retina or from the ear-drum, for instance, brain cells react at random. When stimuli are associated with a pleasant event, like the presence of the mother, some brain cell responses to those stimuli are selected and get strengthened, and would then be repeated. Building on units of perception like this, the brain learns to associate shapes, sounds or action with objects, people or events. And as the child grows, this progresses to skills, to language, reading, writing, and so on

In the field of Artificial Intelligence, typical stimuli are broken down to be represented by a collection of numbers. An image, for instance, is the intensity of pixels, a sound, the mix of frequencies. The AI system then carries out a calculation on the numbers, to select from a set of results. If the images are of digits, for instance, the results could be the numbers from 0 to 9. And then, there is an arrangement of feedback, depending on whether the selection was correct or not. And based on the feedback, the system modifies the calculation, to come closer the correct result.

The calculation used can be complex, to take into account many kinds of variable factors, and the process of feedback and correction can be repeated a huge number of times. The system then gets quite good at recognising specific shapes, be it handwritten digits or alphabets, objects or faces of people. In respect of sounds, the same process can identify phonemes, or units of speech, and the system can put sounds together as words and write them out, to work as a dictation machine. Or listen to a piece of music and write out the staff notation for each of the instruments.

There could even be a method of ‘parsing the words’, that is, analysing the relationship of the nouns, the verbs, and so on. It would, however, be instructive to understand the mechanics of how the simplest components of language are processed in rudimentary brains, as of animals.

In the case of the response, and ‘trainability’, of animals, like dogs, even chickens and fleas, to sounds, this is understood as a case of conditioned response. In the classic experiment, a neutral stimulus, like a bell, was paired with a biological stimulus, food, which resulted in salivation. The result of repetition was conditioning, so that the bell, which was a neutral stimulation, by itself, led to salivation.

To train a dog, the word, “sit”, for instance, is spoken, and the dog is encouraged to sit, perhaps by a tap on the hind quarters. If the dog sits, and every time she sits in response to the command, she is rewarded with a treat. As before, there is conditioning, and the dog learns to sit every time she hears the word. This, of course, may not amount to ‘understanding’ of language. But, in the case of the dog, the Budapest paper says, intense exposure to human speech creates powerful familiarity with a large number of words. “This makes dogs a useful comparison species for exploring the evolutionary bases of human voice and speech perception,” the paper says.

The dogs had been trained to lie still in an MRI scanner, and while the sounds were played to them, the activity of the primary and secondary auditory cortex of their brains was scanned. The idea was to see which portions of the brain were excited when the dogs heard a familiar language, and scrambled sounds, from the same language, or in an unfamiliar language.

The results were that the primary auditory cortex showed different activation when the matter heard was natural speech, as opposed to scrambled sounds. The effect was the same with both the familiar and the unfamiliar language. The languages, in fact, were Hungarian and Spanish, which, the paper says, have similar rhythms of vowels and consonants. The result hence indicates that the natural flow had been internalised and the novelty of a new language was detected. It was also found that longer headed dogs were more adept, suggesting a physical basis.

The activity in the secondary auditory cortex showed different activations when the matter heard was of the familiar or the unfamiliar language. And here, the difference was more marked in the case of older dogs. This suggests that dogs are able to learn the specific regularity, or rhythm, characteristic of a language, through exposure.

The study shows that the dog brain has the capacity to detect speech naturalness and distinguish between languages. That there are different portions of the brain that handle the two kinds of discrimination may lead to greater understanding of how language is processed, by living things, or could be processed by a machine.

------------------------------------------------------------------------------------------
Do respond to : response@simplescience.in
-------------------------------------------