How do dogs see the world? Human researchers to decode visual cognition in dog brains

Dogs and humans have co-evolved over the past 15,000 years. Today, dogs are often inhabited as pets in the human environment. Sometimes dogs watch videos at home like people, as if they understand.

So, what does the world look like in the eyes of dogs?

A study from Emory University has decoded visual images from a dog’s brain, revealing for the first time how a dog’s brain reconstructs what it sees. The research was published in the Journal of Experimentation in Visualization.

picture

Paper address: https://www.jove.com/t/64442/through-dog-s-eyes-fmri-decoding-naturalistic-videos-from-dog

The researchers recorded fMRI neural data from two awake, unrestrained dogs while they watched 30 minutes of video three times, for a total of 90 minutes. They then used machine learning algorithms to analyze patterns in the neural data.

“We can monitor a dog’s brain activity as it watches the video and reconstruct to some extent what it’s seeing,” said Gregory Berns, a professor of psychology at Emory University and one of the paper’s authors. amazing.”

Berns and colleagues pioneered the use of fMRI scans in dogs and trained them to be completely still and unrestrained while measuring neural activity. A decade ago, the team released the first fMRI brain images of a fully awake, unrestrained dog, opening the door to what Berns calls “The Dog Project.”

picture

Berns and Callie, the first dog whose brain activity was scanned while fully awake and unrestrained.

Over the years, Berns’ lab has published several studies on how the canine brain processes vision, language, smell, and rewards such as receiving praise or food.

At the same time, advances in machine learning have allowed scientists to decode some of the human brain’s activity patterns. Berns then wondered whether a similar technique could be applied to a dog’s brain.

The new research is based on machine learning and fMRI technology. fMRI is a neuroimaging technique that uses magnetic resonance imaging to measure changes in hemodynamics caused by neuronal activity. This technique is non-invasive and has an important role in the field of brain function localization. Besides humans, the technology has only been applied to a few other species, including some primates.

Research introduction
Using two dogs in the experiment, it demonstrates that techniques such as machine learning, fMRI, and more can be used universally for canine analysis, and the researchers hope the study will help others to gain a deeper understanding of how different animals think.

The experimental process is roughly as follows:

Experiment participants: Bhubo, 4 years old; Daisy, 11 years old. Both dogs had previously participated in several fMRI sessions (Bhubo: 8, Daisy: 11), some of which involved viewing visual stimuli projected onto a screen. The two dogs were chosen because of their ability to stay inside the scanner for extended periods of time and not move around without the owner’s sight.

Video Shooting: Shoot videos from a dog’s perspective to capture everyday scenes in a dog’s life. These scenarios include walking, feeding, playing, interacting with humans, dog-to-dog interaction, and more. Edit the video into 256 unique scenes, each depicting an event such as a dog hugging a human, a dog running, or a walk. Assign each scene a unique number and label based on its content. These scenes were then edited into five larger compilation videos of approximately 6 minutes each.

picture

Experimental design: Participants were first scanned with 3T MRI while they watched a compiled video projected onto a screen behind the MRI hole. For dogs, pre-trained to place their head in a custom chin rest to achieve a stable head position like the one pictured below

picture

The experiment was divided into three viewings, each of 30 minutes of video, for a total of 90 minutes.

During the experiment, fMRI was used to scan the dog at the same time, and then the data was analyzed. The experiment used the Ivis machine learning algorithm, which is a nonlinear method based on the twin neural network (SNN), which has been used in the analysis of high-dimensional biological data. Success. In addition, machine learning algorithms such as scikit-learn and RFC are also used in the experiments.

picture

Daisy, being scanned, has her ears taped to secure the earbuds for noise cancellation.

The study compared how the human and dog brains work. Results from two human subjects showed that models developed using neural networks were 99 percent accurate in mapping brain data to object- and action-based classifiers; the same model was able to decode dog brain patterns with 99 percent accuracy. Not suitable for object classifiers, and achieved 75% – 88% accuracy when decoding dog action classification. This illustrates a major difference in how the human and dog brains work, as shown in the experimental results for humans (A) and dogs (B) below. In this regard, Berns concludes: “We humans are very concerned about what we see, while dogs seem to care less about who or what they see and more about action behavior.”

Exit mobile version