Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Using AI to decode dog vocalizations

Jun 15, 2024 - news.umich.edu
Researchers at the University of Michigan have developed an AI tool that can identify different types of dog barks, as well as the dog’s age, sex, and breed. The tool was developed in collaboration with Mexico’s National Institute of Astrophysics, Optics and Electronics (INAOE) Institute in Puebla. The AI models used were initially trained on human speech, and the researchers found that these could be repurposed to understand animal communication. The lack of publicly available data on animal vocalizations has been a challenge, but the researchers overcame this by using an existing model designed for human speech.

The researchers used a dataset of dog vocalizations from 74 dogs of varying breed, age, and sex, recorded in different contexts. The AI model, called Wav2Vec2, was then used to interpret these vocalizations. The model was found to be successful in four classification tasks and outperformed other models trained specifically on dog bark data, with an accuracy rate of up to 70%. This research could have significant implications for animal welfare, as understanding dog vocalizations could improve human interpretation of dogs' emotional and physical needs.

Key takeaways:

  • An AI tool developed at the University of Michigan can identify different types of dog barks, as well as the dog’s age, sex, and breed.
  • The AI models were initially trained on human speech and then used to understand animal vocalizations, a collaboration with Mexico’s National Institute of Astrophysics, Optics and Electronics.
  • One of the main challenges in developing these AI models is the lack of publicly available data on animal vocalizations, which are difficult to collect.
  • The research has implications for animal welfare, as understanding dog vocalizations could improve human interpretation of dogs' emotional and physical needs.
View Full Article

Comments (0)

Be the first to comment!