AI Learns to Recognize Animal Emotions by Facial Expressions

Scientists have developed AI systems that can detect pain, stress, and illness in animals by analyzing photographs of their faces. UK-based Intellipig AI can detect discomfort in pigs, while AI algorithms from Israel’s University of Haifa (UH) have been trained to detect stress in dogs. In an experiment conducted at the University of Sao Paulo (USP), AI demonstrated up to 88% accuracy in detecting pain in horses. These technologies could transform veterinary diagnostics and significantly improve animal welfare.

Image source: Virginia Marinova / Unsplash

The Intellipig system, developed by English scientists from the University of the West of England in Bristol (UWE Bristol) together with Scottish researchers from the Scottish Agricultural College (SRUC), is designed to monitor the condition of pigs on farms. AI analyzes photographs of the animals’ faces, identifying three key markers: pain, malaise and emotional distress. Farmers receive automatic notifications, which allows them to promptly respond to deteriorating animal conditions and increase the efficiency of agricultural production.

Meanwhile, a UH research team is adapting machine learning technologies to work with dogs. Previously, the scientists developed AI algorithms used in facial recognition systems to find lost pets. Now, these algorithms are being used to analyze the animals’ facial expressions to detect signs of discomfort. It turns out that 38% of dogs’ facial movements are similar to those of humans, which opens up new possibilities for studying their emotional state.

Traditionally, such AI systems rely on humans to do the legwork of identifying the meaning of different animal behaviors based on long-term observations of animals in different situations. However, USP recently conducted an experiment in which the AI ​​independently analyzed photographs of horses taken before and after surgery, as well as before and after taking painkillers. The AI ​​examined the horses’ eyes, ears, and mouths to determine whether they were in pain. According to the study, the AI ​​was able to identify signs indicating pain with 88% accuracy, which confirms the effectiveness of this approach and opens up prospects for further research.

admin

Share
Published by
admin

Recent Posts

Intel has officially begun offering 18A technology to its customers

Intel management has repeatedly stated that it will not delay providing its customers with access…

5 hours ago

Elon Musk has managed to make X profitable, but revenue still lags behind Twitter’s independent days

The sudden surge of investor interest in Elon Musk's X has been reported recently, but…

5 hours ago

Trump’s allies intend to hold tech giants accountable for censorship on social networks and other services

The new head of the US Federal Trade Commission (FTC), appointed by President Donald Trump,…

5 hours ago

Chinese Go Underground to Find Place to Store Energy in Compressed Air

The project of storing energy in compressed air, tested in Germany in the 1970s, has…

5 hours ago