Is There a Better Way to Deal With Wild Cattle?
The company’s goals for their project were relatively straightforward: They wanted to use facial recognition to enable their feeders to provide individualized diets for specific animals. And by the end of the project, Baxter said, they had developed a neural network that could identify individual pigs with 97% accuracy using a basic webcam.
Baxter, however, came to believe that the software could do more. If the network could identify pigs based on minute facial details, then it might also be able to tell what the pigs were feeling emotionally based on tiny changes in facial expression (technology that is also being explored for humans).
“We ourselves can see changes in their body posture and facial expressions, so we were quite confident that this would show up,” Baxter said.
Baxter started with a relatively simple experiment — measuring stress in first-time mother sows when introduced into pens with larger, more experienced sows. Older pigs are often domineering and bully their younger neighbors, providing researchers with an opportunity to observe distress without subjecting animals to excessively inhumane circumstances. Before long, they had trained a computer to detect distress in pigs with 92% accuracy as compared to human assessment — a figure that has since improved with additional experimentation.
Around the world, a small but growing number of researchers and technologists are building similar technologies: machine learning-based systems that monitor farm animals and determine whether they are happy or distressed. Most, like Baxter’s system, employ cameras to look for subtle changes in expression or posture that may indicate how an animal feels. Others draw on additional sources of input: infrared cameras to monitor body temperature or microphones to collect vocalizations.
Many of these AI models have astounded the researchers developing them in terms of their ability to consistently and accurately identify nonverbal signs of emotional wellbeing. But the AI doesn’t always produce the results researchers expect — and many of the same researchers developing this technology have begun to develop misgivings about how it might be used.
Reading the Animal Mind
The study of animal emotions dates as far back as Darwin, but the field has seen a resurgence of interest since 1995, according to Dominique Blanche, associate professor of agriculture at the University of Western Australia. Since then, studies have demonstrated that animals can have similar brain structures and display parallel activities with many human emotions. Sheep — Blanche’s personal species of choice — have been shown to have measurable physiological stress responses to novel experiences and to the degree of control they have over their lives. Ducks appear to experience frustration, with a measurable change in body temperature, when prevented from doing something they want to do. Cows not only bond with other specific friends within their herds, but have a lower heart rate and show signs of feeling relaxed when they are near these preferred individuals.
Whether or not animals experience consciousness or feel and interpret emotions the same way humans do remains a matter of fierce debate, Blanche said. But there is no question, he continued, that farm animals like pigs, cattle, sheep, and even poultry have physiological experiences that resemble our own emotional responses.
There is also a growing consensus that more content animals are more productive animals. Like humans, when animals experience long-term stress, they seem to grow prone to chronically poor health, said Caroline Lee, principal research scientist on the animal behavior and welfare team at Australia’s national science agency, The Commonwealth Scientific and Industrial Research Organisation. Stressed and anxious animals invest more energy in managing those emotional states, which leaves them with less energy to grow. It also erodes their immune health, making them more prone to disease. And in general, Lee said, farmers report that calmer animals are easier to handle, which cuts down on labor.
All that seems to point toward an obvious use-case for AI systems trained to identify emotional states in animals: Make the animals happier to improve their productivity and decrease farm costs.
Mammals like pigs and cows tend to show their emotions in ways that might be more familiar to humans: ear posture, changes around the eyes and lips.
The potential has brought a variety of software developers and researchers to the field. But first they had to figure out how to interpret animal emotions themselves.
Fortunately, scientific research has already greatly expanded our understanding of animal emotions and expression, Lee said. Researchers have used pharmaceuticals to induce particular states in animals like anxiety, depression, and even happiness — though Lee notes that last one is somewhat difficult to manipulate reliably with pharmacological means. Once the emotional state is physiologically validated, scientists can then observe the animals to associate behaviors with specific emotional states.
Some of the results are what you might expect. Anxious animals tend to be more vigilant, holding themselves in an erect posture with their heads held up. Distressed animals will make higher-pitched vocalizations.
Researchers like Suresh Neethirajan, a professor at Dalhousie University in Canada, have taken this process one step further, inducing emotional states in animals and then training artificial intelligence to recognize them. Neethirajan and his team used stimuli such as loud noises or toys to prompt certain states in farm animals while using cameras and other sensors to record their responses. The software can now reliably identify 13 emotional states in cows and pigs, Neethirajan said.
The software can now reliably identify 13 emotional states in cows and pigs.
Chickens, Neethirajan said, are a little bit harder, but they’re on the AI to-do list as well. Mammals like pigs and cows tend to show their emotions in ways that might be more familiar to humans — ear posture, changes around the eyes and lips. Chickens have fewer facial muscles, making them less expressive and harder to read. But they do behave and vocalize differently, depending on what is going on in their environment.
“The data clearly show that chickens do have emotions, and the emotions can be easily measured by changes in respiration rate — there’s a sudden drop of temperature in the beak region, and the model clearly shows that when they are under stress, the birds become quiet,” Neethirajan said.
All that may sound rather straightforward, but some of his research has come to rather disturbing conclusions as well. In one experiment, Neethirajan said, his team monitored the reaction of pigs who were being shipped from one farm to another, and another group of pigs being sent to the slaughterhouse. And even though they had no reason to know where they were going, the pigs en route to the slaughterhouse showed greater distress than those moving between farms.
“We still need a bit more concrete evidence,” Neethirajan said, “but I have a hunch based on the preliminary data that they have an understanding — that these animals can somehow sense they are going to be killed.”
Impacts of climate change on animals will be multi-faceted, reveals study
A Moral Quandary
Any discussion of emotions in animals may have, for understandable reasons, a tendency to become tricky or uncomfortable for some animal producers. Indeed, no full-time animal producers responded to requests for comment on this story.
On the whole, Lee said, she believes producers want to do the right thing by their animals and maintain high standards for animal welfare. But there is also a fear that society will impose regulations on them that would make farms difficult to manage profitably, she said.
Neethirajan, for his part, opted out of the system altogether: He went vegetarian after the findings from the slaughterhouse experiments. But he recognizes a need for animal products in the human diet. We don’t yet have the ability to feed the global population with plants alone, and he himself still eats eggs and dairy. And yet he remains conflicted about the use of the technology he’s helped to create. The obvious application, Neethirajan said, is to reduce labor costs and increase the productivity of farm animals. But should we really be focused on profit when we’re talking about living, thinking, feeling beings?
There’s also a disconnect, Baxter said, between what technologists believe AI can do, and the reality on the ground. Sure, you can get AI to read animal facial expressions via a camera. But farm animals don’t always look up at cameras, and they love to chew equipment.
Although AI has proven itself effective at monitoring signs of basic emotional states in animals, it’s proven challenging to use computers to monitor more nuanced emotional expression. Older animals, Baxter said, are more difficult for the AI to read than younger animals. And while the rate of accuracy for using AI to detect negative emotions such as stress and fear is relatively high — approaching 99% accuracy compared to human and physiological measures in some trials — positive emotions, like happiness, have proven more difficult to parse.
Farm animals don’t always look up at cameras, and they love to chew equipment.
Going beyond the emotional state to try to interpret specifics about farm conditions or animal behaviors is also tricky, according to Rick D’Eath, a colleague of Baxter’s at Scotland’s Rural College. Outside the college’s work on facial recognition, D’Eath collaborated on a system intended to predict a seemingly intractable issue on pig farms: tail biting. Unhappy, bored pigs will bite each other’s tails, leading to injury and infection. So D’Eath and his team set up a computerized camera system intended to monitor pigs for signs of a potential tail-biting outbreak.
The theory behind the system was relatively straightforward: When the pigs start to worry about being bit by other pigs, they will tend to hold their tails closer to the body and back into corners to protect themselves. So the camera system was intended to detect these behaviors and alert the farmers when signs of tail biting emerged.
At first the system worked perfectly, D’Eath said. But as time went on and they tested it at more farms, the system started to flag a growing number of false positives. Tail posture, it seems, wasn’t exclusively related to tail biting.
“We initially thought we could produce a system that we could say, this is an early warning system that will warn of tail biting outbreaks,” he said. “Now we have a system that can tell you there is something amiss in that pen and you should go check it, but we can’t be specific what it is.”
That, as you might imagine, is not the winning sales pitch that D’Eath started with. And it raises questions about the potential for increased mistakes and animal welfare concerns — particularly if AI is used to reduce the ratio of human farmers to animals on farms.
Monetizing Welfare — A Different Approach
As an advocate for animal welfare, Sarah Ison, head of research at advocacy group Compassion in Farming, has her own concerns about the use of artificial intelligence on farms — especially if it leads to more factory farming and decreases the connection between people, farmers, and the animals they raise for food.
But she also sees potential benefits. Many species of animals, especially farm animals, will naturally try to hide signs of pain and vulnerability. The inability to know with any certainty whether an animal is in pain has led to recommendations for veterinarians to give all animals with the same condition a standardized dose of pain meds, which could lead to over- or under-treatment, depending on how an individual animal responds. AI monitoring systems could give us the ability to better understand when and how to manage pain in animals, Ison said.
AI could also give farm animals a greater sense of autonomy, allowing them to signal when they want to go inside or outside, Ison said. Or the benefits could be as simple as recognizing animals by their faces alone, eliminating the need to mutilate animals with ear tags for identification, Baxter said.
He can envision a world where AI monitors and interprets how animals are feeling — and then uses that information to generate welfare reports for consumers.
And while no one denies the potential for AI to accelerate trends toward fully robotized and dehumanized farming, Neethirajan believes it could have the potential to increase our connection to farm animals. If we could get beyond the obvious applications, he said, he can envision a world where AI monitors and interprets how animals are feeling — and then uses that information to generate welfare reports for consumers.
Ison notes that farmers have already begun to parlay consumer concerns about animal welfare into improved prices for their products, charging more for cage-free eggs, free-range chickens, and grass-fed beef. But it’s not always clear what these labels mean or how they are verified.
AI, Neethirajan said, could bring true transparency to these labels by monitoring whether an animal truly led a happy life and generating a comprehensive — and reasonably independent — report on which consumers and retailers could base their purchasing decisions.
The current gold standard for animal welfare monitoring is someone who “goes and looks at a farm, and it’s a one-day visit,” D’Eath said. With AI, “we are talking about 24-7 monitoring. But we’re not there yet.”