When Erin Moreland considered becoming a zoologist, she spent her days imagining sitting on cliffs, drawing seals and other animals to document their lives, try to understand their behavior, and protect their habitat. The scientist from the United States National Oceanic and Atmospheric Administration (NOAA) was sure that there was a technological solution to help her team classify millions of aerial images of ice each year. The answer was found in artificial intelligence (AI).
But her beginnings as a zoologist, back in 2007, were somewhat different from what she had dreamed of: Moreland spent many hours in front of a computer screen viewing thousands of aerial photographs in search of any sign of life between the ice and the waters of Alaska. She and her team took 90,000 snapshots and spent months scanning them, finding only about 200 seals. The work was so long and tedious that it took many months to organize the data, and by the time the research was published, the information was already out of date. Fortunately, today Erin and her colleagues at NOAA already have a technology solution that will allow them to classify millions of aerial images in a very short time. Now, researchers from this scientific agency use artificial intelligence to monitor the behavior of endangered beluga whales, endangered ice seals and polar bears, among other wildlife.
Know animal behavior
Right now, interdisciplinary teams are training the AI tool so that it can distinguish, for example, a seal from a rock and the sound of a whale from the chirp of a dredge, while trying to understand the behavior of marine mammals to help them. to survive in their environment where there is increasing human activity. Specifically, Moreland’s project combines AI with photos that are being taken by enhanced cameras from a NOAA turboprop aircraft in the Beaufort Sea in northern Alaska. This will be used to scan, classify images and produce a population count of ice seals and polar bears, the results of which will be ready in hours instead of months. Manuel Castellote, a scientist associated with NOAA, will apply a similar algorithm to the recordings collected by the equipment positioned at the bottom of Cook’s Inlet in Alaska, helping him to quickly decipher how the dwindling population of belugas passed the winter.
The data gleaned from both the photographs and the recordings will be processed by scientists, analyzed by statisticians, and then reported to the likes of Jon Kurland, NOAA’s regional administrator for protected resources in Alaska. Currently, the Kurland office in Juneau, Alaska, is charged with overseeing both marine mammal conservation and recovery programs throughout the region and helping guide all federal agencies that issue permits or take actions that could potentially affect species that are threatened or endangered.
Technology helps threatened species
There are four types of ice seals in the Bering Sea: bearded, ringed, spotted, and striped. Both bearded and ringed ones are classified as threatened species, which means they are in danger of extinction in the near future. On the other hand, the Cook Inlet beluga whale is already in danger of extinction: according to 2019 data, its population has decreased to 279 specimens while it is estimated that only 30 years ago there were approximately 1,000 specimens.
Spotted seals look similar to harbor seals, but use floating ice rather than land to have their young and molt. It is a worrying situation in which it is a priority to mitigate the impact of human activities, such as construction and transportation, in the periods and places of breeding and feeding. For this, it is essential to have up-to-date information on the places where threatened species tend to breed, for example. “We don’t currently have the basic information, so getting it will give us a much clearer picture of the kinds of responses that might be needed to protect these animal populations. For both whales and seals, these data analyzes translate into cutting-edge science, filling gaps that we have no other way to fill, ”says Kurland.
This is how AI is used in the Arctic
The complexity of the challenge was due to the fact that, although there are many models of recognition of people by images, there was no model developed to date that could distinguish the species from a seal by analyzing aerial photographs. Fortunately, the hundreds of thousands of patterns that NOAA scientists had classified in previous research helped technologists train AI prototypes to recognize which photographs and recordings contained mammals and which did not. “Part of the challenge was that there was 20 terabytes of image data and working on the computer with that much data was not practical at all. We had to do daily hard drive deliveries between Seattle and Redmond to do it. But the Microsoft Azure cloud made it possible to work with all that data and train the AI models, ”says Morris.
Ice seals generally lead solitary lives, making them more difficult to detect than group-dwelling animals. Inspections are also tricky because the plane has to fly high enough to prevent the seals from freaking out and diving, and low enough to get high-resolution photos that allow scientists to differentiate a ringed seal from a spotted seal. for example. Additionally, the often rainy and cloudy weather in Alaska further complicates efforts.
Months of records
Castellote and his team spend months each year classifying the sound records they have collected. The objective is to determine if a noise is a change in the ice, the passage of a ship, an airplane or the song of a whale, and in the latter case, if it is a beluga, humpback or killer whale. All this work did not allow them to dedicate themselves to the essential: the analysis of communication between whales. The whales are guided by the sound, taking advantage of the echo’s location to move, especially in Cook’s Inlet, where sediment from melting glaciers turns the water cloudy. That means the noise, which is amplified underwater, can disorient mammals, leaving them unable to find the ocean floor or surface, follow their path, catch prey, or be alerted to the presence of a predator such as a killer whale. For example, if the young cannot hear their mothers’ whistling, they may separate from the group and die.
In this context, it is essential for both Erin Moreland and Manuel Castellote to have data that can guide mitigation measures to make human activity less harmful to endangered animals. Thanks to the use of artificial intelligence, they will be able to work to minimize vessel activity near the most important feeding areas or to reduce construction noise in certain places during the breeding season, for example. Studying the data provided by the AI on the Arctic species in an optimized way will be a first step to better understand animal behavior and, at the same time, also to modify that of humans.