Camera traps are often used to protect endangered wildlife, recording thousands of hours of footage of the natural world. But who has the time to watch and analyse all of it? An open data project is helping animal conservation efforts by scanning hours of video footage and automatically highlighting the things that count.
Conserving the planet’s natural biodiversity is vital for the functioning of our natural ecosystems. Every animal, every plant, every tiny fungus is a part of a bigger system, and if parts of ecosystems disappear, it can cause them to become unstable – or collapse completely.
But effective conservation requires information: Which species live where? How many animals are there? Where are they most threatened? Exact estimates about population sizes or on the density of endangered animal species in specific areas form the basis for decision-makers to be able to take the right measures when it comes to protecting the planet’s biodiversity. And especially when it comes to rare animal species, we need as much information as possible on their behaviour and habitat in order to be able to create the ideal conditions for protected areas. And last but not least, it’s also important to know what else is happening in nature reserves, besides the animal and plant world: Are there illegal human activities going on, such as poaching or illegal logging?
There are various ways that conservationists collect that data, including taking aerial photographs via satellite or analysing audio recordings taken in the rainforest. In remote, hard-to-reach areas, camera traps are often used, remotely activated camera that are equipped with motion sensors or infrared sensors. These kind of cameras are often used to monitor rare animals and observe their behaviours. Unlike research expeditions, where humans invade the animals’ natural habitats to collect information, camera traps capture images of wildlife with as little human interference as possible.
But collecting data with camera traps also presents new challenges. First, researchers have to comb through the huge amounts of information. Just one single site survey can result in hundreds of thousands of videos being produced at several different locations. And camera traps often end up recording video material that isn’t relevant – like for example, when the sensors are triggered by falling branches. So which information is relevant and which is not? And which animals can be seen in the recordings at all? Processing and analysis all of this data requires an enormous amount of time – often much more than researchers have available. That’s why they often call upon the support of volunteer citizen scientists, to help them sift through the information. But unfortunately, that kind of support isn’t always available.
One solution to this problem is the open data project Zamba, which uses artificial intelligence and computer vision to evaluate the video material and automatically recognise which species of animals have passed through the camera trap and which shots are irrelevant. This saves researchers huge amounts of time, and allows them to concentrate exclusively on the relevant videos and their actual work.
Citizen Science meets Data Science
Zamba was made possible by two key groups: thousands of citizen scientists and hundreds of data scientists. Researchers at the Max Planck Institute for Evolutionary Anthropology laid the foundations for the project when they too reached the limits of what was humanly possible when it came to viewing, analysing and labelling the hundreds of thousands of video clips that they had collected. And so in 2017 they organized a machine learning competition together with DrivenData (a spin-off from the Harvard Innovation Lab). In this so-called Pri-matrix Factorization Challenge, data scientists from over 90 countries worked to develop a machine learning algorithm for identifying selected animal species on the basis of hundreds of thousands of video clips. These clips had previously been carefully manually categorized and put together into a comprehensive database by thousands of citizen scientists as part of the Chimp&See Zooniverse project.
The ultimate winner was an algorithm developed by Dmytro Poplovskiy, which uses AI based on neural networks. According to Driven Data, this algorithm was able to determine the presence of wild animals with an accuracy of 96 percent and achieve an average accuracy of 99 percent in the identification of species. The algorithm also recorded an average recall rate of 70 percent for all species and up to 96 percent for the three most common labels (blank, human and elephant).
The code of the best entries for the competition has been published under an open source license and is therefore available for any researchers and conservationists to use. The machine learning model by Dmytro Poplovskiy was also packaged into an open source software tool, the Python package Zamba. Zamba (from the African lingua franca Lingála for “forest”) is an open source command line tool that can be freely installed and used by researchers around the world to identify species from camera trap recordings. It has also been integrated into a web application so that conservation researchers can easily upload videos and run them through the Zamba algorithms.
Currently, Zamba is able to identify a total of 23 species, with the current focus on African wildlife. According to Driven Data in conversation with RESET, work is currently underway to add data from other species and other locations. Initial tests of Zamba Cloud to estimate species populations have also shown promising results, and Driven Data says it intends to expand these possibilities further. The startup is currently looking for funding to further improve the algorithms, expand the user base – and increase its impact.
This is a translation of an original article that first appeared on RESET’s German-language site.
This article is part of the RESET Special Feature “Artificial Intelligence – Can Computing Power Save Our Planet?”
The RESET Special Feature on AI is part of a project funded by the Deutschen Bundesstiftung Umwelt (German Federal Environmental Foundation DBU). As part of this project, over a period of two years we will be developing four RESET Special Features on the topic of “Opportunities and Potentials of Digitalisation for Sustainable Development”.
You can find more information here.