With rapid advances in camera trap technology, researchers should “hurry up and wait”

In this post Pen-Yuan Hsing discusses the recent paper from Cole Burton and colleagues ‘Wildlife camera trapping: a review and recommendations for linking surveys to ecological processes’ and the exciting new advances in camera trap technology.

Camera traps have come a long way since first entering the ecologist’s toolbox more than a hundred years ago. Early iterations involved bulky film cameras powered by lead acid batteries, and were mechanically triggered by trip-wires or pressure pads. Nowadays, one can buy off-the-shelf camera traps for less than $100. They are lightweight, waterproof, and capable of taking thousands of digital images over many months in harsh field conditions. Advances like this allow researchers to deploy more camera traps over a much wider area than previously possible.

Mockup of our camera trap photo classification page running on a mobile device.
Mockup of our camera trap photo classification page running on a mobile device.

In my research, we will use camera traps to monitor wild mammals – from deer to badgers – in and around County Durham in the UK. We hope to gain a basic understanding of their abundance, distribution, and even behaviour. Though modern camera traps are easy to use and need no user intervention once deployed in the field, it would be hard for me to set up, and regularly check dozens (if not hundreds!) of them in a region as large as County Durham. To make this work, we are turning to citizen scientists for help. The idea is that a volunteer would set up a camera trap in their neighbourhood, and upload its photos to an online platform where users collaboratively classify the animals that show up. Working with the Durham Wildlife Trust, we have started this process by recruiting an initial group of student volunteers to place camera traps in the woods around Durham University. There are now hundreds of camera trap images uploaded to an early version of the website, where users can tap on the name corresponding to the animal shown in a particular photo. I am quite excited by what we can learn from this camera trap monitoring network, which we hope will grow organically over the coming months, with citizen scientists from across County Durham contributing.

Wild mammals in County Durham, UK caught by camera traps, including roe deer Capreolus capreolus, and red fox Vulpes vulpes.
Wild mammals in County Durham, UK caught by camera traps, including roe deer Capreolus capreolus, and red fox Vulpes vulpes.

While excitement and curiosity drive the pursuit of knowledge, we should not let them detract us from doing good science. In a new paper in Journal of Applied Ecology, Burton et al. (2015) remind us that while rapid advances in camera trap technology have led to an explosion of potential ecological applications, we need to base them on sound sampling design and statistical analyses. In this review paper, the authors thoroughly examined 266 recent camera trap studies from several perspectives. This included assessing the studies based on basic metrics like focal taxa or geographical region; the design, implementation, and reporting of camera trap sampling and, when appropriate, how those studies examined animal populations, whether through indices, relative abundance, density estimation, or occupancy modelling. I found the breadth of the reviewed camera trap studies to be astounding. They were carried out in 60 countries, covering nearly 200 species with body masses from just 1.3 g to almost 5000 kg, and home ranges as large as 2800 km2.

Unfortunately, the widespread application of camera traps in these ecological studies was seldom coupled with rigorous methodology. I was surprised that many studies did not report what seems to be fairly basic information, such as the type of camera trap(s) used or their settings, and more than a fifth of the studies did not even describe their sampling design. Close to 60% of the reviewed studies placed their camera traps close to some kind of animal attractant (e.g. bait, water sources, or roads), which may confirm the presence of the target species, but will be problematic when performing other analyses.

Burton et al. (2015) paid particular attention to camera trap studies that tried to assess population parameters, including density estimation, occupancy modelling, and relative abundance. The common theme among these studies was that theoretical assumptions and constraints were often ignored or not explicitly addressed. For instance, studies that attempted occupancy modelling to understand abundance regularly neglected to define basic parameters such as the area occupied, or the period in which the area was occupied. Many implicitly assumed that the detectability of their species would not vary across time and the sites in which they surveyed. Another observation was the variations in which a camera trap detection “event” was defined. These events were usually delineated by seemingly arbitrary timespans from 30 minutes to a day between consecutive photos of the target species.

While what I described are just a few of the issues in camera trap studies found by Burton et al. (2015), they illustrate a lack of thoroughness in the reporting of methodology, and “uncritical application of statistical models… without due consideration of associated model assumptions” (Burton et al. 2015). In a recent blog post, Paul Lukacs discussed “the power of combining data from multiple camera studies to gain broader insights than any single data set could have provided”. For this synthesis to be useful, it is critical to employ consistent, transparent, and rigorous designs in camera trap studies. Without it, “time wasting, stressful and diversionary debate” might ensue that would detract us from the scientific issue at hand, as described in another recent post in this blog by Matt Hayward.

I am lucky to have read this review article as a student who is just starting to use camera traps in ecological monitoring. There is an expression, “hurry up and wait”, meaning hurriedly doing something, then realising that it doesn’t work because a lot of other pre-requisites are not done well. In this case, I have been reminded that in the midst of excitement and hurrying to get my hands dirty, I should also wait, take a breath, and carefully consider exactly how I plan to conduct my research and justify every step.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s