AG真人百家乐官方网站

Skip to main content
NSF NEON, Operated by Battelle

Main navigation

  • AG真人百家乐官方网站 Us
    • Overview
      • Spatial and Temporal Design
      • History
    • Vision and Management
    • Advisory Groups
      • Science, Technology & Education Advisory Committee
      • Technical Working Groups (TWGs)
    • FAQ
    • Contact Us
      • Contact NEON Biorepository
      • Field Offices
    • User Accounts
    • Staff
    • Code of Conduct

    AG真人百家乐官方网站 Us

  • Data & Samples
    • Data Portal
      • Spatial Data & Maps
    • Data Themes
      • Biogeochemistry
      • Ecohydrology
      • Land Cover and Processes
      • Organisms, Populations, and Communities
    • Samples & Specimens
      • Discover and Use NEON Samples
        • Sample Types
        • Sample Repositories
        • Megapit and Distributed Initial Characterization Soil Archives
      • Sample Processing
      • Sample Quality
    • Collection Methods
      • Protocols & Standardized Methods
      • Airborne Remote Sensing
        • Flight Box Design
        • Flight Schedules and Coverage
        • Daily Flight Reports
          • AOP Flight Report Sign Up
        • Camera
        • Imaging Spectrometer
        • Lidar
      • Automated Instruments
        • Site Level Sampling Design
        • Sensor Collection Frequency
        • Instrumented Collection Types
          • Meteorology
          • Phenocams
          • Soil Sensors
          • Ground Water
          • Surface Water
      • Observational Sampling
        • Site Level Sampling Design
        • Sampling Schedules
        • Observation Types
          • Aquatic Organisms
            • Aquatic Microbes
            • Fish
            • Macroinvertebrates & Zooplankton
            • Periphyton, Phytoplankton, and Aquatic Plants
          • Terrestrial Organisms
            • Birds
            • Ground Beetles
            • Mosquitoes
            • Small Mammals
            • Soil Microbes
            • Terrestrial Plants
            • Ticks
          • Hydrology & Geomorphology
            • Discharge
            • Geomorphology
          • Biogeochemistry
          • DNA Sequences
          • Pathogens
          • Sediments
          • Soils
            • Soil Descriptions
        • Optimizing the Observational Sampling Designs
    • Data Notifications
    • Data Guidelines and Policies
      • Acknowledging and Citing NEON
      • Publishing Research Outputs
      • Usage Policies
    • Data Management
      • Data Availability
      • Data Formats and Conventions
      • Data Processing
      • Data Quality
      • Data Product Bundles
      • Data Product Revisions and Releases
        • Release 2021
        • Release 2022
        • Release 2023
        • Release 2024
        • Release-2025
      • NEON and Google
      • Externally Hosted Data

    Data & Samples

  • Field Sites
    • AG真人百家乐官方网站 Field Sites and Domains
    • Explore Field Sites

    Field Sites

  • Impact
    • Observatory Blog
    • Case Studies
    • Papers & Publications
    • Newsroom
      • NEON in the News
      • Newsletter Archive
      • Newsletter Sign Up

    Impact

  • Resources
    • Getting Started with NEON Data & Resources
    • Documents and Communication Resources
      • Papers & Publications
      • Outreach Materials
    • Code Hub
      • Code Resources Guidelines
      • Code Resources Submission
    • Learning Hub
      • Science Videos
      • Tutorials
      • Workshops & Courses
      • Teaching Modules
    • Research Support Services
      • Field Site Coordination
      • Letters of Support
      • Mobile Deployment Platforms
      • Permits and Permissions
      • AOP Flight Campaigns
      • Research Support FAQs
      • Research Support Projects
    • Funding Opportunities

    Resources

  • Get Involved
    • Advisory Groups
      • Science, Technology & Education Advisory Committee
      • Technical Working Groups
    • Upcoming Events
    • NEON Ambassador Program
      • Exploring NEON-Derived Data Products Workshop Series
    • Research and Collaborations
      • Environmental Data Science Innovation and Inclusion Lab
      • Collaboration with DOE BER User Facilities and Programs
      • EFI-NEON Ecological Forecasting Challenge
      • NEON Great Lakes User Group
      • NEON Science Summit
      • NCAR-NEON-Community Collaborations
        • NCAR-NEON Community Steering Committee
    • Community Engagement
      • How Community Feedback Impacts NEON Operations
    • Science Seminars and Data Skills Webinars
      • Past Years
    • Work Opportunities
      • Careers
      • Seasonal Fieldwork
      • Internships
        • Intern Alumni
    • Partners

    Get Involved

  • My Account
  • Search

Search

Impact

  • Observatory Blog
  • Case Studies
  • Papers & Publications
  • Newsroom

Breadcrumb

  1. Impact
  2. Observatory Blog
  3. Automating Biodiversity Surveys at NEON Field Sites

Case Study

Automating Biodiversity Surveys at NEON Field Sites

March 10, 2022

AudioMoth recorder in a Ziploc bag

Most biodiversity data at the NEON field sites is collected the old-fashioned way: by humans traipsing through the fields and woods, gathering specimens or noting observations. But modern instrumentation and machine learning methods are increasingly used in the ecology community to supplement human effort. Could some of these methods be applied at the NEON field sites? A recent paper in Ecosphere - - explores the possibilities.

Who's There? Just Listen!

Dr. Justin Kitzes, an assistant professor of Biological Sciences at the University of Pittsburgh, initially proposed the idea as a topic for a working group at the October 2020 NEON Science Summit. Kitzes, who focuses on quantitative ecology, sits on two NEON Technical Working Groups (TWGs): Breeding Birds and Data Standards.

In his own research, he uses autonomous acoustic recorders and machine learning algorithms to identify and classify birds and anurans (frogs and toads) from their calls. He and his students and colleagues place more than 1000 recording devices each year in field sites across North and South America. In Pennsylvania, he is monitoring large-scale forest restoration projects supported by the National Fish and Wildlife Foundation.

"We're looking for focal species in these areas鈥攇olden wing warblers, cerulean warblers, wood thrushes and other species鈥攖o see if they are using the habitat as hoped," he explains. "The acoustic recorders allow us to listen in without disturbing the wildlife to see what species are actually there. It's one way to monitor the success of the restoration project."

Acoustic recording devices are just one example of automated sensors that can be used for biodiversity surveying. The working group also explored several others, including wildlife cameras for mammals, hydroacoustic sensors and remote sensing for aquatic species, expanded remote and ground-based sensor measurements for plant biodiversity, and laboratory analysis of the NEON physical specimen collections.

The Rise of Automation in Ecology

Automated instruments have shown tremendous promise for collection of biodiversity data. A fully automated system typically includes two parts:

  • A sensor that collects visual, acoustic, or other data that provides evidence of the presence of a species in an area. These could include camera traps, acoustic sensors, or remote sensing technologies (e.g., imaging spectrometer). Some types of sensors can be automated so they only collect data when triggered (e.g., by motion or sound). This reduces collection of data that are not meaningful for ecologists. For example, a wildlife camera trap typically uses a motion sensor so that it only records when an animal is within range of the camera.
  • A machine learning algorithm that is capable of processing the data and providing preliminary identification of a species. The program may use visual or acoustic signatures for species identification. These programs must be trained using large datasets so that they learn to distinguish, for example, the call of a cerulean warbler from that of a golden wing warbler, or the profile and markings of a bobcat vs. a common housecat. Once they are trained, they can quickly process vast amounts of recorded data and pull out the data of interest to ecologists, who can confirm the species identification.

Automated sensing and identification systems enable collection of much more data than is possible by human field collection alone. Sensors can be left in the field 24/7/365 to record events that may happen diurnally or seasonally. And unlike human observers, they do not disrupt the wildlife researchers are trying to record.

AudioMoth recorder in a Ziploc bag

Pilot deployment at SCBI in Front Royal in 2019. An AudioMoth recorder is hung on a tree in a Ziploc bag. Photo credit: Justin Kitzes.

Kitzes says that automation is particularly helpful in detecting events that are very rare or hard to capture. It is also a better way to capture "firsts" in a season, such as the first birdsong that heralds the return of a species in the spring. "One of the things instruments are really good at is waiting around. They don't care if it's raining or snowing, and they don't have to sleep. So, they are particularly good at detecting events that are very rare, such as the call of a rare species, or recording the first time something happens in a year. When humans are in the field, they would have to be there at just the right time to capture these events. The instruments can just wait there until it happens."

Biodiversity, Automation, and the NEON Program

Researchers have used instrument data to explore biodiversity in a variety of ways. For example, the phenocams and remote sensing data (hyperspectral and lidar) from the NEON Airborne Observation Platform (AOP) have been used by researchers to study plant community composition, including tree species classification via machine learning using lidar and hyperspectral data. Researchers have also investigated the use of machine learning to classify beetles and other taxa from photos taken of pitfall trap catches. Additional and more sophisticated machine learning algorithms could be used to extract other kinds of biodiversity data from existing instrument systems or NEON Biorepository collections.

As suggested by Kitzes and coauthors, adding new instrumentation鈥攕uch as acoustic recorders, camera traps, and sonar and remote imaging technologies for aquatic sites鈥攚ould further expand the types of biodiversity data that researchers could derive from NEON data. Camera traps, for example, would enable study of medium-to-large mammal populations that are not currently included in the NEON program design. Acoustic recorders would provide valuable insights into the presence of many different species of birds, bats, insects, and anurans鈥攖axa that are either currently not covered at all by the NEON program or are likely to be missed by human observation or specimen collection methods such as pitfall traps. And additional aquatic instrumentation could supplement labor-intensive human surveys of fish species presence and abundance and enable better monitoring of rare, migratory, or invasive species.

These instruments are beyond the scope of the original NEON instrument system design. When the NEON program was scoped many years ago, many of these instruments and methods were not yet in widespread use. Adding them now would require additional funding support from the NSF for instrumentation and significant work to write new data collection protocols. However, adding new instruments to NEON field sites can happen through the NEON Assignable Assets program. Under this program, a researcher can apply for permission to add instrumentation to existing NEON infrastructure in a cost-recoverable arrangement to one or more NEON field sites.

Other suggestions made by Kitzes and coauthors would require fewer resources to implement. The physical specimens archived by the NEON Biorepository represent a treasure trove of data that could be used to support biodiversity studies. Machine learning is already being applied for taxonomic classification of carabids collected in the pitfall traps. Imaging of other types of organisms鈥攊ncluding aquatic and benthic invertebrates, terrestrial invertebrates now classified as "bycatch" in the carabid pitfall traps, and other physical plant and animal specimens鈥攚ould open new possibilities for machine learning classification and species identification using the specimens that already exist in the archive.

Kitzes is enthusiastic about the possibilities of automated instruments and machine learning for biodiversity surveys. He says, "Our group is interested in studying biodiversity at large scales across both time and space. When you look at it that way, you start to realize how much of the planet has not been surveyed. Currently, we try to fill those holes statistically, but that's not the same as having actual observations鈥攅specially when you are looking for rare species and events. These technologies can be used to augment human efforts and vastly expand the scale of biodiversity data collection."

Share

Related Posts:

Resolved: Spurious trace precipitation in Primary Precipitation data product

December 27, 2024

Precipitation (DP1.00006.001) to be split into three Data Products

December 27, 2024

Protocol and Data Processing Updates for Dust and Particulate Size Distribution (DP1.00017.001)

December 19, 2024

NSF NEON, Operated by Battelle

Follow Us:

Join Our Newsletter

Get updates on events, opportunities, and how NEON is being used today.

Subscribe Now

Footer

  • AG真人百家乐官方网站 Us
  • Newsroom
  • Contact Us
  • Terms & Conditions
  • Careers
  • Code of Conduct

Copyright © Battelle, 2025

The National Ecological Observatory Network is a major facility fully funded by the U.S. National Science Foundation.

Any opinions, findings and conclusions or recommendations expressed in this material do not necessarily reflect the views of the U.S. National Science Foundation.