User Guide

Chapter 1 Introduction

A finch, singing on a branch.

Let’s start from the very beginning – what is WildTrax and what are its objectives?

  • To manage all components of environmental sensor and biological data from field metadata to media to species observations
  • To store data safely and securely while still making it accessible to users
  • To process environmental sensor data to a high degree of data quality and integrity
  • To share environmental sensor and biological data with other WildTrax users, collaborators and the public
  • To discover data in your study area or across the WildTrax system

WildTrax is continuously improved based on user needs and stakeholder engagement processes. Sign up for the newsletter in User Settings or check out the News page to get the most up-to-date feature releases.

1.1 About WildTrax

The WildTrax platform was developed by the Alberta Biodiversity Monitoring Institute (ABMI), the Bioacoustic Unit (BU), and the Bayne Lab at the University of Alberta.

The ABMI is an incorporated, arm’s length, not-for-profit scientific organization, and has been providing scientifically credible tools, information products and services on Alberta’s biodiversity and human footprint to provincial government, industry, environmental decision-makers, and all Albertans since 2003. The ABMI has since become a global leader in the application and development of biodiversity monitoring.

The BU is a consortium of students, researchers and collaborators that lead best practices for using acoustic technology in Canada and participate in the application of wildlife acoustic data and technology to meet environmental management and research needs. The team is actively engaged in research to enhance methodologies and tools to better understand the natural environment through acoustics. Clients and collaborators regularly partner with the BU to assist with their wildlife monitoring needs; the BU’s involvement varies from client to client and spans the full range of services from simply providing information to conducting a full research project on their behalf. The BU has been continually improving acoustic data organization and transcription methodologies since 2012.

The Bayne Lab’s research centers on understanding the cumulative ecological impacts of human activities on biodiversity. They use a combination of behavioral, population, and community ecology in combination with cutting edge techniques in wildlife monitoring, survey design, geographic information systems, and habitat modeling. Their goal is to provide recommendations on how biodiversity reacts to various types of human and natural disturbance with the goal of achieving better conservation outcomes. This includes understanding interactions between native and invasive species, interactions between climate change and land-use, and economic-ecological trade-off assessment. While many in the lab work on birds, there is no particular taxonomic bias to their research. They work closely with government, industry, and conservation organizations to facilitate better conservation decision making.

Each sensor in WildTrax is supported by an outstanding team of researchers and collaborators that have paved the way for a multi-sensor experience in WildTrax.

You can visit out our full list of sponsors and partners in the About section of the site.

Note: The pronoun “you” throughout the guide refers to the reader. “We” refers to the WildTrax Team in general. WildTrax specific tools, functions, jargon or important fields are bolded.

Environmental sensors

Environmental sensors (such as autonomous recording units [ARUs] or remote cameras) are an increasingly common monitoring method used to measure environmental and ecological attributes across broad geographic scales. These sensors allow for automated collection of data over an extended period and can generate large amounts of valuable biological data.

Biological data

Biological data, such as counts of animals, their behaviour, or other attributes, can be derived from environmental sensors. WildTrax seamlessly integrates such data across multiple sensors, with the additional capacity to incorporate data from point counts, a commonly used method for evaluating species’ relative abundance, especially birds.

Open data

Open data is data that can be accessed, re-used or redistributed by anyone and is freely available in a usable and convenient format. Open data benefits the scientific community and society. Data accessibility allows users (e.g., researchers, conservation practitioners and the public) to find, manipulate and analyze data, as well as link it data to other types of information. Open data can lead directly to conservation knowledge and action. This requires data to be usable, compatible with other datasets, and reliable.

Getting Started

Administrators

If you’re an administrator wishing to create or manage an organization or projects within WildTrax:

  1. Create a WildTrax account using an email address.
  2. Create an organization(s). The WildTrax Team will verify your identity to finalize your organization setup.
  3. Create a project(s) within your organization.
  4. Upload data to your projects.
  5. Process your image or acoustic data within the online interfaces, or manage your data in the organization.
  6. Use species verification to ensure high quality data.
  7. Publish, share or download your data.

Observers

If you’re a user who will be assigned tasks, create a WildTrax account and provide your user name to the project administrator, who will add you to the appropriate projects.

Projects will then be visible on your dashboard. You’ll be assigned either as an observer or validator.

Partners and collaborators

If you are a partner or collaborator wishing to share or discover data:

  1. Browse the published projects available in either the project dashboardData Discover or Data Downloads.
  2. Request Access to organizations or projects to alert administrators you would like access to their data.

WildTrax is a proponent of making data as open and accessible as possible and many organizations are shifting towards a more open, collaborative, co-produced framework. Recognizing the importance of data privacy, WildTrax also offers many options and features to control how data is shared with others.

Accounts and Member Types

1.2 Create an Account

WildTrax operates under a role-based access control system, meaning a user in the system represents an individual, not an organization or a group of people. This restricts access to authorized users and is a policy-neutral mechanism designed to create roles and privileges. These users can then collaborate to manage data or share data to answer broader scientific questions.

Click on LOGIN in the top ribbon.

wildtrax header screenshot

Follow the steps to create an account—after you verify your email and have logged in, your account will be activated and you can begin using WildTrax. When you are logged in, your first name will appear on the right side of the ribbon, as will the “My Data” button.

You must verify the email address associated to your WildTrax account before you begin using the system. You will not be able to access data or use certain features without first verifying your email.

wildtrax header screenshot logged in

The About section, Resources, this Guide and publicly available data on Data Downloads and Data Discover are accessible without a WildTrax account. Hovering over the “My Data” button in the top ribbon will link you to the main sections of WildTrax:

  • My Organizations
  • My Projects
  • Data Downloads

1.3 User Settings

The user settings dashboard can be accessed by hovering over your username in the top right corner of the ribbon when you’re logged into the system.

screenshot13

  • Name: your full name
  • Initials: an acronym or set of initials you can use to define an observer or user
  • Email: the email address associated to your account
  • Subscribe to Newsletter: a toggle that will opt you in for occasional WildTrax newsletters delivered to your email
  • Language: your default language. Currently available in English and French.
  • Affiliation (optional): the organization, institution or group of which you’re a member or user in the system.

Once you’ve made your desired changes, click the Update button.

Chapter 2 Organizations

black birds flying during daytime photo by hermant

This chapter will walk you through organizations.

2.1 Organization Set-up

Snow geese flying in the daytime

Organizations sit at the top of the WildTrax hierarchy and are the central entity to which environmental sensor data, biological data and metadata are associated. When in doubt, if you’re looking for any information in WildTrax, you can likely find it under the organization. Organizations represent groups of users who collect data, design and publish projects, manage equipment and survey locations. Organizations allow you to coordinate efforts with multiple WildTrax users to create a structured, standardized dataset. Examples of organizations include government branches, industry, research labs and non-profits.

Click on My Data in the top ribbon, followed by My Organizations. This will take you to the organization dashboard. Click Create an Organization. From here the organization settings form will appear.

Fill in the fields in the form and click Save. A WildTrax administrator will need to confirm your identity before approving your new organization request. If you’re having any technical difficulties creating an organization contact WildTrax Support.

These are the fields and settings controlled by the organization. Organization settings control the defaults and the general infrastructure of the data they manage. You can access these by clicking on the pencil icon beside the name on the organization dashboard, or on the organization name while you’re on the organization’s main page.

  • Organization acronym: a short-hand name for your organization (e.g., ABMI)
  • Organization full name: the full name of the organization
  • Institution / company / group: the institution the organization is a part of, such as a university or government branch
  • Storage location: where the data will be stored
  • Default location privacy settings:
    • Default location buffering: the organization’s default for whether the coordinates entered are exact (“True Locations”) or randomly offset (“Buffered Locations”) by a specified radius (i.e., the “default buffer radius [m]”).
      • True Locations: used if the coordinates entered pertain to exact location coordinates.
      • Buffered Locations: used if the coordinates entered are randomly offset by a specified radius (i.e., the “default buffer radius [m]”).
    • Default buffer radius (m): the organization’s default location buffer radius (m) (i.e., the radius of the buffer around in the location [in metres] within which the coordinates have been randomized [0 m if coordinates have not been hidden])
    • Allow location reports: creates location summary reports when enabled
  • Default image privacy settings: 
    • Default visit image access: default privacy setting that will be applied to visit photos uploaded to the organization
    • Human blurring:  an organization-wide setting for the type of users for which the Auto-tagger will blur humans detected in images, if any (i.e., “Blur for anonymous users,” “Blur for non-admins,” or “Blur for everyone”). Images categorized as “Human” are blurred if the human and vehicle prediction thresholds sum to >95.
      • Blur for anonymous users: images of humans will be blurred for all non-read only/non-admin WildTrax users if project data is visible based on the project status.
      • Blur for everyone: images of humans will be blurred for all WildTrax users regardless of their organization or project membership.
      • Blur for non-admins: images of humans will be blurred for all read-only WildTrax users.
  • Organization description: a short description of the organization.

Default settings can be applied at the organization level if you do not want to manually perform operations such as location buffering or visit image access.

Once your organization is approved, the User Assignment tab will appear in the organization settings form. This will allow you to add any WildTrax user to your organization either as an administrator or read-only member.

assign org members

The principal investigators of the organization are users who respond to access requests related to the organization or its projects. Without a principal or secondary investigator, all organization and project access requests will default to organization then project administrators, in order.

Organization administrators collaboratively manage the media and metadata of the organization and have the ability to:

  • Enjoy administrator privileges by default on all organizational projects
  • Add WildTrax users to the organization or its projects
  • Read and write to organizational locations
  • Read and write to the visit, equipment, and media metadata

Organizational read-only members can:

  • Read the unbuffered locations, i.e., read-only members can see the true locations if they are buffered but cannot modify them
  • Read the visit, equipment, and media metadata
  • Enjoy read-only access to all organizational projects

The organization dashboard lists all organizations in WildTrax. The View Only My Organizations toggle in the top-right filters the list to only organizations you’re a part of.

If the organization is greyed out, you are not a member of that organization. Click the drop-down arrow beside the organization name and then click Request Access and fill in the Request Access form to request membership. Administrators of the organization will receive a notification and will either approve or deny your membership request.

Organization Properties

project in WildTrax must belong to an organization and there can only be one organization assigned per project.

WildTrax currently lists two server locations where data can be uploaded (Oregon and Montreal) in the storage location field. The choices here simply come down to where the data is being stored (in the US vs Canada) and the distance between you and the server, which can sometimes affect upload and download speeds. Choose based on your organization’s terms of use for data management and consult WildTrax’s Terms and Conditions of Use and Data Access Policies.

WildTrax uses Amazon Web Services (AWS) to store the compressed acoustic media and raw camera images. The BU offers other storage solution options at the University of Alberta in Edmonton, Alberta, Canada—contact info@wildtrax.ca for more information.

An organization controls the default buffering of locations it manages.

“Allow location reports” creates links you can share with collaborators or landowners without giving them organizational privileges or unwieldy information.

The visit image access field controls global access to your visit images.

Human blurring is an organization-wide setting for the type of users for which the auto-tagger will blur humans detected in images, if any. The options include “Blur for anonymous users (if applicable),” “Blur for non-admins,” and “Blur for everyone”. See section 5.1.2.3 Human blurring for more information.

Organizations that opt-out of Human blurring understand that images they upload, and (potentially) publish publicly may contain humans and are doing so at their own risk. Once you make a selection, it cannot be undone; however, this will be adjustable in the near future.

Equipment is intimately linked with an organization—your organization ideally owns and manages all the equipment used to collect data during visits at locations. Equipment can be marked as loaned out if different organizations are using the same equipment.

Manage Organization Data

2.2 Locations

locations

Locations refer to the physical, geographic places at which environmental sensors were deployed and/or biological data was collected on the landscape. They are one of the most important components in WildTrax as media and metadata are linked by the location. Organizations control and manage access to locations.

You can control each location’s settings by clicking on the Pencil icon beside the location name. Location settings control who and what users can see across the system.

  • Location: the name of the location
  • Latitude: a numeric value with a maximum of 10 digits indicating the location’s latitude (in decimal degrees e.g., “54.67890918”)
  • Longitude: a numeric value with a maximum of 10 digits indicating the location’s longitude (in decimal degrees e.g., “-115.0109191”)
  • Elevation (m): the elevation (in metres) of a location above sea level
  • Location visibility: a location privacy option used to hide locations and/or data from users who are not part of the organization or project to which the location belongs (“Visible,” “Hidden – Location,” or “Hidden – Location+Data”)
  • Location buffering settings: Location buffering settings are used to mask sensitive locations such that WildTrax and other users will never know the exact coordinates; they consist of two fields that work in tandem, location buffering and buffer radius (m)
    • Location buffering: whether the coordinates entered are exact (“True Location”) or randomly offset (“Buffered Location”) by a specified radius (i.e., the buffer radius [m])
    • Buffer radius (m): the radius of the buffer around the location (in metres) within which the coordinates have been randomized (will be 0 m if the coordinates have not been hidden)
  • Location description: a short description of the location. More details about the location can be added in the visit metadata.

  1. Click the Locations tab in your organization dashboard. Clicking Create Location will open the location form, where you can add the spatial metadata to the location or configure the location’s settings. You only need to enter the location name, latitude and longitude in order to create a location.

location form

2. When you’ve filled in the form click Save. The map tab will appear allowing you to visualize the point on the landscape. The location will also be visible on maps across WildTrax.

Locations are created implicitly in the organization when media are uploaded to a project. If a location name already exists in the organization, WildTrax will append the media to it.

Locations UI

You can delete locations using the Delete location button located using the drop-down arrow beside any location name. If any data (i.e. visits, media or tags) require a location to be supported, you will not be able to delete the location until these other data are removed first (button will appear grey). The sections on projects and the three sensor types (ARUcamerapoint count) discuss how to do this more in depth. Given that locations support all cascading data beneath them, WildTrax has implemented these security measures to ensure that it is difficult to lose data.

delete location

Merging Locations Screen 1

WildTrax allows you to merge multiple locations together. You may want to merge locations if the location has been visited multiple times but was named something different on each visit or if the locations are at the same place on the landscape.

To merge locations:

  1. Select a location and click the drop-down arrow
  2. Click Merge location
  3. The location merge form will appear; ensure you are selecting the correct source location (location you want to change) and target location (location you are merging to) which you can choose from the dropdown list. The media will merge together but only the metadata from the target location is preserved.
  4. Click Merge; you should now see data for only the target location chosen in the previous step.

 

merge locations

Only locations that have the same spatial coordinates can be merged. If a source or target location is missing spatial coordinates, WildTrax uses the existing entry during the merge. The merged location will also not generate spatial coordinates if a source and target location both did not contain any.

Sync Locations UI First Screen

WildTrax also allows you to batch upload and download locations and location metadata by clicking the Manage button. This allows you to manage and edit location information more flexibly outside of WildTrax and then sync it back with your modifications.

You need to be an organization administrator in order to upload data, however, as a read-only member, you can download data.

Organization Download CSV

To sync locations:

  1. Click the Download Location CSV button in order to get the current list of all locations and metadata in your organization. If you don’t have any metadata yet, a template CSV will be provided.
  2. Conduct the edits or changes to your CSV. You can modify entries that were downloaded, or
  3. Click on Upload Location CSV.
  4. This will take you to the Upload CSV form; choose your local CSV file and click Preview Changes – this will allow you to preview the changes you’ll be making to your location data. If there are no differences, a prompt indicating No changes detected in this file will appear.
  5. To accept the modifications, scroll down and click Apply Changes.

All batch upload processes to WildTrax work on add and update only. You cannot delete anything in batch, even if you are an administrator and accidentally uploaded an empty csv, for example.

Location privacy options

WildTrax provides multiple ways for organization administators to control how WildTrax members can see locations and data collected at them. These location privacy options keep data protected when needed. 

The location visibility setting is used to hide locations and data from WildTrax users who are not part of the organization or project the location belongs to.

  • Use hidden – location if you want to hide only the location—only organization and project administrators will see the location in the maps across the system. Species data can still be downloaded by non-members but with no coordinates, non-members will not know where the data comes from.
  • Use hidden – location + data if you want to hide both the location and the species information. This setting will effectively hide everything from users who are not organization or project members.
  • Use visible if you want the location and data to be visible to everyone once the project is published

If you are using hidden – location + data, you can refer to location reporting to learn how to share data with users who are not members of the organization or project. This may include landowners, collaborators, or other users to whom you do not wish to grant privileges but are still able to share your data.

Location buffering is another way to mask sensitive locations. You can use the location buffering toggle in two ways:

  • You can enter the coordinates for the true location in the form and then instruct WildTrax to provide a buffer (indicated in the buffer radius [m] field)
  • You can enter previously buffered locations (done outside of WildTrax) and indicate the buffer radius you provided. This way WildTrax and other users will never know the true location.

If you’re an organization administrator, you can allow location reports independently for each location in the organization. This lets organization users share specific location reports with collaborators without having to give full project or organization access. This is particularly useful for organizations who need to provide reports to landowners, leaseholders, partners, and collaborators.

location reporting

To generate and access Location Reports:

  • Go to your organization’s settings and click on Allow under Allow Location Reports.
  • Next, within the Organization view, click on the Locations tab and then on the pencil icon next to any location.
  • At the bottom of the location settings, you should see a link under “Report Link”. Copy this link. Open this in your browser or share it with the desired recipient.
    Allow Location Reports

Report Link Screenshot

You can then easily access elements of the report by clicking on the different tabs depending on the sensors or data that were collected.

Report Screenshot UI

2.3 Visits

visit is when an observer has gone to a location to collect environmental sensor or biological data. When equipment is placed at a location during a visit, WildTrax calls this a deployment. If you are visiting a location to pick up a sensor, or the data is collected, this is a retrieval. At each of these times, you can add sensor (i.e. equipment) metadata to support the visit. Conversely, when media are uploaded into WildTrax, there is the capacity to generate the visit automatically. Here are even more ways you manage your visits:

WildTrax supports a variety of visit metadata and uses an integrated approach based on feedback from major WildTrax users. The goal is to provide ways of standardizing field metadata collection so that it can be easily shared with other organizations or WildTrax users. You can request additional functions or features by contacting WildTrax Support.

Why are visits important?

  • Visit metadata provides quality control for environmental sensor media and metadata collected at the location
  • Visits relate field activities on the landscape to equipment, media and biological data
  • Visits record general information about the landscape where the environmental sensor was placed or the biological data collected

Do I need to collect visit data?

Visit metadata is not a requirement however you will not be able to upload visit images or equipment metadata without a visit.

What visit data is supported in WildTrax?

Two major activities can take place during a visit: using an environmental sensor or conducting a point count.

WildTrax currently supports the addition of general landscape information, visit images and sensor and equipment inventory information

Visit functionalities

Here are some steps to try if you want to begin using visits in WildTrax:

Basic visit functionality

To create a visit:

  1. Go to the Visits tab in the organization.
  2. Click Create Visit.
  3. Select or search for the location from the list in the table.
  4. Once you’ve done this, the visit summary form will appear: a comprehensive list of all visits and equipment for that location.

Please select a location screenshot

visit summary

5. Click on Create Visit.

6. Enter the visit date in the visit form. Click Save after which the visit images tab will appear and the Add Sensor button will turn green and allow you to begin adding other metadata to the visit

create visit

You only need to enter a date in order to create a visit; the rest of the values are optional.

WildTrax has the ability to upload photos for every unique visit you make. Once you create a visit, the visit images tab will appear, allowing you to upload either a folder of images or a single file.

visit images tab

To upload visit images to WildTrax:

  1. Select your images and click Upload within a visit form. Your images will be appended in order down the page with the name of the image being taken from the file name.
  2. You can click either on Edit to add more attribute information to the images or Delete to remove it altogether.

Screenshot of Image Management

WildTrax also supports the following metadata for each visit image.
  • Direction: cardinal direction of the location broken down into sub-cardinal units
  • Vertical Angle: angle at which the image was taken relative to cardinal north
  • Access: the accessibility of the visit image; this is determined by organization administrators
  • Vist Image Comments: any useful information about the visit image

visit image info

Once there are images associated with a visit, thereby a location, the photos will appear on the ARU and point count task page so that users can view the landscape where they are tagging or collecting data.

Visit Photo Grid section

If you’re using an ARU or a camera to survey the location, you can add the sensor information either through the visit summary form (by first selecting a visit in the top part of the panel) or in an individual visit by clicking on the pencil icon. Click on Add Sensor to open the sensor form.

Visit Summary Screen Summaries

Within these forms you can select the sensor type (ARU or camera) and the visit during which the equipment was deployed or retrieved. You can do the same thing by clicking on the visit you’d like to add equipment to while you’re creating the visit.

1234 NE

You can also quickly register new equipment by clicking the Register New Equipment button when you select the sensor or equipment type in the form.

You won’t be able to add sensor and equipment information until the visit has been created or selected. In other words, you need to have visited the location first in order to deploy or retrieve equipment.

All media loaded into WildTrax contains spatial (locations) and temporal (date and time) metadata. You can take advantage of automatically generating visit metadata with the autogenerate visits function found in the Manage button.

You can choose settings for the autogenerator prior to creating and quality controlling the visit metadata. The fields you can modify are as follows:

  • Gap days: The threshold number of days between media (recordings or images) or point count surveys beyond which a new visit will be created.
  • Check equipment: Compares the serial numbers associated with the equipment that collected the media against those entered in the visit metadata.
  • Create new records: Suggests new visit records based on dates associated with the media
  • Alter incorrect dates: Compares dates associated with the media to those entered in the visit metadata. The date of the first image or recording is considered the first visit (e.g. a sensor deployment) and the date of the final image or recording is considered the next visit (e.g. a sensor retrieval)

A few things to note before getting started:

  • The visit autogenerator won’t give accurate visit metadata for ARU data because only a subset of recordings is typically uploaded to WildTrax for processing and these recordings are not usually the first collected by the sensor (e.g. the voice note or a test recording). The advantage of using the autogenerator is when a large amount of media has been uploaded to WildTrax but no visit metadata has been created yet. This sets the stage for an organization administrator to manage the visit metadata, append visit photos, etc.
  • Image sets and surveys, on the other hand, work great!
    • When image sets are uploaded into the system, all images, including those triggered at the start and end of the set (such as a human setting up or taking down the camera), will be used to autogenerate the visit.
    • Surveys are simple as there is only one occurrence at a given time and location, so the visit history is unique.

To use the autogenerator:

  1. Click Manage in the Visits tab of the organization.
  2. Click Autogenerate Visits.
  3. Choose the settings you’d like to use (see above).
  4. Click Find Missing Values. Depending on how much media or visit metadata you are running the autogenerator on it may take a moment to process the results.
  5. The corresponding table will show the location, a comparison of the media and visit metadata dates. The Action field summarizes the information that the autogenerator creates
  6. Select any of the locations to accept the results
  7. Scroll to the bottom of the table to create visits or override individual selections by creating all visits. The results will then be populated in the visit metadata form.

Just like with locations, you can batch upload and download visits and visit metadata by clicking the Manage button.

  1. Click the Download Visits in order to get the current list of all visits and metadata in your organization. If you don’t have any metadata yet, the CSV will provide the column headers.
  2. Conduct the edits or changes to your CSV
  3. Click on Upload Visits
  4. This will take you to the Upload CSV form; choose your local CSV file and click Preview Changes—this will allow you to preview the changes you’ll be making to your location data. If there are no differences, a prompt indicating No changes detected in this file will appear
  5. To accept the modifications, scroll down and click Apply Changes

You only need to enter a date in order to create a visit; the rest of the values are optional.

Here’s a list of the CSV fields and their descriptions that can be included when syncing visits:

  • Location: the name of the location
  • Visit date: the date a location was visited (in the format “YYYY-MM-DD”)
  • Crew: the name(s) of the crew member(s) who visited the location
  • Bait/lure type: the type of wildlife attractant (e.g., scent, lure, bait, etc) placed in front of the camera (if applicable)
  • Walktest distance (m): the horizontal distance from the camera (in metres) at which the crew members perform the walktest (using the walktest mode, if applicable)
  • Walktest height (m): the vertical distance from the camera (in metres) at which the crews members perform the walktest (using the walktest mode, if applicable)
  • Facing human feature: flags used to identify whether a camera is facing a human feature or if an ARU is in the range of a human feature (depending on the sensor type)
  • Visit trigger mode(s): the camera setting that determines how the camera is set to activate (e.g., motion/heat [“Trigger”] and/or at set time-intervals [“Time-lapse”])
  • Motion image interval (seconds): the camera setting that provides the time (in seconds) between the trigger events; that is, if the camera was programmed to pause between firing internally and firing a second time. If a quiet period was not set, enter “0” seconds. The quiet period differs from the motion image interval in that the delay occurs between multi-image sequences rather than between the images contained within multi-image sequences (as in the motion image interval).
  • Visit comments: comments describing additional details about the visit

The visit summary form is accessible by clicking the pencil icon beside any location in Visits. This presents a summary of all visits, sensors and equipment that have been used at the specified location. This section of the visit form is handy when you are servicing sensors over multiple months or years, or you need to know detailed historic information about the visits that took place for data quality control.

visit summary form

 

2.4 Equipment

frosty cameras

Some environmental monitoring programs may require large inventories of equipment in order to meet survey objectives and sample size. The ABMI and BU alone manage thousands of cameras and ARUs for a multitude of purposes. With this comes quality assurance and control of the equipment itself, and not just the data it collects.

The ABMI and BU offer standards and protocols for some of the major types of ARUs and cameras on the marketplace and provide standard practices for environmental sensor repair and assessment that go far beyond the objectives of this book. The goal of the Equipment tab in WildTrax is to centralize your environmental sensor monitoring equipment and keep track of its operational status within the provided forms.

WildTrax does not make direct recommendations on how to repair or operate sensor equipment given their usage extends beyond its operating functionalities. However, we encourages sharing and collaboration to create standard protocols. Environmental sensors are rugged but long periods of field activity gradually reduce their functionality, as with any equipment exposed to the elements. Researchers are encouraged to ensure their equipment is functioning properly before collecting data.

Just like with locations and visits, you can batch upload and download equipment and equipment metadata by clicking the Manage button. Here are the fields and descriptions for the CSV upload.

  • Equipment make: the make (manufacturer) of a particular sensor (e.g., “Wildlife Acoustics”, “Reconyx”)
  • Equipment model: the model number of a particular sensor (e.g,. “SM2”, “PC900”)
  • Equipment serial number: the serial number of the particular sensor (e.g., H600HJ2269118)
  • Equipment purchase date: the date the equipment was purchased
  • Equipment type: the type of equipment (e.g., “SD card”, “ARU”, “camera”, “microphone”)
  • Equipment status: the status of the equipment (e.g. “Active”, “Broken”, “Loaned Out”)

To sync your equipment:

  1. Click the Download Equipment CSV button to get the current list of all equipment and metadata in your organization. If you don’t have any metadata yet, the CSV will provide the column headers.
  2. Conduct the edits or changes to your CSV.
  3. Click on Upload Equipment CSV.
  4. This will take you to the Upload CSV form; choose your local CSV file and click Preview Changes – this will allow you to preview the changes you’ll be making to your location data. If there are no differences, a prompt indicating No changes detected in this file will appear.
  5. To accept the modifications, scroll down and click Apply Changes.

Remember, equipment can be added when you add visit metadata as well.

2.5 Media Management

Projects are the major entry point for your media in WildTrax. But the information for each recording and image set is also saved and becomes accessible in the organization in the Recordings and Image Sets tabs. Since projects belong to organizations, administrators can benefit from:

  • Aggregating media to encapsulate the organizations’ total survey efforts
  • Storing and managing the media to summarize its use in the system
  • Re-utilizing media for other purposes 

An image set in WildTrax is defined as all the images taken at a location between two visits. A recording is a single audio file.

You’ll immediately notice that the summaries for each media tab in the organization are different. Why is this? A few reasons:

  • Cameras can be set to motion capture and/or time-lapse, i.e., record the environment at set times
  • ARUs are set to record at predetermined times or intervals, or can record continuously

A single image is ~5 MB. A single uncompressed audio recording (i.e., CD quality) around ten minutes in duration is usually ~100 MB. This is around 20x as large, the unit of measurement here being a single media unit (one image vs one minute of audio).

With cameras, even though a subset of images may be tagged by a user, all images are uploaded to WildTrax as an image set.

With ARUs, the amount of data collected at a single location typically outweighs the ability for a human to process for a general community census. Therefore, a subset of recordings is typically chosen instead.

You can click on the drop-down arrow for any row of data in either the recordings or image sets tab to display overview information for each media type. Clicking on the project name will take you directly to the camera or ARU task.

Screen Shot 2022 08 26 at 3.59.14 PM

Audio data management

Clicking on the Manage button in the recordings tab will allow you to select from the dropdown to Download Recordings. The metadata in the CSV file contains all the information:

  • Organization: the organization the recording belongs to
  • Location: the name of the location where the recording was collected
  • Latitude: a numeric value with a maximum of 10 digits indicating the location’s latitude (in decimal degrees e.g., 54.11910100)
  • Longitude: a numeric value with a maximum of 10 digits indicating the location’s longitude (in decimal degrees e.g., -118.1728210)
  • Recording date: date and time of the recording (YYYY-MM-DD HH:MM:SS)
  • Length: length of the recording in seconds
  • Landscape features: a comma-delimited list of land features surrounding the location where the recording was collected (e.g. beaver dam)
  • Location visit images: whether or not the location has any visit images, i.e. photos of the area around the site.

You can use the metadata of the recordings to begin generate processing tasks in the system. Uploading recordings through a project will also give you the option to easily generate tasks as you upload.

  1. Go to the Recordings tab and click Manage then Download Recordings. This will generate a CSV list of all of your organizations’ recordings. With the fields mentioned above.
    image
  2. Create or go to an ARU project.
  3. Go to Manage and click Download Tasks or Download Task Template
  4. Fill in the CSV with the following fields to generate tasks. You must fill in all these at a minimum. You can also generate tasks for the same recording at different lengths, for different observers, and for different method types. See the Acoustic Projects and Acoustic Tagging Methods sections for more information.
    • Location
    • Date
    • Method
    • Task Length
    • Observer (if no observer is assigned, WildTrax will use “Not Assigned” by default)
    • Status (if no status is assigned, WildTrax will use “New” by default)
image

 Return to the project, click Manage followed by Upload Task Template. WildTrax will quality check the CSV before final upload into the project

Upload Tasks

Image data management

You can access image set metadata from the image sets tab. Note that this information differs in the project’s image sets tab.

Here you will find summaries of the following information:

  • Location: the name of the location
  • Image set start date/time: the date and time of the first image collected in the image set (in the format YYYY-MM-DD HH:MM:SS)
  • End date: the date and time of the last image collected in the image set (in the format YYYY-MM-DD HH:MM:SS)
  • Total image count: the total number of images
  • Motion image count: the total number of images where the camera was triggered due to heat or motion (i.e., Trigger mode = “Motion Detection” or “M”) in an image set
  • Task count: the total number of tasks
  • Details drop-down: clicking on the drop-down arrow will show the projects the image set is associated with, the observer who tagged the task, the number of unique species detected, and the series gap used in the project.

Edit my Organizations screen

2.6 Map

The map shows all locations that have spatial metadata for your organization. Buffered locations have halos around them whereas true locations are fixed points.

map with buffers

You can zoom in to see individual locations. Clicking on a single location will display the location name and the size of the buffer

The map buffer editing screen

Chapter 3 Projects

elbow sheep

This chapter will walk you through projects.

3.1 Project management

Projects are purposeful groupings of an organization‘s media and metadata to answer specific questions or implement a study design. Projects can be one of three sensors: ARUs, cameras or point counts. For each of these sensor types, a collection of tasks (ARU), image sets (camera) or surveys (point counts) are processed or uploaded in order retrieve species data.

WildTrax requires at least one administrator per project. Project administrators manage the upload of media, user management and assignments, auditing and species verification, and project publication. Organization administrators create the project and assign a project administrator to manage it. These administrators also add read-only members or share location reports with other WildTrax users.

The general life cycle of a project involves:

  1. Creating a project 
  2. Adding users to the project 
  3. Uploading media or data depending on the sensor
  4. Creating or uploading tasks or surveys
  5. Creating (processing) or uploading tags or observations
  6. Verifying and quality controlling tags
  7. Publishing the project (All sensors)

Each processing interface and project dashboard in WildTrax is customized to the sensor type. Depending on the type of media or data you have, you would create a project in the appropriate sensor. Here are some basic steps on how to get started creating a project.

  • Create or join an organization. You must create an organization or become an administrator of an organization before creating a project. The Add Project buttons won’t appear on the Project dashboard until this is complete.
  • Go to the top ribbon and select My Data and then My Projects
  • Select which sensor you’d like to create a project for. You can jump to the corresponding sections to find out more on how to create and administrate each type of project:

sensor types

You may need to use environmental sensors across a number of years or for specific questions related to study design. WildTrax offers an unlimited number of projects that can be created at any time in order to cater to this need. However, there are times when these questions and multi-year projects can be unified to collate and make the data more cohesive. WildTrax gives you this ability with the project merging tool.
Clicking on the drop-down arrow beside the project name on the dashboard will also show the Merge Project button. This function allows you to merge a source project to a target project.
merge project
merge project 2

To delete a project, click the drop down arrow next to the editing pencil in the project list, and click the Delete Project button.

Only projects with no image sets or tasks can be deleted. If you already uploaded data to your project, you will need to delete them prior to deleting your project.

The User Agreement tab supports additional information for your project. If you need to append documentation to the project for data sharing agreements you can click on the Add an Agreement button and upload it to the project. A form will also appear detailing more information about the User Agreement with the following fields:

  • Agreement Name
  • Start and End Dates
  • Data constraints: Select from one of three spatial extent options
  • Signee information (full name, email and organization)
  • Signed Date

user agreement

The Map displays all the locations in the project that contains spatial metadata. The visibility or buffer applied to the location will also be reflected in the map depending on your project or organization membership.

Map spatial data screen

When new media is uploaded into WildTrax, new locations will be generated in the organizations. Organization administrators and project administrators should work in tandem to ensure:

  • Spatial coordinates (latitude and longitude) are entered for locations so that taggers and other users can see them on a map.
  • Visit photos are uploaded to visits so that project taggers can see the habitat characteristics of the location when they’re tagging using the Location Photos tab
  • Running the visit autogenerator to check visit metadata against the media
  • Ensuring faulty equipment is repaired from malfunctioning media

 

3.2 ARU projects

The goal of an ARU project is to upload media, process tasks, verify tags and publish the results. Projects belong to organizations and use organization media on order to generate and report on a certain set of results designed by the project administrators.

Clicking on the ARU sensor on the main project dashboard will show which projects you have access to. The list of projects you see is determined by your organization membership, project membership and status of the project. You can use the filter and sort some of the project attributes to find what you’re looking for:

  • Organization
  • Project name
  • Project status
  • Due date
  • Total number of tasks
  • Total number of completed tasks
ARU Project Dashboard screen

Clicking on the pencil brings you to the Project Settings window. You can also click on the down arrow to find out more details about the project or to Request Access. The View only my projects toggle filters the default view to only projects to which you are a member (either administrator or read-only).

3.2.1 Create an ARU project

Click on Create an ARU Project. This will open the Project Settings form allowing you to begin adding information to the project. The Create/Update ARU Project contains the following fields; those marked with an * are mandatory before being able to create the project:

Update ARU Project Screen
  • Project: A short name to identify the project.
  • Organization*: Drop-down list of all existing organizations of which you are an admin. This field allows you to group all of your projects together in the database. Once your project is saved, you will be unable to change the organization to which it belongs.
  • Data Source: A field used to specify the original source of the data; this field should be entered if you are importing data from an outside source (e.g., previously tagged data).
  • Purpose and methods: A short description of the methods used to create this project and goals it hopes to achieve.
  • Results summary: A short description of the results your project found.
  • ARU project options box:
    • Default minimum frequency (Hz) and default maximum frequency (Hz) of the spectrogram that will be displayed
    • Default X-Scale: Length displayed in seconds. Corresponds roughly to 0.25x (60 seconds) to 10x (2 seconds) in length displayed
    • Default Y-pixels: Height of the spectrogram in pixels 
    • Default monochrome: Default to monochrome colour palette toggle. Alternative is colour
    • Default light mode: Default to light mode toggle; alternative is darkmode spectrograms
Screen Shot 2022 10 13 at 2.56.07 PM

Once your project settings are completed, you can click Update and move onto the next two steps of the project creation process which include adding species and users.

The Assign Species tab contains a set of tools that allow you to limit or choose which species you want to tag in your project. You may want to do this in order to control the amount of tagging error (incorrect species identification) or to limit which species you want to tag.

To add species to your project, click on a species in the Not Included column (the row will turn blue once selected) and click the arrows to add it to the project’s Included list. You can also unselect and select all from either column. Once a species has been tagged in the project, the assign species tab will automatically lock the row preventing you from removing the species from the project if it has already been tagged.

assign species tab

You can also add an entire group of species using the preset groups found in the Apply Preset button. Click Apply Preset and then search and select which groups of species you’d like to use. You can select one or many groups at a time. Once you’re done with your selection, click Submit and the species will be added to the project.

show presets

If you have a preset group or species you would like to be available, email info@wildtrax.ca with a request.

Once you’ve created and saved your project details, you’ll be able to manage the users who are part of the project. Select the tab to add and view current administrator and read-only project members.Screen Shot 2022 08 27 at 8.26.42 AM

Project administrators have both read and write access to the project details, settings. They can add users to the project and can assign taggers to tasks.

Project read access members can view project details, and settings, and all tasks within the project. However, they can only edit tasks assigned to them.

Once you have created your project and close the window, the new project will be visible in the ARU project list. Click on the project title to enter the project page or click on the pencil icon to edit project details, settings or users.

Users are granted their respective membership levels to an organization’s projects. You don’t need to add an organization member to a project unless you need to grant them administrator privileges and they only have organization read-only.

recording is the raw media or audio file that is uploaded to an ARU project in WildTrax.

WildTrax supports four audio file types: wacwav, flac, mp3

  • wac are proprietary, lossless compressed file formats developed by Wildlife Acoustics
  • wav is the standard, ubiquitous uncompressed audio file format
  • mp3 a lossy compressed audio file format; works by reducing the accuracy of certain sound components, and eliminating others
  • flac is a lossless compressed audio file format

A recording uploaded to WildTrax must also contain two important pieces of information: a location prefix and a timestamp. This allows WildTrax to generate or append the media to the locations stored in the organization as well as tell you when the recording took place.

When data are uploaded, WildTrax converts and stores the audio as flac and the spectrograms are generated using the raw audio file format.

Once you have the recordings selected that you wish to upload for processing, go to the desired project and click Manage and then Upload Recording(s) to Project. The recording upload window will open with settings before you upload your files.

Upload to ARU Project Screenshot

  • Including subdirectories to scan if your media is layered in a hierarchical structure.
  • Removing leading zeros after delimiters helps to clean up the location naming to a generic WildTrax standard. It is not obligatory to use this function but is recommended
  • Recordings uploaded to WildTrax through a project automatically have a task generated for them. This allows a spectrogram to be generated and the task added to the project dashboard. You can upload recordings to the organization if you don’t want to generate tasks yet—see media management for more information.
  • The default length and method type will be applied to all the recordings you’re uploading—you can select a different method or length later on if needed
  • Marking recordings as triggered to differentiate them from schedule-based recordings
  • Optionally pre-scanning recordings to show sample rate and length during the upload process

Click Choose a Folder to Upload and select the directory that contains your recordings. WildTrax will identify how many files it has scanned which will then allow you to go to the next step.

Upload coordinates fields example

The next part of the process is entering spatial coordinates to new locations or entering coordinates if locations are missing them. This step in the upload process is optional and you can enter this information later using the Upload and Download Location tool. Note that:

  • If the location doesn’t exist yet in the organization, WildTrax will create it
  • If the recording is shorter than the method you indicated by more than 3 seconds (e.g., 56 seconds for a 60 second length), you’ll be prompted with an error that a recording shorter in length than the desired method should not be uploaded to WildTrax

When you’ve completed reviewing the form, click on Upload Files in Queue to initiate the upload. It is strongly recommended to connect via ethernet if you want to stabilize and ensure internet connectivity while you upload media.

Confirm Files to Upload Screen

3.2.2 Location management

Administrators can benefit from managing only the locations from the media in the project, This differs from location management at the organization level, where locations from all projects can be managed.

  • Click the Manage button and go to Download Location CSV to obtain the current list of locations in the project. If you don’t have any locations yet, a template will be provided.
  • Make the changes or additions you’d like in the csv and re-upload the csv with the modifications using Upload Location CSV

3.2.3 Task management

When recordings are uploaded to a project, new tasks are generated from the media and are added to the project dashboard under My Tasks.

task is a unique combination of a user, a processing method and a recording. This flexibility allows you to generate tasks using the same recording but allow for multiple users or processing methods.

Go to the project dashboard and click My Tasks. The form will display the list of tasks currently in view depending on the View only my tasks toggle—switching the toggle to red allows you to filter the list to only your assigned tasks. The metadata for each one includes the location name, the date and time of the recording, the method it was assigned, the status of the task and the assigned user.

If you didn’t generate tasks when you uploaded recordings to the project, or, if you’re using an organization’s recordings to create tasks, you can also use the Upload Tasks tool under the Manage button to automatically generate the tasks you need. Use the Download Tasks or Download CSV Template button and fill in the following fields:

  • Location: The name of the location.
  • Recording date/time: The date and time that a ARU recording was collected (in the format “YYYY-MM-DD HH:MM:SS”).
  • Task method: The method used to process the recording (SPM = 1 tag per individual per species per minute, SPT = 1 tag per individual per species per task, NONE = other methods).
  • Task duration (seconds): The duration the recording was processed during the task or the duration of the point count.
  • Observer ID: The unique numeric primary key of the observer that processed the image or recording, or that conducted the point count.
  • ARU task status: The processing state of the ARU project task (recording) (i.e., “New,” “In progress,” “Transcribed,” ‘Bad weather,” or “Malfunction”).
  • Rain: The average rain on the task
  • Wind: The average wind on the task
  • Industry noise: The average chronic industrial noise on the task
  • Other noise: Other noise averages on the task
  • Audio quality: The audio quality of the task (recording).

If you have hundreds or thousands of tasks without an assigned user, you can use the Randomly Assign button and select users from the dropdown list on the left to assign tasks to. It will randomly, and equally, assign unassigned tasks to the selected user(s).

If you’re a project administrator, you can also delete tasks from the project dashboard by clicking the drop-down arrow beside the desired task and clicking Delete Task. Follow the prompts and warnings accordingly as task deletion is permanent and irreversible.

3.2.4 Tag management

The “Manage” button also contains functions that allow you to to upload and download tags to and from a project. This is useful if you’ve ever tagged acoustic data in another database and wish to sync it to WildTrax standard or you’re looking for a downloadable record of the tags and their metadata. Here are the CSV fields required in order to upload; if your project doesn’t contain any tags, you’ll be able to download a template first by clicking “Manage”.

Before you can upload tags into the system, you’ll need to first upload the recordings and create tasks in order to have the proper metadata needed to populate the tag CSVs. You cannot generate tags for tasks or recordings that don’t exist.

  • Location: The name of the location
  • Recording date/time: The date and time that an ARU recording was collected (in the format YYYY-MM-DD HH:MM:SS)
  • Task method: The method used to process the recording (SPM = 1 tag per individual per species per minute, SPT = 1 tag per individual per species per task, NONE = other methods).
  • Observer ID: The unique numeric primary key of the observer that processed the image or recording, or that conducted the point count.
  • Species common name: The common name of the species that was observed in the tag or point count.
  • Individual: This field corresponds to uniquely identified individuals of a species on the recording. In other words, if you hear two Ovenbirds (OVEN), the first individual would be tagged as “1” and the second OVEN as “2”. WildTrax has been designed to automatically populate the Individual field as new data is being entered. Therefore, if you forget to add in the individual number, the system will update it for you automatically in sequence.
  • Vocalization: The vocalization type of the tag (i.e., “Song,” “Call,” or “Non-vocal signal”).
  • Start time (seconds): The start time(s) of the window of the recording processed by the classifier.
  • Tag duration (seconds): The duration of the tag (in seconds).
  • Minimum frequency (Hz): The minimum frequency (in Hertz) of the tag.
  • Maximum frequency (Hz): The maximum frequency (in Hertz) of the tag.
tag management

Click on the Manage button and select Upload Tags. In the new pop-up window, select Choose a CSV file to upload to select your tag data. When the file is uploaded, the “QA Tag Data” button will turn green. Click it and WildTrax will ensure that the data you’re uploading fits the WildTrax standard. This may take a few minutes depending how large your file is.

demo org

3.3 Camera projects

As a reminder, projects are purposeful groupings of an organization’s media and metadata to answer specific questions or implement a study design.

The goal of a camera project is to upload image sets (i.e., tasks), process tasks (i.e., tag species and individuals within images), verify tags, and publish the results. The main interface you will interact with when processing image sets is the camera project dashboard.

Camera project dashboard

You can access the camera project dashboard by first clicking on My Data, followed by My Projects.

my projects

Clicking on the camera sensor on the main project dashboard will bring you to a list of camera projects you have access to. The list of projects you see is determined by your organization membership, project membership, and the status of the project.

camera project

The following project metadata and functionalities can be viewed or access on the camera project dashboard:

  • Organization: the name of the organization.
  • Project: the name of the project.
  • Project status: the status of the project (i.e., “Active,” “Test Only,” “Published – Private,” “Published – Public,” “Published – Map + Report Only,” or “Published – Map Only”) (see section 3.5 – Publishing projects for more information).
  • Due date: the anticipated processing completion date. Taggers can use this as a guideline for when their image sets are due.
  • Task count: the total number of tasks.
  • Completed task count: the total number of tasks (recordings or image sets) completely tagged. Note that the task status for an image set can be “Tagging Complete” even if species verification is not complete.
  • The “View only my projects” toggle: filters the default view to only projects to which you are a member (either an administrator or read-only member).
  • Edit project settings: clicking on the pencil icon brings you to the camera project settings window.
  • Details drop-down: you can also click on the drop-down arrow to find out more details or to request access to the project data.

3.3.1 Create a camera project

There are three general steps required when creating a new camera project:

  1. Create a new camera project and select camera project settings
  2. Assign project species
  3. Assign project users

To create a new camera project:

Click the “Create a Camera Project” button in the project dashboard under the camera tab. The camera project settings window will appear. As a reminder, you must be an organization administrator to create a project.

camera settings

Camera project settings

The following information can be added to new projects:

  • Project: the full name used to identify the project as defined by the admin
  • Organization: the name of the organization. This will entail a drop-down list of all existing organizations of which you are an admin. This field allows you to group all of your projects together in the database. Once your project is saved you will be unable to change the organization to which it belongs
  • Due date: the anticipated processing completion date. Taggers can use this as a guideline as to when their image sets are due.
  • Data source: a field used to specify the original source of the data; this field should be entered if you are importing data from an outside source (e.g. previously tagged data).
  • Purpose and methods: an outline of the purpose, goals, objectives, and methods of your project.
  • Results summary: a summary of the data and results found as the outcomes of the project (if applicable).
  • Tagging field options: settings that allow users to customize tagging forms to include the fields most relevant to their project. See section 5.2.3 –Tag types for more information.
camera attributes

Camera project pre-processing settings are settings that allow users to manage the pre-processing tools that will run on their image sets. By default, no selection is made. The following settings can be enabled:

  • Auto-tagger: settings defined at the project level to indicate the classifier categories for which the Auto-tagger (an AI tool which uses a combination of Microsoft’s MegaDetector,1 MegaClassifier v0.1,1* and a WildTrax AI) will be automatically applied when new image sets are uploaded to a project, and whether the selected categories should be automatically hidden from view in the image tagging page. Options currently include NONE (i.e., false-fires), Human, Vehicle, STAFF/SETUP and MegaClassifier v0.1.1* Note that human image blurring is a setting found within the organization’s settings.
    • STAFF/SETUP tagger: an auto-tagger setting used to select the application of the STAFF/SETUP tagger, which will automatically tag images of humans that occur within the first or last series as STAFF/SETUP (using a 5-minute series gap), unless there are <2 images in the 2-hour period, <2 images auto-tagged as human, or the STAFF/SETUP tag has already been applied.
    • MegaClassifier v0.11*: an auto-tagger setting used to select the application of the preliminary (testing) version of MegaClassifier1 automatically to image sets.
  • Series gap: the default time (in seconds) between consecutive images that defines an independent detection event (or “series”). This setting will affect how images are displayed in the tagging “series view.” For example, suppose the series gap is 120 seconds (the default). Once an animal/or object triggers the camera, all images that occur within 120 seconds of the previous image are considered as the same series (this continues until >120 seconds pass without a trigger event). Users can modify the series gap at any time.
  • Show time-lapse images: a setting used to indicate whether time-lapse images will be shown by default; this can be toggled on or off within image sets while tagging.
  • Default bounding box display threshold: the default minimum classifier confidence level for which bounding boxes will be displayed for the selected detection categories. 30-50% is recommended.
  • Project status: the status of the project (i.e., “Active,” “Test Only,” “Published Private,” “Published Public,” “Published Map+Report Only,” or “Published Map Only”). See more in project publication.

1 Beery S, Morris D. & Yang S (2019). Efficient pipeline for camera trap image review. arXiv:1907.06772. Retrieved from: http://github.com/ecologize/CameraTraps. https://doi.org/10.48550/arXiv.1907.06772

 *Note: we are in the process of assessing the accuracy of MegaClassifier v0.1.

pre processing settings

Assign project users

Once the camera project settings have been saved, you will be able to manage project users. Select the “User Assignment” tab to view, add, or edit project administrators and read-only members.

image 22

Project administrators have read and write access to the project details and settings. They can add users to the project and assign taggers to image sets.

Project read-only access members can view project details, and settings, and all tasks within the project. However, they can only edit tasks assigned to them.

Once you have created your project and close the window, the new project will be visible in the ARU project list. Click on the project title to enter the project page or click to edit project details, settings or users.

Users are granted their respective membership levels to an organizations’ projects. You don’t need to add an organization member to a project unless you need to grant them administrator privileges and they only have organization read-only.

Assign project species

Once the project users have been saved, you will be able to add/manage project species. The Assign Species tab contains a set of tools that allow you to choose which species you want to tag in your project. You may want to do this to limit tagging errors (incorrect species identification) or to limit which species you want to view in the drop-down list. Note that once you’re in the tagging view, you will not see species that have not been assigned to the project.

assign project species

To add species to your project:

  1. Click on the “Assign species” tab to view, add, or edit project species (if not automatically prompted).
  2. Click on a species in the Not Included column (left side); the row will turn blue once selected.
  3. You can either:
    a) Click the right arrow to add it to the project’s included list. You can also select and de-select all from either column. You can remove a species from the project’s included list using the left arrow. However, once a species has been tagged in the project, the Assign Species interface will automatically lock the row, preventing you from removing the species from the project if it has already been tagged.
    b) You can also add an entire group of species using the preset group using the Apply Preset button. Click Apply Preset and then search and select the group(s) of species you’d like to use. You can select one or many groups at a time.
  4. Once done with your selection, click “Confirm” and the species will be added to the project
preset species

If you have a preset group of species you would like to be available, email info@wildtrax.ca with a request.

3.3.2 Upload image data

Image data can be uploaded as image sets or media-less datasets (tabular data). This section describes how to upload images as image sets; please refer to the section on synchronizing tags/uploading media-less data for information on how to upload tagged data without the associated images into WildTrax.

Within each project page, tasks can be created by uploading images sets collected from remote cameras as single image sets or in “batches” of multiple image sets (batch uploads).

Before you begin

Since web browsers are designed to protect users’ data and file systems, there are limits to the size of folders that can be uploaded. The number of images you can expect to upload without error is also variable and dependent on your computer specifications. Generally, if you wish to upload 1 million images, your computer must have 5GB of free memory. If your computer does not have enough memory, WildTrax may crash mid-way through the upload process. If you upload fewer than 100,000 images in one batch, you do not need to be concerned about available memory.

Folder and file names

Before uploading, image sets should be organized into folders on a computer or hard drive, each with a folder name that corresponds to a unique location where the image set was collected. The folder name will appear in the task list once uploading is complete. If multiple deployment periods exist for a location, the user may wish to retain this hierarchy by placing these multiple deployments (image sets) into an overarching folder. What matters for the upload process is that images sets are organized into high-level folders based on how you wish to see the data structured for processing. If uploading more than one image set (“batch uploading”), name the folders appropriately, and select the correct “Level in upload folder structure” corresponding to your data structure.

  • Location folder names (i.e., location names) should not have spaces
  • Leading zeroes are not recommended
  • Special characters that are supported are: – , _ & # @
  • Special characters that are not supported are periods, spaces, and slashes (/)

Image file names do not need to be changed because they are sorted by date/time stamps in WildTrax and are attributed to the location, which is specified by the folder name you select in the hierarchy.

Image upload process

To upload image sets:

  1. Click on Manage  and then Upload Images to Project.
  2. Select the appropriate number for Level in upload folder structure based on whether you are uploading a single image set or multiple image sets. If you are uploading multiple image sets, the image sets are nested within locations, which might be nested in another folder, such as a year. Level in upload folder structure corresponds with the level of the location folders relative to the highest-level folder you would like to upload (i.e., the “batch folder”). Count from the top folder (the batch folder) to the bottom level (the location folders). Folder names at the bottom level will be used as the location names. Images in folders below this level will be split into separate image sets.
    • If you are uploading a single image set, leave the “Level in upload folder structure” set to 1, click “Choose a folder to upload” to select image sets from your computer or hard drive, and select the folder corresponding with the location name.
    • If you are uploading multiple image sets, the images are likely nested within locations, which might be nested in another folder, such as a year. Adjust the ‘Level in upload folder structure’ appropriately to correspond with the number of levels that the location folders are nested below the batch. For example, if locations are nested within a year folder, change the base level to 2, click “Choose a folder to upload”, and select the folder that is one level up from the folders containing your image sets.
  3. Click “Upload” and confirm that you would like to upload images to WildTrax. Before committing to the upload, ensure that that the selected folder is correct by checking the locations that are listed to be uploaded, and the number of images associated with each location
batch upload 2

The Skip Existing Image Sets setting can be applied to speed up the overall upload time by skipping locations (folder names) that already exist. Use this setting if a number of locations from the upload folder have already been uploaded to the project and you are certain that those image sets were completely uploaded.

After clicking Upload, files will upload in the background. A green check mark will appear next to successfully uploaded folders. If images have been skipped, you can re-upload those files by clicking on “Upload Failed Images”.

When uploading is complete, the newly uploaded image sets will appear in the task list under the “Location” column. At this stage, the status will either be
“Ready for Tagging” or ”Processing,” depending on your project settings (refer to “Camera task (image set) status” for more information on task statuses). The list of tasks can be sorted and filtered by location, image set start date, motion image count, tagged image count, task status, or the observer assigned.

If an image set does not move from “Preparing” to “Processing” within a reasonable timeframe, you can request that the image set is pushed through processing by using the “Tag image sets with AI” button  that can be accessed to through the “Manage” button. Note that the “Tag images sets with AI” button can only be used for image sets in the “Preparing” status; image sets in either the “Processing” or “Uploading” status cannot be sent for AI processing using this button.

CAM manage tag image set AI 2023 11 04

3.3.3 Manage a camera project

Location management

Administrators can benefit from managing only the locations from the media in the project; this differs from location management at the organization level, where locations from all projects can be managed. 

  1. Click Manage and go to “Download Location CSV” to obtain the current list of locations in the project. If you don’t have any locations yet, a template will be provided.
  2. Make the changes or additions you’d like in the CSV and re-upload the CSV with the modifications using “Upload Location CSV”.

Task management

A camera task is a unique combination of an image set and an observer, in this case a WildTrax project member.

When image sets are uploaded to a project, new “tasks” are generated from the image sets and added to the project dashboard.

The tasks listed on the project dashboard will depend on the “View only my tasks” toggle. Switching the toggle to red filters the default view to only tasks assigned to you.

The metadata for each task includes the location, image set start date, motion image count, tagged image count, task status, and the observer assigned or “observer” (refer to the section on Image set metadata for more details).

Camera task (image set) status

The camera task (image set) status is the processing state of the camera project task (image set) (i.e., Uploading, Preparing, Processing, Ready for Tagging, or Tagging Complete). Statuses differ by sensor type.

  • Uploading: the image set is being uploaded to the server
  • Processing: if applicable, the auto-tagger is being applied to the image set returning results of NONE (i.e., false-fires), Vehicles, Humans, animals, or species within the MegaClassifier v0.1
  • Preparing: if applicable, the STAFF/SETUP portion of the auto-tagger and Human blurring is being applied to the image set, and STAFF/SETUP tagged will be filtered out deployment and retrieval images.
  • Ready for Tagging: the image set is ready to be manually processed; this is the status of the image set after the “Processing” and “Preparing,” phases if you selected the Auto-tagger options for your in project settings.
  • Tagging Complete: if all images have been tagged, and the camera Field of View review has been completed. Note that a task’s status may be “Tagging Complete” even if species verification has not occurred.

Camera task (image set) status follows the path: Uploading -> Ready for auto-tagger -> Processing -> Preparing -> Ready for tagging -> Tagging complete.

Assign taggers

Taggers can be assigned both randomly and manually by the project or organization administrator(s). If you choose to randomly assign, unassigned tasks are proportionally distributed among your selected taggers. When assigning taggers to tasks, the selection list will be comprised of users that have been added to the project (see Assign project users). The project administrator(s) can also assign themselves as taggers to tasks.

Taggers can be assigned both randomly and manually by the project or organization administrator(s). If you choose to randomly assign, unassigned tasks are proportionally distributed among your selected taggers. When assigning taggers to tasks, the selection list will be comprised of users that have been added to the project (see Assign project users). The project administrator(s) can also assign themselves as taggers to tasks. 

image 85

Delete tasks

You can easily delete tasks regardless of their status. To delete a camera task, click the drop-down arrow next to the tagger assignment pencil in the image set list, and click the “Delete task” button.

delete image

A window will appear with the task details and you will have to confirm the deletion.

Deleting a task will delete the entire image set from the cloud and any associated tags from the database. These actions cannot be undone.

Tag management

Sync tags

The Manage button also contains functions that allow you to upload and download tags to and from a project. This is useful if you have tagged image data in another database and wish to sync it to WildTrax, or if you are looking to download tags and associated metadata. You can also upload media-less data sets (e.g., CSVs with tagged information processed using another software or platform) into WildTrax to centralize your data. This is especially useful if you have concerns with uploading media (e.g., constrained by privacy concerns). To sync data, a template can be downloaded from the Manage button.

Here are the fields that are included in the CSV download when syncing tags or uploading media-less data; fields in bold are editable:

  • Location: the name of the location. 
  • Equipment make: the make (manufacturer) of a particular sensor (e.g., “Wildlife Acoustics”) 
  • Equipment model: the model number of a particular sensor (e.g., “SM2”)
  • Equipment serial number: the serial number of the sensor (e.g., “H600HJ12269118”) 
  • Source file name: the original name of the media file uploaded to WildTrax. 
  • Image date/time: the date and time an image was taken (in the format “YYYY-MM-DD HH:MM:SS”). 
  • Species common name: the common name of the species that was observed in the tag or point count. 
  • Individual count: the number of individuals of a certain species, age, sex, behaviour, etc. (i.e., applies to a specific tag where all other fields remain equal, rather than the image). 
  • Age class: the categorical age class (see codes; choose one) of individual(s) in a tag (when identifiable). 
  • Sex class: the categorical sex class (see codes; choose one) of individual(s) in a tag (when identifiable). 
  • Tag needs review: flags applied when species attributes are unclear and need to be checked. Each individual(s)-level tag in an image is associated with a review tag, so it is clear which tag needs to be reviewed (reported as “TRUE” or “FALSE” when syncing tags). 
  • Tag verification status: whether the tag requires verification (“FALSE”) or has been verified (“TRUE”). 
  • Tag comments: comments entered by the observer to describe any additional details about the tag (ARU or camera) or individual observation (point count). Note: camera projects tags also include “image comments”; ARU projects also include “individual species comments.” 
  • Image exists in WildTrax: whether an image file was uploaded to WildTrax (vs through sync of CSV files without media)
  • Tag ID: the unique numeric primary key of the tag entry
  • Image ID: the unique numeric primary key for the camera image

3.4 Point count projects

This section of the guide provides an overview to help you integrate your listening point data into WildTrax. The Boreal Avian Modeling Project offers services to help you adapt your data to this standard. Contact the Boreal Avian Modeling Project team for more information.

Create a point count project

To create a new project, click the Add Point Count Project button in the project dashboard under the point tab. You must be the administrator of at least one organization within WildTrax in order to create a project. The main settings panel will have general information fields and a description of the project. 

image
  • Project Title: The full name used to identify the project as defined by the admin.
  • Organization: The name of the organization.
  • Year: The year the data was collected.
  • Data Source: A field used to specify the original source of the data; this field should be entered if you are importing data from an outside source (e.g., previously tagged data).
  • Organization description: A short description about the organization..
  • Status: The status of the project (i.e., “Active,” “Test Only,” “Published – Private,” “Published – Public,” “Published – Map+Report Only,” or “Published – Map Only”).

Once you save your project details, you will be able to manage your users. Select the User Assignment tab to add and view current administrator and read-only project members.

image

Uploading point count data

You can upload your point count data directly to a project in WildTrax. Here are the required CSV fields : 

  • Location: The name of the location.
  • Point count date/time: The date and time that the point count was conducted (in the format ” YYYY-MM-DD HH:MM:SS”).
  • Task method: The method used to process the recording (SPM = 1 tag per individual per species per minute, SPT = 1 tag per individual per species per task, NONE = other methods).
  • Task distance: The distance method used during the point count.
  • Observer ID: The unique numeric primary key of the observer that processed the image or recording, or that conducted the point count.
  • Species common name: The common name of the species that was observed in the tag or point count.
  • Distance detection: The distance at which the individual/species was detected.
  • Detection time: The time(s) within a recording or point count that a tagged individual/species was detected (in the format “HH:MM:SS”).
  • Abundance: The number of individuals; 1-10, TMTT (too many to tag), or CI1, CI2, CI3 for amphibians.
  • Detection seen: Was the observation made visually?
  • Detection heard: Was the observation made acoustically?
  • Task comments: Comments entered by the observer about the task or point count.

Click on the Manage button and select Upload Surveys. In the new pop-up window, select Choose a CSV file to upload to select your survey data. When the file is uploaded the QA Project Data button will turn green. Click it and WildTrax will ensure that the data you’re uploading fits the WildTrax standard. This may take a few minutes depending how large your file is. 

image

Just like with ARU and camera media upload, survey data will also implicitly create locations that don’t yet exist in the organization. If the location exists, it will automatically join it to the spatial metadata provided the names match. You can merge locations if you need to later. Furthermore, since surveys only typically contain one visit, you can easily autogenerate them in the organization as well.

3.5 Publishing projects

When the project is completed and all data processed, the status of the project can be changed to a published status. Project publication allows other WildTrax users, who are not project members, to access the media, metadata or species detections from the project either through Data Downloads or Data Discover. The publication status will control how data will become visible across the system.  Project publication will lock users from editing species detections and is considered the final version of the data. You can change the status of a project at any time.

Location and project membership settings will also determine what you and others can see. Ensure you have these correctly set before publishing a project.

image

Project status

Here are some in-depth descriptions of each of the published statuses and what they mean.

Published

In Published – Private, the project data will only be available to project members. Users will need to request access to the project in order to view any details such as species or locations.

In Published – Map Only, the project data will be accessible through Data Discover but the media and report are not accessible to users who are not project members. If you’re not a project or organization member, the location buffering and visibility settings will apply.

In Published – Map + Report Only, the project data become available to all WildTrax users through Data Downloads and Data Discover, however, the media is not accessible. Use this setting if you want to make your data publicly available but there are privacy concerns with your media. If you’re not a project or organization member, the location buffering and visibility settings will apply.

This status does not exist for point counts since the sensor does not contain media.

In Published – Public all of the project data become available to any WildTrax user as well as the details of the project in Data Downloads and Data Discover. If you’re not a project or organization member, the location buffering and visibility settings will apply.

Unpublished

Active projects are currently being processed, designed or are in the preliminary stages of data uploading. Use this status for any general use or if the project is actively being worked on. This is the default project status when it is first created.

Just getting started with an environmental sensor program? Or have some media you want to upload to test WildTrax’s functionalities? Use the Test status in these cases. Project data will not be visible in Data Downloads or Data Discover except to project members.

Chapter 4 Acoustic Data

A finch, singing on a branch.

Autonomous recording units (ARUs) are sensors that allow for the automated collection of acoustic recordings over extended periods of time, resulting in the accumulation of large amounts of valuable data.

This chapter will describe how to process acoustic data. If you’re just getting started using WildTrax or you haven’t uploaded any acoustic data yet, it’s recommended that you review the chapters on organizations and projects beforehand. 

4.1 Audio data concepts

Acoustic recordings are made by sampling sound pressure waves from the environment. Sampling rate (the number of samples taken per second) and bit depth (degree of quantization of pressure measurements) are two important values to consider when making sound recordings. Waveforms or the shape of the total acoustic signal over time are difficult to interpret.

These waveforms can be transformed into spectrograms. This is done using algorithms called Fourier transformations to make acoustic data more intelligible to a human or other software. These spectrograms contain patterns that a human can more easily use to recognize an acoustic signal and convert into biological data, such as animal vocalizations.

image
Waveform of an audio recording
image
Spectrogram of the same recording

Animals use acoustic signals to communicate messages for mate attraction, territorial defense, identification, or alarm. Acoustic signals are important metrics that biologists and scientists can use to study animal populations, community assemblages, or individual fitness. Collecting sound signals also provides a permanent, unbiased, analyzable, and reproducible dataset for researchers in contrast to point count data

Sampling design

An ARU can be deployed at a location for a long period of time. This provides a number of benefits to users processing acoustic data and project administrators to optimize their study design. If you’re just getting started using ARUs, WildTrax’s recommendations on how to best process acoustic data are based on the analyses, reports and publications that can be found in the Resources section. 

  • How long you leave an ARU out at a location and how much data you process depends on your objective. Thus, there is no one answer that optimizes results for all species or for all questions. The inclusion of groups such as amphibians, owls, or nocturnal species changes the optimal ways of sampling with ARUs.
  • For songbirds, there is strong evidence that shorter duration surveys (e.g., 1 minute) will increase detection rates and allow for a greater number of recordings from different days to be processed. This will result in more species found faster. However, trade-offs with other methodological approaches will occur. For example, 1 minute methods have higher detection error per visit but cumulatively have higher detection overall if you put in equal effort (10 x 1-minute point counts vs. 1 x 10 minute point counts). More work is needed to understand the implications of using occupancy estimates from short versus longer periods of sampling time per individual point counts in terms of the stability of occupancy estimates.
  • There are no firm recommendations on whether the total time should be 3, 5, or 10 minutes, etc. per recording processed. However, we do strongly recommend that listening in 1-minute time blocks within any longer interval provides the greatest flexibility in methods and highest return on processing investment. The ability to estimate parameters such as song rate with USPM methods increases the utility of such data and has the potential to help better measure a greater array of state variables (e.g., singing rate, occurrence or density).
  • A single day of recording does not seem to be the best way to estimate occupancy or assess probability of occurrence. Leaving an ARU to record at a location for several days seems to provide a reasonable balance in getting detections from species that hold territories close to an ARU while also increasing the probability of getting rarer species with larger home ranges that only periodically are near an ARU. Less is gained by leaving them out for a month if it comes at the cost of visiting more locations.

Tasks

In order to process data, a task must be generated and assigned in WildTrax. A task is a unique combination of an acoustic recording, a processing method, and an observer. The observer needs to be at least a read-only project member and assigned the recording in order to process it. WildTrax has the ability to process acoustic recordings with multiple methods and users, however, please contact WildTrax Support to provide this service. 

image

Recommended equipment

The following equipment is recommended for processing acoustic data. Proper equipment will enhance your experience and data quality using WildTrax.

  • Computer with stable, high-speed internet connection and a minimum screen size of 15”
  • Headphones with the following specifications:
    • Stereo
    • Circumaural (fully enclosing the ear)
    • Minimum frequency response: 20 – 20000 Hz
    • No bass boost (flat frequency response)

Contact WildTrax Info regarding headphone specifications if you are unsure if your headphones meet specifications.

4.2 Acoustic tagging page

This section is designed to provide a comprehensive overview of the tools and features available within the acoustic tagging page. Jump ahead to the section on methods if you’re looking for more details on tagging methods for species identification.

image

There are three main components to the acoustic tagging page:

  • Spectrogram Interface: Here is where tagging will take place through various methods described below. You can interact with the spectrogram via the hotkeys.
  • Logged Sounds: This is a running list of the detection history of species and individuals in the recording that you have tagged.
  • Header: Here you can navigate back to the project page using the breadcrumb, see the organization, location and date time of the task you’re working on, the task status and the weather conditions.

A spectrogram is a visual representation of the acoustic energy in a recording created using a short-time Fourier transform (STFT). The x-axis corresponds to time, the y-axis to frequency and the z-axis (or how dark the signal is) to amplitude. Spectrograms in WildTrax are generated with a command line software called SoX. Audio file types have been calibrated to create the “best looking” spectrograms.

The page is also sorted into up to four tabs depending on what metadata is available for the recording.

4.2.1 Spectrogram controls and audio settings

image

Audio editing controls have been implemented in WildTrax using hotkeys allowing easy navigation through a task. Hovering over the panel above the spectrogram will remind you how to use these controls at any time if you forget.

  • Z: Jump back 10 seconds
  • M: Jump forward 10 seconds
  • B: Toggle display for boxes
  • T: Toggle display for species | individual text
  • L: Show / hide tags on the left channel
  • R: Show / hide tags on the right channel
  • Left Arrow: Previous marker
  • Right Arrow: Next marker
  • Up Arrow: Go to the first marker
  • Down Arrow: Go to the last marker
  • Spacebar: Pause / Play
  • Tab: Move between fields
  • 1: Create a marker
  • Enter: Select within a field

You can dynamically set the audio settings which control the spectrogram and audio parameters of the recording for the task. These settings are set by default at the project level by an administrator but you can change them at any time while processing a task. You can set the audio settings by either clicking on the gear icon below the spectrogram or going to Manage > Audio Settings above and to the right of the spectrogram. When you have your desired settings, click Apply or Reset to the project defaults.

  • Channel: Set the default channel that will display on the screen (left, right or both)
  • Amplify: Increase the gain or amplitude of the audio to hear faint sounds
  • Minimum and Maximum Frequency (Hz): The minimum and maximum bounds displayed on the spectrogram. The audio will be played for the entire spectrum unless the Generate Audio toggle is enabled.
  • X-Scale: Use the slider to change the amount of seconds displayed on the spectrogram. A lower value will display more time whereas a larger value will display less time.
  • Y-Height: The spectrogram height in pixels. A larger number will increase the vertical dimension of the spectrogram
  • Use Monochrome: When toggled on, displays spectrogram in a monochrome format (black and white). Toggle off to display colour spectrogram.s
  • Generate Audio: When toggled on, only the audio between the selected frequency ranges will be played.
  • Noise Filter: Attempts to apply a noise filter to better help hear the signal.
  • Contrast: Contrast range in dB. Default is 120 dB. This sets the dynamic-range of the spectrogram to be the selected value dBFS to 0 dBFS. This may range from 20 to 180 dBFS. Decreasing dynamic-range effectively increases the ‘contrast’ of the spectrogram display, and vice versa.
  • Brightness: Sets the upper limit of the Z-axis in dBFS. A negative number effectively increases the ‘brightness’ of the spectrogram display, and vice versa.

4.2.2 Task Info

Clicking on the Show Info Panel icon at the top right corner of the spectrogram  shows the Task Info panel. This table contains fields that describe average metadata for the task such as abiotic noise levels and audio quality. These are optional to be filled in before the task is set to Transcribed otherwise a warning will appear. Average noise assessments and audio quality are filled in after the tagging is complete.

  • Rain: average rain on the task
  • Wind: average wind on the task
  • Industry: average chronic industrial noise on the task
  • Noise: average background noise on the task (intermittent and other abiotic noise)
  • Audio quality: overview of the audio quality of the task
  • Comments: any general comments or descriptions to make about the task

There are options for tagging abiotic noise sources. See Acoustic tagging methods for more information.

4.2.3 Tags

You can add tags (click and drag a box) or markers (Hotkey: 1) on spectrograms.

The boxes that comprise the tag are the fundamental way that WildTrax differs from many other audio tagging systems: they allow you to enclose a species detection in real-time while providing information about vocalization frequency, length and amplitude using the dimensions of the box. When a box is drawn on the spectrogram, WildTrax requires you to enter species metadata after it is created, which pauses the recording while the metadata is being added.

image

In contrast, you can use markers to indicate where a signal is detected without stopping the playback. You’ll still need to return later to each marker, draw a box, and enter the appropriate species metadata but this way you can rapidly annotate the vocalizations you want. WildTrax offers these two different ways to tag species to allow you to create your own workflow for data processing. 

When a signal is detected, click and drag on the spectrogram to create a box. Draw the box around the outside of the signal. Depending on your study design, you may want to include harmonic frequencies. Once you’ve drawn the box, the Add New Tag window will pop up on the right side of the spectrogram. 

There are a minimum set of rules that WildTrax uses to create tags. As the system develops, new rules may be applied in the future. Project administrators are invited to instruct their members to annotate species as necessary for the objectives of the project.

  • Tags are a minimum of 0.1 seconds and a maximum of 20 seconds long as defined by the constraints of the maximum dimensions of the spectrogram currently employed in WildTrax
  • Tags will always have a minimum frequency of 0 Hz and a maximum frequency of 12,000 Hz as defined by the constraints of the maximum dimensions of the spectrogram
  • Regardless of spectral coverage, a tag must always have minimum dimensions of 0.1 seconds and 200 Hz otherwise the tag will automatically disappear (in reality the tag must have a minimum size of 5 x 5 pixels)
  • Depending on what method is applied to the task, the number of tags per species-individual you can make may be restricted

  • The tag should enclose the whole signal
  • The tag should be as “tight” as possible around the signal leaving as little space as possible
  • The tag should capture important vocalization patterns and repetitions

The fields associated with each tag are as follows:

  • Species: This field is a dropdown that operates through detecting text either by a modified version of the American Ornithology code system or the English common name of the species. 
  • Individual: This field corresponds to uniquely identified individuals of a species on the recording. In other words, if you hear two Ovenbirds (OVEN), the first individual would be tagged as “1” and the second OVEN as “2”. WildTrax has been designed to automatically populate the Individual field as new data is being entered. Therefore, if you forget to add in the individual number, the system will update it for you automatically in sequence.
  • Needs Review: This checkbox flags tags for expedited verification or if you’re not confident in the identification of the species or the tag metadata.
  • Abundance: This field stores abundance information for each detection. It is broken down into two main categories: for amphibian calling intensity(CI) and all other species (1-10 and TMTT).
  • Vocalization Type: This field describes the type of vocalization each detected species/individual made. Vocalization type can be changed for each individual in each minute interval as well. The current options are song, call and non-vocal. 
  • Comments: Use this field liberally to make notes about any of the songs or calls relating to the species in question, especially in the case of unknowns.

At any time during tagging you can edit the information stored for each tag either by clicking on the box, or visiting the logged sounds table. Once you click on the box, you can change the dimensions, position or the metadata for any of the fields when the tag panel is open. You can even change the species code of the tag in case it was, for example, accidentally misidentified or you wanted to change it to an unknown. You can also delete the tag altogether.

4.2.4 Logged Sound table

Every time a tag is made, the information is saved below the spectrogram in the logged sounds table. This is a running list of all of the detections that are currently in the task. The table lets you keep track of the species and individuals that are already tagged and provides another place to edit tags.

Components of the table include the species code, the minute intervals where the species was detected (where applicable depending on method), the abundance, the vocalization type of the species (shaded in different colours to allow ease of visualization) and the confidence of the identification. If you hover over a coloured cell, you can also see the second within the minute interval that the box was created, and by clicking on the cell, WildTrax will transport you directly to the part of the spectrogram where the species was detected and play the recording (if Auto play mode is selected).

This will also open the Tag panel. Clicking on the drop-down arrow beside the Confidence field will also let you navigate back to the species detection and make edits in the Tag panel. The switch just below the spectrogram allows you to toggle auto-playing the recording when you navigate to a tag. 

image

4.2.5 Task status

As discussed before, a task is a unique combination of user, processing method and recording. The task status indicates its processing state. 

image

If you’ve been assigned a task and you’re a read-only project member, your privileges relating to the task status field will be slightly different in that if you switch the task to Bad Weather, Malfunction, or Transcribed you will not be able to switch it back to an unprocessed status. 

Go to the header when you’ve completed tagging and you wish to move to another task. Here, you can change the task status to Transcribed and then click on the Next Task icon to process another task. There are other options to choose from to annotate the status of the task; for example, if you wish to leave the task but tags have not been completed, select In Progress.

If a tag is created in any task, the status will automatically change to In Progress as a reminder that you visited the task.

Here are some detailed descriptions of the task statuses:

Unprocessed or incomplete processing

The task was either recently assigned or no tags have been made yet.

In Progress is a general usage status for a task that is currently being processed, or that needs to be re-processed for any reason. Once a tag has been made, the task status will automatically switch to In Progress.

Use this status if you are a read-only member and you need to flag tasks for audit to your project administrator or if you are a project administrator and you want a tagger to review an assignment.

Processed

Sometimes you might upload media into your project that is unsuitable for tagging. You can flag these assigned tasks as Bad Weather (e.g., due to wind or rain) so they don’t get processed. Use the weather conditions panel simultaneously to make a decision on the weather suitability.

If you detect any malfunctions with your acoustic media, such as channel failures or poor audio quality that does not allow you to process the task, use this setting to indicate no processing took place due to the errors.

Note: You can indicate the details of the malfunction in the audio quality field in the task info panel and in the task comments.

Switch the task to Transcribed to indicate you’ve completed the task!

4.2.6 Weather Conditions

image

Conditions recorded at the nearest weather station (at the nearest time to when the ARU recording took place) are located in the panel just above the spectrogram. Clicking on the panel opens the detailed weather information panel describing both hourly and daily conditions recorded at weather stations within a maximum radius of 155 km. You can toggle through the list of stations with the Previous and Next buttons to get the conditions recorded at various stations. 

Note: Getting weather conditions requires that locations have spatial metadata.
image

4.2.7 Map and location photos

image

If the location of the task you’re working on contains spatial metadata and/or photos were taken during the visit, the Map and Location Photos tabs will automatically appear next to the Audit tab. This allows you to know where the task took place and what the habitat looked like to help with your species identifications. The Location Photos tab will sort the images by visit date so if you took photos in the winter and summer you can see on the ground how your location changes over the season! 

image

4.3 Acoustic tagging methods

The acoustic processing methods in WildTrax are designed to emulate avian point counts and are an amalgamation of recommendations provided by the Bioacoustic Unit, the Boreal Avian Modelling project, and Environment and Climate Change Canada based on standard methodologies. 

The processing method is chosen either during the upload stage of the media or by uploading tasks to a project and is based on two values: the processing length of the recording and the “count-removal” method type.

The first component is the length of time in seconds. The easiest way to think of count-removal is when you tag an individual of a species and then want to “remove it” from the rest of tagging. WildTrax will automatically implement this limit depending on the method you choose. Here we define the limit at the individual level, like OVEN 1 and OVEN 2. There are currently options for a tag per minute limit (1SPM), a tag per task limit (1SPT) or an unlimited number of tags (None). 

Contact WildTrax Support if you would like an additional method added.

4.3.1 Individuals and abundance

Each tag in WildTrax records the individual number and abundance. These fields seem similar but are quite different, and how you use them to tag your data is dependent on your research objectives. Check with your project administrator on how to tag species on your assigned tasks. 

An individual is the unique identifier for distinct individuals of a species. Here’s an example of a Tennessee Warbler (TEWA) that receives an individual number of 1 (since it is the first) and an abundance of 1 (since it is one individual).

If required by your project, it’s important to ensure individuals are tracked consistently through each interval of the processing method. This way all individuals are annotated correctly.

The number of individuals you can distinguish and track is dependent on multiple factors and these can either facilitate or hinder your ability. Accuracy in individual tagging increases when:

  1. individuals are territorial (separated spatially)
  2. individuals are sedentary (not moving or not appearing to move via spectrogram attributes such as amplitude or channel)
  3. individuals have unique song attributes or singing behaviour
  4. the species richness on the recording is low (there are no masking signals)
  5. abiotic signals on the recording are low (there is no noise interference)

It becomes difficult to tag distinct individuals when certain species have life history traits or behaviour leading them to be abundant. Alternatively, you can tag multiple individuals of a species at once and change the abundance value to reflect the count of individuals of that species in the tag. You can also use the TMTT (too many to tag) option to indicate an uncountable number of individuals of the species within the bounds of the tag.

Increasing the abundance value should only be used when you’re certain you can’t tell individuals apart. If you’re unsure, it’s recommended to always use the more conservative estimate. Practicing also helps!

image
Busy recording in a prairie grassland (southern Alberta, Canada)

4.3.2 Amphibian abundance

During the breeding season, amphibians congregate together making individual counts difficult. Therefore, WildTrax uses a calling intensity (CI) rank as a common measure of estimating amphibian abundance rather than by distinguishing individuals, like with birds or mammals. Call intensity is used as a metric for relative abundance and is adapted from the North American Amphibian Monitoring Program (NAAMP) Amphibian Calling Index (ACI) (2005).

  • CI 1: Individuals can be counted, there is space between the calls
  • CI 2: Calls of individuals can be distinguished but there is overlap
  • CI 3: Full chorus, calls are constant, continuous and overlapping

4.3.3 Confidence and classifying unknowns

There are degrees of uncertainty in identifying acoustic signals on a recording. Since vocalizing species make a variety of different sounds, the ability of an observer to identify many hundreds of species and all their vocalization types is learned over time. Checking the Needs Review box helps to filter tags for verification. 

When you come across a signal you don’t recognize, you can either classify it as Needs Review and/or as unknown. It’s important that these unknowns are incorporated into the dataset accordingly to define that:

  • The signal is clear but it cannot be identified
  • The signal is too degraded, faint or masked
  • The signal is a non-descript short call note or alarm call

As unknowns are detected they should be placed into sub-categories as much as possible to refine how another observer or project administrator will identify it later on. For example, a triller should be identified as UNTL instead of UNPA (unknown passerine) or UNKN (unknown). Be conservative in the categorization process—if you’re not sure, always revert to the more basal category, like UNKN. Some signals are too faint, degraded or masked, that they’re likely not identifiable. The threshold of signal identification is dependent on many factors including amplitude, signal complexity, observer skill, and how much the signal is masked or overlapped by others in the recording. There are certain species complexes that cross over families and genera of species—in this case the more conservative unknown code should be used.

Faint, generic, and indistinct call notes tend to be overlooked when conducting community census tagging as they do not provide much information to the occupancy or detectability of that individual over time; the BU uses standardized replication at each station to account for these tendencies.

Once a tag is labeled as Needs Review, the panel in species verification will also change to yellow. See species verification for more details.

4.3.4 Vocalization type

Animals can produce many different vocalizations. In the current version of WildTrax, vocalizations are categorized as either song, call or non-vocal and for ultrasonic species call+feeding buzz.

Song is loosely defined as the male, territorial, mate-attracting vocalization produced by passerine (songbird) males and some non-passerine species. In many bird studies, passerine song is used to estimate the density and population size of a species and is a reliable metric. This is because a male passerine generally sings in the territory it’s occupying.

  • Ovenbird (Seiurus aurocapillus) song
  • Black-throated Green Warbler (Setophaga virens) multiple song types

Calls are any vocalization made by individuals where the sex cannot be distinguished and/or the vocalization is not for mate attraction. Some examples of this include the simpler and less melodious vocalizations typical of some non-passerines as well as vocalizations like alarm calls, begging calls, etc.

Exceptions can and do exist in the natural world. For example, despite being passerines, all corvid vocalizations are calls, as both males and females can produce them. Conversely, the Yellow Rail (YERA) is a non-passerine but the primary vocalization (“tik-tik tik-tik-tik”) is always entered as a song since it is for territorial delineation and mate attraction, and the female does not produce the vocalization.

Flight calls are other unique vocalizations given by birds that WildTrax differentiates as they come from trigger-based recordings and separate study designs from scheduled-based recordings. It’s possible with certain families like the thrushes (Turdidae) and blackbirds (Icteridae) to identify these call signals to species.

  • Night flight call notes
  • White-throated Sparrow (Zonotrichia leucophrys) call notes
  • Swainson’s Thrush (Catharus ustulatus) call notes

Non-vocal vocalizations are mechanical sounds made by a species, such as winnows, bleats, drums or booms that are not made by the vocal tract.

  • Wilson’s Snipe (Gallinago delicata) winnowing
  • Yellow-bellied Sapsucker (Sphyrapicus varius) drumming

Call+Feeding Buzz are the rapid series of calls produced to capture prey by echolocating species.

  • Little Brown Bat (Myotis lucifugus) feeding buzz

Depending on the taxa that you’re studying, you may want to use these categories differently than how WildTrax suggests. These recommendations, while imperfect, reflect common practices and help to standardize vocalization metadata. Nevertheless, the system is agnostic to vocalization type standards and is a proponent in making the system as flexible as possible. If you have a recommendation or suggestion on how to help develop this standard, email WildTrax Info.

4.3.5 Abiotic signals and bad weather

ARUs don’t discriminate between noise sources, including geophonic (e.g., wind and rain) and anthropogenic (e.g., human produced) sounds. Assessing, tagging, and recording these noise sources is important as they can affect species detection depending on the frequency range and amplitude of the abiotic signal.

For example, certain recorders can be placed very close to industrial features such as compressor stations where noise is almost constant throughout the entire recording due to motors, fans, engines, etc. Others can be located near busy roads where cars may pass by. All these sounds have distinct acoustic signatures that can be identified and tagged on a spectrogram. Noise sources can also be biotic such as bugs flying past the microphones. 

Keep in mind that the assessment of noise can be made both through frequency and/or amplitude. Even faint broad-scale sounds will affect species detection, whereas a loud low-frequency sound can also trick the ear into missing biotic signals.

WildTrax defines two different noise source types: intermittent and chronic. They describe whether a noise source is occurring consistently throughout a recording or if the noise source is temporary. Intermittent noise can be traffic from a highway, a plane flying overhead, or deterrents from an industrial site. Chronic noise sources are classified into four broad categories:

  • Rain
  • Wind
  • Industry: chronic industrial noise such as processing facilities, mines, oil and gas features, etc.
  • Background: chronic noise that does not fall into the other categories such as flowing water, thunder, or insect swarms.

Assessing the noise on a recording involves two steps:

  1. tagging the noise on the spectrogram 
  2. estimating the average noise level in the task info panel

Tagging abiotic noises according to their detection interval on a recording will typically be in the standard one-minute interval just as for birds and amphibians.

Abiotic tags should always be a minimum of 5 seconds if the source is continuous, or the maximum duration of the event (e.g., a car driving past). Tags should also be made in the frequency range of the event (i.e., heavy winds may box the maximum dimensions of the spectrogram on the y-axis). WildTrax uses four-letter codes similar to AOS designations to record rain, wind, chronic industrial, intermittent anthropogenic, and other background noise.

All ARUs produce a certain amount of internal static; you can record the static to indicate a possible equipment failure. Older models of ARUs have a lower signal-to-noise ratio that can produce the increase in static.

As wind, rain, and industrial noise increase in intensity and frequency, biotic sounds are harder to detect and accuracy with species identification decreases. You can avoid processing certain tasks where wind, rain, background, or industrial noise is above a certain threshold defined by the project.

You can quickly check if the task will be appropriate for conducting an analysis based on the noted noise threshold. Sometimes this can be very easy (e.g.. recording will have obvious heavy rain). Other times you may need to scroll quickly through the recording to check. You can either proceed with the transcription or change the status of the task to Bad Weather and move onto the next task.

The weather panel also provides a way of making assessments of the weather conditions to determine whether or not they are appropriate for processing. WildTrax will develop recommendations for this in the near future. 

4.4 Species verification

image

The species verification tab houses a set of tools to assist in the quality control of acoustic tags. In species verification, administrators assign users as validators to verify all tags grouped from a species-vocalization type in a project. Species-vocalization type is used for the grouping to allow the user to focus on one signal type at a time, e.g., Wilson’s Snipe (WISN) calling vs. winnowing. Species verification is important because it allows you to efficiently check all the tags produced in a project before publishing it, thereby dramatically increasing data quality output. 

ARU species verification in WildTrax is a two-stage approach:

1) Create or import tags in a project

2) Verify all the tags of a species-vocalization type

This allows a WildTrax user to have the first pass at processing the acoustic data and then use the verification tools to target and verify tags. Verification tags are populated once the task is completed (or switched to Transcribed). Clicking on the Species tab in the project page displays a list of species-vocalization types, summaries of the total number of verified tags, and tools to assign users for verification, in order to help you manage the verification process. Click on any of the species to enter the verification page.

This is the standard practice currently used in WildTrax. Use this route if you have trained taggers who will be generating tags from assigned tasks.

  • Upload recordings to a project and generate tasks by checking Create a new task
  • Assign users as taggers to the tasks
  • Tag all of the tasks in a project
  • Assign validators under the Species tab
  • Verify all of the tags for each species-vocalization type where desired; return to the tasks to change or delete tags where needed
  • Filter by verified tags and mark high quality ones as Nice, or mark as Nice as you verify
  • Proceed to publishing your verified project

If you have outputs from a recognizer and wish to verify the hits and share the results in WildTrax:

  • Upload recordings to a project and create tasks
  • Upload tags from the automated classifier
  • Assign validators under the Species tab
  • Verify all of the tags for each species-vocalization type where desired; return to the tasks to change or delete tags where needed
  • Rate the tags to help determine recognizer performance
  • Proceed to publishing your verified project

 

image

Check with your project administrator for how you should proceed verifying acoustic data for your project.

4.4.1 Verification page

The verification page header designates the species-vocalization type you’re verifying and the filters are metadata you can use to filter the list of tags you see in the single tag panels. The single tag panels allow you to individually access and manipulate the audio and spectrogram properties, and take actions in order to verify the tag.

image

You can listen to any of the audio in the single tag panels by clicking the Play button  in the top-left hand corner. The icons below the spectrogram indicate the following:

  • Verify: when checked green , the tag has been verified. The background of the single tag panel will also change to green. 
  • Rating : when checked yellow , the tag has been rated
  • Task link : opens a new tab to the task where the tag was created; the tag will be coloured black in the task
  • BirdNET probability: maximum probability (0-1) returned from BirdNET. The number is the maximum value found in all of the 3-second windows where BirdNET also positively detected the species and intersected the tag.
  • Amplitude : peak amplitude (in dBFS) of the tag
  • Abundance: abundance of the species made in the tag. If TMTT, the icon will appear as

Rating tags helps to create a library of high quality acoustic resources and verified datasets. Below are the descriptions of the 5-star rating system (adapted from eBird). You can rate a tag anywhere you see a while in species verification.

1 Star: Very weak target sound that is barely audible due to high background or handling noise or many overlapping sounds.

2 Stars: Weak target sound with significant background or handling noise or many overlapping sounds.

3 Stars: Good target sound with moderate background noise or some overlap with other sounds.

4 Stars: Strong target sound with limited background noise or some overlap with other sounds.

5 Stars: Very strong target sound with little or no background noise or overlap with other sounds.

image
Tag needs review

image
Tag has been verified
image
Tag is selected

WildTrax uses the BirdNET API and returns the maximum probability from all the 3-second windows that intersected the tag. Note, results from other species are not returned. 

image
WildTrax returns the maximum value in all of the 3-second windows (in dashed blue lines here) generated by BirdNET that intersect the tag. The value indicates the probability of BirdNET having detected the species in that interval. If BirdNET doesn’t detect the same species, the probability will be 0.

You can also click on the  button which opens the Help menu describing everything from the legend to keyboard shortcuts and tag selection methods so you can customize your verification workflow the way you want.

image

4.4.2 Detailed verification window

Clicking on the top-right corner of the single tag panel  opens the detailed verification window. This window allows you to manipulate the audio and spectrogram parameters in order to verify the tag. The main sections of the detailed verification window include:

  • Task link (in the header)
  • Action buttons
  • Tag details
  • Spectrogram
  • Filters

The task link will open a new tab and jump back to the task to view where the tag took place, highlighting the tag in black. This is useful if you need more context beyond what is available in the detailed verification window.

image

Actions are where you can quickly validate the tag after you’ve done any filtering or audio manipulation.

  •  to verify. The icon will turn green when the tag is verified.
  • Screen Shot 2022 01 27 at 12.31.09 PM to rate the tag following eBird guidelines.
  •  to delete. This will delete the tag from the system. The changes will also be tracked in the audit table of the task. 

Tag details in the left column summarize other useful information about the media and tag in order to help make a decision on the verification of the tag.

  • Minimum frequency: minimum frequency of the tag
  • Maximum frequency: maximum frequency of the tag
  • Length: length of the recording (seconds)
  • Default channel: indicates the channel used for default verification. If one of the channels were malfunctioning, the better channel will appear by default. 
  • BirdNET probability: maximum value returned from BirdNET
  • Peak dBFS: maximum amplitude of the tag
image

You can manipulate the audio and spectrogram using the different filters and editors located below the tag. These include the following settings that you can combine in any way you’d like to generate the best-looking and -sounding spectrogram to verify the tag. 

  • Amplify: increases the gain, or amplitude, of the media.
  • Noise filter: runs a noise profile on the tag and attempts to eliminate noise.
  • Channel filter: Select the channel you want to display and listen to (Left or Right). By default, the left channel is visible while both channels are audible.
  • Z Contrast: Contrast range in dB. Default is 120 dB. This sets the dynamic-range of the spectrogram to be the selected value dBFS to 0 dBFS. This may range from 20 to 180 dBFS. Decreasing dynamic-range effectively increases the ‘contrast’ of the spectrogram display, and vice versa.
  • Z Brightness: Sets the upper limit of the Z-axis in dBFS. A negative number effectively increases the ‘brightness’ of the spectrogram display, and vice versa. 
  • Y Scale: Expand the number of pixels displayed on the Y axis. Default is 1x. A larger number stretches the spectrogram vertically.
  • Frequency filter toggle: when turned on, limits the audio file to playing only the frequency bounds of the tag. Helpful for eliminating other bandwidths to more clearly hear the signal. 

Chapter 5 Image Data

Chapter 5

Environmental sensors such as remote cameras are rapidly replacing conventional human survey techniques. Remote cameras are used to capture images of mammals and other taxa. Such sensors allow for continuous or near continuous data collection over extended periods of time, resulting in the accumulation of large amounts of data—a key benefit of their use. This chapter will show you how to tag images from remote cameras in WildTrax.

5.1 Concepts

5.1.1 Sampling Design

A remote camera can be deployed at a location for a short or long period of time. One or several units can be deployed at a single location for years, swapping out the batteries and SD cards every 6 months to a year, or the unit(s) can be moved to a new location after a predefined sampling period has lapsed, depending on the research question and intended data use. This flexibility provides great benefit to a user or researcher, as the only limitation to data collection is storage space and battery life. Depending on the length of time these units will be in the field prior to being serviced, camera settings can be changed to optimize battery life. When developing a remote camera sampling design for questions related to density estimation, relative abundance, occupancy modeling, etc., strong considerations should be made regarding the length of time in the field, number of units to install and the distance between units. Resources on camera deployment methods, sampling protocols and analytical approaches can be found in the Resources section. 

Resources

5.1.2 Camera Tasks

A camera task in WildTrax is a unique combination of an image set and an observer, in this case, a WildTrax project member. WildTrax currently supports the ability to process images using a double-observer method; tasks are tagged first by image set, but it’s important to note that each task itself is not assessed a second time. The second validation of tags occurs by species during species verification.

The  filters the default view to only tasks assigned to you. You can toggle to view all tasks in a project like this: .

image 49

5.1.3 Auto-tag with Artificial Intelligence (AI)

The use of remote cameras can lead to the capture of hundreds, thousands or even hundreds of thousands of images in a single image set. The large data sets collected are a benefit to users; however, image processing is also usually a bottleneck to producing meaningful data in a timely manner. The time required for humans to process each image and categorize animals or humans can be incredibly time-consuming and inefficient. WildTrax’s auto-tagger features allow you to reduce the time required to review remote camera images by applying tags to images (i.e., “‘auto-tagging”) before you begin manually tagging.

WildTrax project administrators can choose to implement the following methods in their organization and project settings to minimize the number of images that require processing:

Auto-tagger

The auto-tagger is an AI tool built on multiple machine learning approaches (Artificial intelligence [AI]), which uses a combination of MegaDetector v51 (to detect NONE [i.e., false-fires], Humans, Vehicles, and animals), MegaClassifier v0.1,1* and a WildTrax AI (to enhance detection of humans and NONE and tag STAFF/SETUP photos).

Auto-tagger settings are defined at the project level to indicate the classifier categories for which the auto-tagger will be applied when new image sets are uploaded to the project, and whether the selected categories should be automatically hidden from view in the image tagging page. Options currently include NONE (i.e., false-fires), Human, Vehicle, STAFF/SETUP and MegaClassifier v0.1.1* Once applied, images of humans are blurred, and tags of the selected classifier categories are automatically tagged and hidden on the image tagging page (filtered out of view). Pre-filtered images can still be viewed by un-filtering through the filter panel (see section 5.2.3 Tagging page controls). Note that human image blurring settings can be found in organization settings.

The goal of MegaDetector1 is to detect animals, people, and vehicles in camera trap images. It does not identify animals; it just finds them. Notably, images classified as animals through MegaDetector v51 alone are not identified to species; thus, no species tag is applied; these images are automatically available for viewing and subsequent tagging.

The MegaClassifier v0.11* is a setting used to select the application of the preliminary (testing) version of MegaClassifier1* automatically to image sets.

ai

The STAFF/SETUP tagger is a setting used to select the application of the STAFF/SETUP tag automatically to image sets. The auto-tagger will automatically tag images of humans that occur within the first or last series as “STAFF/SETUP” (using a 5-minute series gap), unless there are <2 images auto-tagged as human, or the STAFF/SETUP tag has already been applied.

Once you’ve applied auto-tagger settings, any image data uploaded into the project will be run through the auto-tagger based on your selection of classifier categories before they become available for tagging.

This will be reflected in the task status, following the path: Uploading -> Ready for auto-tagger -> Processing -> Preparing -> Ready for tagging -> Tagging complete

5.1.4 Recommended equipment

Proper equipment will enhance your experience and data quality using WildTrax. The following equipment is recommended for processing camera image data.

  • Computer with a minimum screen size of 15″
  • A stable, high speed internet connection

Contact WildTrax Info (info@wildtrax.ca) if you have any questions.

5.2 Camera project page and tagging controls

image

5.2.1 Image sets tab

Image set metadata 

You can access image set metadata from the project dashboard by selecting the image sets tab. Note that this information differs in the organization image sets tab. Here you will find summaries of the following information. 

  • Location: the name of the location. 
  • Image set end date: the date of the last image collected in the image set (in the format “YYYY-MM-DD”). 
  • Motion image count: the total number of images where the camera was triggered due to heat and motion (i.e., Trigger mode = “Motion Detection” or “M”) in the image set. 
  • Tagged image count: the total number of auto-tagged and manually-tagged images in the image set. 
  • Camera task status: the processing state of the camera project task (image set) (i.e., “Uploading,” “Preparing,” “Processing,” “Ready for Tagging,” or “Tagging Complete”). 
  • Observer assigned: the WildTrax user assigned to the task (if no observer is assigned, WildTrax will use “Not Assigned” by default). 
  • Task count: the total number of tasks (recordings or image sets). 
  • Total image count [task]: the total number of images in a task. 
  • Equipment make: the make (manufacturer) of a particular sensor (e.g., “Wildlife Acoustics” or “Reconyx”). 
  • Equipment model: the model number of a particular sensor (e.g., “SM2” or “PC900”). 
  • Equipment serial number: the serial number of the sensor (e.g., “H600HJ12269118”) 
  • Image set start date/time: the date and time of the first image collected in the image set (in the format “YYYY-MM-DD HH:MM:SS”). 
  • Image set end date/time: the date and time of the last image collected in the image set (in the format “YYYY-MM-DD HH:MM:SS”). 
  • Classifier status: the status of an image set being processed through the Auto-tagger (if applicable). 
  • Tag types: the species tags that occur within the task. 
image 73

5.2.2 Verify species tab

After tagging is complete, a second check of species identification is completed to catch and correct tagging errors. It is easy to select incorrect tags from the drop-down lists, so this step is important both for double-checking species identifications and for correcting data entry mistakes. In the “Tagging complete” tab within the project page, the project administrator can assign species tags to individual taggers: 

For each species listed, select a user’s name from the user drop-down menu to assign that species to a tagger for verification, or select the users for which you want to randomly assign species.

See section 5.3.3 – Species verification for additional information.

camera snip

5.2.3 Tagging page controls and settings

From within a project, click on a camera task to open the tagging screen. Although you can see all tasks within a project, you can only tag images in tasks that are assigned to you.

camera tagging interface

The following information can be adjusted/accessed from the image tagging page:

Image tagging page views (a): You have the option to tag images in two tagging page views.

  • Full Tagging View: an image tagging view that displays all images in a chronological order, except for images that have been pre-filtered based on your processing settings (will exclude time-lapse and may exclude auto-tagged images).
  • Series View: an image tagging view that contains images within a single series, ordered chronologically . Images will appear in a series if the time between consecutive motion-triggered images is less than what is defined as the series gap. You can switch between views while tagging or adjust the series gap (can be adjusted in the tagging interface or via project settings).

Filters panel (b): The filters panel can be used to search for tagged images or filter for certain parameters within an image set. You can filter for tagged images containing any of the tagging field options (see section 5.2.3 – Tag types for more information), such as tags of a specific species, sex, or age, as well as a variety of descriptive tags, date ranges and field of view. From this window, you can view all hidden images (time-lapsed and auto-tagged) as well as select a date and time of images you want to view. 

full tagging 2

Image panel (c): where images are uploaded to the image set from a specific location, and a camera will appear. 

Icon panel (d): a panel of icons that directly correspond to the images being displayed on the page to increase the ease with which users can identify errors or batch tag image sets. Different coloured flags or boxes will appear based on the tags applied. You can access the legend through the help menu.  

tags

Number of images per page (e): where you can define how many images are displayed per page.

Bounding box display threshold (f): the minimum classifier confidence level for which bounding boxes will be displayed for the selected detection categories. The default is managed in the camera project settings. 

bounding box

Page numbers (g): depending on the number of images you display per page, your corresponding number of pages will increase or decrease.

page numbers

Image information icon (h): The icon at the bottom right-hand corner of an image can be used to see the image metadata (e.g., Equipment make, Equipment model, Flash mode, etc.). The icon will also display the image URL, which can be used to download a specific image.

help menu

Help menu (i): The help menu can be found to the right of the location name; it contains information about tagging tabs, keyboard shortcuts, image selection options, a legend, and icon descriptions.

help menu 2

Keyboard shortcuts: keyboard shortcuts allow for easy navigation through a task. The help menu will remind you how to use these controls

keyboard shortcuts

Image zoom, Toggle Detection, and quick-flag “Nice” images:

  • Use the image zoom button to pull up the image full-screen
  • Once zoomed in, you can use the Toggle Detection button to turn classifier bounding boxes on or off. 
  • You can also click the star icon to flag your favourite images as “Nice”.
toggle detection

5.2.4 Tag types

Two types of tags exist for image data, individual(s)-level tags and image-level tags. Most camera project tagging fields are optional and selected in the camera project settings.

The individual(s)-level tags refer to a tag applied to one or more individuals with the same combination of characteristics (i.e., all are adult males displaying the same behaviour). Note that what constitutes the “same combination” will depend upon the tagging field options selected within camera project settings. Individual(s)-level tags appear in the upper gray portion of the tagging page.

image 78

Image-level tags are applied based on information that can only occur once per image (e.g., snow or fire is either present in an image or it is not). Image-level tags are displayed in the larger grey box at the bottom of the tagging page. Note that this box will shift down as new tags are added.

image 79

Individual(s)-level tags

The following fields can be optionally selected in the camera project settings to classify characteristics of one or more individuals (if they have the same characteristics) in a tag.

Some of the tagging field options are “one-to-many fields,” meaning you can select multiple options that apply. In the tag report, one-to-many fields will occur as a comma-separated list.

image 81

Since more than one individual-level tag may occur for a single image, it’s important to note that each tag corresponds with a unique row in the tag report (see section 7.1 Data Downloads).

Species

The species menu is divided into Mammals, Birds, and Human tags. Common and frequently used species appear at the top of the drop-down menu to facilitate quick tagging.

Special species tags are also used:

  • Unidentified: a species tag used if the individual in the image cannot be identified based on visible features. This is often used when the only images of an individual are blurs, blotches of fur, etc.
  • NONE: a species tag used for motion-activated images with no individual(s) present.
  • STAFF/SETUP: a species tag used for the series of photos taken while staff are setting up or taking down the camera (humans that occur within the first or last series, using a 5-minute series gap). This tag is applied automatically if the STAFF/SETUP auto-tagger is selected in the project settings.
Count (camera)

Count is the number of individuals of a particular species, age, sex, behaviour, etc. (i.e., applies to a specific tag where all other fields remain the same rather than an image). For example, if adults and juveniles occurred together in an image, the count would not equate to the number of both adults and juveniles, but rather, a separate tag for juveniles should be applied, and the counts in the separate tags should include the number of adults and the number of juveniles, respectively.

The default count for all wild animals is ‘1.’ The number can be changed if the count is greater than one. The default count for domestic animals, birds, vehicles, and humans is ‘VNA.’ Users can also input VNA if they do not want to collect information in this field.

Age class

Age class is the categorical age class (choose one) of individual(s) in a tag (when identifiable).

Age class options:

  • Adult (Adult; default for mammals): an animal that is old enough to breed.
  • Juv (Juvenile): an animal during its first summer [mammals older than neonates but still requiring parental care]. The juvenile tag is only used for an animal’s first summer when they have apparent juvenile features, such as spots.
  • UNKN (Unknown): the age class of the individual is unclear.
  • VNA (Variable not applicable; default for domestic animals, birds, and humans): the tag does not apply, or the user is not interested in collecting information for this field.
Sex class

Sex class is the categorical sex class (choose one) of individual(s) in a tag (when identifiable). For example, ungulate species such as deer, elk, and moose can be identified by sex class based on the presence of antlers, but antlers are not manifested year-round. Therefore, it is recommended that antler-less ungulates are only tagged as a female between May 15 and October 1. Outside of these dates, if antlers are not present, the default of UNKN should be used. Some species, such as bears, are often photographed with their young. When an adult mammal is with a juvenile, it can be assumed to be Female and tagged as such.

Sex class options include: Male, Female, Unknown (default), and VNA (variable not applicable). Users can select VNA if they are not interested in collecting this information.

Behaviour

Behaviour is a one-to-many field (choose all that apply) used to classify behaviour(s) of mammals (reported as a comma-separated list when syncing tags).

Behaviour options include: Travelling, Standing, Running, Feeding/Foraging, Drinking, Bedding, Inspecting, Vigilant, Territorial Display, Rutting/Matting, and Unknown.

Health/Disease

Health/disease is a one-to-many field (choose all that apply) used to classify descriptors of health and/or disease status (reported as a comma-separated list when syncing tags).

Health/Disease options include: Poor Condition, Discolouration, Hair loss, Lumps, Scarring, Injured, Malformed (environmental and/or genetic), Diseased, Ticks, Mange, Dead, and Other.

Direction of travel

Direction of travel is a categorical field (choose one) used to classify the direction of travel of moving individual(s). The 12 categories represent the 12 positions of a clock. Assuming the camera always faces 12 o’clock position, the option entered for a moving individual should represent the clock position that an animal moves towards in relation to the direction the camera faces. For example, if the animal travels from left to right, and the movement is perpendicular to where the camera faces, the direction of travel would be “3 -o- Clock.”

Direction of travel options include: 1 -o- Clock, 2 -o- Clock, 3 -o- Clock, 4 -o- Clock, 5 -o- Clock, 6 -o- Clock, 7 -o- Clock, 8 -o- Clock, 9 -o- Clock, 10 -o- Clock, 11 -o- Clock, and 12 -o- Clock.

Coat colour

Coat colour is a one-to-many field (choose all that apply) used to classify the coat colour(s) of mammals (reported as a comma-separated list when syncing tags).

Coat colour options include: Beige, Cream, Brown, Chocolate Brown, Dark Brown, Black, Blonde, Cinnamon, Grey, Red, Orange, Yellow, White, Melanistic, and Other.

Coat attributes

Coat attributes is a one-to-many field (choose all that apply) used to classify attributes of mammals’ coats (reported as a comma-separated list when syncing tags).

Coat attributes tag options include: Spots, Speckled, Salt-pepper, Stripes, Cross-Phase, Chest Blaze, and Other.

Antler tine attributes

Antler tine attributes is a combined field (choose one combination) used to document information on antler tine attributes, including antler position (the side of the rack being categorized), tine count (the number of antler tines present) and tine count precision (the precision of the tine count). Users will only be able to apply this tag to mammal species with antlers (e.g., Moose, White-tailed deer, etc.).

Antler tine attributes:

  • Antler position: the side of the rack being categorized (options include: Left, Right, and Symmetrical)
  • Tine count: the number of antler tines present.
  • Tine count precision: the precision of the tine count (options include: Exact, Approximate, and At least)
tag type

Note that the three antler tine attributes are concatenated into one value in the resulting tag report.

Collar

Collar flags are used to identify individuals affixed with a collar (e.g., a GPS collar; may be interpreted as “off-leash” for Domestic Dogs) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Ear tag

Ear tag flags are used to identify individuals with ear tags (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Interacting with human feature (IHF)

Interacting with human feature (IHF) flags are used to indicate when individual mammals use or interact with human features (e.g., an animal walking in the adjacent forest vs. along the fence, or digging in a compost pile) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Tag needs review

Tag needs review flags are applied when species attributes are unclear and need to be checked. Each individual-level tag in an image is associated with a review tag, so it is clear which tag needs to be reviewed (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Tag comments

Tag comments are any comments entered by the observer to describe any additional details about the individual(s)-level tag. Note there is also a field to document image comments.

Image-level tags

Image Field of View (FOV)

Field of View (FOV) is the extent of a scene that is visible in an image (Wearn & Glover-Kapfer, 2017). For remote cameras, the camera’s FOV is influenced by how the camera is set up (i.e., camera height and angle, slope, etc.), and it often remains largely unchanged throughout a deployment. However, a camera’s FOV can change during deployment, such as snow blocking the lens or because the camera was nudged by an animal moving past it (e.g., and now faces another tree rather than an open area). When the camera’s FOV has changed when compared to the setup view, users should use the image FOV tags to document images that shouldn’t be included as part of the observation period (and thus the sampling effort). It is important to use the image FOV tags correctly since they allow for the correct estimation of each camera’s sampling effort.

Field of View tags are only used when the camera’s FOV has changed significantly (for 3+ hours) compared to the FOV at setup. When this occurs, the images are considered to be ‘Out of range,’ and the observation period ends for the camera. The image(s) may be motion or time-lapse-triggered images.

FOV ‘Out of Range’ criteria:

  • The camera’s FOV changed for 3+ hours
  • The change in FOV was “significant”, which may occur due to:
    • Loss of visibility – e.g., the lens is more than 50% covered (by snow, vegetation, fallen trees, etc.). Discretion is used where (e.g.) cattle are leaning on a post and making the camera go in and out of position repeatedly. In such cases, the camera is said to be not working properly the whole time this is happening.
    • Major changes in the roll, pitch, and yaw of the camera:
      • Roll – the tilt is more than 30 degrees from level. Note: The lines in the image below show the angle to which the horizon would need to rotate to be considered out of range.
      • Pitch – the camera’s angle shifted upwards or downwards such that the pole (if used) is now beyond the bottom of the image or above the center of the image.
      • Yaw – the bottom of the pole (if used) is out of view beyond the right or left side of the image.
image 83
Application of Image FOV tags

There are four potential FOV tags, “Within,” “Out of Range,” “END – Last good image in FOV,” and “START – First good image in FOV”. Importantly, the Out of Range tag is automatically applied to the images between the END and START tags (if applied) after a FOV review has been completed.

FOV

If the FOV remains unchanged (or altered, but for < 3 hours) compared to the setup view, the camera’s FOV is assumed to be “Within” the normal range (the default), and the observation period includes the entire deployment. However, what if, for example, you’re tagging an image set with ~2,500 images, and there was a period in the middle of the deployment (e.g., images 1001-1551) where snow covered the lens of the camera for more than 3 hours, and thus, these images should be considered “Out of Range.” However, users do not need to manually apply the Out of Range tag manually since WildTrax will populate it automatically after the field of view review if the END and/or START tags are applied. For the first 999 images, the user can leave the image FOV as the default (“WITHIN”) since the FOV remained the same as the setup view, and thus the observation period began with the first image and continued until the FOV changed. Users should apply the END tag to image 1000 since it is the last image with the correct FOV and signifies the end of the observational period. Since the snow melted after image 1551, the user should apply the START tag to image 1552 to signify that the observation period has recommenced.

To summarize, follow these instructions to use the FOV tags correctly:

  • WITHIN (default): applied to images where the camera is assumed to have the camera setup FOV (i.e., is “WITHIN” the normal range). The Field of View field defaults to ‘WITHIN’ as images are assumed to be within range.
  • Out of Range: applied to images where the camera’s FOV has changed significantly (for 3+ hours) compared to the setup FOV. This tag does not need to be applied manually; WildTrax will complete the process automatically during the Field of View review. Images may be motion or time-lapse images.
  • END – Last good image in FOV: if the FOV has changed significantly (for > 3 hours), apply to the last image with the correct FOV (the camera setup FOV) before the FOV changed (e.g., the last image before snow covers the lens or a cow leans against the camera) to signify the end of the observational period.
    • Note that the ‘END’ and ‘START’ tags are only used if the view changes compared to the set-up images.
  • START – First good image in FOV: applied if a) the END tag was applied, and b) if the camera’s FOV returns to the correct FOV (the camera setup FOV); applied to the first image captured with the corrected FOV (e.g., snow melts or the cows stops leaning against the camera) to signify that the observation period has recommenced.
    • Don’t apply a ‘START’ tag at the beginning of an image set to indicate that the camera has been successfully set up.
    • Note that the ‘END’ and ‘START’ tags are only used if the view changes compared to the set-up images.
  • After the Field of View review is complete (see section 5.3.6 Field of View review), the ‘Out of Range’ tag will be automatically applied to the images between the END tag (end of the observation period) and the START tag (beginning of a new observation period).
Snow presence

Snow presence flags are applied to images where snow is present on the ground (reported as ‘TRUE’ or ‘FALSE’ when syncing tags). If snow is in the air (i.e., it is snowing) but not on the ground, snow presence should not be flagged.

Image snow depth (m)

Image snow depth (m) is the depth of snow (in metres) at the distance at which the camera detects motion at the ground/snow surface level.

Image water depth (m)

Image water depth (m) isthe depth of water (in metres) at the distance at which the camera detects motion at the water’s surface.

Fire presence

Fire presence flags are applied to images where the camera was clearly triggered by fire. Note: there may be animals present in these images or not (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Malfunction

Malfunction flags are applied to images when it appears that the camera is not working properly (e.g., images are completely black or pixelated) (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Nice

Nice flags are used to highlight high-quality images, so they are easier to find at a later date (reported as ‘TRUE’ or ‘FALSE’ when syncing tags).

Image comments

Image comments are comments entered by the observer to describe any additional details about the image. Note: there is also a field to document individual(s)-level comments.

Tagging field codes

You can find the field option codes by clicking the Manage button on the project page followed by Download Codes.

image 80

5.3 Image processing

image 93 3

There are four primary functions for camera image processing within WildTrax:

  • Image tagging — Classify species and their characteristics (individual[s]-level tags), as well as characteristics of the location (image-level tags) captured by the camera.
    • This step includes determining the sampling effort of the camera by applying image FOV tags where the images collected are not accurately capturing the sampling area (e.g., lens was covered by snow, i.e., thus should not be considering in the sample effort).
  • Species verification — Verify the correctness of the tags applied during manual tagging.
  • Field of View review — Confirm corrections made to sampling effort by reviewing the image FOV tags, if applied.
  • Validate location information — Confirm that the location information associated with the media on WildTrax is correct (i.e., location names on match the camera prefix).

All camera project tasks with the status ’Ready for Tagging’ can be tagged. Unless you are the organization or project admin, you can only tag the tasks that are assigned to you.

5.3.1 Image tagging

Tagging images in WildTrax is relatively flexible in that the user can be as general or as detailed as your question requires. Tagging entails the application of one or more tags, composed of a species, sex, age and number of individuals, to each image. In the future, WildTrax will allow for the application of additional tags such as coat colour, snow depth, etc.
From the tagging screen, select one or more images for tagging in multiple ways:

  • Click on the image to select it.
  • Click and drag your cursor over groups of images
  • Using Shift, you can click on the first image; then, holding Shift, click on the last image to select all images in between.
  • Using Ctrl, you can select multiple images that are not in consecutive order. This includes being able to drag boxes around multiple subsets of images.
  • Select images in the panel on the left-hand side of the screen. Note: Selected images will be highlighted in teal in the number panel. Press ESC at any time to deselect highlighted images.
  • Apply tag(s) to selected image(s) using the tagging window. With image(s) selected, click on “Tag Selected X,” where X represents the number of images you have highlighted for tagging, and fill in applicable information.
image 88 1

Tagging form

The tagging form differs when tagging a single image (Single image tagging form) compared to when tagging multiple images (Batch image tagging form):

Single image tagging form:

image 92

Batch image tagging form:

image tagging form

There are four main tagging scenarios you may encounter:

1) ≥ 1 images / 1 individual / 1 species: if you select one or more images with a single individual of a species, then a single tag is applied. Once the tag fields are completed, click “Save All and Close” to apply the tag.
2) 1 image / >1 individuals / 1 species: if you select a single image that contains > 1 individual of a single species, but age and/or sex differ among these individuals, then multiple tags are applied. Once you have completed the tag fields for the first individual, click image 93 to create a new tag for the next individual. Example: an image with a female moose with a calf, or a male and female deer together. In this case, you would create a unique tag for each individual.

Example: an image with a female moose with a calf. In this case, you would create a unique tag for each individual.

image 93 1
image 93 2

3) 1 image / ≥ 2 species: If you select a single image that contains > 1 species, then multiple tags are applied. Once you have completed the tag fields for the first species/individual, click image 93to create a new tag for the next species/individual. Example: an image where a deer triggered the camera and a coyote was also captured in the background. In this case, you would tag the deer and then click image 93 to create a second tag for the coyote (or vice versa).

4) 1 image / >1 individuals / 1 species / tags differ: If you select multiple images that contain > 1 individual of a single species and age, sex, or any other tags (e.g., behaviour) differ among these individuals, then multiple tags are applied. Once you have completed the tag fields for the first individual, click image 93 to create a new tag for the next individual.

Picture2

Update all untagged

Images with abundant species that will all have the same tag, such as Domestic Cows, may be left untagged until the end. Once all other images in the image set (including NONE) have been tagged, you can select the image 94 button and enter the Domestic Cow tag in the tagging window. Doing so will tag all remaining untagged images (on all pages) as Domestic Cow. image 94can be used for any species whose tag attributes (e.g., Individual count, Sex class, or Age class) defaults to ‘VNA.’ Thus, image 94can be used for all domestic animals, humans, or NONE. image 94cannot be used if the tag attributes vary.

5.3.4 Validate location information

Double-checking the location information is an easy but important task. It should be completed for all applicable tasks where reference signs was used during field set-up and/or pick-up activities. If this does not apply, please continue to section 5.3.1 – Tagging images.

If applicable, you can review the initial STAFF/SETUP photos and compare the location name on the reference sign against the photo labels on WildTrax. If the STAFF/SETUP portion of the auto-tagger and ‘Hide Auto-’ were selected in the camera project settings, STAFF/SETUP photos will be automatically hidden from view in the image tagging page. If these settings apply, the filter will need to be adjusted to complete location validaton.

1) Click on a task to be taken to the tagging page.

2) On the filter panel, Uncheck image 119

Notify the project admin if there are any mismatches.

5.3.2 Field of View review

Field of View (FOV) review is completed is only if an ‘Out of Range,’ ‘END,’ and/or ‘START’ tag was previously applied during the tagging process, and after all images in an image set have been tagged.

field of view

It is easiest to think of the image FOV tags as bookmarks applied during the manual tagging. These bookmark tags are then used to confirm exactly when the camera’s Field of View changed by reviewing the images again, but this time, in reverse chronological order (i.e., last photo taken to first photo taken) and with any auto-hidden images included (i.e., all images are included; motion-triggered, time-lapsed, auto-tagged).

image 106

To complete a Field of View review:

  1. Locate the image FOV tag(s) applied during the tagging process; images FOV tags will have a yellow triangle across them, such as image 98. If you can’t locate them on the first page, click the “…” between the page numbers.
  2. Click on the page with the first yellow triangle and locate the last image taken within the desired Field of View, which will occur to the right of the image marked with the “END” tag as images are in reverse chronological order.
  3. Select the last good image, open the tagging form, and confirm that the “End – Last Good Image in FOV” tag is selected from the Field of View drop-down menu.
image 96

4) Once you’ve done so, click on the image 104 button in the top right-hand corner.

fov review

5.3.3 Species verification

image

Species verification is completed as part of a quality control step within WildTrax to ensure accurate application of image tags. This step is only carried out when all image sets within a project have a status of “Tagging Complete”. In general, all wild mammals are double-checked. Domestic and bird species are verified at the discretion of the project admin, who assigns species to taggers.

The main objectives for species verification are:

  • Ensure manually applied species tags are correct
  • Ensure context-tagged species tags are correct
  • Ensure auto-tagged species tags are correct (if applicable)
  • Conduct additional analyses on verified images

To complete species verification:

Access your assigned species through the Verify Species tab of the project page.

species verification

Checks to complete during species verification are: Correct species ID tag(s) and use of attribute tags.

If errors are found, you can edit the tag either by (a) clicking on the image and selecting the  button in the top right-hand corner of the screen to open the tagging form or (b) clicking on the tag below the image. In both cases, edit the tag in the tagging form and click “Save All and Close”.

Once all images on a page have been checked, click on the “Verify Species” button at the bottom of the screen. The next page of images to verify will automatically appear. When all images on a page are verified, the images and page navigation boxes will be filled in with green to indicate they are complete.

canada lynx

Notes of species verification:

  • If an image was given a “Tag Needs Review” tag during the first round of tagging, that image will have an orange border around it in the side panel to emphasize that this image’s species tag needs to be reviewed with extra care.
white tailed deer
  • Unidentified’ and ‘Needs Review’ tags: Images labeled as ‘Unidentified’ and ‘Needs Review’ are either unidentifiable species and shapes that triggered the camera or identifications that taggers are not 100% sure of, respectively. These tags should be double-checked to verify that tags have been applied correctly. Tags with the ‘Tag Needs Review’ flag that cannot be identified but still possess identifying features should remain as “Tag Needs Review”. However, if the species in the image cannot be identified and there are no identifiable features, re-tag the species as “Unidentified” and remove the “Tag Needs Review” tag.

Chapter 6 Point Count Data

ErikHedlin Seiurus aurocapilla PeaceRiver

The last sensor chapter will walk you through point count data.

6.1 Concepts

A point count is a methodology used to survey animals. It involves an observer standing at a pre-determined location for specific period of time, counting the individuals they detect. Specifically for birds, this detection can either be aurally or visually. 

To account for error in detecting a species, either because it didn’t sing during the survey, or the observer misses it or misidentifies it, distance estimation and duration intervals are used. All these attributes are what define a survey in WildTrax. When the observer detects a species, it is assigned into a distance band, the duration interval it was detected, a species and the abundance (or count of individuals). Each detection becomes an observation

Surveys and visits are different things in WildTrax. Visits refer to a human going to a location whereas surveys are the unique combination of methods used to conduct a point count. 

6.2 Point count survey page

The point count survey page outlines the observations and many other attributes of the survey. The page is broken up into two sections: the header and the observations panel. The header contains information on the duration method, distance method, location name and date and time of the survey. It also contains the weather panel similar to the ARU and camera tagging pages. 

image

Within the observation panel you can also click through to the Map tab and Location photos tab to cycle through the spatial and photos if they were taken during the visit.

The fields in each row correspond to the number of individuals observed in each distance band and duration method. If multiple individuals were heard, a new row is created.

image

Chapter 7 Data Download and Data Discover

Black oystercatcher Haematopus bachmani along Pacific Coast of North America


The final chapter in this guide will demonstrate how you can download, extract and discover data across WildTrax through Data Downloads and Data Discover.

7.1 Data Downloads

Data Downloads is the section of the WildTrax website where you can export and download your processed and tagged data from projects. The format is a zip file that contains multiple csvs (up to a maximum of 10 projects at a time) with columns customized to each sensor type. 

Your accessibility to the projects within Data Downloads is dependent on your organization and project membership and the status of the project. As an administrator, this allows you to download data at any time to perform preliminary analyses. You can also authenticate into WildTrax using the wildRtrax package if you’re an R user.

To download data from a project, login to WildTrax, go to the My Data button, and click Data Downloads. Choose the sensor type, and then choose the projects whose data you’d like to download.

image

You can use the checkbox on the side of each project to select it for download. You can also search or filter by organization or project name.

image

Click Next once you have your projects selected to download the zip file of raw data.

image

WildTrax reports are standardized in content, terminology, and field codes across data collection methods (ARU, camera, point count) to make navigation and integration across methods easy and intuitive and to meet the diverse needs of users.

WildTrax reports will be downloaded separately for each data collection method and each project.

The ZIP file downloaded for a project contains a combination of the following CSVs (depending on project type):

  • Main report (all projects): a report containing a variety of fields from various levels of the database and should be the only report most users will need. Each row will be a single tag (ARU, camera) or detection (point count).
  • Project report (all projects): a report containing information on the project description fields (all project types).
  • Location report (all projects): a report containing details about all of the locations within the project. Note that what is included in this report will depend on the organization’s location privacy settings.
  • image 37 Image set report: a report containing details about each image set within the project. Note that there may be multiple image sets per deployment
  • Recording / Image / Point count report: 
    • image 38 Recording: a report containing details about individual recordings.
    • image 37 Image: a report containing details about individual images.
    • Point count: a report containing details about individual point counts.
  • image 38  image 37 Tag report: an all-purpose, long-format list of all of the tags in the project. For ARU projects, this includes a detailed summary of the tags in the projects with various audio statistic results. For camera projects, recall that there may be more than one tag per image, and thus there may be multiple rows for a single image.
  • Classifier reports
    • image 38  BirdNET report (ARU): a report detailing the results BirdNET found for each media recording in the project. Recordings that do not produce any results are labelled ‘Not Generated’
    • image 37 MegaDetector: an output related to automated classification via MegaDetector v5 for each image in the project.
    • image 37 MegaClassifier: an output related to automated classification via MegaClassifier v.01 for each image in the project.
  • English column definitions
  • French column definitions

The structure of the reports generated by WildTrax was developed through a multi-stage user engagement process. If you’re looking for another report format or would like to know more about how to use the data included in the reports, contact info@wildtrax.ca.

WildTrax reports are also available via the wildRtrax R package.

The ZIP file downloaded for acoustic projects contains the following CSVs:

  • abstract.csv
  • location_report.csv
  • main_report.csv
  • project_report.csv
  • recording_birdnet.csv
  • recording_report.csv
  • tag_report.csv
  • english_column_definitions.csv
  • french_column_definitions.csv

The ZIP file downloaded for camera projects contains the following CSVs:

  • abstract.csv
  • main_report.csv
  • project_report.csv
  • location_report.csv
  • image_set_report.csv
  • image_report.csv
  • tag_report.csv
  • megadetector_report.csv
  • megaclassifier_report.csv
  • english_column_definitions.csv
  • french_column_definitions.csv

The ZIP file downloaded for point count projects contains the following CSVs:

  • project_report.csv
  • point_count_report.csv
  • main_report.csv
  • location_report.csv
  • abstract.csv
  • english_column_defitions.csv
  • french_column_defintions.csv

7.2 Data Discover

data discover landing page

Data Discover is the central hub for exploring environmental sensor data in WildTrax. In Data Discover, you can search for data from ARUs, cameras, and point counts, using a variety of attribute filters, and create summary statistics within a dynamic mapping interface. Here, you can gain a comprehensive understanding of environmental sensor data in an area that interests you. Data Discover allows you to see which organizations have published data on WildTrax and which species were detected, and explore media elements such as images and sounds captured in the environment.

Explore Data Discover and leverage the wealth of environmental sensor data available for scientific research. You can easily search for published data by project or organization; once you’ve found data that interests you, head over to the organization or project dashboard, Data Downloads, or wildRtrax for data download options.

Filter Panel

Use attribute filters or select a specific sensor to search available data. On the left side of the interface, the Filter Panel houses various filters for refining your search. Results will be displayed on the map and in a table below. Ensure locations with spatial coordinates are visible on the map, and toggle between different base maps (light or satellite) in the top right corner.

filter panel

You can search by:

  • Taxonomy: Classify data based on class, order, family, and genus.
  • Species: Search for individual species or add multiple species to your selection.
  • Organization
  • Project
  • Dates and times (also months and hours) within set intervals or with start and end dates

You can delete the selected options in the filter panel at the bottom left of the panel using Delete Layer.

Layers

layers

Explore data in depth with up to five customizable layers. Create a new layer by clicking on the colored number in the filter panel. Alternatively, duplicate an existing layer to preserve its results and further refine your exploration. Each layer has the ability to be duplicated, deleted, or made visible or hidden.

The map below is an example overlay of two layers: the blue layer represents wolf detections on ARUs and the green layer represents wolf detections on wildlife cameras.

example data discover map

Searching an area of interest

Refine your selection using the polygon tool in the top-right corner. Create a polygon on the map for a more targeted filter. Each layer supports only one polygon, specific to that layer.  Access the entire layer’s summary by clicking the Layer Summary icon in the bottom right corner. This will also bring up the Summary Window. To remove a polygon, select it on the map and click the Garbage can icon.

area of interest

Summary Window

Summary Statistics offer a visual representation of the organizations, projects, species, and tag counts in your layer.

Within the Summary Window, insights are available in the Summary and Media tabs:

Summary Tab: View pie charts detailing the number of organizations, projects, and species for your selected area. Scroll down for bar charts representing tag counts across months and hours.

summary tab

Media Tab: Tiles correspond to species tags. Play audio clips or view images. Observe the minimum and maximum frequency of an audio clip in its ARU spectogram. Note that point counts do not include any media.

media tab

Downloading data of interest

After using Data Discover, you may have found an organization or project that you’re interested in – so what’s next? Head over to either the Organization or Project dashboard to find out more about the data owners or the project or to request access to the data. Once you are granted access by the project administrators, proceed to Data Downloads to acquire the data.

Alternatively, you can use the R package wildRtrax to download data.

Sharing your data on Data Discover

Interested in contributing to Data Discover? Want your data discoverable? Complete the following checklist before publishing your data:

  • All tasks are completed
  • Species have been verified
  • Location or data privacy settings are set
  • Project status is changed to Published – Map Only, Published – Map+Report, or Published – Public so that others can find your data