new [abstract] smart quality assurance module · 2019. 8. 28. · considerations, the qa module’s...
TRANSCRIPT
1
Background Collecting information for conservation is certainly not a new practice. The study of our natural world has
sparked the minds and interest of people for millennia. Field observations have provided some of the most
significant findings in behaviour, ecological process and the interconnectivity of the world’s biodiversity.
However, the use of information (data) in fields such as protected area management and law enforcement
monitoring for conservation has only really developed in the past decade or so. With the advancement of
technologies which allow for the rapid collection, analysis and dissemination of this information, it is more
important now than ever that the data used for decision-making is accurate and reliable.
We’re proud at how SMART has become one of the world’s leading data management systems for
conservation. The SMART Partnership is dedicated to improving and developing the software, the ‘SMART
Approach’ and the fast-growing community of users through upgrades, additional plugin applications and
the development of materials to support these advancements.
Purpose of this handbook With the release of SMART 6, included is the ‘Quality Assurance Module’; a tool which allows users to
quickly identify, edit or remove any potentially erroneous data which could negatively impact their desired
outputs, planning or decision-making processes. It is aimed at individuals working in data management or
analysis levels within their conservation areas, with the hopes that it will assist in the quality control of all
existing and incoming data. This manual aims to support SMART users by outlining some data collection
considerations, the QA Module’s functions and demonstrate how the tool is set up and effectively used.
Acknowledgements The training handbook was prepared by the SMART Training Taskforce, a group of dedicated SMART
users who work across geographic regions, sites and situations where SMART is implemented, in
terrestrial and marine environments. The Training Taskforce is one working group under the SMART
partnership which currently comprises the following organizations; Frankfurt Zoological Society, Global
Wildlife Conservation, North Carolina Zoo, Panthera, Peace Parks Foundation, Wildlife Protection
Solutions, Wildlife Conservation Society, World Wide Fund for Nature, and the Zoological Society of
London.
Further thanks to the Community of SMART users; your input and feedback is invaluable.
And finally, special thanks to those men and women working in the field of conservation, who tirelessly
strive to safeguard our world’s protected areas and biodiversity.
Editor: James Slade | Global Wildlife Conservation | 2019
Photo Credits:
Cover Image: Tamaraw rangers collecting field data, Mindoro, Philippines © James Slade/Global Wildlife Conservation
Page 5: BHAPU Rangers review patrol data, Zimbabwe © James Slade/Bumi Hills Anti-Poaching Unit
Back cover: © Rich Bergl/North Carolina Zoo
2
Table of Contents Background ............................................................................................................................................... 1
Purpose of this handbook ......................................................................................................................... 1
Acknowledgements ................................................................................................................................... 1
Module 1 OVERVIEW ............................................................................................................................ 3
Installing SMART 6 and above .................................................................................................................. 3
Upgrading SMART 6 and above ............................................................................................................... 3
Accessing the Quality Assurance module ................................................................................................. 3
Module 2 DATA COLLECTION CONSIDERATIONS ............................................................................. 5
Data Collection Principles ......................................................................................................................... 5
Data & Decision Making ............................................................................................................................ 5
Data Storage ............................................................................................................................................. 5
Module 3 QUALITY ASSURANCE ......................................................................................................... 6
Track & Waypoint Errors ........................................................................................................................... 6
Configuring QA Routines .......................................................................................................................... 7
Step 1 – Configuring Routines .................................................................................................................. 8
Step 2 – Selecting Routine Types ............................................................................................................. 8
Step 3 – Manual vs Automatic Data Validations ........................................................................................ 9
Step 4 – Defining the Routine Parameters ................................................................................................ 9
Module 4 – VALIDATING DATA ................................................................................................................. 11
Manually Running a QA Routine ............................................................................................................. 11
Automated Data Validation Errors ........................................................................................................... 14
3
Module 1 OVERVIEW SMART 6 is a major release that builds upon SMART 5 with a suite of exciting new functionalities, including the introduction of the SMART Profiles platform and the beta release of SMART Mobile powered by CyberTracker. Additional key features include the SMART Field sensors plugin, integration of the R statistical platform and Global Forest Watch alerts, a new Quality Assurance module, enhancements to SMART Connect functionality and user interface, and strengthened security features. For more details, see the SMART 6 Release Page.
Installing SMART 6 and above 1. Download the applicable file from https://smartconservationtools.org/download/, depending on your
operating system. 2. Copy the zipped file to a local folder and unzip it (extract the files). 3. Run the executable SMART.exe (or just SMART on MacOS and Linux).
When first installing SMART, use the following credentials to login to the sample conservation area: User Name= smart Password = smart
Upgrading SMART 6 and above You can upgrade to SMART 6 from SMART 1.1.2 or later. For versions prior to 1.1.2 see prior version upgrade instructions. To upgrade to SMART 6:
1. Backup your existing SMART database using the ‘File -> Backup System’ menu option. 2. Install SMART 6. If you are going to install it in the same location as your previous version, you must delete
the previous version first. Do not merge directories. 3. On the login screen, select the ‘Advanced’ option, then the ‘Restore a Backup’ option. Follow the wizard,
and when asked to select a backup file to restore, pick the file created in step 1 above. 4. A prompt will appear, warning that the backup file is not compatible with the current software version. You
will be asked if you want to upgrade (and restore) – select ‘Yes’. At this point the backup will be upgraded and restored.
NOTE: Upgrading a large database may take a while (10-15 minutes for a database with ~800,000 waypoints).
Accessing the Quality Assurance module Unlike many other Plugins found in the various versions of SMART, the Quality Assurance module comes
pre-built into SMART Version 6 and above. There is no need to install the other plugins, although this is
highly recommended and necessary to use the QA module together with other SMART features, such as
Cybertracker/SMARTMobile.
To install SMART plugins:
Navigate to the ‘Available Software’ window from the menu bar File Install New Plugins…
o Select ‘All Available Sites’ from the “Work with” drop-down menu
o Select the SMART plugins you wish to install; leave all the other options on the screen as
they are.
o Keep clicking ‘Next’, accept the agreement, and then click ‘Finish’. If you get an error
message, select ‘Install Anyway’ to continue the process.
o You will need to log out and log back in again once the process is complete.
4
To access the Quality Assurance module:
Ensure your existing SMART database has been upgraded to SMART Version 6 or above.
Open your SMART Conservation Area and navigate to the ‘Field Data’ tab in the menu bar
You will see the Quality Assurance module, preceded by this icon:
The Quality Assurance module can be found by selecting ‘Field Data’ on the menu bar.
Once you have upgraded to SMART 6 and the necessary accompanying plugins have been installed, you
are ready to use the Quality Assurance model. The Quality Assurance tab in the drop-down menu has
three functions:
Manual Data Validation
Automated Data Validation Errors
Configure QA Routines
Each of these functions will be explained in this manual.
The Quality Assurance tab and functions found in the drop-down menu.
5
Module 2 DATA COLLECTION CONSIDERATIONS
The Quality Assurance (QA) module is a very useful tool for cleaning up any ‘bad’ data, such as incorrect
tracks or waypoints. The tool is configurable to users’ needs and can help to save a lot of time identifying
data which could negatively affect patrol planning, reports and more. It can be used to either ‘search’ for
errors through manual validations or to automatically identify any new errors when data are imported.
However, the QA module is not able to identify or fix any
number of errors that can arise from poor data collection
techniques, data model issues or bad data analysis and
management.
Certain considerations should be taken through the entire
data collection process; from the field through to analysis,
reporting and decision making. Therefore, before how use
of the QA module is explained, a few techniques to
improve data management and storage will be covered.
Data Collection Principles When field data are collected, following certain principles can save users from importing bad, unnecessary
or false data into SMART before any quality control and data management is done. Data should be
collected in ways which are:
Standardized. This includes:
o Data collection method(s) (i.e. data sheets, configurable data models, mobile devices, etc.)
o Recording of events and/or information
o Monitoring techniques
o Administration and data management
Rapid. Speed at which data can be properly collected and the transfer of time-sensitive information
Clear. Accurately capturing all the necessary facts of an observation/incident
Flexible. Data is important, but should not distract entirely from the main objective (i.e. LEM patrol)
Specific. Data should reflect mandates, needs and threats of the area and can be used for action
True. Accurate data is vital for informed decision making, false data can harm operations
Data & Decision Making Users should be aware that data analysed, used in reports and for decision making will only be as good as
that which is accurately collected and managed. Users should therefore ensure that before SMART is
implemented at a site, that good data collection (following the principles above),
data model design, data management and storage practices are followed. Once
these systems are in place, quality control of incoming data should be much
simpler.
The general rule is: ‘Accurate Data Good Decisions, Garbage (Bad) Data Bad Decisions’.
Data Storage Keeping data well-organized is useful to ensure that any SMART users, with the
correct permissions, can find and access data quickly. Having a standardized
system for categorizing attributes within your data model, as well as folders and
naming criteria for queries and reports, can save users time and effort when
managing data collected from the field.
Good data management is key to reliable and trustworthy data.
Queries & reports can be stored in folders
and in a structured method
OUT IN
6
Module 3 QUALITY ASSURANCE
Track & Waypoint Errors The Quality Assurance module allows users to view ‘bad data’ in the form of tracks or waypoints and to
edit these data as required. It is important to check for errors in data as a part of good quality control
practices and this should be incorporated into any data managers’ responsibilities. Errors can occur with
either of these data for a number of reasons.
Tracks
Example of ‘bad’ track data. The track ‘jumps’ from one point to a completely different location. This can affect track distances, speeds, locations
and other data which may be queried in SMART.
Track ‘jumps’ can occur when:
A handheld device (GPS, mobile phone) has incorrect position format or map datum settings
The track accuracy or track timer settings (Cybertracker Properties GPS…) are too high
The speed of travel is quicker than the track timer settings will correctly record
A GPS device’s tracklog has not been reset from use in a previous location
Always check device settings before deployment to avoid errors.
7
Waypoints
‘Bad’ waypoints are incorrectly recorded observations which appear outside of the conservation area or are offset from the intended position.
Bad waypoints can occur when:
A handheld device (GPS, mobile phone) has incorrect position format or map datum settings
An incorrect GPS location has been input.
Sighting accuracy settings (Cybertracker Properties GPS…) are too high
Sighting fix count settings are set too low
An observation is recorded with poor satellite accuracy/connection
These are just a few issues that can cause incorrect tracks or waypoints to appear. It is important to always
check the settings of any GPS or mobile device before it is deployed for data collection and to ensure that
Cybertracker/SMARTMobile properties and settings are correct.
Configuring QA Routines Before the Quality Assurance module can be used to check data, routines must be configured. This means
users have the ability to define how and where the system should search for errors and what it should be
looking for.
There are currently two types of routines in the QA module:
Patrol Maximum Speed Routine
This checks the Track Log Speed and can be set to look for those ‘jumps’ in tracks or tracks which have
been moving at irregular speeds.
Location Routine
This can identify tracks or waypoints which occur outside of a specific Area.
Either of these routines will need to be configured to the users’ specifications using the following steps.
8
Step 1 – Configuring Routines Select ‘Configure QA Routines…’ from the Quality Assurance tab under ‘Field Data’ in the menu bar.
This will open the QA Routines window.
To create a new routine, select ‘Add’ to open a new routine configuration window.
Step 2 – Selecting Routine Types Select either a new Patrol Maximum Speed Routine or Location Routine from the drop-down menu.
Click ‘Next’ to define the routine parameters.
SMART 6 - Quality Assurance Module
9
Step 3 – Manual vs Automatic Data Validations Once the type of routine has been selected, users must first define the new routine with a Name. There is
also an option to write a brief Description for each routine and a ‘check box’ labelled Auto Execute.
The ‘Auto Execute’ function allows users to decide if the routine(s) will be run through a process of Manual
or Automated Data Validation checks.
Manual Validations – the QA Routine is run manually
(see: Module 4: Validating Data)
Automated Validations – the QA Routine is run
automatically when any new patrols are downloaded from a
device or manually entered into SMART.
NOTE:
• Automated Data Validations will not bring up an error message when viewing patrols OR when the
patrol is downloaded, only when ‘Automated Data Validation Errors’ section in SMART is viewed.
Therefore, it is best to come up with a once monthly or another regular schedule to use this tool.
• Automated Data Validations do not bring up historical data either, only new data entered after the
QA routines have been created for the Conservation Area and set to ‘Auto Execute’.
• Historical data must be manually validated.
Step 4 – Defining the Routine Parameters Each routine type has different parameters which need to be defined before error checks can be run.
1. Patrol Maximum Speed Routines
Set the Maximum Speed (km/h)
Identify Patrol tracks and waypoints which are recorded above a maximum value (e.g. 100km/h).
Select the Type(s)
This identifies the Patrol Types and which methods of Transportation will be checked for errors.
For example: A QA Routine could be designed to check for any Foot Patrols which exceed 15km/h.
Complete the process by selecting ‘Finish’. At present, this only works with Patrol Data.
10
2. Location Routines
Set the Boundaries
Users can either select a Custom Polygon (.shp file), a Conservation Area defined boundary layer or input
the WKT (Well-Known Text) vertices for a polygon.
For example: All waypoints and track positions which are found outside of the Conservation Area Boundary
will be identified when the routine is run.
Complete the process by selecting ‘Finish’.
Once the QA Routines have been defined, users can view, edit or delete routines in the QA Routines
window.
11
Module 4 – VALIDATING DATA
Manually Running a QA Routine Now that QA routines have been configured, a data validation check can be performed.
Using an example Conservation Area, the steps to run a manual QA routine will be demonstrated below.
1. Manual Data Validation from the Quality Assurance tab is select from ‘Field Data’ in the menu bar.
2. Next, the Location Routine is used to check for Patrol Waypoints that occur outside of the
Conservation Area. Once the dates and filters* have been selected, run the routine by clicking the
‘Validate Data...’ button (see red arrow).
*IMPORTANT: Users can choose which
filter(s) to run the routine through and for
which dates. Filters include Patrol
Waypoints, Patrol Tracks, Mission
(Ecological Records) Waypoints and
Tracks and Independent Incidents.
At the moment, Location data checks can
be done on all filters, but Patrol Maximum
Speeds are only for Patrol Data.
For this example, only ‘Patrol Waypoint’
is selected.
12
Indicates how far the
selected waypoint is
from the Validation Area
Displays on the map where the
potentially erroneous waypoints
are located.
This window displays any waypoints
found outside of the ‘Validation Area’ as
defined by the QA Routine parameters.
3. After the routine has been run, the ‘Results’ tab will indicate any potential Patrol Waypoint errors.
4. By right clicking on the selected line, users can select how they want to manage the associated data.
‘Zoom to’ the selected waypoint, clear the selection from the list or edit the data as shown below.
This deletes the waypoint from the database.
Note: It is probably best not to do any deleting here as the
waypoint might have important data associated with it.
13
The ‘Edit Waypoint…’ option will open a new window, in which the coordinates of a waypoint or track lines
can be edited manually.
These are the basic functions of the Quality Assurance module when running Manual Data Validations.
‘Go to Source’ – Displays the associated patrol or independent
incident that is linked to the potentially erroneous waypoint.
Edit Waypoint – Allows editing the coordinates of a potentially
erroneous waypoint.
14
Automated Data Validation Errors ‘Automated data checking’ is based on QA routines which have been configured to ‘Auto Execute’.
This allows for data downloaded from a device or entered manually to automatically be checked for any
potentially erroneous waypoints or tracks, depending on the parameters set in the users’ QA routines.
Data will appear in the ‘Automated Data Validation Results’ window, which must be checked manually by
selecting ‘Automated Data Validation Errors’ from the Quality Assurance tab.
Any potentially incorrect data can then be managed in the same way that Manual Data Validations are
reviewed and correct.
Indicates how far the
selected waypoint is
from the Validation Area
Displays on the map where the
potentially erroneous waypoints
are located.
This window displays any waypoints
found outside of the ‘Validation Area’ as
defined by the QA Routine parameters.
REMINDER:
• Automated Data Validations will not
bring up an error message when
viewing patrols OR when the patrol is
downloaded, only when ‘Automated
Data Validation Errors’ section in
SMART is viewed. Therefore, it is best
to come up with a once monthly or
another regular schedule to use this
tool.
• Automated Data Validations do not
bring up historical data either, only new
data entered after the QA routines have
been created for the Conservation Area
and set to ‘Auto Execute’.
• Historical data must be manually
validated.