low cost solutions for lidar and gis analysis

Post on 13-Jun-2015

1.191 Views

Category:

Technology

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

Discusses different low cost and fee software packages that can be used for Lidar and GIS analysis, the pitfalls and successes.

TRANSCRIPT

Low Cost Solutions for LiDAR Point Cloud Analysis

-Utilizing Open Source and Low Cost software-

Lidar Technologies 2012

Cairns, 7th June, 2012

Rahman Schionning and George Corea

Atherton Tablelands Geographic Information Services

Overview

Where we were an year ago

The search for new software

SAGA Demo – LAS Capabilities

Python integration

Dealing with outputs and derivative products

The 32bit Windows shortfall

More tools we are exploring

A future option

So...

You've gone and gotten yourself a point cloud!

Now What?

What software have we already got?

• ArcGIS and MapInfo could handle derivative products like the 1m ASCII DEM, 12.5cm Imagery and the 25cm Contours (in small portions)

• Erdas Imagine could import .las for rasterization and subsequent analysis

• Global Mapper worked well but we only had one license at the time.

• Mars Freeview

Mars Freeview

We wanted MORE!

One taste of freedom!

SAGAQuantum GISOSGEO LiveGeoserverPostgisPythonLasToolsMany more...

SAGA GIS Demo

Customized Tools

•SAGA / Python based extraction of “.las”

Dealing with outputs of Lidar- ArcGIS centred approaches

Even 10km2 blocks of 25cm data (usually) crashed in vector processing

Solution is to “chunk” the datasets and then merge after analysis.

Issue is that for the area of TRC this means…Hundreds of files =

High operator time for running processes =

Increased risk to operator error (-:

Complex models

Simplified to require minimum inputs…with Python Scripting

Property level analysis

- ArcGIS couldn’t process datasets of ~ 1m points at 30m spacing (same was true with Mapinfo and QGIS)

Utilized …UNIX “SED” command to reformat txt dataset to geocodable format

then python to “chunk” the datasets into segments of 500k points

finally models to rejoin all the processed data.

Other complex “Point Clouds”

•Some hard lessons…

Dataset size: - Spatially splitting data significantly reduces processing time even when multiple parts have to be merged at the completion of the process. Models including python components have helped significantly.

“[If] there were originally 65,000 * 1000^2 = 6.5 E10 cells in the DEM. To represent each of these requires at least four ordered pairs of either 4-byte integer or 8-byte floating coordinates, or 32-64 bytes. That's a 1.3 E12 - 2.6 E12 byte (1.3 - 2.5 TB) requirements. We haven't even begun to account for file overhead (a feature is stored as more than just its coordinates), indexes, or the attribute values, which themselves could need 0.6 TB (if stored in double precision) or more (if stored as text), plus storage for identifiers. Oh, yes--ArcGIS likes to keep two copies of each intersection around, thereby doubling everything. You might need 7-8 TB just to store the output.Even if you had the storage needed, (a) you might use twice this (or more) if ArcGIS is caching intermediate files and (b) it's doubtful that the operation would complete in any reasonable time, anyway.” A response to my query - http://gis.stackexchange.com/questions/16110/issues-with-large-datasets which succinctly describes the issues with large datesets.

•Some hard lessons…

- For exampleA dataset which took 7 hours to process (and crashed sometimes),

processed in about 100 mins when split into 6 parts and then took 10 mins to merge.

- Need to investigate multi-core, multi-threaded operations but the currently available modules are to cumbersome to easily integrate into our models.

•Added “intelligence” to models so that unnecessary processes aren’t run “brute force” in ArcGIS model builder

•Some steps took ~12h to run and so this (~10min) quick step saved ~10h per model that took 30+ hours to complete.

•Added “intelligence” to models so that unnecessary processes aren’t run “brute force” in ArcGIS model builder

•Some steps took ~12h to run and so this quick step saved ~10h per model that took 30+ hours to complete.

More Tools•Global Mapper ~$350

•great for conversion and clipping.

•3D viewing exists but not as good as Mars free view. Flood and View shed analysis

•Meshlab ~$0

•Only useful for visualization of small areas and objects in high resolution.

•Google Sketchup ~$0-$500

•similar to Meshlab in usefulness for GIS.

•Commercial > $2500

o Makai Voyager – resamples las files to proprietary format. Highly functional for detailed analysis but commercial version is not yet launched

o Point Tools – resamples las files to proprietary format. Highly functional for detailed analysis

o LASTools -o Mars Pro -

http://dataserver.dielmo.com/australia/TRC3D.html

Questions

top related