Customer Stories

Bucking Tradition

Issue link: https://geospatial.trimble.com/en/resources/i/1397029

Contents of this Issue

Navigation

Page 1 of 3

Coastal wetlands are notoriously devilish to survey, both from the ground and from the air. Louisiana environmental authorities have routinely used 1-meter aerial imagery and ground surveys to map and monitor their coastal marshland. But that traditional approach is time consuming and vulnerable to inaccuracies. One scientist at a Louisiana environmental and geotechnical services company set out to improve traditional survey methodologies by combining unmanned aerial system (UAS) data with eCognition object-based imagery analysis (OBIA) technology. The integrated approach has not only proven its viability for vegetation mapping, it has yielded new business revenue streams for the company. overview Location LOUISIANA TRANSFORMING THE WAY THE WORLD WORKS TRANSFORMING THE WAY THE WORLD WORKS In August 2016, Jennings, Louisiana-based JESCO, Inc., was contracted to fly its UX5 UAS over a restoration site in Terrebonne Parish, a dense, marshland region near the Gulf of Mexico. Whitney Broussard, a senior scientist at JESCO, used that opportunity to test his UAS-OBIA proof- of-concept application. "I thought the UAS hyperspatial imagery would be a natural fit for developing an OBIA-based technique," said Broussard. "Unlike traditional image-processing methodologies, OBIA software can handle the high spectral variance and subtleties of the hyperspatial data. I wanted to test the feasibility of combining the two technologies to produce meaningful coastal vegetation maps that could supplement the state's traditional monitoring programs." To ensure the reliability and accuracy of the UAS data over the one-square-kilometer site, Broussard and his team set out five ground control points (GCPs) for each flight block. Carrying a handheld GPS, they navigated to each pre- defined location, laid down the elevated target and used a Trimble Geo 7X GNSS handheld unit to record the GCP's position via RTK corrections from a VRS network. Since collecting imagery over patches of open water is a photogrammetry challenge, Broussard outfitted the UAS with a Sony Alpha 5100 sensor for natural color (RGB) imagery and a Sony NEX5 with a near infrared (NIR) sensor. Launching the UAS off their boat about 4 km from the landing site, the UAS flew three flights with the RGB sensor and one with the NIR sensor. In three hours of total flight time, the UX5 had collected 4,106 images over the entire AOI at a ground sample distance of 2.5 centimeters. Taking the overlapping RGB and NIR flights—totaling 1,984 images—as his test-case imagery sources, Broussard used Trimble's Inpho UASMaster software to generate two digital surface models (DSM), one from the RGB data and one from the NIR data, and then used the DSMs to produce orthomosaics of each. The orthomosaics and DSMs had horizontal and vertical accuracies of 2.4 cm. All of those products were used as source data for eCognition. Broussard developed a two-tiered eCognition rule set with three separate processes to both delineate and map the land/water interface and classify the vegetation into three types. Relying predominantly on the NIR information, eCognition first identified and classified all water objects and defined the land/water boundary. It then classified the land cover. Combining height thresholds, spectral algorithms and normalized vegetation indices, the software separated the vegetation into three classes: Spartina Patens (grass), Phragmites Australis (reed) and Other.

Articles in this issue

view archives of Customer Stories - Bucking Tradition