High-resolution orthophotos are not just pretty pictures. They contain an abundance of valuable information — if you know how to extract it. Too complex for visual interpretation alone, orthophotos are most effectively studied using automated image analysis functions.
Vegetation classification using orthophotos is one example of geospatial analysis that plays an important role in urban planning, forestry, agriculture, environmental monitoring, and many other types of research. Grouping vegetation with similar characteristics into classes allows analysts to quantify area, calculate change over time, and assess the health and growth of plant life. Unfortunately, manual classification is quite time-consuming and prone to human error.
Vegetation over large geographic areas is quickly classified using object-based image analysis.
The development of automated vegetation classification is a vast improvement. By applying pattern recognition and correlation tools to orthophotos, the classification of objects is faster and more accurate than ever before. Classification of vegetation can seem daunting because there are multiple modeling methods to be considered, as well as an official vegetation hierarchy created to maintain consistency in statistics reported by federal agencies and their partners. However, the classification process can be broken down into a few simple steps with the aid of advanced image analysis software.
Automated Vegetation Classification Method:
- Select orthophotos covering area of interest
- Import orthophotos into software package that performs knowledge-based automated classification
- Create a simple rule set that groups similar spectral signatures
- Apply knowledge-based algorithms to group these objects into classes
- Run a classification, then edit and revise the results
- Conduct multiple iterations of the analysis to improve the results
- Export results into a GIS for use in planning, modeling, forecasting, etc.
First, keeping in mind what information is required as output to meet the project goal, the user selects the appropriate orthophotos for the area of interest and imports the orthophotos into a software package capable of performing knowledge-based automated classification. A guided workflow facilitates the process from start to finish, particularly for non-technical users with a need for customized analysis.
Next, the user creates a simple rule set with segmentation algorithms that group similar spectral signatures, such as water, bare Earth or vegetation. By applying machine learning, algorithms recognize the unique spectral signatures of objects and group these objects into classes. Classes should be defined according to what the customer needs. Classifications range from simple to complex, i.e., if just forest vs. non-forest information is inadequate, an increased level of detail can identify deciduous vs. conifer, tree health, grass, shrubs, cultivated fields, etc.
Forestry classification distinguishes trees from shrubs, fields, roads and structures.
After running a classification, the user edits and revises the results. The software allows the user to select different bands in multispectral imagery and adjust thresholds on spectral signatures for each class. Since repetitive tasks are automated during classification, multiple iterations of the analysis can be run to obtain the best results. Finally, the results are exported into a GIS for use in planning, modeling, forecasting, etc.
Vegetation classification is a highly useful form of geospatial analysis for many applications. The automated classification process delivers more consistent and accurate information than manual efforts, and the segmentation and classification routines continue to improve as additional knowledge is introduced and stored. eCognition uses an object-based approach using context information to effectively quantify and classify diverse environments.
eCognition is an out-of-the box object-oriented image analysis software package ideal for forestry, agriculture, environment, disaster response and land-cover mapping.