Sunday, October 19, 2008

ERDAS Automated Feature Extraction Solution: IMAGINE Objective

ERDAS announces the release of IMAGINE Objective, a new tool providing object-based multi-scale image classification and feature extraction capabilities for building and maintaining accurate geospatial content.

Since the open Beta earlier this year, the IMAGINE Objective workflow has been streamlined, based on customer feedback and interaction. IMAGINE Objective includes a set of tools for feature extraction, update and change detection, enabling geospatial data layers to be created and maintained using remotely sensed imagery. With IMAGINE Objective, imagery and geospatial data of all kinds can be analyzed to produce GIS-ready maps.

“IMAGINE Objective transforms data from remotely sensed imagery into relevant information for a broad range of industries, including insurance companies, utility providers, local government institutions, forest management companies and tax assessors,” said Mladen Stojic , Senior Vice President, Product Management & Marketing, ERDAS. “The output generated reflects the image content, with smooth roads and squared up buildings, and can be directly merged into a GIS, minimizing the need for additional post-processing.”

IMAGINE Objective crosses the boundary of traditional image processing with computer vision through the use of both pixel level and true object processing, ultimately emulating the human visual system of image interpretation.

Catering to both experts and novices alike, IMAGINE Objective contains a wide variety of powerful tools. For remote sensing and domain experts, IMAGINE Objective includes a desktop authoring system for building and executing feature-specific (i.e. building, roads) and landcover (i.e. vegetation type) processing methodologies. In addition, more entry-level users may modify and apply existing examples of such methodologies to their own data.

For more information about IMAGINE Objective or other ERDAS solutions, please call +1 770 776 3400, toll free +1 877 GO ERDAS, or visit www.erdas.com.

0 comments:

Post a Comment