Orientador(es)
Resumo(s)
In this paper, a land use/cover classification methodology of the rural/
urban fringe is presented, by means of the application of a neuronal network, with
resource to the multiresolution image segmentation, construction of complex elements
through object oriented analysis and integration of not spectral (ancillary) information
(to assist). The study area is the municipality of Almada, located in the south bank of
Tagus river and corresponding to one of the core regions of the Lisbon Metropolitan
Area (Portugal).
The developed procedure is based on 4 phases: (i) image multiresolution
segmentation strategy for construction of different scales objects that have good
similarity with the shape of the land use/cover final objects (polygons); (ii) objects
attributes acquisition, mainly, context, texture, spectral information, shape, among
others; (iii) acquisition of statistical auxiliary data proceeding from the Geographic
Base of Information Referencing (BGRI in Portuguese); (iv) integration of the data
different types in a neuronal network for classification and posterior discriminated
analysis of the land use/cover spatial units.
Data used in this methodological experimentation was a 2004 HRVIR SPOT image,
with fusion between the panchromatic (supermode 2,5 meters) and the multispectral
bands (10 meters) through a transformation between RGB-IHS-RGB color spaces,
which allowed a final spatial resolution of 2,5 meters for all the bands. This resolution
was respected in the images gathered from the alphanumeric database associated to the
BGRI.
Descrição
Palavras-chave
Land use/cover Neural networks Object oriented multiresolution segmentation
Contexto Educativo
Citação
Rocha, J., Tenedório, J. A., Encarnação, S., & Estanqueiro, R. (2007). Land use/cover classification using orbital and ancillary data, neural networks and multiresolution segmentation. In. Oluió Z. Bochenek (ed.). New Developments and Challenges in Remote Sensing/ Proceedings of the 26th EARSeL Symposium (pp. 241-250). Millpress. ISBN 978-90-5966-053-3.
