Land Use and Land Cover (LULC) information are significant to observe and evaluate environmental change. LULC
classification applying remotely sensed data is a technique popularly employed on a global and local dimension
particularly, in urban areas which have diverse land cover types. These are essential components of the urban terrain and
ecosystem. In the present, object-based image analysis (OBIA) is becoming widely popular for land cover classification
using the high-resolution image. COSMO-SkyMed SAR data was fused with THAICHOTE (namely, THEOS: Thailand
Earth Observation Satellite) optical data for land cover classification using object-based. This paper indicates a
comparison between object-based and pixel-based approaches in image fusion. The per-pixel method, support vector
machines (SVM) was implemented to the fused image based on Principal Component Analysis (PCA). For the objectbased
classification was applied to the fused images to separate land cover classes by using nearest neighbor (NN)
classifier. Finally, the accuracy assessment was employed by comparing with the classification of land cover mapping
generated from fused image dataset and THAICHOTE image. The object-based data fused COSMO-SkyMed with
THAICHOTE images demonstrated the best classification accuracies, well over 85%. As the results, an object-based data
fusion provides higher land cover classification accuracy than per-pixel data fusion.
|