Enhancing Land Cover Mapping through Integration of Pixel-Based and Object-Based Classifications from Remotely Sensed Imagery

Yuehong Chen, Ya'nan Zhou, Yong Ge, Ru An, Yu Chen
2018 Remote Sensing  
Pixel-based and object-based classifications are two commonly used approaches in extracting land cover information from remote sensing images. However, they each have their own inherent merits and limitations. This study, therefore, proposes a new classification method through the integration of pixel-based and object-based classifications (IPOC). Firstly, it employs pixel-based soft classification to obtain the class proportions of pixels to characterize the land cover details from pixel-scale
more » ... properties. Secondly, it adopts area-to-point kriging to explore the class spatial dependence between objects for each pixel from object-based soft classification results. Thirdly, the class proportions of pixels and the class spatial dependence of pixels are fused as the class occurrence of pixels. Last, a linear optimization model on objects is built to determine the optimal class label of pixels within each object. Two remote sensing images are used to evaluate the effectiveness of IPOC. The experimental results demonstrate that IPOC performs better than the traditional pixel-based hard classification and object-based hard classification methods. Specifically, the overall accuracy of IPOC is 7.64% higher than that of pixel-based hard classification and 4.64% greater than that of object-based hard classification in the first experiment, while the overall accuracy improvements in the second experiment are 3.59% and 3.42%, respectively. Meanwhile, IPOC produces less salt and pepper effect than the pixel-based hard classification method and generates more accurate land cover details and small patches than the object-based hard classification method. mutually exclusive land cover classes in terms of their spectral properties. By contrast, PSC produces the proportions (i.e., possibilities of class occurrence) of land cover classes within each pixel because mixed pixels that contain more than one class are inevitable in various remote sensing images [8] . Usually, PSC results can be converted into pixel-based hard classification results by assigning the class label with the maximum proportion to the pixel. Pixel-based classification has long been the mainstay technique for classifying remote sensing images [9,10], especially low/medium spatial resolution remote sensing images (e.g., MODIS images and Landsat images). In recent years, with the advent of high and very-high spatial resolution remote sensing images, the advanced object-based classification has been developed [6]. Object-based classification has two differences from pixel-based classification. The first difference is that object-based classification performs in units of objects derived from image segmentation whereas the process of pixel-based classification is directly based on image pixels. The second difference is that pixel-based classification uses mainly the pixels' spectral properties while object-based classification employs not only spectral properties of objects but also objects' spatial, textural, and shape properties [6] . Despite these differences, both pixel-based and object-based classifications have achieved a relatively satisfactory performance in extracting land cover information from different remote sensing images [6, 7] ; each has their own inherent merits and limitations. Pixel-based classification does not change the spectral properties of the pixels and may preserve land cover details; however, it is difficult to use complementary properties (e.g., spatial structures) [11, 12] , which may lead to the salt and pepper effect and the unmaintained structure of land cover patches in classified maps [6, 13] . Although object-based classification can use both spectral and complementary properties of objects, the spectral properties of objects are smoothed by image segmentation [14] ; the segmentation errors caused by under-segmentation and over-segmentation could affect the accuracy of object-based classification results [15] . The smoothed spectral properties of objects may be suitable for heterogeneous land areas. However, they are inappropriate for homogeneous land regions because the spectral separability between different classes are smoothed and reduced in homogeneous areas [6, 14] , especially for medium spatial resolution remote sensing images. Generating accurate image segmentation results is considered as the crucial process in object-based classification; however, image segmentation errors are often inevitable [6, 13, [15] [16] [17] . For instance, image segmentation usually produces the mis-segmented boundary of objects (e.g., the object marked by the red polygon in Figure 1a ). Meanwhile, some important small land cover patches cannot be successfully segmented and they are merged into adjacent objects (e.g., the object marked by the red polygon in Figure 1b) . Basically, the two marked objects are mixed objects that include more than one land cover class [15, 18] . The mixed object in Figure 1a could be regarded as the classic high-resolution type defined by Woodcock and Strahler [19] , where the segmented objects are smaller than the objects of interest and they often occur in intersection regions between different large land cover patches. Land cover information in the intersection region of this type of mixed objects is often spatially related with neighboring objects. By contrast, the mixed object in Figure 1b could be viewed as the classic low-resolution type [19, 20] , where the segmented objects are larger than the objects of interest and they usually involve isolated small land cover patches. In traditional object-based classification, mixed objects have mixed properties of different land cover classes and all pixels within each mixed object have to be assigned to the same class (i.e., a hard classification process on objects), and thus reducing the accuracy of classified maps [15] .
doi:10.3390/rs10010077 fatcat:isq6ys63rvcl3fizkkcbnvdqyu