Worawattanamateekul J., Canisius X. J. F., Samarakoon L.,
Asian Center for Research on Remote Sensing
Asian Institute of Technology
P.O.Box4, Klong Luang, Pathumthani 12120 Thailand
E-Mail: [email protected] , [email protected]
Abstract
Food security has become a global key issue, and this is a major concern particularly to Asian region due to rapid expansion of Asian population. In many countries of the region, accurate evaluation of food production and estimation is not possible due to lack of information. Further, insufficient or obsolete information hampers timely solutions when there is a decrease in production, and introducing appropriate solutions for increase productivity. One of the basic information that is not available in most of the countries in Asian region is the cultivated area that could keep the planers well informed of the future harvest, and prepared for food crisis in advance. This paper examined the potential of satellite remote sensing in estimating irrigated paddy cultivated areas in a test site in Indonesia. Due to frequent cloud cover in this area solely rely on optical sensor data is a limiting factor of using satellite data for mapping. Attempt was made to integrate SAR data acquired during growing period with optical data to overcome this limitation. Data from JERS-1, optical and SAR data was used for the study. Applying various fusion methods, it was found that combination of vegetation index, average intensities of SAR, and principal component of optical data gives the optimal solution for the test area. Results proved data fusion from different sources acquired in various stages irrespective to their source could satisfactorily be used in estimating irrigated paddy area under cultivation.
Introduction
Data fusion means a very wide domain and it is quite difficult to provide a precise definition. Several definitions of data fusion have been proposed. Pohl and Van Genderen (1998) defined ” image fusion is the combination of two or more different images to form a new image by using a certain algorithm” which is restricted to image. Hall and Llinas (1997) defined “data fusion techniques combine data from multiple sensors, and related information from associated databases, to achieve improved accuracy and more specific inferences that could be achieved by the use of single sensor alone”. This definition focused on information quality and fusion methods. According to these definitions, it could imply that purposes of data fusion should be the information obtained that hopefully should at least improve image visualization and interpretation.
There are several fusion approaches, generally, fusion can be divided into three main categories based on the stage at which the fusion is performed namely: pixel based, feature based and decision based. Among these approaches, only pixel based method had been considered in this study. In pixel based fusion, the data are merged on a pixel-by-pixel basis. Feature based approach always merge the different data sources at the intermediate level. Each image from different sources is segmented and the segmented images are fused together. Decision based fusion, the outputs of each of the single source interpretation are combined to create a new interpretation.
Although there are many data sources for the purpose of fusion, this study was mainly dedicated to only remote sensing data fusion and their visualization with the following possibilities; multitemporal and multisensor data fusion. Several remote sensing data have been acquired and some possible fusion techniques were applied to these data to generate image fusion results. Results of fused data were demonstrated and interpreted in terms of its usefulness in irrigated rice field identification.
Test Area and Data Used
Semarang is the selected test area and located in Java Island, Indonesia. Both optical and SAR data were used in this study. Figure 1 shows false color composite of JERS-OPS data of test area, table 1 present satellite data descriptions used in the study.
|
||||||||||||||||||||||
Figure1: Map of Test Area | Table 1 Remote Sensing Data Descriptions |
Methodology
All remotely sensed data both optical and SAR data required systematic corrections. The data distributor normally provides this step. Speckle reduction by applying speckle specific filter is the next process applied to SAR data to reduce the data noise while retaining the information. Subsequently, 16 bit SAR data were converted to 8 bit data to be able to compare with 8 bit optical data. In the optical side, the data were necessary to go through atmospheric correction step. Then, both optical and radar data were coregistered in order to be fused together. Finally, the visualization step presents results of fusing data. Figure 2 shows general steps in fusion process.
Figure 2: Data Fusion Process
Figure 2 shows general steps in data fusion process. However, it is necessary keeping in mind that the previous steps in the flow are crucial for the succeeding steps and therefore to the overall accuracy of the image map. Selection of data for fusion is very important as fusing inappropriate data could deteriorate information content.
At the fusion step, we applied several fusion techniques and compared the fused results. Those techniques are overlay, color transformation (RGB-> IHS) and substitution, principal component analysis (PCA) and substitution and thematic combinations.
- Overlay
Overlay of multitemporal data and display in different color channel of RGB. This technique is suitable to apply to single frequency or single polarization data like SAR. Not only this technique renders some colors to the interpreter but it also present changes during the acquisition period of the multidate data. - Color Transformation and Substitution
Transforming multispectral image from RGB to HIS color space, then substitute intensity with other higher resolution data like SPOT-PAN or SAR data, finally, convert back to RGB, which is the standard approach to improve the resolution of low resolution multispectral image was used. The color characteristics of the original multispetral image are maintained (Hue and Saturation are not changed) while we can observe the image at higher detail as a result of replacing intensity with high resolution image. Result shown in figure 4 is the fusion of Landsat TM with SAR. - Principal Component Analysis and Substitution
The purpose of applying principal component analysis (PCA) is to reduce the dimensionality of input data into a smaller number of output channels. It is more suitable to multispectral data where there are more bands to combine. Also, PCA could provide lesser dimensionality due to high correlating among spectral bands. This fact is well documented for all of the optical sensors presently available. In PCA, the most information of input will be transformed into the first component and the information content decreases with increasing of the number of PCA component. - Thematic Combinations
Information was derived from optical and SAR data sources separately, and they were integrated to get the fusion image. Combination of NDVI image, PCA1 of JERS-OPS image and average of multitemporal SAR data were investigated for the purpose of irrigated rice area identification.
Results and Discussions
Figure 3: Data Fusion: Overlay
Figure 4: Data Fusion : Color Transformation and Substitution (Right): Thematic Combinations
The result in figure 3 shown multitemporal SAR data of the Semarang area. White and black areas can be interpreted as no change areas during the whole period (Sep96-March97). Other primary colors, blue, green and red indicate that there are changes occurred in such areas in September, November and March respectively. Color transformation technique by replacing SAR data to intensity channel can improve interpretation accuracy of Landsat TM (30 m.) to 18 m. of JERS-SAR by retaining spectral properties of TM data. The right image in figure 4 shown the result of thematic combination. Red channel represent NDVI derived from JERS-OPS, Green represent the average backscatter of multidate SAR and blue represent first PCA of JERS-OPS. There are more colors generated from this fusion implying more information are obtained. Magenta color indicates vegetation area which is paddy area in August. Blue indicate vegetation area, green indicate non-vegetated area with some structures because it gives high backscatter while low NDVI and yellow area represent vegetation with high structure which mean forests.
Conclusion
Combining of data from different sources using thematic combination seems to be the most appropriate technique because more information can be derived. However, the information obtained from this study still requires field information to verify the applicability of fusion technique presented here. Further, fusion among multisensor and multisystems could require more parameters for instance satellite geometry, spectral band width etc. complexing the interpretation.
Reference
Soldberg, A. H., 1994, Multisource classification of Remotely Sensed Data: Fusion of Landsat TM and SAR Images, IEEE Transactions on Geosceince and Remote Sensing Vol.32 No.4 July 1994 pp768-776.
Wald L., 1999, Some Terms of Reference in Data Fusion, IEEE Transactions on Geosceince and Remote Sensing Vol.37 No.3 May 1999 pp1190.