Publication:
Soil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion

cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.orcid#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department1c9194e8-5738-4394-87bc-d2a86eba997c
cris.virtualsource.orcid1c9194e8-5738-4394-87bc-d2a86eba997c
dc.contributor.affiliationMiddle East Technical University; Devlet Su Isleri (DSI); Ministry of Forestry & Water Affairs - Turkey; Turkish Aeronautical Association; Turk Hava Kurumu University
dc.contributor.authorKaba, Eylem; Leloglu, Ugur Murat
dc.date.accessioned2024-06-25T11:45:37Z
dc.date.available2024-06-25T11:45:37Z
dc.date.issued2023
dc.description.abstractSoil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.
dc.description.doi10.1117/1.JRS.17.044513
dc.description.issue4
dc.description.pages27
dc.description.researchareasEnvironmental Sciences & Ecology; Remote Sensing; Imaging Science & Photographic Technology
dc.description.urihttp://dx.doi.org/10.1117/1.JRS.17.044513
dc.description.volume17
dc.description.woscategoryEnvironmental Sciences; Remote Sensing; Imaging Science & Photographic Technology
dc.identifier.urihttps://acikarsiv.thk.edu.tr/handle/123456789/1313
dc.language.isoEnglish
dc.publisherSPIE-SOC PHOTO-OPTICAL INSTRUMENTATION ENGINEERS
dc.relation.journalJOURNAL OF APPLIED REMOTE SENSING
dc.subjectsoil classification; hyperspectral; random forest; unmixing; image fusion
dc.titleSoil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion
dc.typeArticle
dspace.entity.typePublication

Files