Please wait a minute...
 
Remote Sensing for Natural Resources    2023, Vol. 35 Issue (2) : 89-96     DOI: 10.6046/zrzyyg.2022095
|
An improved spatio-temporal fusion model for remote sensing images based on singular spectrum analysis
AN Na1(), ZHAO Yingying2, SUN Yaqin1, ZHANG Aizhu3, FU Hang3, YAO Yanjuan4(), SUN Genyun3,5
1. China Aero Geophysical Survey and Remote Sensing Center for Natural Resources, Beijing 100083, China
2. Changsha Planning and Design Survey Research Institute, Changsha 410007, China
3. College of Oceanography and Space Informatics, China University of Petroleum (East China), Qingdao 266580, China
4. Satellite Environment Center, Ministry of Environmental protection of China, Beijing 100094, China
5. Laboratory for Marine Resources Qingdao National Laboratory for Marine Science and Technology, Qingdao 266237, China
Download: PDF(5166 KB)   HTML
Export: BibTeX | EndNote | Reference Manager | ProCite | RefWorks    
Abstract  

Spatio-temporal fusion can generate image sequences with sufficiently high temporal and spatial resolution. However, current studies tend to improve prediction accuracy using as much spatio-temporal data as possible and complex non-linear models, while few of them focus on analyzing images themselves by making full use of their intrinsic features, such as trends and textures. This study proposed a 2DSSA spatio-temporal fusion model (2DSSA-STFM) based on 2D singular spectrum analysis (2DSSA). In this model, the major spatial trends and details of the existing images at the target moment can be predicted by decomposing the images into trend and detail components. Firstly, the linear relationship between the trend of high-spatial-resolution data and low-spatial-resolution data was built to calculate the trend components of the images at the target moment. Then, the linear relationship between the low-resolution and the high-resolution detail components in two time phases was established to determine the detail components of the images at the target moment. Finally, the calculated trend and detail components were combined to form the target prediction images. The 2DSSA-STFM was applied to two sets of medium-resolution Landsat7 ETM+ and MODIS images, yielding smaller experimental errors than conventional spatio-temporal fusion models.

Keywords spatio-temporal fusion      singular spectrum analysis      trend component      detail component      decomposition     
ZTFLH:  TP753  
Issue Date: 07 July 2023
Service
E-mail this article
E-mail Alert
RSS
Articles by authors
Na AN
Yingying ZHAO
Yaqin SUN
Aizhu ZHANG
Hang FU
Yanjuan YAO
Genyun SUN
Cite this article:   
Na AN,Yingying ZHAO,Yaqin SUN, et al. An improved spatio-temporal fusion model for remote sensing images based on singular spectrum analysis[J]. Remote Sensing for Natural Resources, 2023, 35(2): 89-96.
URL:  
https://www.gtzyyg.com/EN/10.6046/zrzyyg.2022095     OR     https://www.gtzyyg.com/EN/Y2023/V35/I2/89
Fig.1  Algorithm flowchart
Fig.2  Images of MODIS and Landsat7 ETM+ of data set 1
Fig.3  Images of MODIS and Landsat7 ETM+ of data set 2
Fig.4  Trend and detail terms predicted by data set 1
Fig.5  Data set 1 prediction image
方法 波段 AAD RMSE ERGAS SSIM
STARFM 绿光 117.99 160.36 0.848 0.536
红光 174.08 239.12 0.432
近红外 304.41 448.40 0.438
2DSSA-
STFM
绿光 115.12 157.79 0.828 0.552
红光 170.55 234.55 0.442
近红外 291.29 418.66 0.518
Tab.1  Quantitative evaluation results of data set 1
Fig.6  Data set 2 prediction image
方法 波段 AAD RMSE ERGAS SSIM
STARFM 绿光 315.86 415.92 1.287 0.120
红光 89.01 131.83 0.522
近红外 67.69 93.68 0.624
ESTARFM 绿光 16.01 21.15 1.154 0.947
红光 18.89 25.13 0.909
近红外 25.26 32.86 0.875
2DSSA-
STFM
绿光 28.29 35.45 1.233 0.894
红光 28.02 37.30 0.834
近红外 24.79 32.34 0.882
Tab.2  Quantitative evaluation results of data set 2
[1] Roy D P, Wulder M A, Loveland T R, et al. Landsat8:Science and product vision for terrestrial global change research[J]. Remote Sensing of Environment, 2014, 145:154-172.
doi: 10.1016/j.rse.2014.02.001 url: https://linkinghub.elsevier.com/retrieve/pii/S003442571400042X
[2] Singh D. Generation and evaluation of gross primary productivity using Landsat data through blending with MODIS data[J]. International Journal of Applied Earth Observation and Geoinformation, 2011, 13(1):59-69.
doi: 10.1016/j.jag.2010.06.007 url: https://linkinghub.elsevier.com/retrieve/pii/S0303243410000711
[3] Bhandari S, Phinn S, Gill T. Preparing Landsat image time series (LITS) for monitoring changes in vegetation phenology in Queensland,Australia[J]. Remote Sensing, 2012, 4(6):1856-1886.
doi: 10.3390/rs4061856 url: http://www.mdpi.com/2072-4292/4/6/1856
[4] Weng Q H, Fu P, Gao F. Generating daily land surface temperature at Landsat resolution by fusing Landsat and MODIS data[J]. Remote Sensing of Environment, 2014, 145:55-67.
doi: 10.1016/j.rse.2014.02.003 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425714000479
[5] Gevaert C M, García-Haro F J. A comparison of STARFM and an unmixing-based algorithm for Landsat and MODIS data fusion[J]. Remote Sensing of Environment, 2015, 156:34-44.
doi: 10.1016/j.rse.2014.09.012 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425714003551
[6] 邬明权, 牛铮, 王长耀. 多源遥感数据时空融合模型应用分析[J]. 地球信息科学学报, 2014, 16(5):776-783.
doi: 10.3724/SP.J.1047.2014.00776
[6] Wu M Q, Niu Z, Wang C Y. Assessing the accuracy of spatial and temporal image fusion model of complex area in south China[J]. Journal of Geo-Information Science, 2014, 16(5):776-783.
[7] Zhao Y, Huang B, Song H. A robust adaptive spatial and temporal image fusion model for complex land surface changes[J]. Remote Sensing of Environment, 2018, 208:42-62.
doi: 10.1016/j.rse.2018.02.009 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425718300154
[8] 张爱竹, 王伟, 郑雄伟, 等. 一种基于分层策略的时空融合模型[J]. 自然资源遥感, 2021, 33(3):18-26.doi:10.6046/zrzyyg.2020346.
doi: 10.6046/zrzyyg.2020346
[8] Zhang A Z, Wang W, Zheng X W, et al. A hierarchical spatial-temporal fusion model[J]. Remote Sensing for Natural Resources, 2021, 33(3):18-26.doi:10.6046/zrzyyg.2020346.
doi: 10.6046/zrzyyg.2020346
[9] Wu M, Wu C, Huang W, et al. An improved high spatial and temporal data fusion approach for combining Landsat and MODIS data to generate daily synthetic Landsat imagery[J]. Information Fusion, 2016, 31:14-25.
doi: 10.1016/j.inffus.2015.12.005 url: https://linkinghub.elsevier.com/retrieve/pii/S1566253515001177
[10] Huang B, Song H. Spatiotemporal reflectance fusion via sparse representation[J]. IEEE Transactions on Geoscience and Remote Sensing, 2012, 50(10):3707-3716.
doi: 10.1109/TGRS.2012.2186638 url: http://ieeexplore.ieee.org/document/6169983/
[11] Cheng Q, Liu H, Shen H, et al. A spatial and temporal nonlocal filter-based data fusion method[J]. IEEE Transactions on Geoscience and Remote Sensing, 2017, 55(8):4476-4488.
doi: 10.1109/TGRS.2017.2692802 url: http://ieeexplore.ieee.org/document/7917259/
[12] Li J, Li Y, He L, et al. Spatio-temporal fusion for remote sensing data:An overview and new benchmark[J]. Science China Information Sciences, 2020, 63(4):7-23.
[13] 董文全, 蒙继华. 遥感数据时空融合研究进展及展望[J]. 国土资源遥感, 2018, 30(2):1-11.doi:10.6046/gtzyyg.2018.02.01.
doi: 10.6046/gtzyyg.2018.02.01
[13] Dong W Q, Meng J H. Review of spatiotemporal fusion model of remote sensing data[J]. Remote Sensing for Land and Resources, 2018, 30(2):1-11.doi:10.6046/gtzyyg.2018.02.01.
doi: 10.6046/gtzyyg.2018.02.01
[14] Gao F, Masek J, Schwaller M, et al. On the blending of the Landsat and MODIS surface reflectance:Predicting daily Landsat surface reflectance[J]. IEEE Transactions on Geoscience and Remote Sensing, 2006, 44(8):2207-2218.
doi: 10.1109/TGRS.2006.872081 url: http://ieeexplore.ieee.org/document/1661809/
[15] Zhu X, Chen J, Gao F, et al. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions[J]. Remote Sensing of Environment, 2010, 114(11):2610-2623.
doi: 10.1016/j.rse.2010.05.032 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425710001884
[16] Shao Z, Cai J, Fu P, et al. Deep learning-based fusion of Landsat8 and Sentinel-2 images for a harmonized surface reflectance product[J]. Remote Sensing of Environment, 2019, 235(12):111425.
doi: 10.1016/j.rse.2019.111425 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425719304444
[17] Li Y, Li J, He L, et al. A new sensor bias-driven spatio-temporal fusion model based on convolutional neural networks[J]. Science China Information Sciences, 2020, 63(4):20-35.
[18] Jia D, Song C, Cheng C, et al. A novel deep learning-based spatio-temporal fusion method for combining satellite images with different resolutions using a two-stream convolutional neural network[J]. Remote Sensing, 2020, 12(4):698.
doi: 10.3390/rs12040698 url: https://www.mdpi.com/2072-4292/12/4/698
[19] Fu H, Sun G, Ren J, et al. Fusion of PCA and segmented-PCA domain multiscale 2-D-SSA for effective spectral-spatial feature extraction and data classification in hyperspectral imagery[J]. IEEE Transactions on Geoscience and Remote Sensing, 2022, 60:1-14.
[20] Sun G, Fu H, Ren J, et al. SpaSSA:Superpixelwise adaptive SSA for unsupervised spatial-spectral feature extraction in hyperspectral image[J]. IEEE Transactions on Cybernetics, 2022, 52(7):6158-6169.
doi: 10.1109/TCYB.2021.3104100 url: https://ieeexplore.ieee.org/document/9533174/
[21] Zabalza J, Ren J, Zheng J, et al. Novel two-dimensional singular spectrum analysis for effective feature extraction and data classification in hyperspectral imaging[J]. IEEE Transactions on Geoscience and Remote Sensing, 2015, 53(8):4418-4433.
doi: 10.1109/TGRS.36 url: https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=36
[22] Emelyanova I V, McVicar T R, Van Niel T G, et al. Assessing the accuracy of blending Landsat-MODIS surface reflectances in two landscapes with contrasting spatial and temporal dynamics:A framework for algorithm selection[J]. Remote Sensing of Environment, 2013, 133:193-209.
doi: 10.1016/j.rse.2013.02.007 url: https://linkinghub.elsevier.com/retrieve/pii/S0034425713000473
[23] 刘慧琴, 吴鹏海, 沈焕锋, 等. 一种基于非局部滤波的遥感时空信息融合方法[J]. 地理与地理信息科学. 2015, 31(4):27-32.
[23] Liu H W, Wu P H, Sheng H F. A spatio-temporal information fusion method based on non-local means filter[J]. Geography and Geo-Information Science, 2015, 31(4):27-32.
[1] SHENG Dezhi, XING Qianguo, LIU Hailong, ZHENG Xiangyang. Remote sensing monitoring of the spatio-temporal changes in pond aquaculture based on mixed pixel decomposition[J]. Remote Sensing for Natural Resources, 2022, 34(4): 53-59.
[2] YANG Wang, HE Yi, ZHANG Lifeng, WANG Wenhui, CHEN Youdong, CHEN Yi. InSAR monitoring of 3D surface deformation in Jinchuan mining area, Gansu Province[J]. Remote Sensing for Natural Resources, 2022, 34(1): 177-188.
[3] ZHANG Aizhu, WANG Wei, ZHENG Xiongwei, YAO Yanjuan, SUN Genyun, XIN Lei, WANG Ning, HU Guang. A hierarchical spatial-temporal fusion model[J]. Remote Sensing for Natural Resources, 2021, 33(3): 18-26.
[4] ZHANG Hongli, LUO Weiran, LI Yan. Spatiotemporal fusion of remote sensing images based on particle swarm optimization and pixel decomposition[J]. Remote Sensing for Land & Resources, 2020, 32(4): 33-40.
[5] QIN Qiming, CHEN Jin, ZHANG Yongguang, REN Huazhong, WU Zihua, ZHANG Chishan, WU Linsheng, LIU Jianli. A discussion on some frontier directions of quantitative remote sensing[J]. Remote Sensing for Land & Resources, 2020, 32(4): 8-15.
[6] Pengyan HUANG, Lijing BU, Yongliang FAN. Integrating visual features in polarimetric SAR image classification[J]. Remote Sensing for Land & Resources, 2020, 32(2): 88-93.
[7] Guangyu ZHOU, Bangquan LIU, Dan ZHANG. Target recognition in SAR images based on variational mode decomposition[J]. Remote Sensing for Land & Resources, 2020, 32(2): 33-39.
[8] Juan FENG, Jianli DING, Wenyu WEI. Soil salinization monitoring based on Radar data[J]. Remote Sensing for Land & Resources, 2019, 31(1): 195-203.
[9] Zengfu HOU, Rongyuan LIU, Bokun YAN, Kun TAN. Hyperspectral imagery anomaly detection based on band selection and learning dictionary[J]. Remote Sensing for Land & Resources, 2019, 31(1): 33-41.
[10] Jia XU, Chunqi YUAN, Yuane CHENG, chenyu ZENG, Kang XU. Active deep learning based polarimetric SAR image classification[J]. Remote Sensing for Land & Resources, 2018, 30(1): 72-77.
[11] XU Jindong, NI Mengying, TONG Xiangrong, ZHANG Yanjie, ZHENG Qiang. A new method for remote sensing image fusion based on multi-scale sparse decomposition[J]. REMOTE SENSING FOR LAND & RESOURCES, 2017, 29(3): 51-58.
[12] HE Lian, QIN Qiming, REN Huazhong. An adaptive hybrid Freeman/Eigenvalue polarimetric decomposition model[J]. REMOTE SENSING FOR LAND & RESOURCES, 2017, 29(2): 8-14.
[13] ZHA Dajian, LI Lelin, JIANG Wangshou, HAN Yongshun. Expanding research on CSG in 3D reconstruction from LiDAR[J]. REMOTE SENSING FOR LAND & RESOURCES, 2016, 28(4): 35-42.
[14] HE Haiyan, LING Feilong, WANG Xiaoqin, LIANG Zhifeng. Estimation of fractional vegetation coverage in water and soil loss area based on Radar vegetation index[J]. REMOTE SENSING FOR LAND & RESOURCES, 2015, 27(4): 165-170.
[15] CHEN Jun, DU Peijun, TAN Kun. An improved unsupervised classification scheme for polarimetric SAR image with MCSM-Wishart[J]. REMOTE SENSING FOR LAND & RESOURCES, 2015, 27(2): 15-21.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
京ICP备05055290号-2
Copyright © 2017 Remote Sensing for Natural Resources
Support by Beijing Magtech