Classification of tea garden based on multi-source high-resolution satellite images using multi-dimensional convolutional neural network
LIAO Kuo1,2(), NIE Lei3, YANG Zeyu4(), ZHANG Hongyan3, WANG Yanjie3, PENG Jida1,2, DANG Haofei1,2, LENG Wei3
1. Fujian Institute of Meteorological Sciences, Fuzhou 350008, China 2. Wuyi Mountain National Climate Observatory, Wuyishan 354300, China 3. Wuhan Jiahe Technology Co., Ltd., Wuhan 430200, China 4. Zhejiang Wanwei Spatial Information Technology Co., Ltd., Nanjing 210012, China
The terrain conditions and tea plantation structure of Wuyishan City are complex, with cloudy and rainy weather, so it is difficult to obtain satellite images here. To address the problem of difficult extraction of tea gardens from a single image source, we investigated the spectral information of Sentinel-2 images and the texture features of Google images in Xintian Town, Wuyishan City, coupled with which a tea garden classification method based on multi-source high-resolution satellite images and multidimensional convolutional neural networks (MM-CNN) was established. In this method, tea gardens and suspected tea gardens were extracted, respectively, with two models developed with images with different spatial resolutions, based on one- and two-dimensional CNN. Results obtained with the two CNN models were combined, and the high-accuracy distribution of tea gardens in the study area was generated in a relatively economical and efficient way. The results showed that the spatial distribution accuracy of the tea gardens identified by MM-CNN is better than that of the single image source method. The MM-CNN method is highly universal and robust and provides a reference method for efficiently monitoring the distribution of tea gardens in large-scale hilly areas of South China.
Li L W, Li N, Lu D S. Mapping tea gardens spatial distribution in northwestern Zhejiang Province using multi-temporal Sentinel-2 imagery[J]. Journal of Zhejiang A & F University, 2019, 36(5):841-848.
[2]
Phan P, Chen N, Xu L, et al. Using multi-temporal MODIS NDVI data to monitor tea status and forecast yield:A case study at Tanuyen,Laichau,Vietnam[J]. Remote Sensing, 2020, 12(11):1814.
doi: 10.3390/rs12111814
[3]
Das A C. Integrating an expert system,GIS,and satellite remote sensing to evaluate land suitability for sustainable tea production in Bangladesh[J]. Remote Sensing, 2020, 12(24):25.
doi: 10.3390/rs12010025
[4]
何小娟. 名山县茶园生态系统碳汇能力分析[D]. 成都: 四川农业大学, 2013.
He X J. Analysis the carbon sequestration capacity of the tea plantation of Mingshan County[D]. Chengdu: Sichuan Agricultural University, 2013.
Zhao X Q, Wang P, Jing L H, et al. The application of spectral characteristics of time series Sentinel-2A images in tea land extraction[J]. Science of Surveying and Mapping, 2020, 264(6):84-92.
[6]
Zhu J, Pan Z, Wang H, et al. An improved multi-temporal and multi-feature tea plantation identification method using Sentinel-2 imagery[J]. Sensors, 2019, 19(9):2087.
doi: 10.3390/s19092087
[7]
Huang Y, Li S, Yuang L, et al. Estimating tea plantation area based on multi-source satellite data[C]// 2019 8th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), 2019.
Yang Y K, Chen Y Z, Wu B, et al. Extraction of tea plantation image based on GF-2 image and texture information[J]. Jiangsu Agricultural Sciences, 2019, 47(2):210-214.
Xu W Y, Sun R, Jin Z F. Extracting tea plantations based on ZY-3 satellite data[J]. Transactions of the Chinese Society of Agricultural Engineering, 2016(s1):161-168.
[10]
Blaschke T, Hay G J, Kelly M, et al. Geographic object-based image analysis-Towards a new paradigm-sciencedirect[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2014, 87:180-191.
doi: 10.1016/j.isprsjprs.2013.09.014
Wu J Y. Research on non-parametric tea garden recognition method based on texture spatial model enhancement of high-resolution remote sensing image[D]. Kunming: Yunnan Noraml Univerity, 2020.
[13]
Jamil A, Bayram B. The delineation of tea gardens from high resolution digital orthoimages using mean-shift and supervised machine learning methods[J]. Geocarto International, 2021, 36(7):758-772.
doi: 10.1080/10106049.2019.1622597
[14]
许光明. 基于面向对象和多源数据融合的茶园遥感提取研究[D]. 西安: 陕西师范大学, 2016.
Xu G M. Research on tea garden remote sensing extraction based on object-oriented and multi-Metadata[D]. Xi’an: Shaanxi Normal University, 2016.
Ma C, Yang F, Wang X C. Extracting tea plantations in southern hilly and mountainous region based on mesoscale spectrum and temporal phenological features[J]. Remote Sensing for Land and Resources, 2019, 31(1):141-148.doi: 10.6046/gtzyyg.2019.01.19.
doi: 10.6046/gtzyyg.2019.01.19
Yao J H, Wu J M, Yang Y, et al. Segmentation in multi-spectral remote sensing images using the fully convolutional neural network[J]. Jounrnal of Image and Graphics, 2020, 25(1):180-192.
[17]
Huang X, Zhu Z R, Li Y S, et al. Tea garden detection from high-resolution imagery using a scene-based framework[J]. Photogrammetric Engineering & Remote Sensing, 2018, 84(11):723-731.
Wang X Q, Zhang Z M, Wang C Y, High resolution remote sensing image recognition of tea plantation based on hierarchical latent Dirichlet allocation model[J]. Journal of Qingdao University(Natural Science Edition), 2020, 131(3):32-37,46.
[19]
朱泽润. 基于高分辨率遥感影像的茶园场景提取方法[D]. 武汉: 武汉大学, 2018.
Zhu Z R. Tea garden scene detection from high-resolution remote sensing imagery[D]. Wuhan: Wuhan University, 2018.
Chen H K, Sun R C. Tea plantation classification techonology using remote sensing image based deep learning[J]. Industrial Control Computer, 2020, 33(2):93-94,96.
[21]
Tang Z, Li M, Wang X. Mapping tea plantations from VHR images using OBIA and convolutional neural networks[J]. Remote Sensing, 2020, 12(18):2935.
doi: 10.3390/rs12182935
[22]
Jamil A, Bayram B. Automatic discriminative feature extraction using convolutional neural network for remote sensing image classification[C]// The 40th Asian Conference on Remote Sensing, 2019.
Huang S D, Xu W H, Wu C, et al. Research progress of remote sensing on tea plantation monitoring[J]. Journal of West China Forestry Science, 2020(2):1-9.
Ye Y G, Li Y H. Development status of tea industry in Wuyishan City and its countermeasures[J]. Taiwan Agricultural Research, 2016(4):59-63.
[25]
Li Z, Chen G, Zhang T. A CNN-transformer hybrid approach for crop classification using multitemporal multisensor images[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2020, 13:847-858.
doi: 10.1109/JSTARS.2020.2971763
[26]
Ronneberger O, Fischer P, Brox T. U-Net:Convolutional networks for biomedical image segmentation[C]// International Conference on Medical Image Computing and Computer-Assisted Intervention.Springer,Cham, 2015.
[27]
Bodansky E, Gribov A, Pilouk M. Smoothing and compression of lines obtained by raster-to-vector conversion[C]// Graphics Recognition Algorithms & Applications,International Workshop,Grec,Kingston,Ontario,Canada,September,Selected Papers.Springer-Verlag, 2001.
Yang Z Y, Zhang H Y, Ming J, et al. Extraction of winter rapeseed from high-resolution remote sensing imagery via deep learning[J]. Bulletin of Surveying and Mapping, 2020, 522(9):113-116.
[29]
武夷山市统计局. 武夷山统计年鉴——2020[M]. 北京: 中国统计出版社, 2020.
Statistical Bureau of Wuyishan City. Statistics yearbook of Wuyishan[M]. Beijing: China Statistics Press, 2020.
Liu Y M, Yang X M, Wang Z H, et al. Extracting raft aquaculture areas in Sanduao from high-resolution remote sensing images using RCF[J]. Haiyang Xuebao, 2019, 41(4):119-130.