1. College of Computer and Information Science, Fujian Agriculture and Forestry University, Fuzhou 350002, China 2. College of Forestry, Fujian Agriculture and Forestry University, Fuzhou 350002, China 3. Forestry Post-Doctoral Station of Fujian Agriculture and Forestry University, Fuzhou 350002, China 4. Key Laboratory for Ecology and Resource Statistics of Fujian Province, Fuzhou 350002, China
Image recognition based on low-altitude remote sensing imageries provides a new technological opportunity for forest survey and monitoring. In this study, the authors took the permanent gully in Benggang District, Anxi County, Fujian Province, as an instance and constructed the FC-DenseNet to identify tree species based on the low-altitude aerial optical image of UAV. First, the dense module in the FC-DenseNet model can extract the features of spectral images and enhance the information of the deep network, and the transition down block has an impact on reducing the image dimensions and highlighting the texture and spectral features; then, the transition up block can resize the scale of the predicted image to that of the original image, combined with information fusion of the shallow Dense module; finally, the Softmax classifier is used to achieve pixel-level classification so as to complete the tree species recognition. The results are as follows: ①The FC-DenseNet model based on the low-altitude aerial images not only could identify the difference between vegetation and non-vegetation but also could detect the their spatial distribution. The accuracy of the FC-DenseNet-103 model for vegetation and non-vegetation pixels is 92.1%, and the 103 layers’ network layer is the best network layer. ②Tree species are subdivided into 13 categories, and the accuracy of FC-DenseNet-103 model for dominant species reaches 79%.Some conclusions have been reached: The FC-DenseNet model based on low-altitude aerial optical images has a high tree classification accuracy. With the low cost of low-altitude aerial optical imagery, low data acquisition costs and short time cycles, forest resource surveys and forest species detection can be facilitated. The results obtained by the authors provide a new method in the field of tree recognition using deep learning.
Zhang Y, Tao P, Liang S X , et al. Research on application of UAV RS techniques in forest inventories[J]. Journal of Southwest Forestry College, 2011,31(3):49-53.
[3]
王伟 . 无人机影像森林信息提取与模型研建[D]. 北京:北京林业大学, 2015.
Wang W . Forest Information Extraction and Model Building[D]. Beijing:Beijing Forestry University, 2015.
[4]
Zarco-Tejada P J, González-Dugo V, Berni J A J . Fluorescence,temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera[J]. Remote Sensing of Environment, 2012,117(1):322-337.
[5]
Zarco-Tejada P J, Guillén-Climent M L, Hernández-Clemente R , et al. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle(UAV)[J]. Agricultural and Forest Meteorology, 2013,171(8):281-294.
[6]
Zarco-Tejada P J, Morales A, Testi L , et al. Spatio-temporal patterns of chlorophyll fluorescence and physiological and structural indices acquired from hyperspectral imagery as compared with carbon fluxes measured with eddy covariance[J]. Remote Sensing of Environment, 2013,133(12):102-115.
[7]
Zarco-Tejada P J, Catalina A, González M R , et al. Relationships between net photosynjournal and steady-state chlorophyll fluorescence retrieved from airborne hyperspectral imagery[J]. Remote Sensing of Environment, 2013,136(136):247-258.
[8]
Duan S B, Li Z L, Wu H , et al. Inversion of the PROSAIL model to estimate leaf area index of maize,potato and sunflower fields from unmanned aerial vehicle hyperspectral data[J]. International Journal of Applied Earth Observation and Geoinformation, 2014,26(2):12-20.
[9]
Calderón R , Navas-Cortés J A,Lucena C ,et al. High-resolution airborne hyperspectral and thermal imagery for early detection of verticillium wilt of olive using fluorescence,temperature and narrow-band spectral indices[J]. Remote Sensing of Environment, 2013,139(139):231-245.
Zhou Z M, Yang Y M, Chen B Q . Study on the extraction of exotic species Spartina alterniflora from UAV visible images[J]. Journal of Subtropical Resources and Environment, 2017,12(2):90-92.
Wei Z Y . Plant Disease Detection Based on Visible Images and Convolutional Neural Network[D]. Harbin:Harbin Institute of Technology, 2017.
[12]
Wang D L, Zhang X M, Liu Y Q . Recognition system of leaf images based on neuronal network[J]. Journal of Forestry Research, 2006,17(3):243-246.
[13]
Wu Q F, Zhou C L, Wang C N . Feature extraction and automatic recognition of plant leaf using artificial neural network [C]//Proceeding of the 2nd International Conference on Computer Science and Education. 2007: 47-50.
[14]
Backes A R, Casanova D, Bruno O M . A complex network-based approach for boundary shape analysis[J]. Pattern Recognition, 2009,42(1):54-67.
[15]
Mallah C, Cope J, Orwell J . Plant leaf classification using probabilistic integration of shape,texture and margin features [C]//Signal Processing,Pattern Recognition and Applications, 2013,5(1):1-8.
Ding L L, Li Q Z, Du X , et al. Vegetation extraction method based on color indices from UAV images[J]. Remote Sensing for Land and Resources, 2016,28(1):78-86.doi: 10.6046/gtzyyg.2016.01.12.
Jing R, Deng L, Zhao W J , et al. Object-oriented aquatic vegetation extracting approach based on visible vegetation indices[J]. Chinese Journal of Applied Ecology, 2016,27(5):1427-1436.
[19]
Krizhevsky A, Sutskever I, Hinton G E . ImageNet classification with deep convolutional neural networks [C]//Proceedings of the 25th International Conference on Neural Information Processing Systems. 2012,( 1):1097-1105.
[20]
Wu X, He R, Sun Z N , et al. A light CNN for deep face representation with noisy labels[J]. IEEE Transactions on Information Forensics and Security, 2018,13(11):2884-2896.
[21]
Collobert R, Weston J, Karlen M , et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011,12(1):2493-2537
[22]
Dollár P, Wojek C, Schiele B , et al. Pedestrian detection:A benchmark [C]//Proceeding of 25th IEEE Conference on Computer Vision and Pattern Recognition.IEEE, 2009: 304-311
Zhang S, Huai Y J . Leaf image recognition based on layered convolutions neural network deep learning[J]. Journal of Beijing Forestry University, 2016,38(9):108-115.
Chen Y Y . OBIA Classification of Remote Sensing Image Based on Convolutional Neural Network[D]. Beijing:China University of Geosciences, 2018.
[25]
Sladojevic S, Arsenovic M, Anderla A , et al. Deep neural networks based recognition of plant diseases by leaf image classification[J].Computational Intelligence and Neuroscience, 2016(6):1-11
[26]
Brahimi M, Boukhalfa K, Moussaoui A . Deep learning for tomato diseases:Classification and symptoms visualization[J]. Applied Artificial Intelligence, 2017: 1-17
[27]
Amara J, Bouaziz B, Algergawy A . A deep learning-based approach for banana leaf diseases classification[EB/OL]. .
[28]
Long J, Shelhamer E, Darrell T . Fully convolutional networks for semantic segmentation[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014,39(4):640-651.
[29]
Huang G, Liu Z, Weinberger K Q , et al. Densely connected convolutional networks [C]//Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition.IEEE, 2017,1(2):2261-2269
Lin J L, Chen Z M, Huang Y H , et al. Study on the characteristics of spatial distribution of slope disintegration erosion in Anxi County[J]. Research of Soil and Water Conservation, 2009,16(6):63-68.
[31]
吴征镒 . 中国植被[M]. 北京: 科学出版社, 1980: 16-19.
Wu Z Y. Chinese Vegetation[M]. Beijing: Science Press, 1980: 16-19.
[32]
Ioffe S, Szegedy C . Batch normalization:Accelerating deep network training by reducing internal covariate shift [C]//International Conference on Machine Learning. 2015: 448-456.
[33]
Jégou S, Drozdzal M, Vazquez D , et al. The one hundred layers tiramisu:Fully convolutional densenets for semantic segmentation [C]//Proceeding of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).IEEE, 2017: 1175-1183.