Discriminate dominant tree species in natural secondary forests from UAV hyperspectral images using a hybrid 3D-2D convolutional neural network

LI Hao, QUAN Ying, LIU Jianyang, BIAN Shaojie, WANG Bin, LI Mingze

Journal of Nanjing Forestry University (Natural Sciences Edition) ›› 2026, Vol. 50 ›› Issue (2) : 9-18.

PDF(4747 KB)
PDF(4747 KB)
Journal of Nanjing Forestry University (Natural Sciences Edition) ›› 2026, Vol. 50 ›› Issue (2) : 9-18. DOI: 10.12302/j.issn.1000-2006.202411024

Discriminate dominant tree species in natural secondary forests from UAV hyperspectral images using a hybrid 3D-2D convolutional neural network

Author information +
History +

Abstract

【Objective】This study aims to improve the classification accuracy of dominant tree species in typical natural secondary forests in northeast China, a convolutional neural network (CNN)-based framework for tree species classification using UAV hyperspectral images was proposed.【Method】Four dominant tree species—Fraxinus mandshurica, Juglans mandshurica, Ulmus sp., and Betula platyphylla—from the Maoershan Experimental Forest Farm of Northeast Forestry University were studied. Hyperspectral images of seven different regions were acquired using a novel UAV-mounted hyperspectral imager. A single-tree dataset with varying crown sizes was constructed using ground-measured data, divided into training and test sets at a 7∶3 ratio. A hybrid 3D-2D-CNN model integrating 3D and 2D convolutional layers was developed: 3D convolutional layers extracted spectral-spatial coupled features, while 2D layers captured detailed spatial features, enhancing the model’s holistic learning capability. The model was compared with 2D-CNN, 3D-CNN, and feature-selection-based machine learning models (random forest (RF), support vector machine (SVM), and gradient boosting machine (GBM)). Additionally, the band importance was analyzed using a progressive band removal method, and spectral feature sensitivity was investigated.【Result】The proposed 3D-2D-CNN model achieved a classification accuracy of 87% and an F1 score of 0.86 for the four tree species, outperforming other algorithms with an overall accuracy improvement of 5%-6%. Band importance analysis highlighted the significant contribution of the near-infrared band classification.【Conclusion】The 3D-2D-CNN model, by effectively integrating spectral and spatial information, significantly enhanced the classification performance of natural secondary forest tree species compared to traditional methods. This approach provides technical support for forest resource management and ecosystem monitoring via remote sensing.

Key words

natural secondary forest / tree species classification / hyperspectral image / 3D-2D-CNN model / convolutional neural network / machine learning / band importance analysis

Cite this article

Download Citations
LI Hao , QUAN Ying , LIU Jianyang , et al . Discriminate dominant tree species in natural secondary forests from UAV hyperspectral images using a hybrid 3D-2D convolutional neural network[J]. Journal of Nanjing Forestry University (Natural Sciences Edition). 2026, 50(2): 9-18 https://doi.org/10.12302/j.issn.1000-2006.202411024

References

[1]
QIAN S N. Hyperspectral satellites,evolution,and development history[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2021, 14:7032-7056. DOI: 10.1109/JSTARS.2021.3090256.
[2]
吴见, 彭道黎. 高光谱遥感林业信息提取技术研究进展[J]. 光谱学与光谱分析, 2011, 31(9):2305-2312.
WU J, PENG D L. Advances in researches on hyperspectral remote sensing forestry information-extracting technology[J]. Spectroscopy and Spectral Analysis,2011, 31(9):2305-2312. DOI: 10.3964/j.issn.1000-0593(2011)09-2305-08.
[3]
ASNER G P. Biophysical and biochemical sources of variability in canopy reflectance[J]. Remote Sensing of Environment, 1998, 64(3):234-253. DOI: 10.1016/S0034-4257(98)00014-5.
[4]
CLARK M L, ROBERTS D A, CLARK D B. Hyperspectral discrimination of tropical rain forest tree species at leaf to crown scales[J]. Remote Sensing of Environment, 2005, 96(3/4):375-398. DOI: 10.1016/j.rse.2005.03.009.
[5]
GHIYAMAT A, SHAFRI H Z M, AMOUZAD MAHDIRAJI G, et al. Hyperspectral discrimination of tree species with different classifications using single-and multiple-endmember[J]. International Journal of Applied Earth Observation and Geoinformation, 2013, 23:177-191. DOI: 10.1016/j.jag.2013.01.004.
[6]
JIA J X, WANG Y M, CHEN J S, et al. Status and application of advanced airborne hyperspectral imaging technology:a review[J]. Infrared Physics & Technology, 2020, 104:103115. DOI: 10.1016/j.infrared.2019.103115.
[7]
LYU W J, WANG X F. Overview of hyperspectral image classification[J]. Journal of Sensors, 2020, 2020(1):4817234. DOI: 10.1155/2020/4817234.
[8]
CAMPS-VALLS G, BRUZZONE L. Kernel-based methods for hyperspectral image classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2005, 43(6):1351-1362. DOI: 10.1109/TGRS.2005.846154.
[9]
LI Y, ZHANG H K, SHEN Q. Spectral-spatial classification of hyperspectral imagery with 3D convolutional neural network[J]. Remote Sensing, 2017, 9(1):67. DOI: 10.3390/rs9010067.
[10]
CHEN Y S, LIN Z H, ZHAO X, et al. Deep learning-based classification of hyperspectral data[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2014, 7(6):2094-2107. DOI: 10.1109/JSTARS.2014.2329330.
[11]
LI Z W, LIU F, YANG W J, et al. A survey of convolutional neural networks:analysis,applications,and prospects[J]. IEEE Transactions on Neural Networks and Learning Systems, 2022, 33(12):6999-7019. DOI: 10.1109/TNNLS.2021.3084827.
[12]
SOTHE C, DE ALMEIDA C M, SCHIMALSKI M B, et al. Comparative performance of convolutional neural network,weighted and conventional support vector machine and random forest for classifying tree species using hyperspectral and photogrammetric data[J]. GIScience & Remote Sensing, 2020, 57(3):369-394. DOI: 10.1080/15481603.2020.1712102.
[13]
CHEN Y S, ZHU K Q, ZHU L, et al. Automatic design of convolutional neural network for hyperspectral image classification[J]. IEEE Transactions on Geoscience and Remote Sensing, 2019, 57(9):7048-7066. DOI: 10.1109/tgrs.2019.2910603.
[14]
MÄYRÄ J, KESKI-SAARI S, KIVINEN S, et al. Tree species classification from airborne hyperspectral and LiDAR data using 3D convolutional neural networks[J]. Remote Sensing of Environment, 2021, 256:112322. DOI: 10.1016/j.rse.2021.112322.
[15]
YU C Y, HAN R, SONG M P, et al. A simplified 2D-3D CNN architecture for hyperspectral image classification based on spatial-spectral fusion[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2020, 13:2485-2501. DOI: 10.1109/JSTARS.2020.2983224.
[16]
ZHANG B, ZHAO L, ZHANG X L. Three-dimensional convolutional neural network model for tree species classification using airborne hyperspectral images[J]. Remote Sensing of Environment, 2020, 247:111938. DOI: 10.1016/j.rse.2020.111938.
[17]
GE Z X, CAO G, LI X S, et al. Hyperspectral image classification method based on 2D-3D CNN and multibranch feature fusion[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2020, 13:5776-5788. DOI: 10.1109/JSTARS.2020.3024841.
[18]
ROY S K, KRISHNA G, DUBEY S R, et al. HybridSN:exploring 3-D-2-D CNN feature hierarchy for hyperspectral image classification[J]. IEEE Geoscience and Remote Sensing Letters, 2020, 17(2):277-281. DOI: 10.1109/LGRS.2019.2918719.
[19]
ZHOU P, FENG J S, MA C, et al. Towards theoretically understanding why SGD generalizes better than ADAM in deep learning[PP/OL]. V2:arXiv(2021-11-29). https://arxiv.org/abs/2010.05627.
[20]
IOFFE S, SZEGEDY C. Batch normalization:accelerating deep network training by reducing internal covariate shift[PP/OL]. V1:arXiv(2015-02-15).
[21]
LIU C X, ZOPH B, SHLENS J, et al. Simple and efficient architecture search for convolutional neural networks[C]// Proceedings of the International Conference on Learning Representations (ICLR). 2019.
[22]
BANNARI A, MORIN D, BONN F, et al. A review of vegetation indices[J]. Remote Sensing Reviews, 1995, 13(1/2):95-120. DOI: 10.1080/02757259509532298.
[23]
SIMS D A, GAMON J A. Relationships between leaf pigment content and spectral reflectance across a wide range of species,leaf structures and developmental stages[J]. Remote Sensing of Environment, 2002, 81(2/3):337-354. DOI: 10.1016/S0034-4257(02)00010-X.
[24]
GITELSON A A, VIÑA A, CIGANDA V, et al. Remote estimation of canopy chlorophyll content in crops[J]. Geophysical Research Letters, 2005, 32(8):2005GL022688. DOI: 10.1029/2005GL022688.
[25]
GITELSON A A, KAUFMAN Y J, MERZLYAK M N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS[J]. Remote Sensing of Environment, 1996, 58(3):289-298. DOI: 10.1016/S0034-4257(96)00072-7.
[26]
ABUTALEB K, FREDDY MUDEDE M, NKONGOLO N, et al. Estimating urban greenness index using remote sensing data:a case study of an affluent vs poor suburbs in the city of Johannesburg[J]. The Egyptian Journal of Remote Sensing and Space Science, 2021, 24(3):343-351. DOI: 10.1016/j.ejrs.2020.07.002.
[27]
HUANG S, TANG L N, HUPY J P, et al. A commentary review on the use of normalized difference vegetation index (NDVI) in the era of popular remote sensing[J]. Journal of Forestry Research, 2021, 32(1):1-6. DOI: 10.1007/s11676-020-01155-1.
[28]
ZHOU X F, HUANG W J, ZHANG J C, et al. A novel combined spectral index for estimating the ratio of carotenoid to chlorophyll content to monitor crop physiological and phenological status[J]. International Journal of Applied Earth Observation and Geoinformation, 2019, 76:128-142. DOI: 10.1016/j.jag.2018.10.012.
[29]
SANMARTÍN P, VILLA F, SILVA B, et al. Color measurements as a reliable method for estimating chlorophyll degradation to phaeopigments[J]. Biodegradation, 2011, 22(4):763-771. DOI: 10.1007/s10532-010-9402-8.
[30]
REN S L, CHEN X Q, AN S. Assessing plant senescence reflectance index-retrieved vegetation phenology and its spatiotemporal response to climate change in the Inner Mongolian Grassland[J]. International Journal of Biometeorology, 2017, 61(4):601-612. DOI: 10.1007/s00484-016-1236-6.
[31]
ZARCOTEJADA P, BERJON A, LOPEZLOZANO R, et al. Assessing vineyard condition with hyperspectral indices:leaf and canopy reflectance simulation in a row-structured discontinuous canopy[J]. Remote Sensing of Environment, 2005, 99(3):271-287. DOI: 10.1016/j.rse.2005.09.002.
[32]
HUNT E R Jr, DORAISWAMY P C, MCMURTREY J E, et al. A visible band index for remote sensing leaf chlorophyll content at the canopy scale[J]. International Journal of Applied Earth Observation and Geoinformation, 2013, 21:103-112. DOI: 10.1016/j.jag.2012.07.020.
[33]
GHADERIZADEH S, ABBASI-MOGHADAM D, SHARIFI A, et al. Hyperspectral image classification using a hybrid 3D-2D convolutional neural networks[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2021, 14:7570-7588. DOI: 10.1109/JSTARS.2021.3099118.
[34]
SPRINGENBERG J T, DOSOVITSKIY A, BROX T, et al. Striving for simplicity:the all convolutional net[PP/OL] V3.arXiv(2015-04-13) [2025-02-21]. https://arxiv.org/abs/1412.6806.
[35]
QUAN Y, LI M Z, HAO Y S, et al. Tree species classification in a typical natural secondary forest using UAV-borne LiDAR and hyperspectral data[J]. GIScience & Remote Sensing, 2023, 60(1):2171706. DOI: 10.1080/15481603.2023.2171706.
[36]
HEIKKINEN V, TOKOLA T, PARKKINEN J, et al. Simulated multispectral imagery for tree species classification using support vector machines[J]. IEEE Transactions on Geoscience and Remote Sensing, 2010, 48(3):1355-1364. DOI: 10.1109/TGRS.2009.2032239.
[37]
VIINIKKA A, HURSKAINEN P, KESKI-SAARI S, et al. Detecting European aspen (Populus tremula L.) in boreal forests using airborne hyperspectral and airborne laser scanning data[J]. Remote Sensing, 2020, 12(16):2610. DOI: 10.3390/rs12162610.
[38]
PEREZ L, WANG J. The effectiveness of data augmentation in image classification using deep learning[PP/OL].V1. arXiv(2017-12-13)[2025-05-21]. https://arxiv.org/abs/1712.04621.
[39]
YUN S, HAN D, CHUN S, et al. CutMix:regularization strategy to train strong classifiers with localizable features[C]// 2019 IEEE/CVF International Conference on Computer Vision (ICCV). October 27-November 2,2019,Seoul,Korea: IEEE, 2019:6022-6031. DOI: 10.1109/ICCV.2019.00612.
[40]
CAO C T, ZHOU F, DAI Y R, et al. A survey of mix-based data augmentation:taxonomy,methods,applications,and explainability[J]. ACM Computing Surveys, 2025, 57(2):1-38. DOI: 10.1145/3696206.
[41]
KHODADADZADEH M, LI J, PRASAD S, et al. Fusion of hyperspectral and LiDAR remote sensing data using multiple feature learning[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2015, 8(6):2971-2983. DOI: 10.1109/JSTARS.2015.2432037.
[42]
GHAMISI P, HÖFLE B, ZHU X X. Hyperspectral and LiDAR data fusion using extinction profiles and deep convolutional neural network[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2017, 10(6):3011-3024. DOI: 10.1109/JSTARS.2016.2634863.
[43]
全迎. 融合无人机LiDAR和高光谱特征的天然次生林树种及林分类型识别[D]. 哈尔滨: 东北林业大学, 2023.
QUAN Y. Mapping tree species and forest types in a typical natural secondary forest by fusing UAV-borne LiDAR and hyperspectral features[D]. Harbin: Northeast Forestry University, 2023.
[44]
赵颖慧, 张大力, 甄贞. 基于非参数分类算法和多源遥感数据的单木树种分类[J]. 南京林业大学学报(自然科学版), 2019, 43(5):103-112.
ZHAO Y H, ZHANG D L, ZHEN Z. Individual tree species classification based on nonparametric classification algorithms and multi-source remote sensing data[J]. Journal of Nanjing Forestry University (Natural Sciences Edition), 2019, 43(5):103-112. DOI: 10.3969/j.issn.1000-2006.201810041.
[45]
HANG R L, LI Z, GHAMISI P, et al. Classification of hyperspectral and LiDAR data using coupled CNNs[J]. IEEE Transactions on Geoscience and Remote Sensing, 2020, 58(7):4939-4950. DOI: 10.1109/TGRS.2020.2969024.
[46]
DALPONTE M, BRUZZONE L, GIANELLE D. Fusion of hyperspectral and LIDAR remote sensing data for classification of complex forest areas[J]. IEEE Transactions on Geoscience and Remote Sensing, 2008, 46(5):1416-1427. DOI: 10.1109/TGRS.2008.916480.
PDF(4747 KB)

Accesses

Citation

Detail

Sections
Recommended
The full text is translated into English by AI, aiming to facilitate reading and comprehension. The core content is subject to the explanation in Chinese.

/