
深度卷积神经网络的遥感植被检测方法
Remote sensing vegetation detection method based on the deep convolutional neural network
【目的】植被检测是城市生态研究的重要手段,然而由于遥感图像中植被存在阴影区域、遮挡区域以及色彩上的畸变等,导致当前的植被检测精度较低。基于遥感卫星影像,采用深度学习技术快速有效地检测出城市中的植被区域,为植被资源统计等相关研究提供依据。【方法】选用深度卷积神经网络模型,对高分辨率遥感影像中的植被区域进行检测。对不同的优化器,通过设置不同的卷积核大小,对精度进行对比分析。最后对网络层数进行研究,对设置合适网络层数进行分析,用构造的深度卷积神经网络在实验数据上进行植被区域检测。【结果】利用卷积神经网络处理二维图像时,无需手动提取特征,进行简单少量的预处理后,直接把图像输入到CNN模型中进行训练,即可实现图片的识别分类功能。降低了预处理的难度,同时局部感知和权值共享大幅度地减少了参数量,加快了计算速度。次抽样还能保证图像处理后的平移、旋转、缩放和拉伸的不变性。解决了传统方法计算量和样本量大、结构复杂以及费时的缺点。在采集到的高分辨率紫金山区域的遥感图像中,通过设计的多层卷积神经网络模型对区域中的植被资源进行分析,对比和研究不同的优化器、卷积核和网络层数,植被检测精度达到95.4%,明显高于当前众多植被检测算法。【结论】在深度学习中,目标检测的精度依赖于网络的结构设置,通过对优化器、卷积核以及网络层数进行设定,可以明显提高目标检测效率和精度。
【Objective】 Vegetation detection is an important aspect of urban ecological research. However, because of shadowed areas, occluded areas and color distortion of vegetation in remote sensing images, the current vegetation detection accuracy is low. The purpose of this study is to efficiently and effectively measure the vegetation area in a city based on remote sensing satellite images and deep learning technology to provide algorithms for relevant research, such as vegetation resource statistics. 【Method】 Initially, a deep convolution neural network (CNN) model was used to detect vegetation areas in high-resolution remote sensing images. Subsequently, different optimizers were analyzed and discussed, and the accuracy was compared by setting different convolution kernel sizes. Finally, the number of network layers was studied, the appropriate number of network layers was analyzed, and the present deep convolution neural network was used to detect the vegetation area in the experimental data. 【Result】 Experimental data were obtained from high-resolution remote sensing images of the Zijin Mountain area. The vegetation resources in the area were analyzed using the designed multilayer convolution neural network model, and different optimizers, convolution cores and network layers were compared and studied. The vegetation detection accuracy reached 95.4%, which is significantly higher than that of many current vegetation detection algorithms. 【Conclusion】 It can be concluded that the accuracy of the target detection depends on the structure of the convolutional neural network. By setting the optimizer, convolution kernel and number of network layers, the efficiency and accuracy of target detection can be significantly improved.
vegetation detection / deep learning / convolutional neural network / image classification
[1] |
徐姗姗, 刘应安, 徐昇. 基于卷积神经网络的木材缺陷识别[J]. 山东大学学报(工学版), 2013, 43(2):23-28.
|
[2] |
徐辛颖. 面向杂草识别的K近邻算法研究[D]. 哈尔滨: 东北农业大学, 2013.
|
[3] |
杨红鑫, 杨绪兵, 张福全, 等. 半监督平面聚类算法设计[J]. 南京大学学报(自然科学), 2020, 56(1):9-18.
|
[4] |
杨红鑫, 杨绪兵, 寇振宇, 等. 基于L1范数的K平面聚类算法设计[J]. 南京航空航天大学学报, 2019, 51(5):681-686.
|
[5] |
孙娅彬. 基于支持向量机的纹理图像分类算法[J]. 计算机仿真, 2012, 29(5):287-290.
|
[6] |
黄昕, 张良培, 李平湘. 基于小波的高分辨率遥感影像纹理分类方法研究[J]. 武汉大学学报·信息科学版, 2006, 31(1):66-69.
|
[7] |
李政浩. 基于BP神经网络的图像粗分类[J]. 科技传播, 2019, 11(19):78-79.
|
[8] |
杨绪兵, 葛彦齐, 张福全, 等. 基于矩阵模式的林火图像半监督学习算法[J]. 图学学报, 2019, 40(5):835-842.
|
[9] |
贾坤, 姚云军, 魏香琴, 等. 植被覆盖度遥感估算研究进展[J]. 地球科学进展, 2013, 28(7):774-782.
|
[10] |
黄昌狄, 徐芳. 基于真彩色高分辨遥感影像稀疏植被覆盖检测[J]. 测绘地理信息, 2016, 41(5):42-46,50.
|
[11] |
陈斌, 王宏志, 徐新良, 等. 深度学习GoogleNet模型支持下的中分辨率遥感影像自动分类[J]. 测绘通报, 2019(6):29-33,40.
|
[12] |
王健. 深度学习算法在山区植被分类中的应用[J]. 价值工程, 2019, 38(4):161-163.
|
[13] |
杨朦朦, 汪汇兵, 欧阳斯达, 等. 基于双树复小波分解的BP神经网络遥感影像分类[J]. 遥感技术与应用, 2018, 33(2):313-320.
|
[14] |
刘祖瑾, 杨玲, 刘祖涵, 等. 一种结合深度置信网络与最优尺度的植被提取方法[J]. 激光与光电子学进展, 2018, 55(2):158-167.
|
[15] |
马凯, 梁敏. 基于BP神经网络高光谱图像分类研究[J]. 测绘与空间地理信息, 2017, 40(5):118-121.
|
[16] |
陈冠宇, 李向, 王岭玲. 基于大数据的遥感图像植被识别方法[J]. 地质科技情报, 2016, 35(3):204-209.
|
[17] |
李彦冬, 郝宗波, 雷航. 卷积神经网络研究综述[J]. 计算机应用, 2016, 36(9):2508-2515,2565.
|
[18] |
刘忠伟, 戚大伟. 基于卷积神经网络的树种识别研究[J]. 森林工程, 2020, 36(1):33-38.
|
[19] |
杨念聪, 任琼, 张成喆, 等. 基于卷积神经网络的图像特征识别研究[J]. 信息与电脑, 2017(14):62-64.
|
[20] |
谭涛. 基于卷积神经网络的随机梯度下降优化算法研究[D]. 重庆: 西南大学, 2021.
|
[21] |
杨观赐, 杨静, 李少波, 等. 基于Dopout与ADAM优化器的改进CNN算法[J]. 华中科技大学学报(自然科学版), 2018, 46(7):122-127.
|
[22] |
郭敏钢, 宫鹤. AlexNet改进及优化方法的研究[J]. 计算机工程与应用, 2020, 56(20):124-131.
|
/
〈 |
|
〉 |