首页 | 本学科首页   官方微博 | 高级检索  
     

基于深度学习模型的种植结构复杂区农作物精细分类研究
引用本文:田甜,王迪,王珍,李会宾. 基于深度学习模型的种植结构复杂区农作物精细分类研究[J]. 中国农业资源与区划, 2022, 43(12): 147-158
作者姓名:田甜  王迪  王珍  李会宾
作者单位:1.中国农业科学院农业资源与农业区划研究所,北京 100081;2.农业农村部农业遥感重点实验室,北京 100081;3.鞍山师范学院化学与生命科学学院,辽宁鞍山 114007
基金项目:国家自然科学基金(重点)资助项目“基于“三位一体”空间抽样理论研究及其二联查找表研建”(41531179)
摘    要:目的 卫星遥感技术具有覆盖范围广、探测周期短、调查成本低等优势而广泛应用于大区域农作物分类。然而在种植结构复杂区(如城乡结合部),因其地块破碎、同期生长的作物种类多且分布分散,利用传统的统计分类或机器学习方法进行农作物分类时仍存在精度不高的问题。为提高种植结构复杂区农作物分类精度。方法 文章选取河北省廊坊市广阳区为研究区,以GF-1 PMS全色多光谱融合影像为数据源,采用U-Net、PSPNet及DeepLabv3+,3种深度学习模型进行农作物分类研究。分析模型参数对农作物分类精度的影响,评价3种深度学习模型的农作物分类精度,优选农作物精细分类方法。结果 (1)学习率与3种深度学习模型的分类精度呈正相关关系,较大的学习率(0.01,0.001)下,3种模型收敛速度快,分类精度高。批样本量与模型分类稳定性相关,批样本量设为100时,3种模型的分类稳定性最好。(2)相比PSPNet、DeepLabv3+模型,U-Net模型分类效果最好,总体分类精度为89.32%。(3)GF-1 PMS影像结合U-Net模型可有效提升种植结构复杂区农作物分类精度,大宗作物春玉米、夏玉米的分类精度在80%以上,花生、红薯、蔬菜小宗作物分类精度在60%以上。结论 该研究可为准确获取种植结构复杂区的农作物类型、面积及空间分布信息提供参考依据。

关 键 词:深度学习  农作物分类  遥感  卷积神经网络  GF-1卫星
收稿时间:2022-01-17

PRECISE CLASSIFICATION OF CROPS IN COMPLEX PLANTING STRUCTURE AREA BASED ON DEEP LEARNING MODEL
Tian Tian,Wang Di,Wang Zhen,Li Huibin. PRECISE CLASSIFICATION OF CROPS IN COMPLEX PLANTING STRUCTURE AREA BASED ON DEEP LEARNING MODEL[J]. Journal of China Agricultural Resources and Regional Planning, 2022, 43(12): 147-158
Authors:Tian Tian  Wang Di  Wang Zhen  Li Huibin
Affiliation:1.Institute of Agricultural Resources and Regional Planning, Chinese Academy of Agricultural Sciences, Beijing 100081, China;2.Key Laboratory of Agricultural Remote Sensing, Ministry of Agriculture and Rural Affairs, China;3.School of Chemistry and Life Science, Anshan Normal University, Anshan 114007, Liaoning, China
Abstract:Satellite remote sensing technology has the advantages of wide coverage, short detection period and low survey cost, and is widely used in crop classification in large areas. However, in areas with complex planting structures (such as urban-rural borders), due to the fragmentation of the plots and the large and scattered distribution of crops grown in the same period, the traditional statistical classification or machine learning methods are still used for crop classification. There is still a problem of low accuracy. In order to improve the classification accuracy of crops in areas with complex planting structure, in this study, Guangyang District, Langfang city, Hebei province was selected as the research area, and the GF-1 PMS panchromatic multispectral fusion image was used, and three deep learning models of U-Net, PSPNet and DeepLabv3+ were used to conduct crop classification research. The influence of model parameters on crop classification accuracy was analyzed, the crop classification accuracy of the three deep learning models was evaluated, and the precise crop classification method was optimized. The results were listed as follows. (1) There was a positive correlation between the learning rate and the classification accuracy of the three deep learning models. With a larger learning rate (0.01, 0.001), the three models had faster convergence speed and higher classification accuracy. The batch sample size was related to the model classification stability. When the batch sample size was set to 100, the classification stability of the three models was the best. (2) Compared with the PSPNet and DeepLabv3+ models, the U-Net model had the best classification effect, with an overall classification accuracy of 89.32%. (3) GF-1 PMS images combined with the U-Net model could effectively improve the classification accuracy of crops in areas with complex planting structures, and the classification accuracy of spring corn and summer corn was more than 80%, and that of peanut, sweet potato and vegetable crops was more than 60%. It concludes that this research can provide reference for accurately obtaining the crop type, area and spatial distribution information in the complex planting structure.
Keywords:deep learning  crop classification  remote sensing  convolutional neural network  GF-1 satellite
点击此处可从《中国农业资源与区划》浏览原始摘要信息
点击此处可从《中国农业资源与区划》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号