基于神经网络的马尾松毛虫精细化预报Matlab建模试验.docx
- 文档编号:23107118
- 上传时间:2023-04-30
- 格式:DOCX
- 页数:44
- 大小:1.53MB
基于神经网络的马尾松毛虫精细化预报Matlab建模试验.docx
《基于神经网络的马尾松毛虫精细化预报Matlab建模试验.docx》由会员分享,可在线阅读,更多相关《基于神经网络的马尾松毛虫精细化预报Matlab建模试验.docx(44页珍藏版)》请在冰豆网上搜索。
基于神经网络的马尾松毛虫精细化预报Matlab建模试验
基于神经网络的马尾松毛虫精细化预报Matlab建模试验
张国庆
(安徽省潜山县林业局)
1.数据来源
马尾松毛虫发生量、发生期数据来源于潜山县监测数据,气象数据来源于国家气候中心。
2.数据预处理
为了体现马尾松毛虫发生发展时间上的完整性,在数据处理时,将越冬代数据与上一年第二代数据合并,这样,就在时间上保持了一个马尾松毛虫世代的完整性,更便于建模和预测。
(1)气象数据处理
根据《松毛虫综合管理》、《中国松毛虫》等学术资料以及近年来有关马尾松毛虫监测预报学术论文,初步选择与松毛虫发生量、发生期有一定相关性气象因子,包括卵期极低气温,卵期平均气温,卵期积温(日度),卵期降雨量,第1、2龄极低气温,第1、2龄平均气温,第1、2龄积温(日度),第12龄降雨量,幼虫期极低气温,幼虫期平均气温,幼虫期积温(日度),幼虫期降雨量,世代极低气温,世代平均气温,世代积温(日度),世代降雨量共16个变量。
将来自于国家气候中心的气象原始数据,按年度分世代转换成上述16个变量数据系列。
(2)发生量数据处理
为了在建模时分析发生强度,在对潜山县1983~2014年原始监测数据预处理时,按照“轻”、“中”、“重”3个强度等级,分类按世代逐年汇总。
(3)发生期数据处理
首先对潜山县1983~2014年原始发生期监测数据按世代逐年汇总,然后日期数据转换成日历天,使之数量化,以便于建模分析。
3.因子变量选择
通过相关性分析和建模试验比较,第一代发生量因子变量选择第1、2龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积;第二代发生量因子变量选择第1、2龄极低气温,卵期极低气温,上一代防治效果,上一代防治面积,第1、2龄降雨量,卵期降雨量;第一代幼虫高峰期因子变量选择第1、2龄平均气温,第1、2龄积温(日度),第1、2龄极低气温,卵期极低气温;第二代幼虫高峰期因子变量选择成虫始见期,卵期平均气温,卵期积温(日度),第1、2龄极低气温。
将第一代发生量变量命名为s1y,因变量命名为s1x;第二代发生量变量命名为s2y,因变量命名为s2x;第一代幼虫高峰期变量命名为t1y,因变量命名为t1x;第二代幼虫高峰期变量命名为t2y,因变量命名为t2x。
4.第一代发生量建模试验
4.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2819:
28:
48CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s1x-inputdata.
%s1y-targetdata.
x=s1x';
t=s1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:
helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2819:
29:
03CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s1x-inputdata.
%s1y-targetdata.
x=s1x';
t=s1y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:
helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:
helpnnprocess
net.input.processFcns={'removeconstantrows','mapminmax'};
net.output.processFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:
helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:
helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:
helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
4.2网络训练过程
网络训练为:
4.3训练结果
训练结果为:
训练样本、验证样本、测试样本的R值分别为0.875337、-1和1。
误差直方图为:
训练样本、验证样本、测试样本、所有数据回归图为:
验证样本和测试样本R值均为1。
5.第二代发生量建模试验
5.1程序代码
程序代码(SimpleScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:
04:
18CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s2x-inputdata.
%s2y-targetdata.
x=s2x';
t=s2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:
helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%SetupDivisionofDataforTraining,Validation,Testing
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
程序代码(AdvancedScript)为:
%SolveanInput-OutputFittingproblemwithaNeuralNetwork
%ScriptgeneratedbyNeuralFittingapp
%CreatedWedOct2820:
04:
31CST2015
%
%Thisscriptassumesthesevariablesaredefined:
%
%s2x-inputdata.
%s2y-targetdata.
x=s2x';
t=s2y';
%ChooseaTrainingFunction
%Foralistofalltrainingfunctionstype:
helpnntrain
%'trainlm'isusuallyfastest.
%'trainbr'takeslongerbutmaybebetterforchallengingproblems.
%'trainscg'useslessmemory.NFTOOLfallsbacktothisinlowmemorysituations.
trainFcn='trainlm';%Levenberg-Marquardt
%CreateaFittingNetwork
hiddenLayerSize=10;
net=fitnet(hiddenLayerSize,trainFcn);
%ChooseInputandOutputPre/Post-ProcessingFunctions
%Foralistofallprocessingfunctionstype:
helpnnprocess
net.input.processFcns={'removeconstantrows','mapminmax'};
net.output.processFcns={'removeconstantrows','mapminmax'};
%SetupDivisionofDataforTraining,Validation,Testing
%Foralistofalldatadivisionfunctionstype:
helpnndivide
net.divideFcn='dividerand';%Dividedatarandomly
net.divideMode='sample';%Divideupeverysample
net.divideParam.trainRatio=90/100;
net.divideParam.valRatio=5/100;
net.divideParam.testRatio=5/100;
%ChooseaPerformanceFunction
%Foralistofallperformancefunctionstype:
helpnnperformance
net.performFcn='mse';%Meansquarederror
%ChoosePlotFunctions
%Foralistofallplotfunctionstype:
helpnnplot
net.plotFcns={'plotperform','plottrainstate','ploterrhist',...
'plotregression','plotfit'};
%TraintheNetwork
[net,tr]=train(net,x,t);
%TesttheNetwork
y=net(x);
e=gsubtract(t,y);
performance=perform(net,t,y)
%RecalculateTraining,ValidationandTestPerformance
trainTargets=t.*tr.trainMask{1};
valTargets=t.*tr.valMask{1};
testTargets=t.*tr.testMask{1};
trainPerformance=perform(net,trainTargets,y)
valPerformance=perform(net,valTargets,y)
testPerformance=perform(net,testTargets,y)
%ViewtheNetwork
view(net)
%Plots
%Uncommenttheselinestoenablevariousplots.
%figure,plotperform(tr)
%figure,plottrainstate(tr)
%figure,plotfit(net,x,t)
%figure,plotregression(t,y)
%figure,ploterrhist(e)
%Deployment
%Changethe(false)valuesto(true)toenablethefollowingcodeblocks.
if(false)
%GenerateMATLABfunctionforneuralnetworkforapplicationdeployment
%inMATLABscriptsorwithMATLABCompilerandBuildertools,orsimply
%toexaminethecalculationsyourtrainedneuralnetworkperforms.
genFunction(net,'myNeuralNetworkFunction');
y=myNeuralNetworkFunction(x);
end
if(false)
%Generateamatrix-onlyMATLABfunctionforneuralnetworkcode
%generationwithMATLABCodertools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
y=myNeuralNetworkFunction(x);
end
if(false)
%GenerateaSimulinkdiagramforsimulationordeploymentwith.
%SimulinkCodertools.
gensim(net);
end
5.2网络训练过程
网络训练为:
5.3训练结果
训练结果为:
训练样本、验证样本、测试样本的R值分别为0.942388、0.999999和1。
误差直方图为:
训练样本、验证样本、测试样本、所有数据回归图为:
验证样本和测试样本R值均为1,训练样本R=0.94239,所有数据R=
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 基于 神经网络 马尾松 毛虫 精细 预报 Matlab 建模 试验