神经网络英文文献.docx
- 文档编号:4150133
- 上传时间:2022-11-28
- 格式:DOCX
- 页数:10
- 大小:1.03MB
神经网络英文文献.docx
《神经网络英文文献.docx》由会员分享,可在线阅读,更多相关《神经网络英文文献.docx(10页珍藏版)》请在冰豆网上搜索。
神经网络英文文献
ARTIFICIALNEURALNETWORKFORLOADFORECASTING
INSMARTGRID
HAO-TIANZHANG,FANG-YUANXU,LONGZHOU
EnergySystemGroup,CityUniversityLondon,NorthamptonSquare,London,UK
E-MAIL:
,long.zhou.
Abstract:
Itisanirresistibletrendoftheelectricpowerimprovementfordevelopingthesmartgrid,whichappliesalargeamountofnewtechnologiesinpowergeneration,transmission,distributionandutilizationtoachieveoptimizationofthepowerconfigurationandenergysaving.Asoneofthekeylinkstomakeagridsmarter,loadforecastplaysasignificantroleinplanningandoperationinpowersystem.ManywayssuchasExpertSystems,GreySystemTheory,andArtificialNeuralNetwork(ANN)andsoonareemployedintoloadforecasttodothesimulation.ThispaperintendstoillustratetherepresentationoftheANNappliedinloadforecastbasedonpracticalsituationinOntarioProvince,Canada.
Keywords:
Loadforecast;ArtificialNeuronNetwork;backpropagationtraining;Matlab
1.Introduction
Loadforecastingisvitallybeneficialtothepowersystemindustriesinmanyaspects.Asanessentialpartinthesmartgrid,highaccuracyoftheloadforecastingisrequiredtogivetheexactinformationaboutthepowerpurchasingandgenerationinelectricitymarket,preventmoreenergyfromwastingandabusingandmakingtheelectricitypriceinareasonablerangeandsoon.Factorssuchasseasondifferences,climatechanges,weekendsandholidays,disastersandpoliticalreasons,operationscenariosofthepowerplantsandfaultsoccurringonthenetworkleadtochangesoftheloaddemandandgenerations.
Since1990,theartificialneuralnetwork(ANN)hasbeenresearchedtoapplyintoforecastingtheload.“ANNsaremassivelyparallelnetworksofsimpleprocessingelementsdesignedtoemulatethefunctionsandstructureofthebraintosolveverycomplexproblems”.Owingtothetranscendentcharacteristics,ANNsisoneofthemostcompetentmethodstodothepracticalworkslikeloadforecasting.Thispaperconcernsaboutthebehaviorsofartificialneuralnetworkinloadforecasting.AnalysisofthefactorsaffectingtheloaddemandinOntario,Canadaismadetogivean
effectivewayforloadforecastinOntario.
2.BackPropagationNetwork
2.1.Background
Becausetheoutstandingcharacteristicofthestatisticalandmodelingcapabilities,ANNcoulddealwithnon-linearandcomplexproblemsintermsofclassificationorforecasting.Astheproblemdefined,therelationshipbetweentheinputandtargetisnon-linearandverycomplicated.ANNisanappropriatemethodtoapplyintotheproblemtoforecasttheloadsituation.Forapplyingintotheloadforecast,anANNneedstoselectanetworktypesuchasFeed-forwardBackPropagation,LayerRecurrentandFeed-forwardtime-delayandsoon.Todate,Backpropagationiswidelyusedinneuralnetworks,whichisafeed-forwardnetworkwithcontinuouslyvaluedfunctionsandsupervisedlearning.Itcanmatchtheinputdataandcorrespondingoutputinanappropriatewaytoapproachacertainfunctionwhichisusedforachievinganexpectedgoalwithsomepreviousdatainthesamemanneroftheinput.
2.2.Architectureofbackpropagationalgorithm
Figure1showsasingleNeuronmodelofbackpropagationalgorithm.Generally,theoutputisafunctionofthesumofbiasandweightmultipliedbytheinput.Theactivation
functioncouldbeanykindsoffunctions.However,thegeneratedoutputisdifferent.
Owingtothefeed-forwardnetwork,ingeneral,atleastonehiddenlayerbeforetheoutputlayerisneeded.Three-layernetworkisselectedasthearchitecture,becausethiskindofarchitecturecanapproximateanyfunctionwithafewdiscontinuities.ThearchitecturewiththreelayersisshowninFigure2below:
Figure1.Neuronmodelofbackpropagationalgorithm
Figure2.Architectureofthree-layerfeed-forwardnetwork
Basically,therearethreeactivationfunctionsappliedintobackpropagationalgorithm,namely,Log-Sigmoid,Tan-Sigmoid,andLinearTransferFunction.TheoutputrangeineachfunctionisillustratedinFigure3below.
Figure.3.Activationfunctionsappliedinbackpropagation
(a)Log-sigmoid(b)Tan-sigmoid(c)linearfunction
2.3.Trainingfunctionselection
AlgorithmsoftrainingfunctionemployedbasedonbackpropagationapproachareusedandthefunctionwasintegratedintheMatlabNeuronnetworktoolbox.
TABLE.I.TRAININGFUNCTIONSINMATLAB’SNNTOOLBOX
3.TrainingProcedures
3.1.Backgroundanalysis
TheneuralnetworktrainingisbasedontheloaddemandandweatherconditionsinOntarioProvince,CanadawhichislocatedinthesouthofCanada.TheregioninOntariocanbedividedintothreepartswhicharesouthwest,centralandeast,andnorth,accordingtotheweatherconditions.Thepopulationisgatheredaroundsoutheasternpartoftheentireprovince,whichincludestwoofthelargestcitiesofCanada,TorontoandOttawa.
3.2.DataAcquisition
Therequiredtrainingdatacanbedividedintotwoparts:
inputvectorsandoutputtargets.Forloadforecasting,inputvectorsfortrainingincludealltheinformationoffactors
affectingtheloaddemandchange,suchasweatherinformation,holidaysorworkingdays,faultoccurringinthenetworkandsoon.Outputtargetsaretherealtimeload
scenarios,whichmeanthedemandpresentedatthesametimeasinputvectorschanging.
Owingtotheconditionalrestriction,thisstudyonlyconsiderstheweatherinformationandlogicaladjustmentofweekdaysandweekendsasthefactorsaffectingtheload
status.Inthispaper,factorsaffectingtheloadchangingarelistedbelow:
(1).Temperature(℃)
(2).DewPointTemperature(℃)
(3).RelativeHumidity(%)
(4).Windspeed(km/h)
(5).WindDirection(10)
(6).Visibility(km)
(7).Atmosphericpressure(kPa)
(8).Logicaladjustmentofweekdayorweekend
Accordingtotheinformationgatheredabove,theweatherinformationinTorontotakenplaceofthewholeOntarioprovinceischosentoprovidedataacquisition.Thedatawasgatheredhourlyaccordingtothehistoricalweatherconditionsremainedintheweatherstations.Loaddemanddataalsoneedstobegatheredhourlyandcorrespondingly.Inthispaper,2yearsweatherdataandloaddataiscollectedtotrainandtestthecreatednetwork.
3.3.DataNormalization
Owingtopreventthesimulatedneuronsfrombeingdriventoofarintosaturation,allofthegathereddataneedstobenormalizedafteracquisition.Likeperunitsystem,eachinputandtargetdataarerequiredtobedividedbythemaximumabsolutevalueincorrespondingfactor.Eachvalueofthenormalizeddataiswithintherangebetween-1and+1sothattheANNcouldrecognizethedataeasily.Besides,weekdaysarerepresentedas1,andweekendarerepresentedas0.
3.4.Neuralnetworkcreating
ToolboxinMatlabisusedfortrainingandsimulatingtheneuronnetwork.Thelayoutoftheneuralnetworkconsistsofnumberofneuronsandlayers,connectivityoflayers,activationfunctions,anderrorgoalandsoon.Itdependsonthepracticalsituationtosettheframeworkandparametersofthenetwork.ThearchitectureoftheANNcouldbeselectedtoachievetheoptimizedresult.Matlabisoneofthebestsimulation
toolstoprovidevisiblewindows.Three-layerarchitecturehasbeenchosentogivethesimulationasshowninFigure2above.Itisadequatetoapproximatearbitraryfunction,ifthenodesofthehiddenlayeraresufficient.
Duetothepracticalinputvalueisfrom-1to+1,thetransferfunctionofthefirstlayerissettobetansigmiod,whichisahyperbolictangentsigmoidtransferfunction.Thetransferfunctionoftheoutputlayerissettobelinearfunction,whichisalinearfunctiontocalculatealayer’soutputfromitsnetinput.Thereisoneadvantageforthelinearoutputtransferfunction:
becausethelinearoutputneuronsleadtotheoutputtakeonanyvalue,thereisnodifficultytofindoutthedifferencesbetweenoutputandtarget.
Thenextstepistheneuronsandtrainingfunctionsselection.Generally,TrainbrandTrainlmarethebestchoicesaroundallofthetrainingfunctionsinMatlabtoolbox
Trainlm(Levenberg-Marquardtalgorithm)isthefastesttrainingalgorithmfornetworkswithmoderatesize.However,thebigproblemappearsthatitneedsthestorageofsomematriceswhichissometimeslargefortheproblems.Whenthetrainingsetislarge,trainlmalgorithmwillreducethememoryandalwayscomputetheapproximateHessianmatrixwithn×ndimensions.Anotherdrawbackofthetrainlmisthattheover-fittingwilloccurwhenthenumberoftheneuronsistoolarge.Basically,thenumberofneuronsisnottoolargewhenthetrainlmalgorithmisemployedintothenetwork.Trainbr(Bayesianregularization)isamodifiedalgorithmoftheLevenberg-Marquardttrainingmethodtocreatenetworkswhichgeneralizewellsothattheoptimalnetworkarchitecturecanbeeasilydetermined.Impactsfromeffectivelyusedweightsandbiasesofthenetworkcanbeseenclearlybythisalgorithm.Andthenumberoftheeffectiveweightsandbiaseswillnotchangetoomuchwhenthedimensionofthenetworkisgettinglarge.Thetrianbralgorithmhasthebestperformanceafterthenetworkinputandoutputnormalizedintotherangefrom-1to+1.Animportantthingwhenusingtrainbrshouldbementionedisthatthealgorithmshouldnotstopuntiltheeffectivenumberofparametershasconverged.MoredetailsareavailableinMatlabneuralnetworktoolbox.
Numberofneuronsinthefirstlayeralsocanbeselectedtooptimizethenetworksothatanexpectedresultcanbemade.Generallyspeaking,themorecomplicatedarchitectureofthenetworkis,themoreaccuratetheoutputresultwillbe,however,thehigherchanceswillthealgorithmsuchastrainlmwithover-fitting.Inthispaper,thenumberofneuronsis8intrainlmalgorithm,and30intrainbralgorithm.
3.5.Neuralnetworktraining
Be
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 神经网络 英文 文献