上海交通大学神经网络原理与应用作业1.pdf
- 文档编号:3175680
- 上传时间:2022-11-19
- 格式:PDF
- 页数:3
- 大小:156.03KB
上海交通大学神经网络原理与应用作业1.pdf
《上海交通大学神经网络原理与应用作业1.pdf》由会员分享,可在线阅读,更多相关《上海交通大学神经网络原理与应用作业1.pdf(3页珍藏版)》请在冰豆网上搜索。
NeuralNetworkTheoryandApplicationsHomeworkAssignment1oxstarSJTUJanuary19,2012ProblemoneOnevariationoftheperceptronlearningruleisWnew=Wold+epTbnew=bold+ewhereiscalledthelearningrate.Proveconvergenceofthisalgorithm.Doestheproofrequirealimitonthelearningrate?
Explain.Proof.Wewillcombinethepresentationofweightmatrixandthebiasintoasinglevector:
x=?
Wb?
zq=?
pq1?
.
(1)Sothenetinputandtheperceptronlearningrulecanbewrittenas:
WTp+b=xTz,
(2)xnew=xold+ez.(3)Weonlytakethoseiterationsforwhichtheweightvectorischangedintoaccount,sothelearningrulebecomes(WLOG,assumethatx(0)=0)x(k)=x(k1)+dz(k1)(4)=dz(0)+dz
(1)+.+dz(k1)(5)wheredzzQ,.,z1,z1,.,zQ.Assumethatthecorrectweightvectorisx,wecansay,xTdz0(6)FromEquation5andEquation6wecanshowthatxTx(k)=xTdz(0)+xTdz
(1)+.+xTdz(k1)(7)k(8)FromtheCauchy-Schwartzinequalitywehave|x|2|x(k)|2(xTx(k)2(k)2(9)1Theweightswillbeupdatedifthepreviousweightswereincorrect,sowehavexTdz0and|x(k)|2=xT(k)x(k)(10)=x(k1)+dz(k1)Tx(k1)+dz(k1)(11)=|x(k1)|2+2xT(k1)dz(k1)+2|dz(k1)|2(12)|x(k1)|2+2|dz(k1)|2(13)2|dz(0)|2+.+2|dz(k1)|2(14)2kmax(|dz|2)(15)FromEquation9andEquation15wehave(k)2|x|2|x(k)|2|x|22kmax(|dz|2)(16)k|x|2max(|dz|2)2(17)Nowwehavefoundthattheproofdoesnotrequirealimitonthelearningrateforthereasonthatkhasnorelationswiththelearningrate.Intuitively,thisproblemequalstoallinputsbeingmultipliedbythelearningrate(ez=e(z).Thecostwillnotchangealottofindthecorrectboundarytoproportionaldata.ProblemtwoWehaveaclassificationproblemwiththreeclassesofinputvectors.Thethreeclassesareclass1:
?
p1=?
11?
p2=?
02?
p3=?
31?
class2:
?
p4=?
21?
p5=?
20?
p6=?
12?
class3:
?
p7=?
12?
p8=?
21?
p9=?
11?
Implementtheperceptronnetworkbasedonthelearningruleofproblemonetosolvethisproblem.Runyourproblematdifferentlearningrate(=1,0:
8,0:
5,0:
3,0:
1),compareanddiscusstheresults.Ans.Inmyexperiment,thelearningratesaresetform0.1to1withstep=0.05.HereIshowthetimesofiterationatdifferentlearningrates(Figure1).Notethatthisperceptronalgorithmcanonlyhandletwo-classproblems,soIusetwophasesofclassificationtosolvethethree-classproblem.Icanhardlyfindtherelationshipsbetweenthelearningratesandtimesofiteration.Justaswhathavebeenprovedabove,thecostwillnotbeinfluencedalottofindthecorrectboundaryatdifferentlearningrates.Ialsoprovedtheresultsforclassificationatlearningrate=1(Figure2).ProblemthreeFortheproblemXORasfollows,class1:
?
p1=?
10?
p2=?
01?
class2:
?
p3=?
00?
p4=?
11?
20.10.20.30.40.50.60.70.80.91051015202530LearningratesTimesoflearningClassification1Classification2Figure1:
TimesofIterationatDifferentLearningRates21.510.500.511.522.53302520151050510Class1Class2Class3Figure2:
ResultforClassification(P2)0.500.511.53.532.521.510.500.51Class1Class2Figure3:
ResultforClassification(XORProblem)4321012345432101234Class1Class2Figure4:
ResultforClassification(TwoSpiralProblem)andthetwospiralproblemdeliveredasthematerial,couldtheperceptronalgorithmcor-rectlyclassifythesetwoproblems?
Ifnot,explainwhy.Ans.Justastheresults(Figure3andFigure4)show,theperceptronalgorithmcannotcorrectlyclassifythesetwoproblems.Asweknow,thesingle-layerperceptronscanonlyclassifylinearlyseparablevectorswhilethevectorsinthesetwoproblemsarejustlinearlyinseparable.3
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- 上海交通大学 神经网络 原理 应用 作业