=========================preview======================
(COMP327)final96F.pdf
Back to COMP327 Login to download
======================================================
HongKongUniversityofScienceandTechnology
COMP327/527:PatternRecognition
Fall1996
FinalExamination
19December1996,8:30{11:30am
StudentName:
StudentNumber:
Instructions
1.Thisisanopen-book,open-notesexamination. 2.Checkthatyouhaveall14pages(excludingthiscoverpage). 3.Writeyournameandstudentnumberonthispage. 4.Answerallquestionsinthespaceprovided.Roughworkshouldbedoneontheback
pages. 5.Makeyouranswersasconciseaspossible.
Question1(16%): Question2(13%): Question3(10%): Question4(15%): Question5(10%): Question6(8%): Question7(8%): Question8(8%): Question9(12%):
TOTAL(100%): 1.(16%) Supposetheprobabilitydensityfunctionsofaunivariatefeaturexfortwoclassesare
asfollows:
8
.5a 3.x . 4
.
.
.3a 4.x . 6
p(xj!1) . .4a 6.x.7
.
.
: 0 otherwise
p(xj!2) .( b0 1.x.10 otherwise
AssumethatPr(!1).Pr(!2).1.2.
(a)[2%]Whatshouldbethevaluesofaandb.
(b)[4%]Whatarethedecisionregionsofthetwoclassesforthedomain1.x.10 whenoptimal(Bayesian)decisionismade.
(c)[3%]WhatistheBayeserror.
(d)[7%]Repeatparts(b)and(c)ifthepriorprobabilitiesarechangedto1.3and2.3 forPr(!1)andPr(!2),respectively.
2.(13%) Considerthefollowingfourtrainingexamples:
x1 x2 x3 x4 t
1 1 ;2 ;4 1
2 ;1 2 3 ;1
5 2 ;5 0 ;1
0 2 1 2 1
Sincethesefourexamplesarelinearlyseparable,asimpleperceptronwithfourinput unitsistrainedtoperformtheclassi.cationtask.
(a)[8%]Supposealltheweightsoftheperceptronareinitializedto0.Showthe executiontraceoftheperceptronlearningalgorithmbylistingthesequenceof weightchangesuntilasolutionisfound.
Showthatthefourexamplesbecomelinearlynon-separable.
(b) [5%]Supposefeaturesx1andx3describethetrainingexamples: are ignoredso that only x2 and x4 are used to
x21;122 x4;4302 t 1 ;1 ;1 1
3.(10%)
x2
x1
Theshadedareaisformedbytwoconcentriccircleswithcenter(a.b)andradiir1and r2(r1.r2). Supposewede.netwoadditionalfeaturesx3andx4as:
22
x3.xx4.x
12
Showthatthereexistsanetworkcomposingoftwolayersofsimpleperceptronswithfour inputs(x1,x2,x3,x4)andoneoutputthatcanrepresentthedecisionregionsabove correspondingtotwoclasses.Assumingthatallthesimpleperceptronshaveoutput valuesinf0.1g,givethevaluesofalltheweightsandbiasesinthenetwork.(Hint:One solutionhastwohiddenunitsandoneoutputunit.)
(cont'd)
4.(15%) Consideranextendedversionofadaline,calledq-adalinehereforconvenience.Aq-adalinewithtwoinputs(x1andx2)andoneoutput(y)hasweightsw12,w1,w2,and. thatrelatetheinputswiththeoutputbythefollowingequation:
y.w12x1x2+w1x1+w2x2+.
(a)[8%]Usingthegradient-descentapproach,derivetheweightupdatingrulesforw12, w1,w2,and..Thetotalquadraticerrorovertheentiretrainingsetofnexamples shouldbeusedastheerrorfunction.
(b)[4%]Whatistheadvantageofq-adalineoveradaline.Whatisitsdisadvantage.
(c)[3%]Doestheerrorfunctionhavelocalminima.Explainyouranswer.
5.(10%)
(a)[6%]Supposewebeginwithallweightshavingexactlythesamevalueandrunthe back-propagationlearningalgorithmonsometrainingdata.Whichstatements belowwillal