generative adversarial networks INF5860—MachineLearningforImageAnalysis Ole-JohanSkrede 02.05.2018 UniversityofOslo Outline ∙ Repetition ∙ GenerativeAdversarialNetworks ∙ Otheradversarialmethods 1 repetition Autoencoders ∙ Anautoencoderf consistofanencoderg andandecoderh ∙ Theencodermapstheinputxtosomerepresentationz g(x)=z ∙ Weoftencallthisrepresentationz forthecodeorthelatentvector ∙ Thedecodermapsthisrepresentationz tosomeoutputx^ g(z)=x^ ∙ Wewanttotraintheencoderanddecodersuchthat f(x)=h(g(x))=x^(cid:25)x ∙ Commonlyusedforcompression,featureextractionand de-noising 3 Compression autoencoder — MNIST example (a)Original (b)Reconstructed 4 De-noising autoencoder — MNIST example (a)Original (b)Reconstructed 5 Variational autoencoders ∙ Avariationalautoencoderisdesignedtohavea continuouslatentspace ∙ Thismakesthemidealforrandomsamplingand interpolation ∙ Itachievethisbyforcingtheencoderg togenerate Gaussianrepresentations,z (cid:24)N((cid:22);(cid:27)2) ∙ Moreprecisely,foroneinput,theencodergeneratesa mean(cid:22)andavariance(cid:27)2 ∙ Wesamplethensampleazero-mean,unit-variance Gaussianz~(cid:24)N(0;1) ∙ Constructtheinputz tothedecoderfromthis z =(cid:22)+z~(cid:27)2 ∙ Withthis,z issampledfromq =N((cid:22);(cid:27)2) 6 Intuition ∙ Thisisastochasticsampling ∙ Thatis,wecansampledifferentz fromthesamesetof ((cid:22);(cid:27)2) ∙ Theintuitionisthatthedecoder“learns”thatforagiven inputx: ∙ thepointzisimportantforreconstruction ∙ butalsoaneighbourhoodofz ∙ Inthisway,wehavesmoothedthelatentspace,atleast locally ∙ Inthepreviouslecture,welearntwaystoachievethis 7 VAE example: reconstruction (a)Original (b)Reconstructed 8 VAE example: generation of new signals ∙ Samplearandomlatentvectorz fromN(0;1) ∙ Decodez 9
Description: