ebook img

Unifying Random Fourier Features and Leverage Scores for Kernel Matrix Approximation PDF

131 Pages·2017·5.32 MB·English
by  
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Unifying Random Fourier Features and Leverage Scores for Kernel Matrix Approximation

Nonlinear Dimensionality Reduction for Faster Kernel Methods in Machine Learning ChristopherMusco,MassachusettsInstituteofTechnology February27,2018 1 relevant paper ICML2017: “Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees” Jointworkwith: Haim Avron (TAU) Michael Kapralov (EPFL) Cameron Musco (MIT) Ameya Velingker (EPFL) Amir Zandieh (EPFL) 2 • Develop an improved Random Fourier Features method based on this analysis (better in theory and experiments). Lotsofopenquestionsanddirectionsforfuturework. Opportunities to combine techniques from randomized linear algebra and Fourier methods. Specifics: • Analyze Random Fourier Features method (Rahimi, Recht NIPS ’07) using techniques based on leverage scores. outline Mainidea: Study Fourierkernelapproximationmethods from a matrix samplingpointofview. 3 • Develop an improved Random Fourier Features method based on this analysis (better in theory and experiments). Lotsofopenquestionsanddirectionsforfuturework. Opportunities to combine techniques from randomized linear algebra and Fourier methods. outline Mainidea: Study Fourierkernelapproximationmethods from a matrix samplingpointofview. Specifics: • Analyze Random Fourier Features method (Rahimi, Recht NIPS ’07) using techniques based on leverage scores. 3 Lotsofopenquestionsanddirectionsforfuturework. Opportunities to combine techniques from randomized linear algebra and Fourier methods. outline Mainidea: Study Fourierkernelapproximationmethods from a matrix samplingpointofview. Specifics: • Analyze Random Fourier Features method (Rahimi, Recht NIPS ’07) using techniques based on leverage scores. • Develop an improved Random Fourier Features method based on this analysis (better in theory and experiments). 3 outline Mainidea: Study Fourierkernelapproximationmethods from a matrix samplingpointofview. Specifics: • Analyze Random Fourier Features method (Rahimi, Recht NIPS ’07) using techniques based on leverage scores. • Develop an improved Random Fourier Features method based on this analysis (better in theory and experiments). Lotsofopenquestionsanddirectionsforfuturework. Opportunities to combine techniques from randomized linear algebra and Fourier methods. 3 quick refresher on kernel methods 3 (theoreticallywell-understood,multipurpose,widelyused) kernel methods in machine learning Adapt standard linear learning methods (least squares regression, support vector machines, PCA, k-means clustering) to learn nonlinear relationships. 4 kernel methods in machine learning Adapt standard linear learning methods (least squares regression, support vector machines, PCA, k-means clustering) to learn nonlinear relationships. (theoreticallywell-understood,multipurpose,widelyused) 4 2 3 x 6 1 7 6 . 7 6 .. 7 6 7 6 7 x 6 d 7 =) ϕ(x) = 66x x 77 1 1 6 7 6x x 7 6 1 27 6 . 7 4 .. 5 x x d d kernel methods in machine learning “Lift” data points to a higher dimensional feature space. E.g. 2 3 x 6 17 6x 7 6 27 x = 6 . 7 4 .. 5 x d 5

Description:
Kernel Methods in Machine Learning. Quick refresher on kernel methods. 4 .. tr(K(K + λI)−1. ) is the statistical dimension. • Since A ⪯ B ⇒ B−1 ⪯ A−1, (ZZ∗. + λI)−1 ≈ (K + λI)−1. • Spectral approximation also gives approximation bounds for kernel PCA, k-means, CCA, etc. (
See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.