Posts

Showing posts from December, 2012

Kernel Approximations for Efficient SVMs (and other feature extraction methods) [update]

Image
Recently we added another method for kernel approximation, the Nyström method, to scikit-learn , which will be featured in the upcoming 0.13 release. Kernel-approximations were my first somewhat bigger contribution to scikit-learn and I have been thinking about them for a while. To dive into kernel approximations, first recall the kernel-trick .

Another look at MNIST

Image
I'm a bit obsessed with MNIST. Mainly because I think it should not be used in any papers any more - it is weird for a lot of reasons. When preparing the workshop we held yesterday I noticed one that I wasn't aware of yet: most of the 1-vs-1 subproblems, are really easy! Basically all pairs of numbers can be separated perfectly using a linear classifier! And even you you just do a PCA to two dimensions, they can pretty much still be linearly separated! It doesn't get much easier than that. This makes me even more sceptical about "feature learning" results on this dataset. To illustrate my point, here are all pairwise PCA projections. The image is pretty huge. Otherwise you wouldn't be able to make out individual data points. You can generate it using this very simple gist . There are some classes that are not obviously separated: 3 vs 5, 4 vs 9, 5 vs 8 and 7 vs 9. But keep in mind, this is just a PCA to two dimensions. It doesn't mean that

Workshop on Python, Machine Learning and Scikit-Learn

Today there was a workshop at my uni, organized by my Professor Sven Behnke, together with my colleagues Hannes Schulz, Nenard Birešev and me. The target group was a local graduate school with a general scientific background, but not much CS or machine learning. The workshop consisted of us explaining the methods and the students then playing around with them and answering some questions using IPython notebooks that we provided (if you still don't know about IPython Notebooks, watch this talk now ). Using the notebooks worked out great! There is only so much you can teach in a 5 hour workshop but I think we got across some basic concepts of machine learning and working with data in Python. We got some positive feedback and the students really went exploring. We covered PCA, k-means, linear regression, logistic regression and nearest neighbors, including some real-world examples. You can find all resources, including tex and notebooks for generating figures etc. on gith