Showing posts from June, 2012

Update for structured SVM in Python

I just pushed an update for my structured SVM in Python. This contains a bugfix in the dual formulation  and a subgradient descent version of the structured SVM.

Structured SVM and Structured Perceptron for CRF learning in Python

[EDIT: If you are reading this now, have a look at . The project matured quit a bit in the meantime.]  Today I pushed some of my code to github that I use for experimenting with CRF learning. This goes along the lines of my recent posts on graphcut and I hope to post a full CRF learning framework for semantic image segmentation soon. This is a pretty standard setup in computer vision, but I really haven't found much code online. Actually I haven't found any code to learn loopy CRFs, so I hope my simple implementation can help to get a better understanding of these methods. It certainly helped me ;)

Basics on structured learning and prediction

I just pushed some of my structured learning code to github and hope that some people might find it useful. Before describing my code here, I wanted to give a basic intro into structured prediction. I hope I can at least convey some intuition for this vast research area. So here goes... What is structured learning and prediction? Structured prediction is a generalization of the standard paradigms of supervised learning, classification and regression. All of these can be thought of finding a function that minimizes some loss over a training set. The differences are in the kind of functions that are used and the losses. In classification, the target domain are discrete class labels, and the loss is usually the 0-1 loss, i.e. counting the misclassifications. In regression, the target domain is the real numbers, and the loss is usually mean squared error. In structured prediction, both the target domain and the loss are more or less arbitrary. This means the goal is not to predict