International Workshop on Advances in Regularization, Optimization, Kernel Methods and Support Vector Machines (ROKS): theory and applications, Leuven 2013
One area of high impact both in theory and applications is kernel methods and support vector machines. Optimization problems, learning and representations of models are key ingredients in these methods. On the other hand considerable progress has also been made on regularization of parametric models, including methods for compressed sensing and sparsity, where convex optimization plays a prominent role. The aim of ROKS-2013 is to provide a multi-disciplinary forum where researchers of different communities can meet, to find new synergies along these areas, both at the level of theory and applications.
The scope includes but is not limited to:
- Regularization: L2, L1, Lp, lasso, group lasso, elastic net, spectral regularization, nuclear norm, others
- Support vector machines, least squares support vector machines, kernel methods, gaussian processes and graphical models
- Lagrange duality, Fenchel duality, estimation in Hilbert spaces, reproducing kernel Hilbert spaces, Banach spaces, operator splitting
- Optimization formulations, optimization algorithms
- Supervised, unsupervised, semi-supervised learning, inductive and transductive learning
- Multi-task learning, multiple kernel learning, choice of kernel functions, manifold learning
- Prior knowledge incorporation
- Approximation theory, learning theory, statistics
- Matrix and tensor completion, learning with tensors
- Feature selection, structure detection, regularization paths, model selection
- Sparsity and interpretability
- On-line learning and optimization
- Applications in machine learning, computational intelligence, pattern analysis, system identification, signal processing, networks, datamining, others
- Software
For more information visit the Workshop´s website.
Opening | ||||
Invited Talks | ||||
Oral session 1: Feature selection and sparsity | ||||
Oral session 2: Optimization algorithms | ||||
Oral session 3: Kernel methods and support vector machines | ||||
Oral session 4: Structured low-rank approximation | ||||
Oral session 5: Robustness | ||||
Closing | ||||