Guy Lever
homepage:http://www.cs.ucl.ac.uk/people/G.Lever.html
search externally:   Google Scholar,   Springer,   CiteSeer,   Microsoft Academic Search,   Scirus ,   DBlife

Description

My research interests are computational and statistical learning theory. In particular I'm interested in geometrical aspects of learning theory, graph-theoretic methods in machine learning and the PAC-Bayes analysis. I'm also interested in neural networks.

The current focus of my research is on the geometrical aspects of machine learning; in particular, understanding the role played in the learning process by the intrinsic geometry of the data or data-generating distribution. The working assumption is that the intrinsic structure of data (e.g. its existence on a manifold , its cluster structure or existence on a graph) plays a key role in the learning process and so a good analysis of learning should be related to the relevant data structure. A goal is to develop sharper explanations of the learning process by a "correct" handling of the geometry involved and then to use this analysis to develop learning methodologies which exploit the intrinsic geometry of data, and obtain performance guarantees relative to the (observed) intrinsic data structure.

Because of their geometrical aspects I'm interested in transduction and semi-supervised learning.

In Relating Function Class Complexity and Cluster Structure in the Function Domain with Applications to Transduction I have derived results relating the Rademacher complexity of function classes to the intrinsic structure of the input data, which demonstrates the intuitive notion that when the data is nicely structured - well-clustered by k-means in a natural intrinsic metric in this case - we can learn well with fewer examples and we can explicitly quantify this improvement (which can be large) in terms of observed structural quantities.

Since a graph built on random samples plays a pivotal role in our current means of understanding the intrinsic geometry of data, and because of the increasing amount of large real-world data which naturally live on graphs and networks, I have worked on some foundational topics such as quantifying the geometry of a graph and its relevance to machine learning, understanding the complexity of learning functions defined over the vertices of a graph and analyzing the performance of algorithms which exploit the structure of a graph, in terms of that structure. For example, in Predicting the labelling of a graph via minimum p-seminorm interpolation the duality between the foundational mincut or minimum "smoothness functional" framework for learning over a graph and the resistance metric on the graph is developed. A specific result here, for example, is that, in the context of online graph label prediction, it is possible to reduce the dependence of regret bounds on the radius of a graph from a linear dependence to a logarithmic dependence. In Online prediction on large diameter graphs fast methods for learning over large graphs with almost state-of-the-art performance guarantees are presented.

More recently I have taken an interest in sharper explanations of the learning process, such as the PAC-Bayes approach, and have been working on incorporating geometrical aspects of learning into the PAC-Bayes analysis.

See my publications page for more details.

I am affiliated with Centre for Computational Statistics and Machine Learning (CSML).


Lectures:

lecture
flag Distribution-Dependent PAC-Bayes Priors
as author at  PASCAL Foundations and New Trends of PAC Bayesian Learning, London 2010,
3587 views
  lecture
flag Function class complexity and cluster structure with applications to transduction
as author at  13th International Conference on Artificial Intelligence and Statistics (AISTATS), Sardinia 2010,
3350 views
lecture
flag Online Prediction on Large Diameter Graphs
as author at  NIPS Workshop on New Challenges in Theoretical Machine Learning: Learning with Data-dependent Concept Spaces, Whistler 2008,
3284 views
  lecture
locked flag Composite learning systems using machine learning primitives
as author at  Cognitive Systems Workshop & Thematic Programmes and Pump Priming Workshops, Cumberland Lodge 2012,
1978 views