Characterise graph classes (for example, trees, planar graphs, regular graphs, cactus graphs, …) satisfying conditions on convexity related parameters like bounds on the Radon number, the geodesic iteration number, or the separability axioms of the induced convexity space. The goal is to find general graph classes where the conditions hold. In other words, the goal is to give necessary and sufficient conditions.
First, understand the basic notions of graph convexity spaces.
Second, choose any subtopic (Radon numbers, geodesic iteration numbers, separation,…), understand the basic results and then try to generalise from there.
Suggested possible starting points are:
This is quite a long list of useful books and papers. We recommend to start with Pelayo’s book and read the other mentioned material only if needed.
Primary resource (book defining all needed concept):
Additional more general book on convexity spaces:
Classical papers discussing (graph) convexity spaces:
Graph convexity spaces and computational complexity:
Recent papers employing convexity aspects of graphs in machine learning:
The mentioned parameters and notions determine the learnability of vertex-classification tasks on graphs. Hence, we are interested in general graph classes where bounds and conditions hold.
Classical notions like convex hulls, halfspaces and (linear) separability can be generalised from Euclidean space to arbitrary convexity spaces. We are particularly interested in convex sets in graphs defined by shortest paths: A set X of vertices is convex if any shortest path having endpoints in X stays in the set.
Even though convex sets in graphs are largely studied from a theoretical computer science perspective (e.g., algorithms, bounds, hardness), only recently researchers have started to use this notion in machine learning settings. In particular, we are interested in vertex classification tasks: A graph with some vertex labels is given, and the goal is to predict the remaining ones. Convexity-based assumption are used to achieve provable bounds on the performane of the learning algorithms. For example, we might assume that the classes are convex in the graph.
Various tools and notions were developed to classify properties of convexity spaces. That includes generalizations of classical parameters like the Radon, Helly and Caratheodory numbers, the separative capabilities of the space (i.e., can a halfspace in the space separate two classes?) and the geodesic iteration number.
We have found that the discussed parameters largely determine to a large extent if learning is possible. Therefore, we are interested in graph classes satisfying these assumptions. In the long run, this might enable applications in domains like biology or social networks, because we can expect that the graphs in such applications belong to relatively restricted graph classes (for example, planar graphs, partial cubes, bipartite graphs, etc.).