Deep Metric Learning on Graphs
Co-Supervised by: Anthony Gillioz and Linlin Jia
If you are interested in this topic or have further questions, do not hesitate to contact anthony.gillioz@unibe.ch or linlin.jia@unibe.ch.
Context
Similarity measurement forms the foundational process for a multitude of machine learning algorithms. With the advent of deep learning techniques, innovative methods have been proposed to address the challenge of similarity measurement, ushering in the era of Deep Metric Learning. This approach leverages the representational capabilities of artificial neural networks to extract deep feature embeddings. These embeddings are obtained via the employment of so-called siamese networks, which are typically trained by employing either contrastive or triplet loss functions.
This procedure has been adapted to accommodate graph structures, a powerful tool capable of representing more intricate real-world data and relational dynamics. Graph Neural Networks (GNNs) have emerged as the suitable deep learning models for this purpose. They utilize similarity or dissimilarity measures, such as graph kernels and graph edit distances, to facilitate metric learning on these complex structures.
In this project, we aim to investigate and harness the potential of a recently proposed loss function known as “group loss”, tailoring it specifically for deep metric learning on graph structures. The group loss function operates by considering similarity across all sample embeddings within a given group, providing an innovative solution to the limitations of contrastive or triplet loss functions. By addressing these shortcomings, the group loss function harbors the potential to achieve superior performance across a range of tasks, thus signifying an important advancement in the field of deep metric learning on graphs.
Goal(s)
- Adapt group loss to graph datasets and graph neural networks.
- Evaluate and compare the performance of group loss against SOTA loss functions.
- Possible improvements for the group loss.
- (optional) Adapt the group loss to address regression and classification problems.
- (optional) Draft and submit a conference paper detailing the findings of this research.
Approach
- Adapting group loss to graph structures can be straightforward by taking advantage of off-the-shelf implementation of graph neural networks and open benchmark graph datasets. During this process, certain modifications might be necessary to accommodate the unique characteristics of graph data.
- Extensions to classification problems can be achieved through at least 3 possible approaches: fine-tuning a network for prediction, applying a classifier based on metrics such as SVM, and train end-to-end a network for prediction by combining losses on metrics and targets.
- While classification labels are crucial for computing loss, regression tasks might require a different approach, possibly involving segmentation or sampling techniques.
Required Skills
- Basic statistics and a bit of graph algorithms, knowledge on graph neural networks is appreciated.
- Good programming skills, with a preference for Python and PyTorch.
- Communication in English (or Chinese 😉).
Further Reading(s)
- Elezi, I., Vascon, S., Torcinovich, A., Pelillo, M., & Leal-Taixé, L. (2020). The group loss for deep metric learning. In Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part VII 16 (pp. 277-294). Springer International Publishing.
- Elezi, I., Seidenschwarz, J., Wagner, L., Vascon, S., Torcinovich, A., Pelillo, M., & Leal-Taixe, L. (2022). The group loss++: A deeper look into group loss for deep metric learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 45(2), 2505-2518.