Abstract

Dimensionality reduction techniques map data represented on higher dimensions onto lower dimensions with varying degrees of information loss. Graph dimensionality reduction techniques adopt the same principle of providing latent representations of the graph structure with minor adaptations to the output representations along with the input data. There exist several cutting edge techniques that are efficient at generating embeddings from graph data and projecting them onto low dimensional latent spaces. Due to variations in the operational philosophy, the benefits of a particular graph dimensionality reduction technique might not prove advantageous to every scenario or rather every dataset. As a result, some techniques are efficient at representing the relationship between nodes at lower dimensions, while others are good at encapsulating the entire graph structure on low dimensional space. We present this survey to outline the benefits as well as problems associated with the existing graph dimensionality reduction techniques. We also attempted to connect the dots regarding the potential improvements to some of the techniques. This survey could be helpful for upcoming researchers interested in exploring the usage of graph representation learning to effectively produce low-dimensional graph embeddings with varying degrees of granularity.

Authors: Just me :)

Papers: https://arxiv.org/abs/2211.05594

Code: Survey Paper, so no code :)