(For a full list see below or go to Google Scholar

A plethora of real-world problems require utilization of hypergraphs and diffusion algorithms. Examples include recommendation systems, node ranking in food networks and community detection in social networks to mention a few. Due to the increased size and complexity of real hypergraphs, local and accurate diffusion algorithms that work with the most complex hypergraphs are in need. We propose the first local diffusion method that works on higher-order relations with only a submodularity assumption. Our method is based on a primal-dual optimization formulation where the primal problem has a natural network flow interpretation, and the dual problem has a cut-based interpretation using the l2-norm penalty for general submodular cut-costs. We prove that the proposed formulation achieves quadratic approximation error for the problem of local hypergraph clustering. We demonstrate that the new technique is significantly better than state-of-the-art methods over a range of real datasets for the local hypergraph clustering and node ranking problems.

*K. Fountoulakis, P. Li, S. Yang*

** **

Recently there has been increased interest in semi-supervised classification in the presence of graphical information. A new class of learning models has emerged that relies, at its most basic level, on classifying the data after first applying a graph convolution. To understand the merits of this approach, we study the classification of a mixture of Gaussians, where the data corresponds to the node attributes of a stochastic block model. We show that graph convolution extends the regime in which the data is linearly separable by a factor of roughly 1/sqrt{D}, where D is the expected degree of a node, as compared to the mixture model data on its own. Furthermore, we find that the linear classifier obtained by minimizing the cross-entropy loss after the graph convolution generalizes to out-of-distribution data where the unseen data can have different intra- and inter-class edge probabilities from the training data.

*A. Baranwal, K. Fountoulakis, A. Jagannath*

** **

Â

Local Hyper-flow Diffusion

*K. Fountoulakis, P. Li, S. Yang *

arXiv:2102.07945

Graph Convolution for Semi-Supervised Classification: Improved Linear Separability and Out-of-Distribution Generalization

*A. Baranwal, K. Fountoulakis, A. Jagannath *

arXiv:2102.06966

Targeted Pandemic Containment Through Identifying Local Contact Network Bottlenecks

*S. Yang, P. Senapati, D. Wang, C. T. Bauch, K. Fountoulakis *

arXiv:2006.06939

p-Norm Flow Diffusion for Local Graph Clustering

*K. Fountoulakis, D. Wang, S. Yang *

International Conference on Machine Learning (ICML) 2020

Flow-based Algorithms for Improving Clusters: A Unifying Framework, Software, and Performance

*K. Fountoulakis, M. Liu, D. F. Gleich, M. W. Mahoney *

arXiv:2004.09608

Statistical Guarantees for Local Graph Clustering

*W. Ha, K. Fountoulakis, M. W. Mahoney *

International Conference on Artificial Intelligence and Statistics (AISTATS) 2020

Parallel and Communication Avoiding Least Angle Regression

*S. Das, J. Demmel, K. Fountoulakis, L. Grigori, M. W. Mahoney, S. Yang *

SIAM Journal on Scientific Computing

Locality And Structure Aware Graph Node Embedding

*E. Faerman, F. Borutta, K. Fountoulakis, M. W. Mahoney *

International Conference on Web Intelligence 2018 (Best student paper award)

A Flexible Coordinate Descent Method

*K. Fountoulakis, R. Tappenden *

Computational Optimization and Applications

Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization

*A. Devarakonda, K. Fountoulakis, J. Demmel, M. Mahoney *

International Parallel and Distributed Processing Symposium (IPDPS) 2018

Variational Perspective on Local Graph Clustering

*K. Fountoulakis, F. Roosta-Khorasani, J. Shun, X. Cheng, M. Mahoney *

Mathematical Programming

Capacity Releasing Diffusion for Speed and Locality

*D. Wang, K. Fountoulakis, M. Henzinger, M. Mahoney, S. Rao *

International Conference on Machine Learning (ICML) 2017

Avoiding communication in primal and dual block coordinate descent methods

*A. Devarakonda, K. Fountoulakis, J. Demmel, M. Mahoney *

SIAM Journal on Scientific Computing (SISC)

A Randomized Rounding Algorithm for Sparse PCA

*K. Fountoulakis, A. Kundu, E. M. Kontopoulou, P. Drineas *

ACM Transactions on Knowledge Discovery from Data

An Optimization Approach to Locally-Biased Graph Algorithms

*K. Fountoulakis, D. Gleich, M. Mahoney *

Proceedings of the IEEE

Performance of First- and Second-Order Methods for L1-Regularized Least Squares Problems

*K. Fountoulakis, J. Gondzio *

Computational Optimization and Applications

A Second-Order Method for Strongly-Convex L1-Regularization Problems

*K. Fountoulakis, J. Gondzio *

Mathematical Programming

Parallel Local Graph Clustering

*J. Shun, F. Roosta-Khorasani, K. Fountoulakis, M. Mahoney *

Proceedings of the VLDB Endowment (VLDB) 2016

A Preconditioner for a Primal-dual Newton Conjugate Gradients Method for Compressed Sensing Problems

*I. Dassios, K. Fountoulakis, J. Gondzio *

SIAM Journal on Scientific Computing (SISC)

Matrix-free interior point method for compressed sensing problems

*K. Fountoulakis, J. Gondzio, P. Zhlobich *

Mathematical Programming Computation

Higher-Order Methods for Large-Scale Optimization

*K. Fountoulakis *

Kimon Fountoulakisâ€™ PhD Thesis