2024-11-24 Sunday Sign in CN

Activities
Graph Refinement via Simultaneously Low-rank and Sparse Approximation
Home - Activities
Reporter:
Zhenyue Zhang, Professor, Zhejiang University
Inviter:
Aihui Zhou, Professor
Subject:
Graph Refinement via Simultaneously Low-rank and Sparse Approximation
Time and place:
10: 00-11: 00 March 16 (Thursday), N733
Abstract:

Graphs play an important role in many fields of machine learning such as clustering. Many graph-based machine learning approaches assume that the graphs have hidden group structures. However, the group structures are unclear or noisy in applications generally. Graph refinement aims to clarify the underlying group structures. In this work, a novel approach, called named Simultaneously Low-rank and Sparse Approximation (SLSA), is proposed for graph refinement, which imposes a strong cluster structure through strict sparse and low-rank assumptions simultaneously. This approach minimizes a nonconvex function. Fortunately, the optimization problem can be efficiently solved via an alternating iteration method, and the iterative method converges globally under a weak condition.

A fast iterative algorithm is also given for large-scale sparse graphs, which costs $O(n)$ in each iteration. Compared  with two other related methods for graph refinement, SLSA performs better on both synthetic and real-world data sets.

Applications of the refinement method SLSA on several machine learning algorithms are discussed in detail. Numerical experiments show that the improvements of these algorithms are significant under the SLSA modifications, and better than the improvements based on the refinements of other approaches.