Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 440 - Contributed Poster Presentations: Section on Statistics in Defense and National Security
Type: Contributed
Date/Time: Wednesday, August 10, 2022 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistics in Defense and National Security
Abstract #323703
Title: HOT-Nets: Higher-Order Topological Neural Networks on Power Distribution System
Author(s): Roshni Anna Jacob* and Yuzhou Chen and Yulia R. Gel and Jie Zhang and H. Vincent Poor
Companies: University of Texas at Dallas and Princeton University and The University of Texas at Dallas and The University of Texas at Dallas and Princeton University
Keywords: persistent homology; simplicial neural networks; graph learning; power distribution system; resilience
Abstract:

With the increasing trend of cyber-physical threats, resilience of power systems emerges as a problem of utmost societal importance. We propose a novel approach for resilience quantification of power distribution networks, based on the notions of persistent homology and simplicial neural networks which is the newest direction in graph learning. Tools of persistent homology allow us to capture the most essential topological descriptors of the distribution network. In turn, extending the convolutional operation to simplicial complexes on the distribution network, using the Hodge-Laplacian analytics, enables us to describe complex interactions among multi-node higher-order graph substructures. Such higher-order graph substructures are of particular importance in distribution networks since a change in power at a bus will produce a corresponding perturbation in nodal variables (such as the bus voltages) and edge variables (such as branch currents). We validate our new Higher-Order Topological Neural Networks (HOT-Nets) model on resilience classification of standard power distribution networks. Our results indicate that HOT-Nets substantially outperforms 6 state-of-the-art methods, yiel


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program