HOOVER: Distributed, Flexible, and Scalable Streaming Graph Processing on OpenSHMEM

Seminar | 336 | 11:00

Max Grossman,



Many problems can benefit from being phrased as a graph processing or graph analytics problem: infectious disease modeling, insider threat detection, fraud prevention, social network analysis, and more. These problems all share a common property: the relationships between entities in these systems are crucial to understanding the overall behavior of the systems themselves. However, relations are rarely if ever static. As our ability to collect information on those relations improve (e.g. on financial transactions in fraud prevention), the value added by large-scale, high-performance, dynamic/streaming (rather than static) graph analysis becomes significant.

This talk introduces HOOVER, a distributed software framework for large-scale, dynamic graph modeling and analysis. HOOVER sits on top of OpenSHMEM, a PGAS programming system, and enables users to plug in application-specific logic while handling all runtime coordination of computation and communication. HOOVER has demonstrated scaling out to 24,576 cores, and is flexible enough to support a wide range of graph-based applications, including infectious disease modeling and anomaly detection.


Dr. Max Grossman is a research scientist at Rice University with joint positions in the Children’s Environmental Health Initiative and the Habanero Extreme Scale Software Research Lab. Max’s research focuses on scalable and programmable approaches to problems in big data and big compute, particularly for novel large scale scientific, engineering, or statistical workloads. Max is also a Principal and Co-Founder of 7pod Technologies, where he consults in the research and development of large scale scientific and machine learning applications, primarily in the energy industry. Max completed a PhD in Computer Science at Rice University in 2017.

For more information: