Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Reimplementation of Graph Neural Networks with Scatter/Gather layers (L…
…LNL#1933) * Added lbann data generator * Remove extraneous data generator file * Updated GNN implementations to use 2D Scatter Gather - Added new ChannelwiseGRU implementation * Added new implementation of PROTEINS dataset for Sparse Scatter-Gather based GNNs - Adds new implementation of the sparse graph trainer for new GNN modules with 2D scatter-gather based message passing - PROTEINS dataset ready for unit testing of sparse GNNs (GCN, GIN, GatedGraph, and Graph conv modules) * Updated graph trainer with new graph neural network calls - Added sample graph data slice function * - Add 2D-scatter-gather based implementation of graph kernel - Add distconv enabled graph kernel for channelwise fully connected layer based kernel * - Fixed typos and added documentation for Graph modules - Updated integration test - Integration tests in application/graph/GNN/test passing - Minor change in Reshape.hpp to include layer name when throwing error - Updated trainer code according to new Sparse data format * Removed MNIST_Superpixel dataset for PyTorch dependency - Updated integration tests on applications/graph/GNN/test/ for stable testing of GNNs - Added complete distconv support to NNConv * Updated README for test directions - Added ChannelwiseGRU class documentation * Updated NNConvModel to include distconv through command line interface * Fix Channelwise-RNN implementation with correct Slice impl - Add layer info to error message in Reshape - Fix typos in Python documentation of implementation Co-authored-by: Tim Moon <[email protected]> * Added Channelwise GRUCell unit cell - Fixed link to point to correct PyTorch doc for GRUCell * Move lbann import in Bamboo test inside test function Co-authored-by: Tim Moon <[email protected]>
- Loading branch information