Skip to content

Commit

Permalink
[Doc] Minor fix on the distributed training doc. (dmlc#2968)
Browse files Browse the repository at this point in the history
Co-authored-by: Zheng <[email protected]>
  • Loading branch information
zheng-da and Zheng authored Jun 2, 2021
1 parent f8d6bf8 commit 7a816f4
Showing 1 changed file with 0 additions and 6 deletions.
6 changes: 0 additions & 6 deletions python/dgl/distributed/dist_graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -422,12 +422,6 @@ class DistGraph:
... labels = g.ndata['labels'][block.dstdata[dgl.NID]]
... pred = model(block, feat)
Note
----
``DistGraph`` currently only supports graphs with only one node type and one edge type.
For heterogeneous graphs, users need to convert them into DGL graphs with one node type and
one edge type and store the actual node types and edge types as node data and edge data.
Note
----
DGL's distributed training by default runs server processes and trainer processes on the same
Expand Down

0 comments on commit 7a816f4

Please sign in to comment.