Skip to content

Commit

Permalink
fix s3 link (dmlc#1310)
Browse files Browse the repository at this point in the history
  • Loading branch information
VoVAllen authored Mar 4, 2020
1 parent 349a48b commit c23a61b
Show file tree
Hide file tree
Showing 24 changed files with 58 additions and 58 deletions.
10 changes: 5 additions & 5 deletions apps/kg/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,11 @@ DGL-KE provides five knowledge graphs:

| Dataset | #nodes | #edges | #relations |
|---------|--------|--------|------------|
| [FB15k](https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/FB15k.zip) | 14951 | 592213 | 1345 |
| [FB15k-237](https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/FB15k-237.zip) | 14541 | 310116 | 237 |
| [wn18](https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/wn18.zip) | 40943 | 151442 | 18 |
| [wn18rr](https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/wn18rr.zip) | 40943 | 93003 | 11 |
| [Freebase](https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/Freebase.zip) | 86054151 | 338586276 | 14824 |
| [FB15k](https://data.dgl.ai/dataset/FB15k.zip) | 14951 | 592213 | 1345 |
| [FB15k-237](https://data.dgl.ai/dataset/FB15k-237.zip) | 14541 | 310116 | 237 |
| [wn18](https://data.dgl.ai/dataset/wn18.zip) | 40943 | 151442 | 18 |
| [wn18rr](https://data.dgl.ai/dataset/wn18rr.zip) | 40943 | 93003 | 11 |
| [Freebase](https://data.dgl.ai/dataset/Freebase.zip) | 86054151 | 338586276 | 14824 |

Users can specify one of the datasets with `--dataset` in `train.py` and `eval.py`.

Expand Down
4 changes: 2 additions & 2 deletions apps/kg/dataloader/KGDataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class KGDataset1:
The triples are stored as 'head_name\trelation_name\ttail_name'.
'''
def __init__(self, path, name, read_triple=True, only_train=False):
url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/{}.zip'.format(name)
url = 'https://data.dgl.ai/dataset/{}.zip'.format(name)

if not os.path.exists(os.path.join(path, name)):
print('File not found. Downloading from', url)
Expand Down Expand Up @@ -105,7 +105,7 @@ class KGDataset2:
The triples are stored as 'head_nid\trelation_id\ttail_nid'.
'''
def __init__(self, path, name, read_triple=True, only_train=False):
url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/{}.zip'.format(name)
url = 'https://data.dgl.ai/dataset/{}.zip'.format(name)

if not os.path.exists(os.path.join(path, name)):
print('File not found. Downloading from', url)
Expand Down
4 changes: 2 additions & 2 deletions docker/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,12 @@ docker build -t dgl-lint -f Dockerfile.ci_lint .

### CPU image for kg
```bash
wget https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/FB15k.zip -P install/
wget https://data.dgl.ai/dataset/FB15k.zip -P install/
docker build -t dgl-cpu:torch-1.2.0 -f Dockerfile.ci_cpu_torch_1.2.0 .
```

### GPU image for kg
```bash
wget https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/FB15k.zip -P install/
wget https://data.dgl.ai/dataset/FB15k.zip -P install/
docker build -t dgl-gpu:torch-1.2.0 -f Dockerfile.ci_gpu_torch_1.2.0 .
```
4 changes: 2 additions & 2 deletions examples/pytorch/graphwriter/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,6 @@ We repeat the experiment five times.
### Examples

We also provide the output of our implementation on test set together with the reference text.
- [GraphWriter's output](https://s3.us-east-2.amazonaws.com/dgl.ai/models/graphwriter/tmp_pred.txt)
- [Reference text](https://s3.us-east-2.amazonaws.com/dgl.ai/models/graphwriter/tmp_gold.txt)
- [GraphWriter's output](https://data.dgl.ai/models/graphwriter/tmp_pred.txt)
- [Reference text](https://data.dgl.ai/models/graphwriter/tmp_gold.txt)

2 changes: 1 addition & 1 deletion examples/pytorch/graphwriter/prepare_data.sh
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
wget https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/AGENDA.tar.gz
wget https://data.dgl.ai/dataset/AGENDA.tar.gz
mkdir data
tar -C data/ -xvzf AGENDA.tar.gz
2 changes: 1 addition & 1 deletion examples/pytorch/metapath2vec/download.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class AminerDataset(object):
"""
def __init__(self, path):

self.url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/aminer.zip'
self.url = 'https://data.dgl.ai/dataset/aminer.zip'

if not os.path.exists(os.path.join(path, 'aminer.txt')):
print('File not found. Downloading from', self.url)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -124,11 +124,11 @@ directory, with three statistics logged in `generation_stats.txt` under `eval_re

We also provide a jupyter notebook where you can visualize the generated molecules

![](https://s3.us-east-2.amazonaws.com/dgl.ai/model_zoo/drug_discovery/dgmg/DGMG_ZINC_canonical_vis.png)
![](https://data.dgl.ai/model_zoo/drug_discovery/dgmg/DGMG_ZINC_canonical_vis.png)

and compare their property distributions against the training molecule property distributions

![](https://s3.us-east-2.amazonaws.com/dgl.ai/model_zoo/drug_discovery/dgmg/DGMG_ZINC_canonical_dist.png)
![](https://data.dgl.ai/model_zoo/drug_discovery/dgmg/DGMG_ZINC_canonical_dist.png)

You can download the notebook with `wget https://data.dgl.ai/dgllife/dgmg/eval_jupyter.ipynb`.

Expand Down
4 changes: 2 additions & 2 deletions examples/pytorch/model_zoo/chem/property_prediction/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,9 +111,9 @@ on the training and validation set for reference.

[8] visualizes the weights of atoms in readout for possible interpretations like the figure below.
We provide a jupyter notebook for performing the visualization and you can download it with
`wget https://s3.us-east-2.amazonaws.com/dgl.ai/model_zoo/drug_discovery/AttentiveFP/atom_weight_visualization.ipynb`.
`wget https://data.dgl.ai/model_zoo/drug_discovery/AttentiveFP/atom_weight_visualization.ipynb`.

![](https://s3.us-west-2.amazonaws.com/dgl-data/dgllife/attentive_fp_vis_example.png)
![](https://data.dgl.ai/dgllife/attentive_fp_vis_example.png)

## Dataset Customization

Expand Down
2 changes: 1 addition & 1 deletion examples/pytorch/pointcloud/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
local_path = args.dataset_path or os.path.join(get_download_dir(), data_filename)

if not os.path.exists(local_path):
download('https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/modelnet40-sampled-2048.h5', local_path)
download('https://data.dgl.ai/dataset/modelnet40-sampled-2048.h5', local_path)

CustomDataLoader = partial(
DataLoader,
Expand Down
2 changes: 1 addition & 1 deletion examples/pytorch/recommendation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ NOTE: this version is not using NodeFlow yet.

This example only work with Python 3.6+

First, download and extract from https://dgl.ai.s3.us-east-2.amazonaws.com/dataset/ml-1m.tar.gz
First, download and extract from https://data.dgl.ai/dataset/ml-1m.tar.gz

One can then run the following to train PinSage on MovieLens-1M:

Expand Down
2 changes: 1 addition & 1 deletion examples/pytorch/rrn/sudoku_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ def __len__(self):

def _get_sudoku_dataset(segment='train'):
assert segment in ['train', 'valid', 'test']
url = "https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/sudoku-hard.zip"
url = "https://data.dgl.ai/dataset/sudoku-hard.zip"
zip_fname = "/tmp/sudoku-hard.zip"
dest_dir = '/tmp/sudoku-hard/'

Expand Down
2 changes: 1 addition & 1 deletion examples/pytorch/rrn/sudoku_solver.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def solve_sudoku(puzzle):
model_filename = os.path.join(model_path, 'rrn-sudoku.pkl')
if not os.path.exists(model_filename):
print('Downloading model...')
url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/models/rrn-sudoku.pkl'
url = 'https://data.dgl.ai/models/rrn-sudoku.pkl'
urllib.request.urlretrieve(url, model_filename)

model = torch.load(model_filename, map_location='cpu')
Expand Down
4 changes: 2 additions & 2 deletions examples/pytorch/transformer/dataset/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
from dgl.data.utils import *

_urls = {
'wmt': 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/wmt14bpe_de_en.zip',
'scripts': 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/transformer_scripts.zip',
'wmt': 'https://data.dgl.ai/dataset/wmt14bpe_de_en.zip',
'scripts': 'https://data.dgl.ai/dataset/transformer_scripts.zip',
}

def prepare_dataset(dataset_name):
Expand Down
2 changes: 1 addition & 1 deletion python/dgl/contrib/sampling/sampler.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ class NeighborSampler(NodeFlowSampler):
layer :math:`i+1` are in layer :math:`i`. All the edges are from nodes
in layer :math:`i` to layer :math:`i+1`.
.. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/NodeFlow.png
.. image:: https://data.dgl.ai/tutorial/sampling/NodeFlow.png
As an analogy to mini-batch training, the ``batch_size`` here is equal to the number
of the initial seed nodes (number of nodes in the last layer).
Expand Down
2 changes: 1 addition & 1 deletion python/dgl/nodeflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ class NodeFlow(DGLBaseGraph):
We store extra information, such as the node and edge mapping from
the NodeFlow graph to the parent graph.
.. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/api/sampling.nodeflow.png
.. image:: https://data.dgl.ai/api/sampling.nodeflow.png
DO NOT create NodeFlow object directly. Use sampling method to
generate NodeFlow instead.
Expand Down
8 changes: 4 additions & 4 deletions tutorials/basics/1_first.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
# 33). The network is visualized as follows with the color indicating the
# community:
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/img/karate-club.png
# .. image:: https://data.dgl.ai/tutorial/img/karate-club.png
# :align: center
#
# The task is to predict which side (0 or 33) each member tends to join given
Expand Down Expand Up @@ -135,7 +135,7 @@ def build_karate_club_graph():
# node will update its own feature with information sent from neighboring
# nodes. A graphical demonstration is displayed below.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/1_first/mailbox.png
# .. image:: https://data.dgl.ai/tutorial/1_first/mailbox.png
# :alt: mailbox
# :align: center
#
Expand Down Expand Up @@ -266,7 +266,7 @@ def draw(i):
plt.close()

###############################################################################
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/1_first/karate0.png
# .. image:: https://data.dgl.ai/tutorial/1_first/karate0.png
# :height: 300px
# :width: 400px
# :align: center
Expand All @@ -278,7 +278,7 @@ def draw(i):
ani = animation.FuncAnimation(fig, draw, frames=len(all_logits), interval=200)

###############################################################################
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/1_first/karate.gif
# .. image:: https://data.dgl.ai/tutorial/1_first/karate.gif
# :height: 300px
# :width: 400px
# :align: center
Expand Down
12 changes: 6 additions & 6 deletions tutorials/basics/4_batch.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
# In this tutorial, you learn how to perform batched graph classification
# with DGL. The example task objective is to classify eight types of topologies shown here.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/dataset_overview.png
# .. image:: https://data.dgl.ai/tutorial/batch/dataset_overview.png
# :align: center
#
# Implement a synthetic dataset :class:`data.MiniGCDataset` in DGL. The dataset has eight
Expand Down Expand Up @@ -64,7 +64,7 @@
# a batch of graphs can be viewed as a large graph that has many disjointed
# connected components. Below is a visualization that gives the general idea.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/batch.png
# .. image:: https://data.dgl.ai/tutorial/batch/batch.png
# :width: 400pt
# :align: center
#
Expand All @@ -91,7 +91,7 @@ def collate(samples):
# ----------------
# Graph classification proceeds as follows.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/graph_classifier.png
# .. image:: https://data.dgl.ai/tutorial/batch/graph_classifier.png
#
# From a batch of graphs, perform message passing and graph convolution
# for nodes to communicate with others. After message passing, compute a
Expand Down Expand Up @@ -254,16 +254,16 @@ def forward(self, g):
###############################################################################
# The animation here plots the probability that a trained model predicts the correct graph type.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/test_eval4.gif
# .. image:: https://data.dgl.ai/tutorial/batch/test_eval4.gif
#
# To understand the node and graph representations that a trained model learned,
# we use `t-SNE, <https://lvdmaaten.github.io/tsne/>`_ for dimensionality reduction
# and visualization.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/tsne_node2.png
# .. image:: https://data.dgl.ai/tutorial/batch/tsne_node2.png
# :align: center
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/batch/tsne_graph2.png
# .. image:: https://data.dgl.ai/tutorial/batch/tsne_graph2.png
# :align: center
#
# The two small figures on the top separately visualize node representations after one and two
Expand Down
8 changes: 4 additions & 4 deletions tutorials/basics/5_hetero.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
# The following diagram shows several entities in the ACM dataset and the relationships among them
# (taken from `Shi et al., 2015 <https://arxiv.org/pdf/1511.04854.pdf>`_).
#
# .. figure:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/hetero/acm-example.png#
# .. figure:: https://data.dgl.ai/tutorial/hetero/acm-example.png#
#
# This graph has three types of entities that correspond to papers, authors, and publication venues.
# It also contains three types of edges that connect the following:
Expand All @@ -70,7 +70,7 @@
# marked with a rating, then each rating value could correspond to a different edge type.
# The following diagram shows an example of user-item interactions as a heterograph.
#
# .. figure:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/hetero/recsys-example.png
# .. figure:: https://data.dgl.ai/tutorial/hetero/recsys-example.png
#
#
# Knowledge graph
Expand All @@ -81,7 +81,7 @@
# occupation (item P106) is politician (item Q82955). The relationships are shown in the following.
# diagram.
#
# .. figure:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/hetero/kg-example.png
# .. figure:: https://data.dgl.ai/tutorial/hetero/kg-example.png
#

###############################################################################
Expand Down Expand Up @@ -144,7 +144,7 @@
import scipy.io
import urllib.request

data_url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/ACM.mat'
data_url = 'https://data.dgl.ai/dataset/ACM.mat'
data_file_path = '/tmp/ACM.mat'

urllib.request.urlretrieve(data_url, data_file_path)
Expand Down
6 changes: 3 additions & 3 deletions tutorials/models/1_gnn/8_sse_mx.py
Original file line number Diff line number Diff line change
Expand Up @@ -567,6 +567,6 @@ def test(g, test_nodes, predictor):
#
# For full examples, see `Benchmark SSE on multi-GPUs <https://github.com/dmlc/dgl/tree/master/examples/mxnet/sse>`_ on Github.
#
# .. |image0| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/img/floodfill-paths.gif
# .. |image1| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/img/neighbor-sampling.gif
# .. |image2| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/img/sse.gif
# .. |image0| image:: https://data.dgl.ai/tutorial/img/floodfill-paths.gif
# .. |image1| image:: https://data.dgl.ai/tutorial/img/neighbor-sampling.gif
# .. |image2| image:: https://data.dgl.ai/tutorial/img/sse.gif
18 changes: 9 additions & 9 deletions tutorials/models/1_gnn/9_gat.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@
# embedding :math:`h_i^{(l+1)}` of layer :math:`l+1` from the embeddings of
# layer :math:`l`.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/gat.png
# .. image:: https://data.dgl.ai/tutorial/gat/gat.png
# :width: 450px
# :align: center
#
Expand Down Expand Up @@ -355,7 +355,7 @@ def load_cora_data():
# to their labels, whereas the edges are colored according to the magnitude of
# the attention weights, which can be referred with the colorbar on the right.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/cora-attention.png
# .. image:: https://data.dgl.ai/tutorial/gat/cora-attention.png
# :width: 600px
# :align: center
#
Expand Down Expand Up @@ -383,7 +383,7 @@ def load_cora_data():
#
# As a reference, here is the histogram if all the nodes have uniform attention weight distribution.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/cora-attention-uniform-hist.png
# .. image:: https://data.dgl.ai/tutorial/gat/cora-attention-uniform-hist.png
# :width: 250px
# :align: center
#
Expand Down Expand Up @@ -453,7 +453,7 @@ def load_cora_data():
# learning curves of GAT and GCN are presented below; what is evident is the
# dramatic performance adavantage of GAT over GCN.
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/ppi-curve.png
# .. image:: https://data.dgl.ai/tutorial/gat/ppi-curve.png
# :width: 300px
# :align: center
#
Expand All @@ -475,7 +475,7 @@ def load_cora_data():
#
# Again, comparing with uniform distribution:
#
# .. image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/ppi-uniform-hist.png
# .. image:: https://data.dgl.ai/tutorial/gat/ppi-uniform-hist.png
# :width: 250px
# :align: center
#
Expand All @@ -502,7 +502,7 @@ def load_cora_data():
# * See the optimized `full example <https://github.com/dmlc/dgl/blob/master/examples/pytorch/gat/gat.py>`_.
# * The next tutorial describes how to speedup GAT models by parallelizing multiple attention heads and SPMV optimization.
#
# .. |image2| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/cora-attention-hist.png
# .. |image5| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/ppi-first-layer-hist.png
# .. |image6| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/ppi-second-layer-hist.png
# .. |image7| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/gat/ppi-final-layer-hist.png
# .. |image2| image:: https://data.dgl.ai/tutorial/gat/cora-attention-hist.png
# .. |image5| image:: https://data.dgl.ai/tutorial/gat/ppi-first-layer-hist.png
# .. |image6| image:: https://data.dgl.ai/tutorial/gat/ppi-second-layer-hist.png
# .. |image7| image:: https://data.dgl.ai/tutorial/gat/ppi-final-layer-hist.png
2 changes: 1 addition & 1 deletion tutorials/models/3_generative_model/5_dgmg.py
Original file line number Diff line number Diff line change
Expand Up @@ -715,7 +715,7 @@ def get_log_prob(self):
import torch.utils.model_zoo as model_zoo

# Download a pre-trained model state dict for generating cycles with 10-20 nodes.
state_dict = model_zoo.load_url('https://s3.us-east-2.amazonaws.com/dgl.ai/model/dgmg_cycles-5a0c40be.pth')
state_dict = model_zoo.load_url('https://data.dgl.ai/model/dgmg_cycles-5a0c40be.pth')
model = DGMG(v_max=20, node_hidden_size=16, num_prop_rounds=2)
model.load_state_dict(state_dict)
model.eval()
Expand Down
2 changes: 1 addition & 1 deletion tutorials/models/4_old_wines/7_transformer.py
Original file line number Diff line number Diff line change
Expand Up @@ -875,6 +875,6 @@
#
# .. note::
# The notebook itself is not executable due to many dependencies.
# Download `7_transformer.py <https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/7_transformer.py>`__,
# Download `7_transformer.py <https://data.dgl.ai/tutorial/7_transformer.py>`__,
# and copy the python script to directory ``examples/pytorch/transformer``
# then run ``python 7_transformer.py`` to see how it works.
4 changes: 2 additions & 2 deletions tutorials/models/5_giant_graph/1_sampling_mx.py
Original file line number Diff line number Diff line change
Expand Up @@ -419,6 +419,6 @@ def forward(self, nf):
# advantages of this API are 1) simplicity, 2) allowing more system-level
# optimization in the future.
#
# .. |image0| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/NodeFlow.png
# .. |image1| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/sampling_result.png
# .. |image0| image:: https://data.dgl.ai/tutorial/sampling/NodeFlow.png
# .. |image1| image:: https://data.dgl.ai/tutorial/sampling/sampling_result.png
#
6 changes: 3 additions & 3 deletions tutorials/models/5_giant_graph/2_giant.py
Original file line number Diff line number Diff line change
Expand Up @@ -356,7 +356,7 @@
# We can see that DGL can scale to graphs with up to 500M nodes and 25B
# edges.
#
# .. |image0| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/arch.png
# .. |image1| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/NUMA_speedup.png
# .. |image2| image:: https://s3.us-east-2.amazonaws.com/dgl.ai/tutorial/sampling/whole_speedup.png
# .. |image0| image:: https://data.dgl.ai/tutorial/sampling/arch.png
# .. |image1| image:: https://data.dgl.ai/tutorial/sampling/NUMA_speedup.png
# .. |image2| image:: https://data.dgl.ai/tutorial/sampling/whole_speedup.png
#

0 comments on commit c23a61b

Please sign in to comment.