Skip to content

Commit

Permalink
Use dim_t instead of size_t (pytorch#3913)
Browse files Browse the repository at this point in the history
Summary:
Fix build
Documentation:

[Optional Fixes #issue]
Pull Request resolved: pytorch#3913

Test Plan: Please see a detailed explanation of how to fill out the fields in the relevant sections in PULL_REQUEST.md.

Differential Revision: D19085746

Pulled By: jackm321

fbshipit-source-id: d8a464963100af40d51957928f858820f917c10d
  • Loading branch information
jackm321 authored and facebook-github-bot committed Dec 16, 2019
1 parent 19988e7 commit bdb74bd
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions torch_glow/src/PyTorchModelLoader.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -2016,8 +2016,7 @@ Error PyTorchModelLoader::loadReshape(const torch::jit::Node *ptNode) {
}

return addValueMapping(
outputs[0],
F_.createReshape("reshape", input, castVector<size_t>(shape)));
outputs[0], F_.createReshape("reshape", input, castVector<dim_t>(shape)));
}

Error PyTorchModelLoader::loadRelu(const torch::jit::Node *ptNode) {
Expand Down

0 comments on commit bdb74bd

Please sign in to comment.