Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot convert TF-Text Tokenizer to TensorRT #486

Open
dshahrokhian opened this issue Jan 7, 2021 · 4 comments
Open

Cannot convert TF-Text Tokenizer to TensorRT #486

dshahrokhian opened this issue Jan 7, 2021 · 4 comments

Comments

@dshahrokhian
Copy link

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): No
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 16.04
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 2.4.0
  • Python version: 3.6
  • Bazel version (if compiling from source):
  • GCC/Compiler version (if compiling from source):
  • CUDA/cuDNN version: 11.0/7.6
  • GPU model and memory: RTX 2080

Describe the current behavior
Whenever I try to convert a model containing the tokenizer as subgraph, I get an error.

Describe the expected behavior
It should just convert it.

Standalone code to reproduce the issue
https://colab.research.google.com/drive/1_S37VihkTZ1B0HgjW8D7DMZcI8nwz2Bk

Other info / logs

---------------------------------------------------------------------------
InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-7-e780e9ed6737> in <module>
      4 )
      5 
----> 6 converter.convert()

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/compiler/tensorrt/trt_convert.py in convert(self, calibration_input_fn)
   1094                                   self._input_saved_model_tags)
   1095     func = self._saved_model.signatures[self._input_saved_model_signature_key]
-> 1096     frozen_func = convert_to_constants.convert_variables_to_constants_v2(func)
   1097     grappler_meta_graph_def = saver.export_meta_graph(
   1098         graph_def=frozen_func.graph.as_graph_def(), graph=frozen_func.graph)

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/framework/convert_to_constants.py in convert_variables_to_constants_v2(func, lower_control_flow, aggressive_inlining)
   1069       func=func,
   1070       lower_control_flow=lower_control_flow,
-> 1071       aggressive_inlining=aggressive_inlining)
   1072 
   1073   output_graph_def, converted_input_indices = _replace_variables_by_constants(

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/framework/convert_to_constants.py in __init__(self, func, lower_control_flow, aggressive_inlining, variable_names_allowlist, variable_names_denylist)
    804         variable_names_allowlist=variable_names_allowlist,
    805         variable_names_denylist=variable_names_denylist)
--> 806     self._build_tensor_data()
    807 
    808   def _build_tensor_data(self):

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/framework/convert_to_constants.py in _build_tensor_data(self)
    823         data = map_index_to_variable[idx].numpy()
    824       else:
--> 825         data = val_tensor.numpy()
    826       self._tensor_data[tensor_name] = _TensorData(
    827           numpy=data,

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in numpy(self)
   1069     """
   1070     # TODO(slebedev): Consider avoiding a copy for non-CPU or remote tensors.
-> 1071     maybe_arr = self._numpy()  # pylint: disable=protected-access
   1072     return maybe_arr.copy() if isinstance(maybe_arr, np.ndarray) else maybe_arr
   1073 

~/anaconda3/envs/ds/lib/python3.6/site-packages/tensorflow/python/framework/ops.py in _numpy(self)
   1037       return self._numpy_internal()
   1038     except core._NotOkStatusException as e:  # pylint: disable=protected-access
-> 1039       six.raise_from(core._status_to_exception(e.code, e.message), None)  # pylint: disable=protected-access
   1040 
   1041   @property

~/anaconda3/envs/ds/lib/python3.6/site-packages/six.py in raise_from(value, from_value)

InvalidArgumentError: Cannot convert a Tensor of dtype resource to a NumPy array.
@markomernick
Copy link
Member

Hi @dshahrokhian - thanks for this report.

This seems like a limitation of TensorRT when it comes to converting lookup tables (or other objects that require DT_RESOURCE tensors.) Do you know if TensorRT has a way to support these types of objects?

@dshahrokhian
Copy link
Author

Hey @markomernick thanks for the response. I am not sure, but given the traceback I assumed that this had more to do with TF-TRT than with TensorRT.

Will create an issue on their page too and reference it.

@thuang513
Copy link
Member

@sanjoy can you help take a look at this issue?

@sanjoy
Copy link

sanjoy commented Feb 12, 2021

CC @bixia1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants