Description
Describe the bug
Can't convert TensorFlow MNIST saved_model.pb to onnx model.
Urgency
none.
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Ubuntu 22.04
- TensorFlow Version: 2.18.0
- Python version: 3.10.12
- ONNX version (if applicable, e.g. 1.11*): 1.17.0
- ONNXRuntime version (if applicable, e.g. 1.11*): 1.20.0
To Reproduce
pip3 install git+https://github.com/onnx/tensorflow-onnx
python3 -m tf2onnx.convert --saved-model ./ --output model.onnx
Additional context
python3 -m tf2onnx.convert --saved-model ./ --output model.onnx
2024-11-09 15:41:08.659796: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1731138068.674427 22280 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1731138068.678814 22280 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-11-09 15:41:08.694085: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/usr/lib/python3.10/runpy.py:126: RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
warn(RuntimeWarning(msg))
W0000 00:00:1731138071.688361 22280 gpu_device.cc:2344] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
2024-11-09 15:41:11,690 - WARNING - '--tag' not specified for saved_model. Using --tag serve
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/snowuyl/samba/workspace_AI/tensorflow-onnx/tf2onnx/convert.py", line 714, in
main()
File "/home/snowuyl/samba/workspace_AI/tensorflow-onnx/tf2onnx/convert.py", line 242, in main
graph_def, inputs, outputs, initialized_tables, tensors_to_rename = tf_loader.from_saved_model(
File "/home/snowuyl/samba/workspace_AI/tensorflow-onnx/tf2onnx/tf_loader.py", line 636, in from_saved_model
_from_saved_model_v2(model_path, input_names, output_names,
File "/home/snowuyl/samba/workspace_AI/tensorflow-onnx/tf2onnx/tf_loader.py", line 570, in _from_saved_model_v2
imported = tf.saved_model.load(model_path, tags=tag) # pylint: disable=no-value-for-parameter
File "/home/snowuyl/.local/lib/python3.10/site-packages/tensorflow/python/saved_model/load.py", line 912, in load
result = load_partial(export_dir, None, tags, options)["root"]
File "/home/snowuyl/.local/lib/python3.10/site-packages/tensorflow/python/saved_model/load.py", line 1042, in load_partial
loader = Loader(object_graph_proto, saved_model_proto, export_dir,
File "/home/snowuyl/.local/lib/python3.10/site-packages/tensorflow/python/saved_model/load.py", line 223, in init
self._load_all()
File "/home/snowuyl/.local/lib/python3.10/site-packages/tensorflow/python/saved_model/load.py", line 320, in _load_all
self._load_nodes()
File "/home/snowuyl/.local/lib/python3.10/site-packages/tensorflow/python/saved_model/load.py", line 529, in _load_nodes
slot_variable = optimizer_object.add_slot(
AttributeError: '_UserObject' object has no attribute 'add_slot'