You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm having the following issue after loading the model.
I save it as following:
...train loop...
def dummy_graph():
x = np.ones((11, 6), dtype=np.float32)
a = np.ones((10, 10), dtype=np.float32)
e = np.ones((100, 1), dtype=np.float32)
i = np.zeros((10), dtype=np.int64)
return [x, a, e, i]
model(dummy_graph())
model.save("...dir...")
I then load the model as following:
model = tf.saved_model.load("...dir...")
If I then try to predict with the same dummy_graph() it works fine:
model(dummy_graph())
But when I try to predict using a graph of different size, for example a with shape (11,11), I get the following error:
def dummy_graph_2():
x = np.ones((11, 6), dtype=np.float32)
a = np.ones((11, 11), dtype=np.float32)
e = np.ones((100, 1), dtype=np.float32)
i = np.zeros((10), dtype=np.int64)
return [x, a, e, i]
model(dummy_graph_2())
ERROR:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/tmp/ipykernel_1600/2299801205.py in <module>
----> 1 model(dummy_graph_2())
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/saved_model/load.py in _call_attribute(instance, *args, **kwargs)
666
667 def _call_attribute(instance, *args, **kwargs):
--> 668 return instance.__call__(*args, **kwargs)
669
670
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in __call__(self, *args, **kwds)
826 tracing_count = self.experimental_get_tracing_count()
827 with trace.Trace(self._name) as tm:
--> 828 result = self._call(*args, **kwds)
829 compiler = "xla" if self._experimental_compile else "nonXla"
830 new_tracing_count = self.experimental_get_tracing_count()
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in _call(self, *args, **kwds)
860 # In this case we have not created variables on the first call. So we can
861 # run the first trace but we should fail if variables are created.
--> 862 results = self._stateful_fn(*args, **kwds)
863 if self._created_variables:
864 raise ValueError("Creating variables on a non-first call to a function"
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py in __call__(self, *args, **kwargs)
2939 with self._lock:
2940 (graph_function,
-> 2941 filtered_flat_args) = self._maybe_define_function(args, kwargs)
2942 return graph_function._call_flat(
2943 filtered_flat_args, captured_inputs=graph_function.captured_inputs) # pylint: disable=protected-access
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _maybe_define_function(self, args, kwargs)
3359
3360 self._function_cache.missed.add(call_context_key)
-> 3361 graph_function = self._create_graph_function(args, kwargs)
3362 self._function_cache.primary[cache_key] = graph_function
3363
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
3204 arg_names=arg_names,
3205 override_flat_arg_shapes=override_flat_arg_shapes,
-> 3206 capture_by_value=self._capture_by_value),
3207 self._function_attributes,
3208 function_spec=self.function_spec,
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
988 _, original_func = tf_decorator.unwrap(python_func)
989
--> 990 func_outputs = python_func(*func_args, **func_kwargs)
991
992 # invariant: `func_outputs` contains only Tensors, CompositeTensors,
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/eager/def_function.py in wrapped_fn(*args, **kwds)
632 xla_context.Exit()
633 else:
--> 634 out = weak_wrapped_fn().__wrapped__(*args, **kwds)
635 return out
636
~/.cache/pypoetry/virtualenvs/app-XPlGwO1Q-py3.7/lib/python3.7/site-packages/tensorflow/python/saved_model/function_deserialization.py in restored_function_body(*args, **kwargs)
271 .format(_pretty_format_positional(args), kwargs,
272 len(saved_function.concrete_functions),
--> 273 "\n\n".join(signature_descriptions)))
274
275 concrete_function_objects = []
ValueError: Could not find matching function to call loaded from the SavedModel. Got:
Positional arguments (2 total):
* [<tf.Tensor 'inputs:0' shape=(11, 6) dtype=float32>, <tf.Tensor 'inputs_1:0' shape=(11, 11) dtype=float32>, <tf.Tensor 'inputs_2:0' shape=(100, 1) dtype=float32>, <tf.Tensor 'inputs_3:0' shape=(10,) dtype=int64>]
* False
Keyword arguments: {}
Expected these arguments to match one of the following 4 option(s):
Option 1:
Positional arguments (2 total):
* [TensorSpec(shape=(None, 6), dtype=tf.float32, name='input_1'), TensorSpec(shape=(None, 10), dtype=tf.float32, name='input_2'), TensorSpec(shape=(None, 1), dtype=tf.float32, name='input_3'), TensorSpec(shape=(None,), dtype=tf.int64, name='input_4')]
* False
Keyword arguments: {}
Option 2:
Positional arguments (2 total):
* [TensorSpec(shape=(None, 6), dtype=tf.float32, name='input_1'), TensorSpec(shape=(None, 10), dtype=tf.float32, name='input_2'), TensorSpec(shape=(None, 1), dtype=tf.float32, name='input_3'), TensorSpec(shape=(None,), dtype=tf.int64, name='input_4')]
* True
Keyword arguments: {}
Option 3:
Positional arguments (2 total):
* [TensorSpec(shape=(None, 6), dtype=tf.float32, name='inputs/0'), TensorSpec(shape=(None, 10), dtype=tf.float32, name='inputs/1'), TensorSpec(shape=(None, 1), dtype=tf.float32, name='inputs/2'), TensorSpec(shape=(None,), dtype=tf.int64, name='inputs/3')]
* False
Keyword arguments: {}
Option 4:
Positional arguments (2 total):
* [TensorSpec(shape=(None, 6), dtype=tf.float32, name='inputs/0'), TensorSpec(shape=(None, 10), dtype=tf.float32, name='inputs/1'), TensorSpec(shape=(None, 1), dtype=tf.float32, name='inputs/2'), TensorSpec(shape=(None,), dtype=tf.int64, name='inputs/3')]
* True
Keyword arguments: {}
Whay I interpret from this message is that it is expecting an a with shape (None, 10) but I want this model to be able to predict on graphs of all sizes.
Can you help me find the cause of this problem? Thanks!
The text was updated successfully, but these errors were encountered:
the second dummy graph is not correct, because it has 10 nodes in the batch index i and still only 100 edges.
It should be:
defdummy_graph_2():
x=np.ones((11, 6), dtype=np.float32)
a=np.ones((11, 11), dtype=np.float32)
e=np.ones((121, 1), dtype=np.float32)
i=np.zeros((11), dtype=np.int64)
return [x, a, e, i]
Anyway, saving the entire model is a bit broken at the moment and the recommended way is to save only the model weights and re-create the model using the constructor from scratch.
Sorry about that
Hi! I'm having the following issue after loading the model.
I save it as following:
I then load the model as following:
If I then try to predict with the same dummy_graph() it works fine:
But when I try to predict using a graph of different size, for example
a
with shape(11,11)
, I get the following error:ERROR:
Whay I interpret from this message is that it is expecting an
a
with shape(None, 10)
but I want this model to be able to predict on graphs of all sizes.Can you help me find the cause of this problem? Thanks!
The text was updated successfully, but these errors were encountered: