Skip to content

Issue while running GraphModel predict #8508

Open
@sivarajakani

Description

@sivarajakani

I'm encountering an undefined variable issue while running inference with the GraphModel in TensorFlow.js. It's occurring in tfjs-converter/src/operations/executors/utils.ts#L138.

System Information:

  • OS: Ubuntu 24
  • TensorFlow.js Version: v4.22.0 / v4.5.0 (installed via npm)
  • Browser: Firefox 134.0.2
  • TensorFlow.js Converter Version: v4.5.0
  • TensorFlow Version: v2.1.0

Current Behavior:
Inference fails due to an undefined variable, leading to a TypeError.

Expected Behavior:
The GraphModel should execute inference without errors.

Minimal Reproducible Code (in HTML):

<!DOCTYPE html>
<html lang="en">
<head>
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs/dist/tf.min.js"></script>
</head>
<body>
    <script>
        async function run() {
            const model = await tf.loadGraphModel('./model.json');
            
            const tensor = tf.zeros([1, 320, 320, 3]);

            const result = model.predict({"input": tensor});
        }

        run();
    </script>
</body>
</html>

I have attached the model to reproduce the issue

tfjs_graph.zip

Error Log (from Node):

TypeError: Cannot read properties of undefined (reading 'split')

      at parseNodeName (tfjs-converter/src/operations/executors/utils.ts:138:22)
      at getTensor (tfjs-converter/src/operations/executors/utils.ts:74:29)
      at getParamValue (tfjs-converter/src/operations/executors/utils.ts:53:20)
...

Any insights on resolving this issue?

Metadata

Metadata

Assignees

Labels

type:bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions