internal : Failed to initialize session: %s% INTERNAL: CalculatorGraph::Run() failed #5724
Labels
os:linux-non-arm
Issues on linux distributions which run on x86-64 architecture. DOES NOT include ARM devices.
task:LLM inference
Issues related to MediaPipe LLM Inference Gen AI setup
type:bug
Bug in the Source Code of MediaPipe Solution
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
No
OS Platform and Distribution
Linux Ubuntu 16.04
Mobile device if the issue happens on mobile device
Pixel 7a
Browser and version if the issue happens on browser
No response
Programming Language and version
Python
MediaPipe version
No response
Bazel version
No response
Solution
LLM Inference
Android Studio, NDK, SDK versions (if issue is related to building in Android environment)
No response
Xcode & Tulsi version (if issue is related to building for iOS)
No response
Describe the actual behavior
I created .tflite file using ai-edge-torch for Llama 3.2 1B model and created the Task Bundle as instructed in the documentation. After pushing the .task file to the device and modified the mediapipe example (which was for Gemma) for LLama. After running it, I get an error.
Describe the expected behaviour
I should be able to run inference without any issue.
Standalone code/steps you may have used to try to get what you need
Other info / Complete Logs
The text was updated successfully, but these errors were encountered: