Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null #5310 #5401
Labels
platform:android
Issues with Android as Platform
task:LLM inference
Issues related to MediaPipe LLM Inference Gen AI setup
type:bug
Bug in the Source Code of MediaPipe Solution
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
None
OS Platform and Distribution
android 14
Mobile device if the issue happens on mobile device
tecno common 30
Browser and version if the issue happens on browser
No response
Programming Language and version
java/c++
MediaPipe version
latest
Bazel version
No response
Solution
no
Android Studio, NDK, SDK versions (if issue is related to building in Android environment)
No response
Xcode & Tulsi version (if issue is related to building for iOS)
No response
Describe the actual behavior
Failed to get LLM params: INVALID_ARGUMENT: LLM model file is null
Describe the expected behaviour
gemma in mediapipe works well any time and any where
Standalone code/steps you may have used to try to get what you need
Other info / Complete Logs
The text was updated successfully, but these errors were encountered: