Open
Description
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
Yes
OS Platform and Distribution
iOS 15.4.1, Chrome 136.0.7103.93 (Official Build) (arm64)
MediaPipe Tasks SDK version
"@mediapipe/tasks-genai": "^0.10.22",
Task name (e.g. Image classification, Gesture recognition etc.)
Gen AI
Programming Language and version (e.g. C++, Python, Java)
Javascript
Describe the actual behavior
When trying to use a finetuned gemma 3 model (created with Gemma FT google notebook), it says it successfully loads, but gets error on text generation
Describe the expected behaviour
Genate text
Standalone code/steps you may have used to try to get what you need
https://gist.github.com/escottgoodwin/5c5e0ceb91392f3c2f4c6800585bba15
Other info / Complete Logs
It works with gemma3-1b-it-int4.task.
Created my model with:
https://colab.research.google.com/github/google-ai-edge/mediapipe-samples/blob/main/codelabs/litert_inference/Gemma3_1b_fine_tune.ipynb
Gives load successful message but when used
Error in webapp:
Error: Failed to generate response: Failed to predict asynchronously: INTERNAL: CalculatorGraph::Run() failed:
Calculator::Process() for node "DetokenizerCalculator" failed: RET_CHECK failure (third_party/odml/infra/genai/inference/calculators/detokenizer_calculator.cc:311) id >= 0 (-1 vs. 0) [type.googleapis.com/mediapipe.StatusList='status { code: 13 message: "Calculator::Process() for node \"DetokenizerCalculator\" failed: RET_CHECK failure (third_party/odml/infra/genai/inference/calculators/detokenizer_calculator.cc:311) id >= 0 (-1 vs. 0) " }']
=== Source Location Trace: ===
third_party/odml/infra/genai/inference/calculators/detokenizer_calculator.cc:311
third_party/odml/infra/genai/inference/calculators/detokenizer_calculator.cc:253
third_party/mediapipe/framework/calculator_node.cc:1013
third_party/odml/infra/genai/inference/llm_engine.cc:1535