Skip to content

Unable to convert the finetuned model to the Tflite format #5916

@TryCAEarvr

Description

@TryCAEarvr

Configure conversion parameters.

config = converter.ConversionConfig(
input_ckpt="C:/Users/Z8/fine_tuned_science_gemma2b-it",
ckpt_format="safetensors",
model_type="GEMMA_2B",
backend='gpu',
output_dir="C:/Users/Z8/fine_tuned_science_gemma2b-it/out",
combine_file_only=False,
vocab_model_file='C:/Users/Z8/fine_tuned_science_gemma2b-it/tokenizer.json',
output_tflite_file=f'C:/Users/Z8/fine_tuned_science_gemma2b-it/scigemma.bin',
)

Start model conversion.

converter.convert_checkpoint(config)

print("Model converted successfully.")

Error:

AttributeError Traceback (most recent call last)
Cell In[28], line 14
2 config = converter.ConversionConfig(
3 input_ckpt="C:/Users/Z8/fine_tuned_science_gemma2b-it",
4 ckpt_format="safetensors",
(...) 10 output_tflite_file=f'C:/Users/Z8/fine_tuned_science_gemma2b-it/scigemma.bin',
11 )
13 # Start model conversion.
---> 14 converter.convert_checkpoint(config)
16 print("Model converted successfully.")

AttributeError: module 'mediapipe.python._framework_bindings.model_ckpt_util' has no attribute 'GenerateGpuTfLite'

Getting these Attribute Error. Kindly provide your support in this regard.

Metadata

Metadata

Assignees

Labels

platform:pythonMediaPipe Python issuestask:LLM inferenceIssues related to MediaPipe LLM Inference Gen AI setup

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions