Skip to content

Conversation

@veblush
Copy link
Collaborator

@veblush veblush commented Jan 13, 2026

Brainstroming.

BUG=N/a

@veblush
Copy link
Collaborator Author

veblush commented Jan 13, 2026

cc @rameshkunasi

Comment on lines +320 to +339
TfLiteStatus MicroInterpreterGraph::ResetVariableTensor(int tensor_index,
int subgraph_index) {
const SubGraph* subgraph = (*subgraphs_)[subgraph_index];
auto* tensor = subgraph->tensors()->Get(tensor_index);
if (tensor->is_variable()) {
size_t buffer_size;
TF_LITE_ENSURE_STATUS(TfLiteEvalTensorByteLength(
&subgraph_allocations_[subgraph_index].tensors[tensor_index],
&buffer_size));

int value = 0;
if (tensor->type() == tflite::TensorType_INT8) {
value = tensor->quantization()->zero_point()->Get(0);
}
memset(subgraph_allocations_[subgraph_index].tensors[tensor_index].data.raw,
value, buffer_size);
}
return kTfLiteOk;
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this only works for TensorFlow 1.0 variable type tensors, the default return value should be failure. Also TensorType_INT8 should not be used for quantization checks. Instead the existence of the quantization field in the schema should be used. Multi-channel quantization checks also need to be here, or alternatively fail this method.

@ddavis-2015
Copy link
Member

ddavis-2015 commented Jan 14, 2026

@veblush This will not work for RESOURCE variable tensors (TensorFlow 2.0)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants