Skip to content

[BUG] Fails to transmit printer status (on Janus datachannel) when size > 1500 due to large file_metadata #229

Closed
@puterboy

Description

@puterboy

In __init__.py there is a function status_update_to_client_loop which prints the _print_job_tracker.status every 0.75 or 2 seconds. This in turn calls post_printer_status_to_client which gets the status payload as a json message from the status routine in print_job_tracker.py. The status includes a range of printer status data, including file_metadata.

The status message in turn is transmitted over the Janus datachannel by calling send_msg_to_client in client_conn.py which encodes and compresses the message before sending via the send routine where the size of the message is limited to MAX_PAYLOAD_SIZE=1500 (since this is UDP). If the size is too large, the send FAILS COMPLETELY and an error message is posted to octoprint.log.

Now, the status includes file_metadata which in turn includes all the additions to the file metadata added by various Octoprint plugins.

Some of this metadata -- such as that provided by PrintTimeGenius is quite detailed causing the total compressed message to ALWAYS exceed 1500 bytes.

As a result the send routine ALWAYS fails meaning that no status data is EVER successfully sent.
This results in an error message every 2 seconds in octoprint.log of form:
2024-01-28 17:46:07,625 - octoprint.plugins.obico - ERROR - datachannel payload too big (2029)

This may explain some of the other bugs I am encountering where filament usage is not properly transmitted.

Note that PrintTimeGenius is a pretty popular plugin so I suspect that others are also encountering this problem but since the fail is silent, others have not detected it yet.

I can brute force prevent overflowing by commenting-out the lines in print_job_tracker.py that add self.get_file_metadata(plugin, data) or self._file_metadata_cache to data['status'] but I understand that some of the metadata may actually be useful.

If so, the most robust solution would presumably be to chop the message up into chunks < 1500 bytes and reassemble on the receiving end.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinghelp wantedExtra attention is needed

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions