Skip to content

Conversation

@cwschilly
Copy link
Contributor

@cwschilly cwschilly commented Oct 21, 2025

Fixes #617

Explanation: Since writing JSON is a fairly small task, Python multiprocessing was introducing unneeded overhead. It's much faster to just write everything in serial: for the problem in question, the write() call went from taking 5 minutes down to 4 seconds

@cwschilly cwschilly linked an issue Oct 21, 2025 that may be closed by this pull request
@cwschilly
Copy link
Contributor Author

I wrote a simple timer decorator for profiling that may be useful down the line, so I separated it out into its own file and kept the decorators in VT Data Writer. The timings are only written when the logging level is debug.

@cwschilly cwschilly marked this pull request as ready for review October 23, 2025 13:25
try:
self.__extension = parameters["json_output_suffix"]
self.__compress = parameters["compressed"]
self.__add_communications = parameters.get("communications", True)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it make sense to default to True here? Or should users opt-in to writing comms?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Writing out json is very slow with large files

2 participants