Skip to content

Verify example results for auto-testing #132

@trevorhardy

Description

@trevorhardy

Right now we have the Github actions running the examples and the workflow fails when any of the examples fails to complete. This is good and will catch changes in Python, PyHELICS or the HELICS API that are fundamentally breaking. What we probably need is some way of verifying the data actually coming out of the co-simulations is correct, though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions