Add simple start of a test framework#120
Add simple start of a test framework#120tangledhelix wants to merge 1 commit intowf49670:developfrom
Conversation
|
There are more possible artifact-based tests that could be done here, if desired, too. For example, I didn't include images to test |
|
How to test this:
At least, if your change would create a diff. Reminder, my test projects don't exercise every ppgen feature, so not every change of output behavior is currently going to be caught by this test framework. It's just a start. |
|
@tangledhelix - I find it very convenient to be able to run the GG2 tests locally, though some may prefer to use Github actions. GG2 can use pytest to run tests like the ones you are describing, loading an input file, doing something, then checking the output against the expected. ETA: Should have made clear, I don't use ppgen, so this isn't a request for a feature for me - just mentioning what I find helpful in the GG2 setup. |
|
@tangledhelix -- Tnhanks, Dan. I'll take a look at that. I do something (or have done, in the past, not recently) similar on a more So, my tests run locally, as @windymilla described for his gg2 testing, but the concept is the same. And I can see that having something less manual, and more centralized where other developers could run it, could be useful. |
I do the same (for GG2) using a script I made from the Github Actions file. But the point of Actions are that they happen consistently, not just when the developer thinks to do it, and they require no setup. In the case of ppgen, there's no real setup (it only needs Python), but a possible next step in this direction might be to use the So may aim here is to make something
But there's nothing stopping anyone from running all the same commands locally, if they like. |
|
For those not familiar with the |
This PR adds the beginnings of an automated test framework for ppgen. This is by no means a complete set of useful tests; in fact it's not looking at the ppgen code at all. There are no unit tests, static analysis, linting, etc. here. This is a purely functional test examining output artifacts, for now.
But we can use the same automation framework to add all the other stuff.
For now, this is what I have:
.binand Guiguts2.jsonfilesIt's a simple test: if the newly built artifact differs from the pre-built artifact, then the test fails. The idea is to detect unintended differences in output. As the code is changed, some changes in output will be expected, in which case the expected output should be updated so that the test will again pass.
The test projects I've included are not designed to cover all of ppgen's features, so other projects should be included (or invented) to cover more of ppgen's features.
Additionally, it makes sense to do some other things that are more akin to the "usual" sort of automated testing projects might do. I'll point to some of the things we've done in Guiguts 2 as inspiration:
https://github.com/DistributedProofreaders/guiguts-py/blob/master/.github/workflows/python-app.yml