Update flexynesis#1679
Conversation
|
Still work in process. The inference mode should be added. Please don't review :) |
| </assert_contents> | ||
| </element> | ||
| </output_collection> | ||
| <output name="model" ftype="safetensors" file="model_1.safetensors" compare="sim_size"/> |
There was a problem hiding this comment.
Is there any other way to test this apart from sim_size?
There was a problem hiding this comment.
maybe some asserts? (Easiest: size?)
| <collection name="results" type="list" label="${tool.name} on ${on_string}: results"> | ||
| <discover_datasets pattern="(?P<name>.+)\.tabular$" format="tabular" directory="output"/> | ||
| </collection> | ||
| <data name="model" format="safetensors" from_work_dir="output/job.final_model.safetensors" label="${tool.name} on ${on_string}: trained_model" /> |
There was a problem hiding this comment.
Is this always needed? Should we make this optional?
There was a problem hiding this comment.
No we can make it optional.
But I think the optional should be set to True as default.
There was a problem hiding this comment.
+1 for creating a boolean parameter and making this output optional
There was a problem hiding this comment.
No we can make it optional. But I think the optional should be set to True as default.
This would mean your safetensor model will always be a output dataset unless the user unselects it. Do we want that?
There was a problem hiding this comment.
This would mean your safetensor model will always be a output dataset unless the user unselects it. Do we want that?
Yes, it will reduce the unnecessary job repetition. Imagine you are training a big data which takes hours to finish, and then you realize the trained model is not saved.
Especially in the next updates, where it is possible to use the model to integrate new data.
For me, the model is not a byproduct.
| </assert_contents> | ||
| </element> | ||
| </output_collection> | ||
| <output name="model" ftype="safetensors" file="model_18.safetensors" compare="sim_size"/> |
There was a problem hiding this comment.
you don't need to test this output here, I guess one of those outputs is enough? sim_size is not a very reliable test
There was a problem hiding this comment.
I don't know of other test for this data.
The size of them are different though.
| <token name="@VERSION_SUFFIX@">3</token> | ||
| <token name="@TOOL_VERSION@">1.1.1</token> | ||
| <token name="@VERSION_SUFFIX@">0</token> | ||
| <token name="@PROFILE@">24.1</token> |
There was a problem hiding this comment.
25.0 ... if your datatype gets merged?
| $use_cv | ||
| $evaluate_baseline_performance | ||
| --feature_importance_method $feature_importance_method | ||
| \${GALAXY_FLEXYNESIS_EXTRA_ARGUMENTS} |
There was a problem hiding this comment.
why do you remove this? We can now not set the --device to GPU
| </assert_contents> | ||
| </element> | ||
| </output_collection> | ||
| <output name="model" ftype="safetensors" file="model_6.safetensors" compare="sim_size"/> |
There was a problem hiding this comment.
don't use sim-size maybe assert a size, this way we do not need to ship the safetonsor files in the git repo here.
Co-authored-by: Saim Momin <64724322+SaimMomin12@users.noreply.github.com>
| <output name="model" ftype="safetensors"> | ||
| <assert_contents> | ||
| <has_size size="107104"/> | ||
| <has_size size="20"/> |
There was a problem hiding this comment.
Probably sim_size was a better option.
Why is this passing?
There was a problem hiding this comment.
The test passes with both size="107104" and size="20"
There was a problem hiding this comment.
I don't know for sure, but I assume because of https://github.com/bgruening/galaxytools/actions/runs/18556838587/job/52896358516?pr=1679#step:7:494
basically with the surrounding ftype check not being executable, the inner check won't run?
There was a problem hiding this comment.
I tried this locally and it passed:
<element name="job.stats">
<assert_contents>
<has_size size="123456789"/>
<has_text_matching expression="DirectPred\tErlotinib\tnumerical\tmse\t"/>
<has_text_matching expression="DirectPred\tErlotinib\tnumerical\tr2\t"/>
<has_text_matching expression="DirectPred\tErlotinib\tnumerical\tpearson_corr\t"/>
</assert_contents>
</element>
There was a problem hiding this comment.
Interesting. If you think its a bug, please look at one of those Galaxy internal test:
Try to replicated it and create a test that clearly shows its not considerd. If people have a test to work against it, they can fix it.
There was a problem hiding this comment.
But, now I see it ... its not size ... its value :)
There was a problem hiding this comment.
8b72a66 to
767f04d
Compare
|
Deployment step was skipped in CI |
This PR updates flexynesis to the newest version.
Features: