2.13.10
Features
Argo Workflows for Incident.io alerts
This release introduces support for Incident.io
alerts with Argo Workflows. In order to enable these, some additional configuration is required compared to other notification implementations.
as an example
python alerting_flow.py argo-workflows create \
--notify-on-error \
--notify-on-success \
--notify-incident-io-api-key API-KEY \
--incident-io-error-severity-id ERROR-ID \
--incident-io-success-severity-id SUCCESS-ID
The API key used should have permissions to create incidents.
The severity ID's are a requirement from incident.io
as this is how alerts are categorized. All severity ID's are account based and users can create new ones as they please, which is why we must set ones as part the flow deployment.
Improvements
Default to Pickle protocol 4 for artifacts
This release changes the default artifact serialization to use protocol 4 for pickling. The change should lead to storage savings in small (<2 GB) artifacts along with faster serializations due to skipping trying protocol 2 first.
What's Changed
- Update cli.py to use
echo_always()
for methodoutput_raw
and `out… by @xujiboy in #2244 - feature: basic support for incident.io in Argo Workflows by @saikonen in #2245
- Fix issues with configs and Runner by @romain-intel in #2234
- Update project doc with new possible options by @romain-intel in #2220
- serialize artifacts with pickle protocol 4 if possible by @amerberg in #2243
- change default micromamba for s3 datastore by @savingoyal in #2254
- Revert "change default micromamba for s3 datastore" by @savingoyal in #2255
- skip boto3 compilation for code download by @savingoyal in #2257
- release: 2.13.10 by @saikonen in #2260
New Contributors
Full Changelog: 2.13.9...2.13.10