Replies: 1 comment
-
|
Struggling to get this working with Github as opposed to CodeCommit since we can no longer create new repos |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Serverless Data Lake Framework 2.0.0 is now available.
The workshop has been updated.
For users of SDLF 1.x, version 1 is still available on the master branch. Development of newer versions of SDLF (2.x) happens on branch main. The workshop still contains sections for version 1 as well.
GPG signature files are included for archives in this release's assets files. You can use our public key to verify that the downloaded archive file is original and unmodified.
We welcome your feedback!
What’s New
SDLF 2.0 is very much in the same spirit as SDLF 1.0 - the constructs are the same and CloudFormation is still used as the language for provisioning the infrastructure. 2.0 intends to fix long-standing issues with SDLF, extend its usage to more data architecture patterns and bring commonly-asked features to the framework.
deploy.shtakes care of deploying the CICD infrastructure used to build these modules, and register them in the private CloudFormation registry of each account. Modules are updated whenever there is a deployment that requires them.pDomain(which defaults todatalake) can be provided when deploying foundations.dev,test,prod).sdlf-main.datadomain-{domain}-{env}.yaml.main,testanddevbranches are expected.parameters-{env}.json.sdlf-main-{domain}-{team}.pipelines.yamland datasets indatasets.yaml.sdlf-main-{domain}-{team}works the same way everything works in SDLF -main,testanddevbranches are expected.parameters-{env}.json.sdlf-datalakeLibrary. They are no longer needed and have been removed.pPipelineDetailsparameter when defining a dataset insdlf-dataset. This parameter goes even further and can be used to store more information that stages can use. These details are stored in the Datasets DynamoDB table (as was already the case in SDLF 1.x).pEventPatternin the example), and then process these events on a schedule (pSchedule)sdlf-pipLibraryis now part of an optional feature called Lambda Layer Deployer. Files related to this feature are part of a team main repository (sdlf-main-{domain}-{team}) under thelayersfolder.sdlf-utils. Files related to this feature are part of a team main repository (sdlf-main-{domain}-{team}) under thetransformsfolder.sdlf-monitoring, with CloudTrail, ELK forwarding and SNS.sdlf-monitoringis not deployed.sdlf-stage-dataqualitywill soon be available as an example on how to add a third stage making use of Glue Data Quality.deploy.sh, there is no more shell scripts.Full Changelog: 1.5.2...2.0.0
New Contributors
Thanks
We thank all the contributors/users for their work on this release!
This discussion was created from the release Serverless Data Lake Framework 2.0.0.
Beta Was this translation helpful? Give feedback.
All reactions