diff --git a/getting-started/tutorials/season-2-scaling-out-and-up/episode05.md b/getting-started/tutorials/season-2-scaling-out-and-up/episode05.md index b355ad62..f9c1849e 100644 --- a/getting-started/tutorials/season-2-scaling-out-and-up/episode05.md +++ b/getting-started/tutorials/season-2-scaling-out-and-up/episode05.md @@ -53,8 +53,8 @@ metaflow("HelloAWSFlow") %>% r_function = start, next_step = "hello") %>% step(step = "hello", - decorator("retry", times=2), - decorator("batch", cpu=2, memory=2048), + retry(times=2), + batch(cpu=2, memory=2048), r_function = hello, next_step = "end") %>% step(step = "end", diff --git a/metaflow-on-aws/metaflow-sandbox.md b/metaflow-on-aws/metaflow-sandbox.md index 7adb8841..7accab32 100644 --- a/metaflow-on-aws/metaflow-sandbox.md +++ b/metaflow-on-aws/metaflow-sandbox.md @@ -11,7 +11,7 @@ Only a limited number of sandboxes are available. When you sign up, you are adde Here are some ideas that you can try with the sandbox: * [The season 2 of tutorials](../getting-started/tutorials/season-2-scaling-out-and-up/) focuses on scaling out. This is a good way to get started. Note that the Season 1 tutorials work with the Sandbox too, when executed using [the `batch` decorator](../metaflow/scaling.md). -* You have up to 64 CPU cores at your disposal using [the `batch` decorator](../metaflow/scaling.md). Test some number crunching! You can run everything in the cloud simply by or you can mix local and remote steps by adding `decorator("batch",...)` to select steps. +* You have up to 64 CPU cores at your disposal using [the `batch` decorator](../metaflow/scaling.md). Test some number crunching! You can run everything in the cloud simply by or you can mix local and remote steps by adding `batch(...)` to select steps. * Test your favorite ML libraries in the cloud using [`batch`](../metaflow/scaling.md) decorator. For instance, try a basic hyper-parameter search using [a custom parameter grid and foreach](../metaflow/basics.md#foreach). * Evaluate Metaflow's [experiment tracking and versioning](../metaflow/tagging.md) using local runs and the [Client API](../metaflow/client.md) in a local notebook. In contrast to the local mode, all runs are registered globally in the Metaflow Service regardless of the directory where you run them. * Test how you can [`resume` tasks locally](../metaflow/debugging.md#how-to-use-the-resume-command) which were originally run remotely using [the `batch` decorator](../metaflow/scaling.md). diff --git a/metaflow/failures.md b/metaflow/failures.md index cf041a03..fe6c1cdb 100644 --- a/metaflow/failures.md +++ b/metaflow/failures.md @@ -33,7 +33,7 @@ end <- function(self){ metaflow("RetryFlow") %>% step(step="start", - decorator("retry"), + retry(), r_function=start, next_step="end") %>% step(step="end", @@ -115,7 +115,7 @@ you may end up withdrawing up to $4000 instead of the intended $1000. To make su metaflow("MoneyFlow") %>% ... step(step="withdraw", - decorator("retry", times=0), + retry(times=0), r_function=withdraw_money_from_account, next_step="end") %>% ... @@ -195,7 +195,7 @@ metaflow("CatchFlow") %>% next_step = "sanity_check", foreach = "params") %>% step(step = "sanity_check", - decorator("catch", var="compute_failed", print_exception=FALSE), + catch(var="compute_failed", print_exception=FALSE), r_function = sanity_check, next_step = "join") %>% step(step = "join", @@ -234,8 +234,8 @@ end <- function(self) { metaflow("SuicidalFlowR") %>% step( - decorator("catch", var = "start_failed"), - decorator("retry"), + catch(var = "start_failed"), + retry(), step = "start", r_function = start, next_step = "end" diff --git a/metaflow/scaling.md b/metaflow/scaling.md index 2d105ed9..33cdc690 100644 --- a/metaflow/scaling.md +++ b/metaflow/scaling.md @@ -39,7 +39,7 @@ end <- function(self) { metaflow("BigSumFlowR") %>% step( - decorator("resources", memory=60000, cpu=1), + resources(memory=60000, cpu=1), step = "start", r_function = start, next_step = "end" @@ -57,7 +57,7 @@ This example creates a huge 80000x80000 random matrix, `big_matrix`. The matrix If you attempt to run this on your local machine, it is likely that the following will happen: ```bash -Evaluation error: vector memory exhausted (limit reached?). +Evaluation error: vector memory exhausted (limit reached?). ``` This fails quickly due to a `MemoryError` on most laptops as we are unable to allocate 48GB of memory.