Skip to content

Commit 1970b2c

Browse files
fix: linked-stacks docs (#36426) (#36439)
1 parent 35fb60a commit 1970b2c

File tree

2 files changed

+37
-37
lines changed

2 files changed

+37
-37
lines changed
Lines changed: 23 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
2-
page_title: Pass data from one Stack to another
3-
description: Learn how to pass data from one Stack to another using `publish_output` blocks to output data from one Stack, and `upstream_input` blocks to input that data into another Stack.
2+
page_title: Pass data from one Stack to another
3+
description: Learn how to pass data from one Stack to another using `publish_output` blocks to output data from one Stack, and `upstream_input` blocks to input that data into another Stack.
44
---
55

66
# Pass data from one Stack to another
@@ -10,7 +10,7 @@ If you have multiple Stacks that do not share a provisioning lifecycle, you can
1010

1111
## Background
1212

13-
You may need to pass data between different Stacks in your project. For example, one Stack in your organization may manage shared services, such as networking infrastructure, and another Stack may manage application components. Using separate Stacks lets you manage the infrastructure independently, but you may still need to share data from your networking Stack to your application Stack.
13+
You may need to pass data between different Stacks in your project. For example, one Stack in your organization may manage shared services, such as networking infrastructure, and another Stack may manage application components. Using separate Stacks lets you manage the infrastructure independently, but you may still need to share data from your networking Stack to your application Stack.
1414

1515
To output information from a Stack, declare a `publish_output` block in the deployment configuration of the Stack exporting data. We refer to the Stack that declares a `publish_output` block as the upstream Stack.
1616

@@ -24,7 +24,7 @@ Downstream Stacks must also reside in the same project as their upstream Stacks.
2424

2525
## Declare outputs
2626

27-
You must declare a `publish_output` block in your deployment configuration for each value you want to output from your current Stack.
27+
You must declare a `publish_output` block in your deployment configuration for each value you want to output from your current Stack.
2828

2929
For example, you can add a `publish_output` block for the `vpc_id` in your upstream Stack’s deployment configuration. You can directly reference a deployment's values with the `deployment.deployment_name` syntax.
3030

@@ -41,25 +41,25 @@ publish_output "vpc_id" {
4141

4242
</CodeBlockConfig>
4343

44-
After applying your configuration, any Stack in the same project can now reference your network deployment's `vpc_id` output by declaring an `upstream_input` block.
44+
After applying your configuration, any Stack in the same project can now reference your network deployment's `vpc_id` output by declaring an `upstream_input` block.
4545

46-
Once you apply a Stack configuration version that includes your `publish_output` block, HCP Terraform publishes a snapshot of those values, which allows HCP Terraform to resolve them. Meaning, you must apply your Stack’s deployment configuration before any downstream Stacks can reference your Stack's outputs.
46+
Once you apply a Stack configuration version that includes your `publish_output` block, HCP Terraform publishes a snapshot of those values, which allows HCP Terraform to resolve them. Meaning, you must apply your Stack’s deployment configuration before any downstream Stacks can reference your Stack's outputs.
4747

4848
Learn more about the [`publish_output` block](/terraform/language/stacks/reference/tfdeploy#publish_output-block-configuration).
4949

5050
## Consume the output from an upstream Stack
5151

5252
Declare an `upstream_input` block in your Stack’s deployment configuration to read values from another Stack's `publish_output` block. Adding an `upstream_input` block creates a dependency on the upstream Stack.
5353

54-
For example, if you want to use the output `vpc_id` from an upstream Stack in the same project, declare an `upstream_input` block in your deployment configuration.
54+
For example, if you want to use the output `vpc_id` from an upstream Stack in the same project, declare an `upstream_input` block in your deployment configuration.
5555

5656
<CodeBlockConfig filename="application.tfdeploy.hcl">
5757

5858
```hcl
5959
# Application Stack deployment configuration
6060
61-
upstream_input "networking_stack" {
62-
type = "Stack"
61+
upstream_input "network_stack" {
62+
type = "stack"
6363
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
6464
}
6565
@@ -72,42 +72,42 @@ deployment "application" {
7272

7373
</CodeBlockConfig>
7474

75-
After pushing your Stack's configuration into HCP Terraform, HCP Terraform searches for the most recently published snapshot of the upstream Stack your configuration references. If no snapshot exists, the downstream Stack's run fails.
75+
After pushing your Stack's configuration into HCP Terraform, HCP Terraform searches for the most recently published snapshot of the upstream Stack your configuration references. If no snapshot exists, the downstream Stack's run fails.
7676

7777
If HCP Terraform finds a published snapshot for your referenced upstream Stack, then all of that Stack's outputs are available to this downstream Stack. Add `upstream_input` blocks for every upstream Stack you want to reference. Learn more about the [`upstream_input` block](/terraform/language/stacks/reference/tfdeploy#upstream_input-block-configuration).
7878

7979

8080
## Trigger runs when output values change
8181

82-
If an upstream Stack's published output values change, HCP Terraform automatically triggers runs for any downstream Stacks that rely on those outputs.
82+
If an upstream Stack's published output values change, HCP Terraform automatically triggers runs for any downstream Stacks that rely on those outputs.
8383

84-
In the following example, the `application` deployment depends on the upstream networking Stack.
84+
In the following example, the `application` deployment depends on the upstream networking Stack.
8585

8686
<CodeBlockConfig filename="application.tfdeploy.hcl">
8787

88-
```hcl
88+
```hcl
8989
# Application Stack deployment configuration
9090
91-
upstream_input "network_stack" {
92-
type = "Stack"
93-
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
91+
upstream_input "network_stack" {
92+
type = "stack"
93+
source = "app.terraform.io/hashicorp/Default Project/networking-stack"
9494
}
9595
96-
deployment "application" {
97-
inputs = {
98-
vpc_id = upstream_input.network_stack.vpc_id
99-
}
96+
deployment "application" {
97+
inputs = {
98+
vpc_id = upstream_input.network_stack.vpc_id
99+
}
100100
}
101101
```
102102

103103
</CodeBlockConfig>
104104

105-
The application Stack depends on the networking Stack’s output, so if the `vpc_id` changes then HCP Terraform triggers a new run for the application Stack. This approach allows you to decouple Stacks that have separate life cycles and ensures that updates in an upstream Stack propagate to downstream Stacks.
105+
The application Stack depends on the networking Stack’s output, so if the `vpc_id` changes then HCP Terraform triggers a new run for the application Stack. This approach allows you to decouple Stacks that have separate life cycles and ensures that updates in an upstream Stack propagate to downstream Stacks.
106106

107107
## Remove upstream Stack dependencies
108108

109109
To stop depending on an upstream Stack’s outputs, do the following in your downstream Stack's deployment configuration:
110110

111-
1. Remove the upstream Stack's `upstream_input` block
111+
1. Remove the upstream Stack's `upstream_input` block
112112
1. Remove any references to the upstream Stack's outputs
113-
1. Push your configuration changes to HCP Terraform and apply the new configuration
113+
1. Push your configuration changes to HCP Terraform and apply the new configuration

website/docs/language/stacks/reference/tfdeploy.mdx

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -7,11 +7,11 @@ description: Stacks help you provision and coordinate your infrastructure lifecy
77

88
A deployment configuration file defines how to deploy your Stack's infrastructure. Each Stack deployment runs in its agent, wholly isolated from other Stack deployments.
99

10-
Every Stack needs a deployment configuration file, `tfdeploy.hcl`, and this page describes all of the blocks you can use within a deployment configuration file. Note that none of the blocks in the deployment configuration file support [meta-arguments](/terraform/language/resources/syntax#meta-arguments).
10+
Every Stack needs a deployment configuration file, `tfdeploy.hcl`, and this page describes all of the blocks you can use within a deployment configuration file. Note that none of the blocks in the deployment configuration file support [meta-arguments](/terraform/language/resources/syntax#meta-arguments).
1111

1212
## `deployment` block configuration
1313

14-
The `deployment` block is where you define how many times you want to deploy your Stack's infrastructure. Each Stack requires at least one `deployment` block, and you can add a new `deployment` block every time you want to deploy your Stack’s infrastructure again.
14+
The `deployment` block is where you define how many times you want to deploy your Stack's infrastructure. Each Stack requires at least one `deployment` block, and you can add a new `deployment` block every time you want to deploy your Stack’s infrastructure again.
1515

1616
-> **Note**: HCP Terraform supports up to a maximum of 20 deployments.
1717

@@ -43,7 +43,7 @@ Each Stack must have at least one `deployment` block, and the label of the `depl
4343

4444
### Reference
4545

46-
For example, the following `deployment` block accepts inputs for variables named `aws_region` and `instance_count` and creates a new deployment in HCP Terraform named “production”.
46+
For example, the following `deployment` block accepts inputs for variables named `aws_region` and `instance_count` and creates a new deployment in HCP Terraform named “production”.
4747

4848
```hcl
4949
deployment "production" {
@@ -76,16 +76,16 @@ orchestrate "auto_approve" "name_of_check" {
7676
}
7777
```
7878

79-
The `orchestrate` block label includes the rule type and the rule name, which together must be unique within the Stack.
79+
The `orchestrate` block label includes the rule type and the rule name, which together must be unique within the Stack.
8080

8181
There are two orchestration rules to choose from:
8282

83-
* The `auto_approve` rule executes after a Stack creates a plan and automatically approves a plan if all checks pass.
83+
* The `auto_approve` rule executes after a Stack creates a plan and automatically approves a plan if all checks pass.
8484
* The `replan` rule executes after a Stack applies a plan, automatically triggering a replan if all the checks pass.
8585

8686
HCP Terraform evaluates the `check` blocks within your `orchestrate` block to determine if it should approve a plan. If all of the checks pass, then HCP Terraform approves the plan for you. If one or more `conditions` do not pass, then HCP Terraform shows the `reason` why, and you must manually approve that plan.
8787

88-
By default, each Stack has an `auto_approve` rule named `empty_plan`, which automatically approves a plan if it contains no changes.
88+
By default, each Stack has an `auto_approve` rule named `empty_plan`, which automatically approves a plan if it contains no changes.
8989

9090
### Specification
9191

@@ -105,7 +105,7 @@ The `check` block contains the following configurable fields.
105105

106106
### Orchestration Context
107107

108-
A `check` block’s `condition` field has access to a `context` variable, which includes information about the context of the current deployment plan. The `context` variable contains the following fields.
108+
A `check` block’s `condition` field has access to a `context` variable, which includes information about the context of the current deployment plan. The `context` variable contains the following fields.
109109

110110
| Field | Description | Type |
111111
| :---- | :---- | :---- |
@@ -115,7 +115,7 @@ A `check` block’s `condition` field has access to a `context` variable, which
115115
| `errors` | A set of diagnostic error message objects. | set of objects |
116116
| `warnings` | A set of diagnostic warning message objects. | set of objects |
117117

118-
The diagnostic message objects that the `context.errors` and `context.warnings` fields return includes the following information.
118+
The diagnostic message objects that the `context.errors` and `context.warnings` fields return includes the following information.
119119

120120
| Field | Description | Type |
121121
| :---- | :---- | :---- |
@@ -166,7 +166,7 @@ If nothing changes in the `component.pet` component, HCP Terraform automatically
166166

167167
## `identity_token` block configuration
168168

169-
The `identity_token` block defines a JSON Web Token (JWT) that Terraform generates for a given deployment if that `deployment` block references an `identity_token` in its `inputs`.
169+
The `identity_token` block defines a JSON Web Token (JWT) that Terraform generates for a given deployment if that `deployment` block references an `identity_token` in its `inputs`.
170170

171171
You can directly pass the token generated by the `identity_token` block to a provider's configuration for OIDC authentication. For more information on authenticating a Stack using OIDC, refer to [Authenticate a Stack](/terraform/language/stacks/deploy/authenticate).
172172

@@ -324,7 +324,7 @@ A local value assigns a name to an expression, so you can use the name multiple
324324

325325
The `publish_output` block requires at least Terraform version `terraform_1.10.0-alpha20241009` or higher. Download [latest version of Terraform](https://releases.hashicorp.com/terraform/) to use the most up-to-date functionality.
326326

327-
Specifies a value to export from your current Stack, which other Stacks in the same project can consume. Declare one `publish_output` block for each value to export your Stack.
327+
The `publish_output` block specifies a value to export from your current Stack, which other Stacks in the same project can consume. Declare one `publish_output` block for each value to export your Stack.
328328

329329
### Complete configuration
330330

@@ -369,13 +369,13 @@ publish_output "vpc_id" {
369369

370370
To learn more about passing information between Stacks, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/pass-data).
371371

372-
## `upstream_input` block configuration
372+
## `upstream_input` block configuration
373373

374374
The `upstream_input` block requires at least Terraform version `terraform_1.10.0-alpha20241009` or higher. Download [latest version of Terraform](https://releases.hashicorp.com/terraform/) to use the most up-to-date functionality.
375375

376-
The `upstream_input` block specifies another Stack in the same project to consume outputs from. Declare an `upstream_input` block for each Stack you want to reference. If an output from a upstream Stack changes, HCP Terraform automatically triggers runs for any Stacks that depend on those outputs.
376+
The `upstream_input` block specifies another Stack in the same project to consume outputs from. Declare an `upstream_input` block for each Stack you want to reference. If an output from a upstream Stack changes, HCP Terraform automatically triggers runs for any Stacks that depend on those outputs.
377377

378-
To learn more about passing information between Stacks, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).
378+
To learn more about passing information between Stacks, refer to [Pass data from one Stack to another](/terraform/language/stacks/deploy/link-stacks).
379379

380380
### Complete configuration
381381

@@ -424,4 +424,4 @@ deployment "application" {
424424

425425
</CodeBlockConfig>
426426

427-
Your application Stack can now securely consume and use outputs from your network Stack. To learn more about passing information between Stacks, reference [Pass data from one Stack to another](/terraform/language/stacks/deploy/pass-data).
427+
Your application Stack can now securely consume and use outputs from your network Stack. To learn more about passing information between Stacks, reference [Pass data from one Stack to another](/terraform/language/stacks/deploy/pass-data).

0 commit comments

Comments
 (0)