Skip to content

Commit 6b743b2

Browse files
Merge branch 'main' into docs/spring-ai-integration
2 parents 658eb2f + cbea160 commit 6b743b2

138 files changed

Lines changed: 5765 additions & 1114 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.github/workflows/docs-preview-links.yml

Lines changed: 9 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -26,26 +26,16 @@ jobs:
2626
with:
2727
node-version: '20'
2828

29-
- name: Compute preview base URL
29+
- name: Get preview URL from Vercel comment
3030
id: preview-url
31-
env:
32-
REPO_PREVIEW_BASE_URL: ${{ vars.DOCS_PREVIEW_BASE_URL }}
33-
REPO_PREVIEW_TEMPLATE: ${{ vars.DOCS_PREVIEW_BASE_URL_TEMPLATE }}
34-
run: |
35-
BRANCH_NAME="${GITHUB_HEAD_REF:-${GITHUB_REF_NAME}}"
36-
BRANCH_SLUG=$(echo "$BRANCH_NAME" | tr '[:upper:]' '[:lower:]' | tr -d '_' | sed 's/[^a-z0-9]/-/g' | sed 's/-\+/-/g' | sed 's/^-//' | sed 's/-$//')
37-
38-
if [ -n "$REPO_PREVIEW_BASE_URL" ]; then
39-
BASE_URL="$REPO_PREVIEW_BASE_URL"
40-
elif [ -n "$REPO_PREVIEW_TEMPLATE" ]; then
41-
BASE_URL="${REPO_PREVIEW_TEMPLATE//\{branch\}/$BRANCH_SLUG}"
42-
else
43-
BASE_URL="https://temporal-documentation-git-${BRANCH_SLUG}.preview.thundergun.io"
44-
fi
45-
46-
echo "DOCS_PREVIEW_BASE_URL=$BASE_URL" >> "$GITHUB_ENV"
47-
echo "BRANCH_SLUG=$BRANCH_SLUG" >> "$GITHUB_ENV"
48-
echo "base_url=$BASE_URL" >> "$GITHUB_OUTPUT"
31+
uses: actions/github-script@v7
32+
with:
33+
github-token: ${{ secrets.GITHUB_TOKEN }}
34+
script: |
35+
const { resolvePreviewUrl } = require('./bin/preview-url-from-vercel.js');
36+
const baseUrl = await resolvePreviewUrl({ github, context, core });
37+
core.exportVariable('DOCS_PREVIEW_BASE_URL', baseUrl);
38+
core.setOutput('base_url', baseUrl);
4939
5040
- name: Generate docs preview list
5141
env:

COMPONENTS.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,21 @@ Usage:
137137

138138
Images are normally stored in the '/static' folder in `img` or `diagrams`.
139139

140+
### Dark mode images
141+
142+
To provide a separate image for dark mode, use the `srcDark` prop:
143+
144+
```
145+
<CaptionedImage
146+
src="/diagrams/my-diagram.svg"
147+
srcDark="/diagrams/my-diagram-dark.svg"
148+
title="My diagram"
149+
alt="Description of the diagram"
150+
/>
151+
```
152+
153+
When `srcDark` is provided, both images are rendered in the DOM and the browser loads both upfront. CSS toggles visibility based on the active theme, so switching between light and dark mode is instant with no loading delay. When `srcDark` is omitted, the component renders a single image as usual.
154+
140155
### Zooming images
141156

142157
When images are complex and may not render in a readable fashion on normal monitors, you can enable a minimal form of zooming by setting the `zoom` prop to true:

bin/preview-url-from-vercel.js

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
#!/usr/bin/env node
2+
3+
const MAX_SUBDOMAIN_LENGTH = 63;
4+
const MAX_ATTEMPTS = 18;
5+
const INTERVAL_MS = 10_000;
6+
const PREVIEW_DOMAIN = 'preview.thundergun.io';
7+
8+
function branchToSlug(branchName) {
9+
return branchName
10+
.toLowerCase()
11+
.replace(/_/g, '')
12+
.replace(/[^a-z0-9]/g, '-')
13+
.replace(/-+/g, '-')
14+
.replace(/^-/, '')
15+
.replace(/-$/, '');
16+
}
17+
18+
function buildSubdomain(branchSlug) {
19+
return `temporal-documentation-git-${branchSlug}`;
20+
}
21+
22+
function extractPreviewUrlFromComment(body) {
23+
const match = body.match(/\[vc\]:\s*#[^:]*:(eyJ[A-Za-z0-9+/=]+)/);
24+
if (!match) return null;
25+
26+
const payload = JSON.parse(Buffer.from(match[1], 'base64').toString('utf8'));
27+
return payload.projects?.[0]?.previewUrl || null;
28+
}
29+
30+
async function resolvePreviewUrl({ github, context, core }) {
31+
const branchName = process.env.GITHUB_HEAD_REF || process.env.GITHUB_REF_NAME;
32+
const branchSlug = branchToSlug(branchName);
33+
const subdomain = buildSubdomain(branchSlug);
34+
35+
if (subdomain.length <= MAX_SUBDOMAIN_LENGTH) {
36+
const baseUrl = `https://${subdomain}.${PREVIEW_DOMAIN}`;
37+
core.info(`Branch name is short enough (${subdomain.length} chars), using constructed URL: ${baseUrl}`);
38+
return baseUrl;
39+
}
40+
41+
core.info(`Subdomain would be ${subdomain.length} chars (exceeds ${MAX_SUBDOMAIN_LENGTH}), polling for Vercel comment...`);
42+
43+
for (let attempt = 1; attempt <= MAX_ATTEMPTS; attempt++) {
44+
core.info(`Polling for Vercel comment (attempt ${attempt}/${MAX_ATTEMPTS})...`);
45+
const { data: comments } = await github.rest.issues.listComments({
46+
owner: context.repo.owner,
47+
repo: context.repo.repo,
48+
issue_number: context.issue.number,
49+
per_page: 100,
50+
});
51+
52+
const vercelComment = comments.find(
53+
(c) => c.user?.login === 'vercel[bot]' && c.body?.includes('[vc]:'),
54+
);
55+
56+
if (vercelComment) {
57+
const previewUrl = extractPreviewUrlFromComment(vercelComment.body);
58+
if (previewUrl) {
59+
const baseUrl = `https://${previewUrl}`;
60+
core.info(`Found Vercel preview URL: ${baseUrl}`);
61+
return baseUrl;
62+
}
63+
}
64+
65+
if (attempt < MAX_ATTEMPTS) {
66+
await new Promise((resolve) => setTimeout(resolve, INTERVAL_MS));
67+
}
68+
}
69+
70+
core.warning('Vercel comment not found after polling, using constructed URL as fallback (links may be broken for this long branch name)');
71+
return `https://${subdomain}.${PREVIEW_DOMAIN}`;
72+
}
73+
74+
module.exports = { resolvePreviewUrl, branchToSlug, buildSubdomain, extractPreviewUrlFromComment };

docs/best-practices/cost-optimization.mdx

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ See [Spooky Stories: Chilling Temporal Anti-Patterns](https://temporal.io/blog/s
6262
### Large payloads in Workflow History
6363

6464
Passing multi-megabyte payloads through Workflows when external storage (S3, blob storage) is more appropriate.
65-
Use [compression](/troubleshooting/blob-size-limit-error#why-does-this-error-occur) or the [claim check pattern](https://dataengineering.wiki/Concepts/Software+Engineering/Claim+Check+Pattern) for large data.
65+
Use [compression](/troubleshooting/blob-size-limit-error#payload-size-limit) or the [claim check pattern](https://dataengineering.wiki/Concepts/Software+Engineering/Claim+Check+Pattern) for large data.
6666

6767
### Over-optimization at the expense of observability
6868

@@ -137,7 +137,7 @@ For detailed discussion of this tradeoff, see [How many Activities should I use
137137

138138
### Child Workflows vs Activities
139139

140-
[Child Workflows cost 2 Actions](/cloud/actions#child-workflows) compared to an Activity's 1 Action.
140+
[Child Workflows cost 2 Actions](/cloud/actions#workflow) compared to an Activity's 1 Action.
141141
See [Child Workflows documentation](/child-workflows) for detailed comparison of capabilities and use cases.
142142

143143
### Retry Policies
@@ -165,7 +165,7 @@ Refer to this blog post on [Mastering Workflow retry logic for resilient applica
165165
### Local Activities
166166

167167
A [Local Activity](/local-activity#local-activity) is an Activity Execution that executes in the same process as the Workflow Execution that spawns it.
168-
Therefore, multiple Local Activities that run back-to-back only [count as a single billable action](/cloud/actions#activities), whereas each regular Activity counts as a billable action.
168+
Therefore, multiple Local Activities that run back-to-back only [count as a single billable action](/cloud/actions#activity), whereas each regular Activity counts as a billable action.
169169
However, there are tradeoffs to converting regular Activities to Local Activities.
170170
For example, if a specific Local Activity fails, *all* of them will be retried together.
171171
Review [the docs](/local-activity) or reach out to your account team to learn more.
@@ -189,7 +189,7 @@ Use Regular Activities instead of Local Activities if you require any of the fol
189189
2. For Search Attributes that must be updated during Workflow Execution, each `UpsertSearchAttributes` call counts as 1 Action regardless of how many attributes are updated.
190190
Batch multiple related attribute updates into single operations to reduce Actions consumed.
191191

192-
See the [Temporal Cloud Action Documentation](/cloud/actions#workflows) for details.
192+
See the [Temporal Cloud Action Documentation](/cloud/actions#workflow) for details.
193193

194194
#### Signal handling
195195

@@ -252,7 +252,7 @@ Alternatively, if you are looking to do analysis on closed Workflow Executions,
252252
### Validation approach
253253

254254
1. **Test in non-production**: Validate functional correctness before production deployment
255-
2. **Monitor comprehensively**: Leverage the [Usage dashboard](/cloud/actions#usage) in the Cloud UI to track the impact on Actions and Storage after optimizations are made
255+
2. **Monitor comprehensively**: Leverage the [Usage dashboard](/cloud/actions-usage#usage) in the Cloud UI to track the impact on Actions and Storage after optimizations are made
256256
3. **Progressive rollout**: Deploy to a small percentage, validate, then expand. Review the [Worker Versioning documentation](/production-deployment/worker-deployments/worker-versioning) to learn about rolling out changes to Workflows
257257
4. **Continuous review**: Re-evaluate optimization effectiveness quarterly as system evolves
258258

0 commit comments

Comments
 (0)