Releases: aws-powertools/powertools-lambda-typescript
v1.16.0
Summary
In this minor release we are adding support for two new environments variables to configure the log level in Logger.
You can now configure the log level of for Logger using two new environment variables: AWS_LAMBDA_LOG_LEVEL
and POWERTOOLS_LOG_LEVEL
. The new environment variables will work along the existing LOG_LEVEL
variable that is now considered legacy and will be removed in the future.
Setting the log level now follows this order:
AWS_LAMBDA_LOG_LEVEL
environment variable- Setting the log level in code using the
logLevel
constructor option, or by calling thelogger.setLogLevel()
method POWERTOOLS_LOG_LEVEL
environment variable
We have also added a new section to the docs to highlight the new behavior.
Changes
🌟New features and non-breaking changes
- feat(logger): add support for
AWS_LAMBDA_LOG_LEVEL
andPOWERTOOLS_LOG_LEVEL
(#1795) by @dreamorosi
📜 Documentation updates
- feat(logger): add support for
AWS_LAMBDA_LOG_LEVEL
andPOWERTOOLS_LOG_LEVEL
(#1795) by @dreamorosi
This release was made possible by the following contributors:
v1.15.0
Summary
This release brings support for the new Node.js 20 AWS Lambda managed runtime as well as tweaking how the Metrics utility emits logs under the hood.
Node.js 20 support
With this release we are excited to announce that Powertools for AWS Lambda (TypeScript) is compatible with the nodejs20.x
AWS Lambda managed runtime 🎉.
The toolkit and our public Lambda Layers are both compatible with the new runtime and no code change should be required on your part.
Metrics
The Metrics utility emits logs using the Embedded Metric Format (EMF). Prior to this release, the logs were emitted using the global console
object. This makes it so that in addition to the payload of the log, AWS Lambda adds the request id and timestamp of the log.
For most customers, and especially those who consume the metrics in Amazon CloudWatch, this is fine as CloudWatch is able to parse the EMF content and create custom metrics. For customers who instead want to send the metrics to third party observability providers the presence of these strings means having an extra parsing step.
To support these use cases, and to align with the behavior of the Logger utility, the Metrics utility now uses a dedicated instance of the Console
object, which allows it to emit only the content of EMF metric. Just like for the Logger, this behavior can be reverted for development environments by setting the POWERTOOLS_DEV
environment variable to a truthy value (i.e. true
, yes
, 1
, on
, etc.).
When POWERTOOLS_DEV
is enabled, the Metrics utility reverts to using the global console
object. This allows customers to place mock and spies and optionally override the implementation for testing purposes.
Changes
🌟New features and non-breaking changes
- feat(maintenance): add support for nodejs20.x runtime (#1790) by @dreamorosi
- feat(metrics): log directly to stdout (#1786) by @dreamorosi
🔧 Maintenance
- feat(maintenance): add support for nodejs20.x runtime (#1790) by @dreamorosi
This release was made possible by the following contributors:
v1.14.2
Summary
In this patch release we are fixing a bug that affected the Metrics utility.
When using the utility you can set default dimensions that will be added to every metric emitted by your application.
Before this release, when setting a dimension using an existing key, the emitted EMF blob would contain duplicate keys. This release fixes the bug and makes sure that keys are deduplicated correctly.
Additionally we have also improved our Gitpod configuration which should make it easier for contributors to get up and running.
Changes
📜 Documentation updates
- docs(maintenance): add clarification about async decorators (#1778) by @dreamorosi
🐛 Bug and hot fixes
🔧 Maintenance
- chore(maintenance): improve Gitpod config (#1782) by @dreamorosi
This release was made possible by the following contributors:
v1.14.1
Summary
In this patch release we have improved the logic around creating AWS SDK clients in the Parameters & Idempotency utility, as well as improving our documentation to include sections dedicated to how to contribute and how we manage the project.
Idempotency & Parameters
Both the Idempotency utility and the Parameters one allow you to bring your own AWS SDK to interact with AWS APIs. This is useful when there's a need to customize the SDK client or share an existing one already used in other parts of the function. Prior to this release, both utilities were instantiating a new AWS SDK client by default, only to then replace it with the customer provided one. In these cases, we were needlessly instantiating a client leading to wasted cycles.
Starting from this version both utilities include a refactored logic that instantiate a new SDK client only when a valid one is not provided by the customer. This way customers bringing their own client don't have to pay the performance hit of instantiating multiple clients.
Documentation
As part of this release we have also added a new section to our documentation dedicated to explain our processes. The section includes our roadmap, the maintainers' handbook, and a few sections dedicated to contributing. These sections are designed to be a companion to the contributing guidelines, which we also refreshed to make them more focused, and provide a deeper look around specific areas like setting your development environment, finding you first contribution, project's conventions, and testing.
Changes
- chore(idempotency): refactor aws sdk init logic (#1768) by @dreamorosi
- chore(commons): update Powertools UA middleware detection (#1762) by @dreamorosi
- chore(parameters): refactor provider constructor to init client as needed (#1757) by @dreamorosi
- chore(ci): add workflow to publish v2 docs on merge (#1756) by @dreamorosi
- chore(docs): add invisible unicode char to decorator docstrings (#1755) by @dreamorosi
- chore(maintenance): set
removeComments
tofalse
intsconfig.json
(#1754) by @dreamorosi - chore(tracer): update warning to better format segment name (#1750) by @dreamorosi
- chore(ci): update v2 release workflow (#1745) by @dreamorosi
📜 Documentation updates
- docs(maintenance): add processes tab (#1747) by @dreamorosi
- chore(docs): update docs url in comments & readme files (#1728) by @dreamorosi
🔧 Maintenance
- build(maintenance): bump
@aws-sdk/*
dev dependencies (#1771) by @dreamorosi - build(tracer): bump
aws-xray-sdk-core
to latest (#1769) by @dreamorosi - chore(internal): remove outdated notice files (#1752) by @dreamorosi
- chore(maintenance): bump
@babel/traverse
from 7.22.19 to 7.23.2 (#1748) by @dependabot - docs(maintenance): add processes tab (#1747) by @dreamorosi
- chore(docs): update docs url in comments & readme files (#1728) by @dreamorosi
This release was made possible by the following contributors:
v1.14.0
Summary
This release brings all the generally available utilities to the Lambda Layers, improves the Idempotency utility with the addition of a new @idempotent
class method decorator, and makes Tracer more reliable.
Lambda Layers
Starting from version 21, which corresponds to this release, of our Lambda Layer includes the Idempotency, Parameters, and Batch Processing utilities. The layer comes with its own reduced copy of AWS SDK v3 for JavaScript clients so you can easily attach it to Lambda functions running on Node.js 16 without having to bundle the SDK.
The layers are available in most commercial AWS Regions, go here to learn more about how to use them and find the ARN for your region.
Idempotency
If you use decorators you can now make your class methods idempotent thanks to the new @idempotent
decorator.
You can use the decorator on your Lambda handler, like shown in the image above, or on any method that returns a response. This is useful when you want to make a specific part of your code idempotent, for example when your Lambda handler performs multiple side effects and you only want to make part of it safe to retry.
Tracer
When segments generated by your code are about to be sent to the AWS X-Ray daemon, the AWS X-Ray SDK for Node.js serializes them into a JSON string. If the segment contains exotic objects like BigInt
, Set
, or Map
in the metadata the serialization can throw an error because the native JSON.stringify()
function doesn't know how to handle these objects.
To guard against this type of runtime errors we have wrapped within try/catch
logic the branches of code in Tracer where this issue could happen. Now, when an error gets thrown during the serialization of a segment within Tracer, we will catch it and log a warning instead.
We are also working with the X-Ray team to add a replacer
function to the serialization logic directly in the X-Ray SDK so that the issue can be better mitigated.
Acknowledgements
Congratulations to @HaaLeo and @KhurramJalil for having your first contribution to the project, thank you for helping make Powertools better for everyone 🎉
Note
We have officially started working on the next major version of Powertools for AWS (TypeScript) 🔥 We have published a Request For Comment (RFC) that details most of the changes that we have planned and in the coming weeks we'll work on an upgrade guide. We would love to hear what you think about our plan and hear any concern you might have.
Changes
- chore(ci): create v2 alpha release workflow (#1719) by @dreamorosi
- chore(layers): add Idempotency, Batch, and Parameters to layer (#1712) by @am29d
🌟New features and non-breaking changes
- feat(idempotency): add idempotency decorator (#1723) by @am29d
- feat(tracer): add try/catch logic to decorator and middleware close (#1716) by @dreamorosi
- feat(layers): add
arm64
to integration test matrix (#1720) by @dreamorosi
🌟 Minor Changes
- improv(commons): add number key type to JSONObject type (#1715) by @KhurramJalil
- refactor(commons): aws sdk util + identity check (#1708) by @dreamorosi
📜 Documentation updates
- docs(idempotency): address comments (#1724) by @am29d
- feat(idempotency): add idempotency decorator (#1723) by @am29d
- docs: typo for serverless framework (#1701) by @HaaLeo
🔧 Maintenance
- docs(idempotency): address comments (#1724) by @am29d
- feat(idempotency): add idempotency decorator (#1723) by @am29d
- chore(layers): bundle assets from source when testing (#1710) by @dreamorosi
- refactor(commons): aws sdk util + identity check (#1708) by @dreamorosi
This release was made possible by the following contributors:
v1.13.1
Summary
In this minor release we have removed the upper bound for middy/core
4.x in our peerDependency
. We only support middy 3.x and in the last release we added peerDependency
section to make it explicit. But some of Powertools user had already middy 4.x in their dependencies and the recent change broke their builds. We have now removed the upper bound to give you the freedom to use middy/core 4.x though we do not support it yet. With more recent requests we will address this issue soon to bring middy 4.x support earlier than we anticipated.
Changes
- chore: bump version in commons for user agent (#1698) by @am29d
- chore(tracer): mark createTracer helper function as deprecated (#1692) by @dreamorosi
📜 Documentation updates
- docs(tracer): update annotation & metadata docs to include full code (#1704) by @dreamorosi
- docs: remove beta warning from batch and idempotency readme (#1696) by @am29d
- docs: upgrade mkdocs to fix dark mode custom highlight style (#1695) by @am29d
🐛 Bug and hot fixes
- fix(maintenance): remove upper peer dependency Middy (#1705) by @dreamorosi
🔧 Maintenance
- fix(maintenance): remove upper peer dependency Middy (#1705) by @dreamorosi
- docs: remove beta warning from batch and idempotency readme (#1696) by @am29d
This release was made possible by the following contributors:
@am29d, @dreamorosi, Alexander Melnyk, Alexander Schueren and Release bot[bot]
v1.13.0
Summary
In this release we are excited to announce the General Availability of two utilities: Idempotency and Batch.
Batch
Warning
Breaking Change
We have introduced a breaking change in the batch utility. We have initially followed the python implementation for this utility and after multiple reviews we have realised that the choice of async/sync processors is different in NodeJS ecosystem compared to Python. Async functions are often the preferred choice of synchronous functions and thus we renamed AsyncBatchProcessor
to BatchProcessor
, making it also a default choice. When you need to process a batch synchronously (i.e. SQS Fifo) use the explicit BatchProcessorSync
processor. We have added a dedicated section in the documentation to clarify the implications and when to pick the right processor.
Idempotency
We have improved the docs on the customer feedback we have received over the month. You have now more details how to implement your own persistence store. In addition, Batch and Idempotency are often used together, so we added a section how to integrate Batch into Idempotency, here is an example:
import {
BatchProcessor,
EventType,
processPartialResponse,
} from '@aws-lambda-powertools/batch';
import type {
Context,
SQSBatchResponse,
SQSEvent,
SQSRecord,
} from 'aws-lambda';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import {
IdempotencyConfig,
makeIdempotent,
} from '@aws-lambda-powertools/idempotency';
const processor = new BatchProcessor(EventType.SQS);
const dynamoDBPersistence = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTable',
});
const idempotencyConfig = new IdempotencyConfig({
eventKeyJmesPath: 'messageId',
});
const processIdempotently = makeIdempotent(
async (_record: SQSRecord) => {
// process your event
},
{
persistenceStore: dynamoDBPersistence,
config: idempotencyConfig,
}
);
export const handler = async (
event: SQSEvent,
context: Context
): Promise<SQSBatchResponse> => {
idempotencyConfig.registerLambdaContext(context);
return processPartialResponse(event, processIdempotently, processor, {
context,
});
};
Layers
Welcome TelAviv! We have added il-central-1
to our regions and the layer is now available in this region. Of course, we bumped the version to the same number, so you only have to change the region in your ARN:
arn:aws:lambda:il-central-1:094274105915:layer:AWSLambdaPowertoolsTypeScript:19
Acknowledgements
A big thanks again to @erikayao93 for the work on the Batch Processing utility that goes GA in this release, we appreciate your work 🎉
Changes
- chore(ci): pin npm 9.x in workflows (#1666) by @dreamorosi
- chore(ci): bump
aws-actions/configure-aws-credentials
action (#1663) by @dreamorosi - chore(logger): extract logger creation into protected method (#1646) by @dreamorosi
- chore(ci): fix post release workflow (#1638) by @dreamorosi
🌟New features and breaking changes
🌟 Minor Changes
- improv(idempotency): expose record status & expiry config + make DynamoDB Client optional (#1679) by @dreamorosi
- improv(batch): improve errors (#1648) by @dreamorosi
- improv(idempotency): consolidate internal implementation (#1642) by @dreamorosi
- improv(tests): adopt aws cdk cli lib (#1633) by @dreamorosi
📜 Documentation updates
- feat(batch): rename AsyncBatchProcessor to default BatchProcessor (#1683) by @am29d
- chore(maintenance): bump dependencies + apply
peerDependencies
(#1685) by @dreamorosi - docs(idempotency): bring your own persistent store (#1681) by @dreamorosi
- improv(idempotency): expose record status & expiry config + make DynamoDB Client optional (#1679) by @dreamorosi
- docs(idempotency): add batch integration to idempotency docs (#1676) by @am29d
- docs(batch): add section on how to trace batch processing (#1673) by @dreamorosi
- chore(maintenance): adopt TypeScript 5.x (#1672) by @dreamorosi
- chore(internal): revisit
tsconfig
files (#1667) by @dreamorosi - chore(docs): improve idempotency documentation (#1655) by @am29d
- chore(ci): add il-central-1 to layer deployment pipeline (#1656) by @am29d
- chore(docs): added tecRacer GmbH & Co. KG to project supporters (#1653) by @dreamorosi
- docs(batch): added flow charts (#1640) by @dreamorosi
- chore(docs): update roadmap, labels, and issue templates (#1632) by @dreamorosi
🐛 Bug and hot fixes
- fix(parameters): return type when options without transform is used (#1671) by @dreamorosi
- fix(batch): Update processor to pass only context to handler (#1637) by @erikayao93
🔧 Maintenance
- feat(batch): rename AsyncBatchProcessor to default BatchProcessor (#1683) by @am29d
- chore(maintenance): bump dependencies + apply
peerDependencies
(#1685) by @dreamorosi - docs(idempotency): bring your own persistent store (#1681) by @dreamorosi
- improv(idempotency): expose record status & expiry config + make DynamoDB Client optional (#1679) by @dreamorosi
- chore(maintenance): adopt TypeScript 5.x (#1672) by @dreamorosi
- chore(internal): revisit
tsconfig
files (#1667) by @dreamorosi - chore(tests): address integration test flakiness (#1669) by @dreamorosi
- chore(ci): move e2e utils into testing (#1661) by @dreamorosi
- chore(docs): added tecRacer GmbH & Co. KG to project supporters (#1653) by @dreamorosi
- chore(ci): add project config to issue templates (#1652) by @dreamorosi
- improv(tests): adopt aws cdk cli lib (#1633) by @dreamorosi
- chore(docs): update roadmap, labels, and issue templates (#1632) by @dreamorosi
This release was made possible by the following contributors:
@am29d, @dreamorosi, @erikayao93, @sthulb and Release bot[bot]
v1.12.1
Summary
This release brings another new utility to Powertools for AWS Lambda (TypeScript): introducing the Batch Processing utility ✨ The release also improves the Logger utility, which can now include the cause
field in error logs.
Batch Processing Beta
Warning
This utility is currently released as beta developer preview and is intended strictly for feedback and testing purposes and not for production workloads. The version and all future versions tagged with the-beta
suffix should be treated as not stable. Up until before the General Availability release we might introduce significant breaking changes and improvements in response to customers feedback.
The batch processing utility handles partial failures when processing batches from Amazon SQS, Amazon Kinesis Data Streams, and Amazon DynamoDB Streams.
Key Features
- Reports batch item failures to reduce number of retries for a record upon errors
- Simple interface to process each batch record
- Build your own batch processor by extending primitives
Problem Statement
When using SQS, Kinesis Data Streams, or DynamoDB Streams as a Lambda event source, your Lambda functions are triggered with a batch of messages.
If your function fails to process any message from the batch, the entire batch returns to your queue or stream. This same batch is then retried until either condition happens first: a) your Lambda function returns a successful response, b) record reaches maximum retry attempts, or c) when records expire.
With this utility, batch records are processed individually – only messages that failed to be processed return to the queue or stream for a further retry.
Getting Started
To get started, install the utility by running:
npm install @aws-lambda-powertools/batch
Then, define a record handler function:
This function will be called by the Batch Processing utility for each record in the batch. If the function throws an error, the record will be marked as failed and reported once the main handler returns.
Record handlers can be both synchronous and asynchronous, in the latter case the utility will process all the records of your batch in concurrently. To learn more about when it's safe to use async handlers, check the dedicated section in our docs.
SQS Processor
When using SQS as a Lambda event source, you can specify the EventType.SQS
to process the records. The response will be a SQSBatchResponse
which contains a list of items that failed to be processed.
To learn more about this mode, as well as how to process SQS FIFO queues, check the docs.
Kinesis Processor
When using Kinesis Data Streams as a Lambda event source, you can specify the EventType.KinesisDataStreams
to process the records. The response will be a KinesisStreamBatchResponse
which contains a list of items that failed to be processed.
Learn more on the docs.
DynamoDB Stream Processor
When using DynamoDB Streams as a Lambda event source, you can use the BatchProcessor with the EventType.DynamoDBStreams
to process the records. The response will be a DynamoDBBatchResponse
which contains a list of items that failed to be processed.
Check the docs to learn more about this processor.
Logger
Starting from this release, when logging an error with the logger.error()
method, the Logger utility will include the cause
field as part of the JSON-formatted log entry:
The cause
field is available in the Error
class starting from Node.js v16.9.0 and allows to specify the error that caused the one being thrown. This is useful when you are catching an error and throwing your own, but still want to preserve the original cause of the error.
Acknowledgements
Congratulations and a big thank you to @erikayao93 for the work on the new Batch Processing utility 🎉
Changes
- chore(ci): add batch to build (#1630) by @dreamorosi
- chore(commons): bump version prior to release (#1628) by @dreamorosi
- chore(ci): fix canary deploy in ci with correct workspace name (#1601) by @am29d
🌟New features and non-breaking changes
- feat(batch): add batch processing utility (#1625) by @dreamorosi
- feat(logger): add
cause
field to formatted error (#1617) by @dreamorosi - feat(batch): Implementation of base batch processing classes (#1588) by @erikayao93
📜 Documentation updates
- docs(parameters): add parameters examples cdk and sam (#1622) by @am29d
- feat(batch): add batch processing utility (#1625) by @dreamorosi
- feat(batch): Implementation of base batch processing classes (#1588) by @erikayao93
🔧 Maintenance
- docs(parameters): add parameters examples cdk and sam (#1622) by @am29d
- chore(maintenance): bump word-wrap from 1.2.3 to 1.2.4 (#1618) by @dependabot
- chore(ci): restore dependencies & fix e2e tests (#1615) by @dreamorosi
- chore(maintenance): remove vm2 from package-lock.json (#1613) by @am29d
- chore(maintenance): remove proxy-agent from dependencies (#1611) by @am29d
- feat(batch): Implementation of base batch processing classes (#1588) by @erikayao93
This release was made possible by the following contributors:
v1.11.1
Summary
In this release we are excited to announce the developer beta for the new idempotency utility 🎉. This new utility allows you to make your functions idempotent so that multiple invocations will return the same result without side effects.
Idempotency Beta
Warning
This utility is currently released as beta developer preview and is intended strictly for feedback and testing purposes and not for production workloads. The version and all future versions tagged with the-beta
suffix should be treated as not stable. Up until before the General Availability release we might introduce significant breaking changes and improvements in response to customers feedback.
Key features
- Prevent Lambda handler from executing more than once on the same event payload during a time window
- Ensure Lambda handler returns the same result when called with the same payload
- Select a subset of the event as the idempotency key using JMESPath expressions
- Set a time window in which records with the same payload should be considered duplicates
- Expires in-progress executions if the Lambda function times out halfway through
Persistence store
Before getting started, you need to create a persistent storage layer where the idempotency utility can store its state - your lambda functions will need read and write access to it. As of now, Amazon DynamoDB is the only supported persistent storage layer, so you'll need to create a table first. Check the documentation for more information on Persistence layer. Then you configure the persistence layer with your table:
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName', // <-- create this table before
});
MakeIdempotent function wrapper
You can quickly start by initializing the DynamoDBPersistenceLayer
class and using it with the makeIdempotent
function wrapper on your Lambda handler.
import { randomUUID } from 'node:crypto';
import { makeIdempotent } from '@aws-lambda-powertools/idempotency';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import type { Context } from 'aws-lambda';
import type { Request, Response, SubscriptionResult } from './types';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName',
});
const createSubscriptionPayment = async (
event: Request
): Promise<SubscriptionResult> => {
// ... create payment
return {
id: randomUUID(),
productId: event.productId,
};
};
export const handler = makeIdempotent(
async (event: Request, _context: Context): Promise<Response> => {
try {
const payment = await createSubscriptionPayment(event);
return {
paymentId: payment.id,
message: 'success',
statusCode: 200,
};
} catch (error) {
throw new Error('Error creating payment');
}
},
{
persistenceStore,
}
);
Middy middleware
If you are using Middy as your middleware engine, you can use the makeHandlerIdempotent
middleware to make your Lambda handler idempotent.
import { randomUUID } from 'node:crypto';
import { makeHandlerIdempotent } from '@aws-lambda-powertools/idempotency/middleware';
import { DynamoDBPersistenceLayer } from '@aws-lambda-powertools/idempotency/dynamodb';
import middy from '@middy/core';
import type { Context } from 'aws-lambda';
import type { Request, Response, SubscriptionResult } from './types';
const persistenceStore = new DynamoDBPersistenceLayer({
tableName: 'idempotencyTableName',
});
const createSubscriptionPayment = async (
event: Request
): Promise<SubscriptionResult> => {
// ... create payment
return {
id: randomUUID(),
productId: event.productId,
};
};
export const handler = middy(
async (event: Request, _context: Context): Promise<Response> => {
try {
const payment = await createSubscriptionPayment(event);
return {
paymentId: payment.id,
message: 'success',
statusCode: 200,
};
} catch (error) {
throw new Error('Error creating payment');
}
}
).use(
makeHandlerIdempotent({
persistenceStore,
})
);
Customization
Similar to the Powertools for AWS Lambda (Python) implementation we have created many options for you to customize the persistence store and the idempotency behavior, such as idempotency key, record expiration, hash function, table attributes, local cache, payload validation and more). You can also bring your own Javascript SDK v3 client or pass only client options for the utility to use. See the our documentation for more information.
Changes
- chore(ci): fix canary deploy in ci with correct workspace name (#1601) by @am29d
- chore(maintenance): remove parameters utility from layer bundling and layers e2e tests (#1599) by @am29d
- chore(ci): add canary to layer deployment (#1593) by @am29d
- chore(maintenance): avoid attaching two middlewares to ua (#1583) by @am29d
- chore(parameters): apply UA to AWS SDK clients used by Parameters (#1577) by @am29d
- chore(tracer): apply UA to AWS SDK clients used by Tracer (#1575) by @am29d
📜 Documentation updates
- docs(internal): update AWS SDK links to new docs (#1597) by @dreamorosi
- docs(idempotency): write utility docs (#1592) by @dreamorosi
- docs(tracer): add callout about ADOT (#1581) by @dreamorosi
🐛 Bug and hot fixes
- fix(idempotency): types, docs, and
makeIdempotent
function wrapper (#1579) by @dreamorosi - fix(docs): fix alias in versions.json (#1576) by @sthulb
- fix(ci): remove old release tag version for layer (#1570) by @am29d
🔧 Maintenance
- chore(idempotency): mark the utility ready public beta (#1595) by @dreamorosi
- build(internal): bump semver from 5.7.1 to 5.7.2 (#1594) by @dependabot
- docs(idempotency): write utility docs (#1592) by @dreamorosi
- test(idempotency): improve integration tests for utility (#1591) by @dreamorosi
- fix(idempotency): types, docs, and
makeIdempotent
function wrapper (#1579) by @dreamorosi - chore(maintenance): add powertools to user-agent in SDK clients (#1567) by @am29d
This release was made possible by the following contributors:
@am29d, @dependabot, @dependabot[bot], @dreamorosi, @github-actions[bot], @sthulb and Release bot[bot]
v1.11.0
Summary
In this release we are excited to announce the General Availability of the Parameters utility 🎉 After almost three months of beta period we consider the utility ready for production workloads and consider the API stable.
Parameters
The Parameters utility provides high-level functions to retrieve one or multiple parameter values from AWS Systems Manager Parameter Store, AWS Secrets Manager, AWS AppConfig, Amazon DynamoDB, or your own parameter store.
Key features
- Retrieve one or multiple parameters from the underlying provider
- Cache parameter values for a given amount of time (defaults to 5 seconds)
- Transform parameter values from JSON or base64 encoded strings
- Bring Your Own Parameter Store Provider
Fetching parameters from AWS SSM Parameter Store
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-ssm
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can retrieve a single parameter using the getParameter()
high-level function.
import { getParameter } from '@aws-lambda-powertools/parameters/ssm';
export const handler = async (): Promise<void> => {
// Retrieve a single parameter
const parameter = await getParameter('/my/parameter');
console.log(parameter);
};
For multiple parameters, you can use getParameters()
to recursively fetch all parameters under a path:
import { getParameters } from '@aws-lambda-powertools/parameters/ssm';
export const handler = async (): Promise<void> => {
/**
* Retrieve multiple parameters from a path prefix recursively.
* This returns an object with the parameter name as key
*/
const parameters = await getParameters('/my/path/prefix');
for (const [key, value] of Object.entries(parameters || {})) {
console.log(`${key}: ${value}`);
}
};
Alternatively, you can also fetch multiple parameters using their full name by using the getParametersByName()
function.
Getting secrets from Amazon Secrets Manager
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-secrets-manager
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can fetch secrets stored in Secrets Manager using the getSecret()
function:
import { getSecret } from '@aws-lambda-powertools/parameters/secrets';
export const handler = async (): Promise<void> => {
// Retrieve a single secret
const secret = await getSecret('my-secret');
console.log(secret);
};
Fetching configs from AWS AppConfig
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-appconfigdata
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can fetch application configurations in AWS AppConfig using the getAppConfig()
function:
import { getAppConfig } from '@aws-lambda-powertools/parameters/appconfig';
export const handler = async (): Promise<void> => {
// Retrieve a configuration, latest version
const config = await getAppConfig('my-configuration', {
environment: 'my-env',
application: 'my-app',
});
console.log(config);
};
Retrieving values from Amazon DynamoDB
To get started, install the library and the corresponding AWS SDK for JavaScript v3:
npm install @aws-lambda-powertools/parameters @aws-sdk/client-dynamodb @aws-sdk/util-dynamodb
Next, review the IAM permissions attached to your AWS Lambda function and make sure you allow the actions detailed in the documentation of the utility.
You can retrieve a single parameter from DynamoDB using the DynamoDBProvider.get()
method:
import { DynamoDBProvider } from '@aws-lambda-powertools/parameters/dynamodb';
const dynamoDBProvider = new DynamoDBProvider({ tableName: 'my-table' });
export const handler = async (): Promise<void> => {
// Retrieve a value from DynamoDB
const value = await dynamoDBProvider.get('my-parameter');
console.log(value);
};
For retrieving multiple parameters, you can use the DynamoDBProvider.getMultiple()
method instead.
Learn More
If you want to learn more, check the post we have just published on the AWS Compute Blog: Retrieving parameters and secrets with Powertools for AWS Lambda (TypeScript)
Acknowledgements
A big thank you to all the people who contributed to this utility with PRs, questions, feedback, and bug reports.
Changes
- tests(tracer): update docs url for e2e tests (#1559) by @dreamorosi
🌟New features and non-breaking changes
- feat(idempotency): preserve original error when wrapping into
IdempotencyPersistenceLayerError
(#1552) by @am29d
📜 Documentation updates
- docs(parameters): release tasks + add docs on bring your own provider (#1565) by @dreamorosi
- chore(docs): add certible + logo (#1563) by @am29d
- chore(tests): upgrade
@aws-sdk/*
pacakges (#1561) by @dreamorosi - chore(docs): update docs base origin url (#1551) by @hjgraca
🔧 Maintenance
- docs(parameters): release tasks + add docs on bring your own provider (#1565) by @dreamorosi
- chore(idempotency): remove decorators (#1554) by @dreamorosi
- chore(docs): add certible + logo (#1563) by @am29d
- chore(tests): upgrade
@aws-sdk/*
pacakges (#1561) by @dreamorosi
This release was made possible by the following contributors:
@am29d, @dreamorosi, @hjgraca and Release bot[bot]