Skip to content

Postman-Devrel/composite-MCP-Pattern

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

composite-MCP-Pattern

The composite MCP pattern privates an implementation pattern for developers who use multiple, related APIs, and wish to use them via MCP in AI builder tools such as Claude Code, Cursor, Gemini etc. The pattern demonstrates how multiple APIs can be exposed in a single MCP tool call. The result is a greatly simplified end-developer experience.

compositeMCP pattern

Example implementation

AWS offers a very popular set of infrastructure related APIs for provisioning s3 buckets, databases, servers etc. These are often utilized by internal engineering teams and require centralized governance (AWS credentials, naming conventions etc). Rather than exposing specific API endpoints or individual tools for MCP services, the composite MCP pattern can be utilized to expose a single MCP service and centralize the governance requirements, thereby ensuring developer productivity and ongoing compliance.

1. Connect APIs into Postman

Using the AWS collection from the Postman Public API network, we can see four categories of infrastructure:

  • storage (s3)
  • network (vpc)
  • servers (ec2)
  • database (rds)

Once these are added to your workspace, add AWS access and secret keys to a vault. You can also specific collection defaults. In the example below, region is being set as an environment configuration.

AWS access keys screenshot

2. Create composite service using Flows

Postman flows will provide the tools to create a visual workflow with branching logic for the specific endpoints. It is this flow that will eventually be exposed as a single MCP service. Make sure you are using a Flow action. (look for the lightning bolt icon). The specifics of your flow will be different to that for the AWS example, but the general pattern is the same:

Flow structure

Under Settings, make sure you add any configuration variables. For our example, this will be the AWS keys. Adding them at the flow level ensures that no confident information has to be passed publically, and assists in keeping downstream implementations very simple. Developers do not need access to keys.

Add AWS keys

Define a tool definition

To expose your flow as an MCP, you must create a Scenario called "toolDefinition". Scenarios allow you to create execution scenarios and pass in request values. Because the composite MCP pattern will be a MCP tool, the toolDefinition defines the structure Postman needs to generate the MCP.

Add a new scenario, toolDefinition, and give it JSON body, similar to below. Make sure you define the input values required. For the AWS composite MCP we require region and infrastructureType.

{
	"tools": [
		{
			"name": "createAWSInfrastructure",
			"description": "create any type of aws infra",
			"inputSchema": {
				"type": "object",
				"properties": {
					"infrastructureType": {
						"type": "string",
						"description": "database,server,storage, or ec2"
					},
					"region": {
						"type": "string",
						"description": "AWS region you want this resource created in. eg: us-east-1"
					}
				}
			}
		}
	]
}

Map request arguments to variables

Now that your toolDefinition scenario is defined, you must map the body arguments to flow variables. The path must be arguments.propertyname. For example, using the tool definition JSON above, you would need to use arguments.infrastructureType and arguments.region. Make sure you are mapping from the Body and Params.

mapping toolDefinition properties

Add branching logic

Depending on our composite logic, you will need to implement branching logic. This branch is where you will determine which underlying API is called. For the AWS example, we will branch on infrastructureType using the variable just created.

Determine which API to call

Handle API implementation

Let's say the infrastructureType is "storage", we then want to implement the API call for Create S3 Bucket from the CloudInfra Collection

Implementing API logic

If you notice, the Create S3 bucket API requires two parameters: region, which we passed in via the tool definition, and bucketname. We could have added an additional tool definition property, but since this API endpoint is the only one which requires this value, we can utilize Flow's ability to create with AI to generate a bucketName. In the example above, we provide a prompt that generates a name based on history and conforms to AWS bucket naming conventions:

"generate a 12 character name based on something interesting that happened on this day in history. The name can include a hypthen, but only lowercase letters"

Handle response

All that is left is to handle the response. In our example, we return a confirmation message, and name if present.

Success response

Deploy MCP

With your composite service completed, tap Deploy. This will provide you with a postman url of your remote MCP endpoint. Tapping on the small icon on the right will take you to the MCP enpoint.

MCP endpoint

Once you have the MCP endpoint, you can test it by tapping Connect. You should see your available tools. Enter some values and test it out.

MCP testing in Postman.

At this point your composite MCP service is complete. Thankfully, since you built it upon Postman, not only is the developer experience of working with a collection as a single MCP tool a great experience, you can easily add unit tests, documenation, monitoring etc via Agent mode to ensure your service continues to run.

3. Integrate with AI tool

Now that you have a composite MCP running as a remote server, you can add it to your favorite AI tool. For example, if you are using Google Gemini CLI you would execute the following command:

gemini mcp add --scope user --transport http yourServiceName https://the-postman-mcp-url-from-deploy

gemini integration

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors