This guide walks you through running and understanding the Plugin-Workflows example, which demonstrates sophisticated multi-step workflow implementations using the Beyond MCP Server library.
By following this guide, you'll understand:
- Multi-Step Workflow Architecture: How workflows differ from simple tools
- State Management: How data flows and transforms between workflow steps
- Error Handling: Advanced error recovery and continuation patterns
- Resource Tracking: Performance monitoring and resource usage tracking
- Parameter Validation: Comprehensive input validation with Zod schemas
cd examples/2-plugin-workflowsdeno run --allow-all main.tsYou should see:
[INFO] Starting MCP server: plugin-workflows-mcp-server v1.0.0
[INFO] Plugin discovery found: WorkflowPlugin v1.0.0
[INFO] Registered 3 workflows: data_processing_pipeline, file_management_lifecycle, content_generation_pipeline
[INFO] Registered 2 tools: current_datetime, validate_json
[INFO] Server ready on stdio transport
The server is now ready to accept MCP requests. You can test the workflows using an MCP client or the provided test suite:
deno test --allow-all tests/Purpose: Processes arrays of data objects through validation, transformation, analysis, and export steps.
Steps:
- Validate: Ensures data is properly formatted
- Transform: Applies operations like normalize, filter, sort, deduplicate
- Analyze: Generates summaries, statistics, or detailed analysis
- Export: Outputs results in JSON or CSV format
Example Parameters:
{
"userId": "demo-user",
"data": [
{ "name": "Alice", "score": 95, "department": "Engineering" },
{ "name": "Bob", "score": 87, "department": "Marketing" },
{ "name": "Charlie", "score": 92, "department": "Engineering" }
],
"transformations": ["normalize", "sort"],
"outputFormat": "json",
"analysisType": "statistical"
}Key Learning Points:
- State management:
processedDatais modified by each transformation step - Error recovery: Failed transformations don't stop the pipeline
- Performance tracking: Each step records timing and resource usage
Purpose: Manages the complete lifecycle of a file from creation to archival.
Steps:
- Create: Sets up file metadata and initial content
- Validate: Applies configurable validation rules
- Process: Formats, sanitizes, and adds metadata
- Archive: Stores the processed file with audit information
Example Parameters:
{
"userId": "demo-user",
"fileName": "config.json",
"content": "{\"debug\": true, \"port\": 3000}",
"validationRules": ["not_empty", "valid_json", "max_size"],
"processingOptions": {
"format": "pretty",
"addMetadata": true,
"sanitize": false
}
}Key Learning Points:
- File type detection and handling
- Configurable validation rules
- Content transformation and metadata injection
- Audit trail creation
Purpose: AI-powered content creation with planning, generation, review, and publishing.
Steps:
- Plan: Creates content outline and structure
- Generate: Produces content based on requirements
- Review: Evaluates quality and applies improvements
- Publish: Finalizes content with metadata and preview
Example Parameters:
{
"userId": "demo-user",
"contentType": "blog",
"topic": "Microservices Architecture Best Practices",
"requirements": {
"wordCount": 800,
"tone": "professional",
"audience": "software developers",
"includeReferences": true
}
}Key Learning Points:
- Multi-phase content creation workflow
- Quality metrics and automated improvements
- Content type-specific processing (blog vs documentation vs report)
- Publication workflow with metadata
All workflows extend WorkflowBase which provides:
abstract class WorkflowBase {
// Required metadata
abstract readonly name: string;
abstract readonly version: string;
abstract readonly description: string;
abstract readonly category: PluginCategory;
abstract readonly tags: string[];
abstract readonly parameterSchema: ZodSchema<any>;
// Main execution method with validation and error handling
async executeWithValidation(params: unknown, context: WorkflowContext): Promise<WorkflowResult>;
// Safe execution wrapper for individual steps
protected async safeExecute<T>(
operationName: string,
operation: () => Promise<T>,
resourceType?: WorkflowResource['type'],
): Promise<{ success: boolean; data?: T; error?: FailedStep }>;
// Step result creation helpers
protected createStepResult(operation: string, success: boolean, data?: unknown): WorkflowStep;
}The WorkflowPlugin follows the correct pattern:
const WorkflowPlugin: AppPlugin = {
name: 'workflow-plugin',
version: '1.0.0',
description: 'Multi-step workflow demonstrations',
// ✅ CORRECT: Populate arrays directly
workflows: [
new DataProcessingWorkflow(),
new FileManagementWorkflow(),
new ContentGenerationWorkflow(),
],
tools: [
// Basic utility tools
],
};Why This Works:
- PluginManager automatically discovers and registers workflows
- No manual registration code needed in the plugin
- Clean separation between plugin definition and workflow implementation
# Run all workflow tests
deno test --allow-all tests/workflows/
# Run specific workflow test
deno test --allow-all tests/workflows/DataProcessingWorkflow.test.ts
# Run integration tests
deno test --allow-all tests/integration/If you have an MCP client, you can test workflows directly:
-
List Available Workflows:
{ "jsonrpc": "2.0", "id": 1, "method": "workflows/list" } -
Execute Data Processing Workflow:
{ "jsonrpc": "2.0", "id": 2, "method": "workflows/execute", "params": { "name": "data_processing_pipeline", "arguments": { "userId": "test-user", "data": [{ "name": "test", "value": 100 }], "transformations": ["normalize"], "outputFormat": "json", "analysisType": "summary" } } }
All workflows support dry run mode for testing:
{
"userId": "test-user",
"dryRun": true,
"fileName": "test.json",
"content": "{\"test\": true}",
"validationRules": ["valid_json"],
"processingOptions": {
"format": "pretty"
}
}Dry run mode validates parameters and simulates execution without making changes.
LOG_LEVEL=debug deno run --allow-all main.tsThis shows detailed information about:
- Plugin discovery and registration
- Workflow execution steps
- Parameter validation
- Error handling and recovery
Error: Workflow 'data_processing_pipeline' not registered
Solution: Check that:
- Plugin is exported as default:
export default WorkflowPlugin - Workflow is included in plugin's
workflowsarray - Workflow name matches exactly (case-sensitive)
Error: Parameter validation failed: data.required
Solution:
- Review the workflow's
parameterSchema - Ensure all required fields are provided
- Check data types match schema expectations
- Use dry run mode to test parameters
Error in step 'validate_data': Data must be a non-empty array
Solution:
- Check input data format and structure
- Review step-specific requirements in workflow code
- Use smaller test datasets to isolate issues
Workflow results include detailed execution information:
interface WorkflowResult {
success: boolean;
completed_steps: WorkflowStep[]; // Successful steps
failed_steps: FailedStep[]; // Failed steps with error details
data?: unknown; // Final workflow output
metadata: Record<string, unknown>; // Workflow-specific metadata
duration?: number; // Total execution time
resources?: WorkflowResource[]; // Resource usage tracking
}Workflows automatically track:
- Execution Time: Duration of each step and total workflow
- Resource Usage: API calls, storage operations, file access
- Memory Usage: Through Deno's built-in memory monitoring
- Success Rates: Step completion and failure statistics
- Parallel Step Execution: For independent steps, consider parallelization
- Data Size Management: Use streaming for large datasets
- Error Recovery: Design workflows to continue after non-critical failures
- Caching: Cache expensive operations between workflow runs
| Aspect | Simple Tools (1-simple) | Workflows (2-plugin-workflows) |
|---|---|---|
| Complexity | Single operation | Multi-step processes |
| State | Stateless | Stateful with data flow |
| Error Handling | Basic try/catch | Advanced recovery patterns |
| Monitoring | Minimal | Comprehensive tracking |
| Use Cases | Utility functions | Business processes |
| Testing | Simple unit tests | Multi-step integration tests |
- Add a new transformation type to
DataProcessingWorkflow - Implement the transformation logic
- Test with sample data
- Observe the step-by-step execution
- Add a new validation rule to
FileManagementWorkflow - Implement the validation logic
- Test with files that pass and fail validation
- Check error handling and recovery
- Add support for a new content type (e.g., 'technical-spec')
- Implement content type-specific outline generation
- Add quality metrics specific to technical content
- Test the complete pipeline
Once you're comfortable with workflows:
- Advanced Features: Explore workflow scheduling and dependencies
- Custom Workflows: Build workflows for your specific use cases
- API Integration: Move to
3-plugin-api-authfor OAuth and external APIs - Production Deployment: Learn advanced configuration in
4-manual-deps
- WorkflowBase Documentation: Deep dive into the base class methods
- Plugin Architecture Guide: Understanding plugin discovery and registration
- Error Handling Patterns: Best practices for workflow error management
- Performance Monitoring: Advanced resource tracking and optimization
This example provides a solid foundation for building production-ready workflows that can handle complex, multi-step business processes with proper error handling, state management, and monitoring.