Skip to Content
WorkflowsTriggers

Triggers

Automate workflow execution with manual, scheduled, webhook, and event-based triggers.

Overview

Triggers define when and how workflows execute. M3 Forge supports multiple trigger types:

Trigger TypeUse CaseExample
ManualOn-demand executionClick “Run” in workflow editor
ScheduledRecurring executionDaily invoice processing at 2am
WebhookExternal system integrationNew document uploaded to S3
EventInternal system eventsJob completed, approval granted

All triggers are managed via the OPERATEAutomation section and support conditional execution, retry policies, and detailed logging.

Trigger configuration panel showing trigger type selection, schedule settings, and webhook URL

Triggers are part of the broader Automation system. For complex multi-step automations, see the Automation documentation.

Manual Triggers

Execute workflows on-demand from the UI or API.

Via Workflow Editor

  1. Open workflow in editor
  2. Click Run in toolbar
  3. Provide input data in dialog
  4. Click Start Execution

Useful for testing and ad-hoc execution.

Via API

Call the tRPC endpoint directly:

const run = await trpc.workflows.execute.mutate({ workflow_id: 'invoice_processing', input: { document: 'base64_encoded_pdf', options: { quality: 'high' } } }); console.log('Run ID:', run.id);

Returns a run ID immediately. Track execution progress via the Runs view.

Via CLI

Use the M3 Forge CLI for scripting:

m3 workflow run invoice_processing \ --input document=@file.pdf \ --input options.quality=high \ --wait

The --wait flag blocks until execution completes.

Scheduled Triggers

Execute workflows on a recurring schedule using cron expressions.

Creating a Schedule

  1. Navigate to OPERATEAutomation
  2. Click New Trigger
  3. Select Scheduled type
  4. Configure schedule:
FieldDescriptionExample
NameDescriptive identifier”Daily Invoice Processing”
WorkflowTarget workflowinvoice_processing
Cron expressionSchedule pattern0 2 * * * (daily at 2am)
TimezoneExecution timezoneAmerica/New_York
InputStatic workflow input{ "source": "s3://bucket" }

Cron Expression Syntax

Standard cron format with 5 fields:

┌───────────── minute (0-59) │ ┌───────────── hour (0-23) │ │ ┌───────────── day of month (1-31) │ │ │ ┌───────────── month (1-12) │ │ │ │ ┌───────────── day of week (0-6, 0=Sunday) │ │ │ │ │ * * * * *

Common patterns:

  • 0 2 * * * - Daily at 2am
  • 0 */4 * * * - Every 4 hours
  • 0 9 * * 1-5 - Weekdays at 9am
  • */15 * * * * - Every 15 minutes
  • 0 0 1 * * - First day of each month at midnight

Dynamic Input

Use template variables in input:

{ "date": "{{now | date: 'YYYY-MM-DD'}}", "source": "s3://bucket/{{now | date: 'YYYY/MM/DD'}}" }

Variables are evaluated at trigger time.

Pause and Resume

Scheduled triggers can be paused without deletion:

  1. Navigate to OPERATEAutomation
  2. Find the trigger in the list
  3. Toggle Enabled switch to pause
  4. Toggle again to resume

Execution history is preserved during pause.

Webhook Triggers

Execute workflows in response to HTTP requests.

Creating a Webhook

  1. Navigate to OPERATEAutomation
  2. Click New Trigger
  3. Select Webhook type
  4. Configure webhook:
FieldDescriptionExample
NameDescriptive identifier”S3 Upload Trigger”
WorkflowTarget workflowinvoice_processing
MethodHTTP methodPOST
PathURL path/webhooks/s3-upload
AuthAuthentication typeBearer token, HMAC signature

A unique webhook URL is generated: https://your-instance.com/webhooks/s3-upload

Request Format

POST JSON payload to the webhook URL:

curl -X POST https://your-instance.com/webhooks/s3-upload \ -H "Authorization: Bearer your_webhook_token" \ -H "Content-Type: application/json" \ -d '{ "bucket": "invoices", "key": "2026/03/invoice-123.pdf", "size": 245678 }'

The JSON body becomes the workflow input.

Response

Webhook returns immediately with run metadata:

{ "run_id": "run_abc123", "workflow_id": "invoice_processing", "status": "created", "created_at": "2026-03-19T10:23:45Z" }

Use the run_id to track execution progress.

Authentication

Webhook authentication options:

Bearer Token

Include token in Authorization header:

Authorization: Bearer your_webhook_token

Tokens are generated automatically and rotatable via UI.

Retries

If webhook invocation fails (network error, rate limit), M3 Forge retries with exponential backoff:

  • Retry 1: 1 second delay
  • Retry 2: 4 seconds delay
  • Retry 3: 16 seconds delay
  • Retry 4: 64 seconds delay
  • Retry 5: 256 seconds delay (final attempt)

After 5 failures, the webhook invocation is marked as failed and logged.

Event Triggers

Execute workflows in response to internal M3 Forge events.

Event Types

M3 Forge emits events for system actions:

Event TypeDescriptionPayload
workflow.completedWorkflow run completed{ run_id, workflow_id, status }
workflow.failedWorkflow run failed{ run_id, workflow_id, error }
approval.grantedHITL approval granted{ run_id, node_id, reviewer }
approval.rejectedHITL approval rejected{ run_id, node_id, reviewer, reason }
document.uploadedNew document in system{ document_id, type, size }
user.createdNew user account{ user_id, email }

Creating an Event Trigger

  1. Navigate to OPERATEAutomation
  2. Click New Trigger
  3. Select Event type
  4. Configure event trigger:
FieldDescriptionExample
NameDescriptive identifier”Process Failed Workflows”
Event typeEvent to listen forworkflow.failed
WorkflowTarget workflowalert_on_failure
ConditionsOptional filtersworkflow_id == 'invoice_processing'
Input mappingMap event payload to workflow input{ error: $.error, run_id: $.run_id }

Conditional Execution

Use conditions to filter which events trigger execution:

// Only trigger for specific workflows workflow_id == 'invoice_processing' // Only trigger for errors containing specific text error.message.contains('timeout') // Only trigger during business hours now().hour >= 9 && now().hour < 17 // Combine conditions workflow_id == 'invoice_processing' && error.message.contains('timeout')

Conditions use JavaScript expression syntax with access to event payload via $. prefix.

Input Mapping

Map event payload fields to workflow input:

{ "run_id": "$.run_id", "error_message": "$.error.message", "timestamp": "$.created_at", "metadata": { "workflow": "$.workflow_id", "user": "$.user_id" } }

Supports JSONPath expressions and template functions.

Chaining Workflows

Common pattern: trigger workflow B when workflow A completes:

  1. Create event trigger for workflow.completed
  2. Add condition: workflow_id == 'workflow_a'
  3. Set target workflow: workflow_b
  4. Map output from A to input for B:
{ "input_for_b": "$.nodes.final_node.output.result" }

This creates a workflow chain where B processes A’s output.

Trigger Management

Listing Triggers

View all triggers in OPERATEAutomation:

  • Active - Currently enabled triggers
  • Paused - Disabled but preserved triggers
  • Filter - By type, workflow, or status

Editing Triggers

  1. Click trigger name in list
  2. Modify configuration
  3. Click Save

Changes take effect immediately for future invocations. In-flight runs use old configuration.

Deleting Triggers

  1. Click trigger name in list
  2. Click Delete button
  3. Confirm in dialog

Execution history is preserved but trigger no longer fires.

Testing Triggers

Test triggers without waiting for schedule or event:

  1. Click trigger name in list
  2. Click Test button
  3. Provide sample input (or use defaults)
  4. Click Run Test

Creates a manual run using the trigger configuration.

Execution Policies

Configure retry and error handling per trigger:

Retry Policy

PolicyDescriptionExample
No retryExecute once, accept failureLow-priority scheduled jobs
Fixed retryRetry N times with fixed delay3 retries, 60s delay
Exponential backoffIncreasing delay between retries5 retries, 2^n seconds
Retry until successKeep retrying indefinitelyCritical data sync

Error Handling

ActionDescriptionExample
Log onlyRecord failure, no notificationNon-critical workflows
Email notificationSend alert to adminsProduction failures
Trigger workflowExecute error handling workflowAutomated remediation
WebhookPOST to external endpointIntegration with PagerDuty, Slack

Configure in trigger settings under AdvancedError Handling.

Monitoring

Trigger Metrics

View trigger performance in OPERATEAutomation:

  • Total invocations - Times trigger has fired
  • Success rate - Percentage of successful executions
  • Average duration - Mean execution time
  • Last invocation - Timestamp of most recent run

Click a metric to view detailed history.

Audit Log

All trigger invocations are logged in the event_tracking table:

  • Trigger name - Which trigger fired
  • Event type - Manual, scheduled, webhook, event
  • Input - Workflow input data
  • Result - Success or failure with details
  • Duration - Time from trigger to completion

Access via OPERATEEvents.

Best Practices

  1. Use descriptive names - Include workflow and trigger type: “Daily Invoice Processing (Scheduled)”
  2. Set realistic schedules - Avoid overlapping executions; use queueing if needed
  3. Validate webhook payloads - Add schema validation nodes at workflow start
  4. Test before enabling - Use test function to verify configuration
  5. Monitor failure rates - Set up alerts for triggers with >5% failure rate
  6. Rotate webhook tokens - Change tokens quarterly or after suspected compromise
  7. Document trigger purpose - Add descriptions explaining business logic

For complex multi-trigger automations, use the Automation builder which provides higher-level orchestration across multiple workflows and services.

Last updated on