Skip to Content
WorkflowsExternal Tool Integrations

External Tool Integrations

With MCP and the Files API you can stitch TestBase into any ecosystem—document stores, ticketing systems, CI pipelines, or custom services.

Pattern 1: Documentation updates via Notion

  1. Create a hosted Notion MCP server that exposes search and page editing.
  2. Attach it to a worker agent and run tasks that pull context from Notion before editing code.
  3. Store generated summaries back into Notion using the same MCP tool call.
await cloud.addMcpServer(agent.containerId, { type: 'hosted', name: 'notion', url: process.env.NOTION_MCP_URL!, bearerToken: process.env.NOTION_TOKEN, allowedTools: ['search', 'pages.retrieve', 'pages.create'] }); await run(agent, 'Read our Notion spec for Project Atlas and update the repo accordingly.');

Pattern 2: Secure filesystem bridging

  • Mount the MCP filesystem server pointing at /workspace to let planners and reviewers inspect or manipulate files without shell commands.
  • Combine with the Files API to snapshot important artefacts (e.g., compiled binaries) to another bucket.
await cloud.addMcpServer(agent.containerId, { type: 'local', name: 'workspace-fs', command: 'npx', args: ['@modelcontextprotocol/server-filesystem', '/workspace'] }); const { files } = await cloud.listFiles(agent.containerId, '/workspace/dist'); console.log('Build artefacts:', files);

Pattern 3: Incident response helper

Combine search, issue tracking, and code modification capabilities into one orchestrated flow:

  1. Planner queries a hosted PagerDuty MCP server for active incidents.
  2. Worker updates the codebase in a container using filesystem + database MCP servers to reproduce and fix issues.
  3. Reviewer confirms the fix and posts a summary back to Slack via MCP.
const orchestrator = new Agent({ name: 'Incident Orchestrator', agentType: 'orchestrator', workspace: './repo', codexMcpServers: [pagerDutyConfig, slackConfig] }); await run(orchestrator, 'Resolve the highest-priority incident and report findings.');

Pattern 4: Data ingestion pipelines

  • Upload CSVs or JSON payloads into /workspace/data using the Files API.
  • Trigger a worker to transform data and generate reports.
  • Download outputs and push them to downstream systems.
await cloud.uploadFile(agent.containerId, '/workspace/data/input.csv', csvContent); await run(agent, 'Process data/input.csv and write summary charts to data/output/'); const outputs = await cloud.listFiles(agent.containerId, '/workspace/data/output');

Tips

  • Encapsulate credentials: wrap MCP configs in helper functions so each tool’s secrets stay in one place.
  • Validate tool availability: inspect cloud.listMcpServers before executing high-stakes runs to ensure all dependencies are mounted.
  • Combine with orchestrators: orchestrator agents can choose which worker runs where (local vs cloud) based on tool availability.
  • Log tool usage: session artefacts include tool invocation history—parse them to monitor adherence to compliance policies.

These integration recipes ensure TestBase agents operate with full context from the systems your team already depends on.

Last updated on