The Death of Context Switching: Building an AI Controller with MCP Servers

Dataemia
5 Min Read



Summarize this content to 100 words:
Author(s): Ritesh Meena

Originally published on Towards AI.

Part 2 : Skip the Theory. Here’s What We Actually Built.
A technical deep-dive into the architecture that made our development workflow 10x faster

We connected an AI coding agent to 5 MCP servers that gave it real-time access to:

Our GitHub repositories (read files, create PRs, monitor CI)
Our PostgreSQL database (query schemas, validate data)
Our JIRA/Atlassian workspace (fetch stories, read acceptance criteria)
Our filesystem (read/write local files)
Web fetching (documentation, API references)

The result: An AI that doesn’t guess. It queries, verifies, and acts with full context.
The MCP Architecture

MCP Server Configuration
Here’s our actual mcp.json configuration:
{“mcpServers”: {“github”: {“command”: “npx”,”args”: [“-y”, “@modelcontextprotocol/server-github”],”env”: {“GITHUB_PERSONAL_ACCESS_TOKEN”: “${GITHUB_PAT}”}},”postgres”: {“command”: “npx”,”args”: [“-y”, “@modelcontextprotocol/server-postgres”],”env”: {“POSTGRES_CONNECTION_STRING”: “postgresql://user:pass@host:5432/db”}},”atlassian”: {“command”: “npx”,”args”: [“-y”, “@anthropic/mcp-atlassian”],”env”: {“ATLASSIAN_SITE_URL”: “https://your-org.atlassian.net”,”ATLASSIAN_USER_EMAIL”: “${ATLASSIAN_EMAIL}”,”ATLASSIAN_API_TOKEN”: “${ATLASSIAN_TOKEN}”}},”fetch”: {“command”: “npx”,”args”: [“-y”, “@anthropic/mcp-fetch”]}}}
What Each MCP Server Provides
1. GitHub MCP Server
Tools Available:
mcp_github_get_file_contents → Read any file from any repomcp_github_search_code → Search across repositoriesmcp_github_create_pull_request → Create PRs programmaticallymcp_github_list_commits → Get commit historymcp_github_push_files → Push changes to branchesmcp_github_create_branch → Create feature branches
Real Usage — Fetching a File, Creating a PR
Agent: “I need to see the existing query builder pattern”Tool Call: mcp_github_get_file_contentsowner: “your-org”repo: “backend-service”path: “src/dao/QueryBuilder.java”Returns: Full file contents (not a summary, actual code)Tool Call: mcp_github_create_pull_requestowner: “your-org”repo: “backend-service”title: “JIRA-1234 Add export feature”body: “## Changes\n- Added ExportService\n- Added unit tests”head: “feature/JIRA-1234-export”base: “master”Returns: PR URL, PR number
2. PostgreSQL MCP Server
Tools Available:
mcp_postgres_query → Execute read-only SQL queries
3. Atlassian MCP Server
Tools Available:
mcp_atlassian_getJiraIssue → Fetch full JIRA ticketmcp_atlassian_searchJiraIssuesUsingJql → Search with JQLmcp_atlassian_getConfluencePage → Fetch documentationmcp_atlassian_addCommentToJiraIssue → Update tickets
The Complete Workflow (With Actual Tool Calls)
Step 1: Fetch Requirements
User: “Implement PROJ-1234″Agent executes:├── mcp_atlassian_getJiraIssue(“PROJ-1234”)│ └── Returns: Full ticket with acceptance criteria│├── mcp_github_search_code(“similar feature keyword”)│ └── Returns: 3 matching files in repo│└── mcp_github_get_file_contents(matching_file_path)└── Returns: Actual implementation to follow as pattern
Step 2: Understand Current Schema
Agent executes:├── mcp_postgres_query(“SELECT * FROM information_schema.tables WHERE…”)│ └── Returns: All relevant tables│└── mcp_postgres_query(“SELECT column_name, data_type FROM…”)└── Returns: Exact column definitions (no guessing)
Step 3: Implement
Agent executes:├── read_file(local pattern file) # Native filesystem tool├── create_file(new_service.java) # Native filesystem tool ├── create_file(new_service_test.java) # Native filesystem tool└── run_in_terminal(“mvn test”) # Verify tests pass
Step 4: Create PR & Monitor CI
Agent executes:├── run_in_terminal(“git add -A && git commit -m ‘PROJ-1234 Add feature'”)├── run_in_terminal(“git push origin feature/PROJ-1234”)│├── mcp_github_create_pull_request(…)│ └── Returns: PR #507 created│└── Loop until CI passes:├── run_in_terminal(“Invoke-RestMethod …/check-runs”)│ └── Returns: { status: “completed”, conclusion: “failure” }│├── Agent reads failure logs├── Agent fixes code├── run_in_terminal(“git push”)└── Repeat check
CI Monitoring Script (The Agent Runs This)
# Agent polls GitHub API for check status$headers = @{“Authorization” = “Bearer $env:GITHUB_PAT””Accept” = “application/vnd.github+json”}$response = Invoke-RestMethod `-Uri “https://api.github.com/repos/org/repo/commits/$SHA/check-runs” `-Headers $headers$response.check_runs | Select-Object name, status, conclusion# Output:# name status conclusion# —- —— ———-# SonarQube completed failure# Unit Tests completed success# Security Scan completed success
When SonarQube fails, the agent:

Fetches the job logs
Parses the violations
Fixes them in code
Pushes again
Re-checks

This loop is fully automated.
Real Results With Real Numbers

TL;DR

MCP servers give AI agents real-time access to your systems
GitHub MCP → Read code, create PRs, monitor CI
Postgres MCP → Query schemas, validate data
Atlassian MCP → Fetch JIRA tickets, read Confluence
Agent knowledge file → Patterns, guardrails, context
Result: AI that doesn’t guess — it queries, verifies, acts

The context switching died because the AI now has the context.

Full MCP documentation: https://modelcontextprotocol.io
GitHub MCP Server: https://github.com/modelcontextprotocol/servers
About the Author
Building AI-powered development workflows. Currently connecting everything to MCP servers ❤️

Published via Towards AI



Source link

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!