Tag Archives: Integration

Step-by-Step: Integrating MCP Servers into Antigravity

If you’ve been following the Antigravity Mastery series, you’ve already seen how Antigravity’s AI orchestration capabilities can transform your development workflow. Today we’re diving deep into one of its most powerful features: Model Context Protocol (MCP) server integration.

MCP servers are the bridge between Antigravity’s AI agents and your existing data sources—GitHub repositories, filesystems, databases, APIs, and proprietary systems. They’re how you give your AI context about your actual codebase and operations, moving from generic advice to project-specific intelligence.

By the end of this guide, you’ll have a working MCP integration, whether you’re connecting to standard services or building custom servers for your proprietary data. Let’s get practical.

Prerequisites: Getting Your Environment Ready

Before we start, make sure you have:

1. Antigravity CLI v2.4+ installed and configured
2. Node.js v18+ (for developing custom servers)
3. A running Antigravity project with at least one agent definition
4. API credentials for any external services you plan to connect to
5. Basic familiarity with TypeScript/JavaScript for custom server development

Verify your setup:

Check Antigravity CLI


antigravity --version

Check Node.js


node --version

Verify project structure


ls -la .antigravity/

If you’re missing anything, run antigravity setup to install dependencies and configure your workspace.

Connecting to Existing MCP Servers

Antigravity makes connecting to pre-built MCP servers trivial through the MCP Connector Wizard. Let’s walk through the most common integrations.

GitHub MCP Server

The GitHub server gives your agents read/write access to repositories, issues, PRs, and more.

Step 1: Launch the wizard

antigravity mcp wizard github

Step 2: Enter your credentials

The wizard will prompt for:

  • GitHub personal access token (with repo scope at minimum)
  • Repository owner/name or organization
  • Default branch (usually main)

Step 3: Review the configuration

The wizard generates a server configuration in .antigravity/mcp-servers/github.json:

{
"id": "github-main",
"name": "GitHub Main Repository",
"type": "github",
"enabled": true,
"config": {
"token": "ghp_your_token_here",
"repository": "your-org/your-repo",
"defaultBranch": "main",
"webhooksEnabled": false
}
}

Step 4: Test the connection

antigravity mcp test github-main

You should see a list of recent commits, open issues, or PRs.

Alternative: Manual Configuration

If you prefer manual setup or need multiple GitHub connections:

antigravity mcp add github --name "GitHub Secondary" --repo "different-org/different-repo" --token "your_token"

Filesystem MCP Server

The filesystem server exposes your local project files to agents for analysis, modification, and documentation generation.

Step 1: Launch wizard

antigravity mcp wizard filesystem

Step 2: Choose scope

The wizard asks what directories to expose:

  • Project root (recommended)
  • Specific subdirectories only
  • Custom paths (comma-separated)

Step 3: Set permissions

Choose read-only or read-write. Read-write is powerful but dangerous—only enable if your agents need to modify files and you trust their outputs.

Step 4: Exclusion patterns

Add patterns for files/directories to exclude:

node_modules/
dist/
.build/
.env
*.log
.git/

The generated config:

{
"id": "fs-project",
"name": "Project Filesystem",
"type": "filesystem",
"enabled": true,
"config": {
"rootPath": "/path/to/your/project",
"readOnly": false,
"excludePatterns": [
"node_modules",
"dist",
".git"
],
"maxFileSizeKB": 1024
}
}

Important: The filesystem server respects .gitignore patterns automatically.

Database MCP Servers

Antigravity supports PostgreSQL, MySQL, SQLite, and MongoDB through separate MCP servers.

#### PostgreSQL Example

antigravity mcp add postgres \
--name "Production DB" \
--host "db.example.com" \
--port 5432 \
--database "app_production" \
--username "readonly_user" \
--password env:DB_PASSWORD

The env: prefix tells Antigravity to read the value from an environment variable—never hardcode passwords in config files.

Generated config:

{
"id": "pg-prod",
"name": "Production PostgreSQL",
"type": "postgres",
"enabled": true,
"config": {
"host": "db.example.com",
"port": 5432,
"database": "app_production",
"username": "readonly_user",
"passwordSource": "environment",
"passwordEnvVar": "DB_PASSWORD",
"ssl": true,
"maxConnections": 5
}
}

#### Database Query Operations

Once connected, your agents can:

  • Read schema information
  • Execute SELECT queries (with read-only users)
  • Analyze query performance
  • Generate migration scripts

Example prompts your agents can now handle:

  • “What tables contain user data?”
  • “Show me the last 10 orders with customer names”
  • “Explain the relationship between orders and payments tables”

Building Custom MCP Servers for Proprietary Data

When off-the-shelf servers don’t cover your internal systems, you’ll need a custom MCP server. Antigravity makes this straightforward with the Server SDK.

Project Setup

Create a new MCP server project


mkdir custom-mcp-server
cd custom-mcp-server

Initialize TypeScript project


npm init -y
npm install typescript @antigravity/mcp-sdk --save-dev
npx tsc --init

Basic Server Structure

src/server.ts:

import { MCPServer, Resource, Tool } from '@antigravity/mcp-sdk';

class CustomEnterpriseServer extends MCPServer {
constructor() {
super({
name: 'Enterprise CRM',
version: '1.0.0',
description: 'Access to internal customer relationship data'
});

// Register resources
this.registerResource(new Resource({
uri: 'crm://customers',
name: 'Customer List',
description: 'All customers in the CRM',
mimeType: 'application/json'
}));

// Register tools
this.registerTool(new Tool({
name: 'get_customer',
description: 'Retrieve customer by ID',
inputSchema: {
type: 'object',
properties: {
customerId: { type: 'string' }
},
required: ['customerId']
}
}, this.handleGetCustomer.bind(this)));
}

async handleGetCustomer(params: any): Promise {
// Your proprietary data access logic here
const customerId = params.customerId;

// Example: Call internal API
const response = await fetch(
https://internal-api.company.com/customers/${customerId},
{
headers: {
'Authorization': Bearer ${process.env.INTERNAL_API_TOKEN}
}
}
);

if (!response.ok) {
throw new Error(Failed to fetch customer: ${response.statusText});
}

return await response.json();
}

async readResource(uri: string): Promise<{ contents: string }> {
switch (uri) {
case 'crm://customers':
// Fetch and return customer list
const customers = await this.fetchCustomers();
return { contents: JSON.stringify(customers, null, 2) };
default:
throw new Error(Unknown resource: ${uri});
}
}
}

const server = new CustomEnterpriseServer();
server.start();

Building and Deployment

Build


npm run build

Test locally


node dist/server.js

Package for distribution


npm pack

Registering with Antigravity

Register the custom server


antigravity mcp register ./custom-mcp-server-1.0.0.tgz \
--name "Enterprise CRM" \
--version 1.0.0

Enable it for your project


antigravity mcp enable enterprise-crm

Advanced Custom Server Patterns

Streaming Data Source:

registerTool(new Tool({
name: 'stream_transactions',
description: 'Stream real-time transactions matching filter',
inputSchema: {
type: 'object',
properties: {
minAmount: { type: 'number', default: 0 },
customerSegment: { type: 'string' }
}
}
}), async (params) => {
// Return a stream URI that agents can subscribe to
return {
streamUri: transactions://realtime?min=${params.minAmount}&segment=${params.customerSegment},
metadata: {
format: 'json-stream',
chunkSize: 100
}
};
});

Bi-directional Updates:

registerTool(new Tool({
name: 'update_customer_notes',
description: 'Add notes to customer record',
inputSchema: {
type: 'object',
properties: {
customerId: { type: 'string' },
notes: { type: 'string' }
},
required: ['customerId', 'notes']
}
}), async (params) => {
// Write to your internal system
await internalAPI.updateCustomerNotes(params.customerId, params.notes);
return { success: true, updatedAt: new Date().toISOString() };
});

The MCP Connector Wizard: Your New Best Friend

Antigravity’s MCP Connector Wizard (antigravity mcp wizard) is more than just a configuration helper—it’s an intelligent assistant that:

1. Detects existing services in your environment (local databases, running APIs, cloud credentials)
2. Generates secure configuration templates with proper secret handling
3. Validates connections before saving
4. Creates agent prompts tuned to the server’s capabilities
5. Documents the integration in your project README

Common wizard commands:

Interactive wizard for any supported server type


antigravity mcp wizard [type]

Quick setup with command-line args (no prompts)


antigravity mcp add postgres --host localhost --database mydb --auto-ssl

Discover what wizards are available


antigravity mcp wizards list

The wizard also detects when you’re in a cloud environment (AWS, GCP, Azure) and offers cloud-native integrations like:

  • AWS Secrets Manager credential retrieval
  • Google Cloud SQL connection pooling
  • Azure Key Vault integration

Server Registry: Discovering and Managing Your MCP Ecosystem

The MCP Server Registry (antigravity mcp registry) is your control plane for all MCP integrations across projects and teams.

Viewing Registered Servers

antigravity mcp registry list

Output:


ID Name Type Status Projects


github-main GitHub Main github active 2


fs-project Project Filesystem filesystem active 1


pg-prod Production Postgres postgres active 3


enterprise-crm Enterprise CRM custom inactive 0


Sharing Servers Across Projects

Instead of configuring the same GitHub connection in every project, share it from the registry:

Share a server (makes it available to your team/organization)


antigravity mcp registry share github-main --team "frontend-team"

Use a shared server in a project


antigravity mcp add --from-registry github-main

Server Versioning

Custom servers can be versioned and updated:

List all versions of a server


antigravity mcp registry versions enterprise-crm

Update a project to use a new version


antigravity mcp update enterprise-crm --version 1.2.0

Security Auditing

The registry provides security insights:

Check for credential issues


antigravity mcp registry audit

Example output:


✓ All servers use certificate-based auth


⚠ 2 servers have passwords stored in config (use env vars)


✗ 1 server uses insecure connection (ssl: false)


Security Considerations: Don’t Skip This

MCP servers are powerful—they give AI agents access to your data. That power requires careful security controls.

Credential Management

NEVER commit credentials to version control. Antigravity provides several secure patterns:

{
"passwordSource": "environment", // Read from env var
"passwordEnvVar": "DB_PASSWORD",

"passwordSource": "file", // Read from file
"passwordFile": ".secrets/db.pass",

"passwordSource": "vault", // HashiCorp Vault
"vaultPath": "secret/data/database"
}

Principle of Least Privilege

For database servers:

  • Use read-only users whenever agents only need to query
  • Create dedicated service accounts with minimal permissions
  • Avoid superuser or root database accounts
  • Restrict to specific schemas/tables where possible

For filesystem servers:

  • Use readOnly: true if agents don’t need to write
  • Set maxFileSizeKB to prevent memory exhaustion
  • Exclude sensitive directories even if they’re in project root

For custom servers:

  • Authenticate every request with API keys or mTLS
  • Implement rate limiting to prevent abuse
  • Log all access for audit trails

Network Security

{
"ssl": true,
"allowedHosts": ["internal-api.company.com"],
"networkPolicy": "restrict-outbound",
"timeoutSeconds": 30
}

Use private networking when possible—don’t expose internal APIs to the public internet just for MCP access.

Secret Rotation

Automate credential rotation:

Script to rotate database passwords and update all dependent MCP configs


#!/bin/bash
NEW_PASS=$(openssl rand -base64 32)
antigravity mcp secret set DB_PASSWORD "$NEW_PASS" --server pg-prod
ansible-playbook rotate-db-passwords.yml --extra-vars "new_pass=$NEW_PASS"

Agent-Specific Access Control

You can restrict which agents can use which MCP servers:

.antigravity/agents/search-agent.yaml


name: search-agent
mcpServers:
- github-main # Can access GitHub
# fs-project is NOT listed, so this agent can't access filesystem

.antigravity/agents/code-agent.yaml


name: code-agent
mcpServers:
- fs-project # Can modify files
- github-main # Can push changes

Troubleshooting: When Things Go Wrong

Even with careful setup, issues arise. Here’s how to diagnose them.

Connection Failures

Symptom: mcp test or agent execution fails with “Connection refused” or “Authentication failed”

Diagnosis:

1. Check server status


antigravity mcp status

2. Enable debug logging


antigravity mcp server github-main --debug

3. Test credentials manually (if applicable)


For databases:


PGPASSWORD=your_token psql -h host -U user -d db -c "SELECT 1"

Common fixes:

  • Verify network connectivity (firewall rules, VPNs)
  • Check if credentials have expired (GitHub tokens, cloud IAM keys)
  • Ensure SSL certificates are valid (not self-signed unless configured)
  • Confirm the service is running

Performance Issues

Symptom: Agent queries are slow or time out

Diagnosis:

Check server health metrics


antigravity mcp metrics github-main

Look for:


- High response times (>2s is concerning)


- Database connection pool exhaustion


- Large result sets (>10MB)


Solutions:

  • Add query limits: SELECT * FROM users LIMIT 100
  • Increase server timeout: "timeoutSeconds": 60
  • Enable caching for static data: "cacheTTLSeconds": 300
  • Scale up database resources or add read replicas

Data Quality Issues

Symptom: Agents return incorrect or incomplete data

Check:

  • Does the MCP server have access to all necessary tables/APIs?
  • Are row-level security policies blocking queries?
  • Do API rate limits truncate results?

Example: GitHub GraphQL API pagination

// Ensure your custom server handles pagination
let allIssues = [];
let cursor = null;
do {
const result = await graphqlQuery(`
query($cursor: String) {
repository(owner: "org", name: "repo") {
issues(first: 100, after: $cursor) {
nodes { title number state }
pageInfo { hasNextPage endCursor }
}
}
}
`, { cursor });

allIssues = allIssues.concat(result.repository.issues.nodes);
cursor = result.repository.issues.pageInfo.endCursor;
} while (cursor);

Agent Prompt Engineering

Sometimes the MCP server is fine, but the agent doesn’t know how to use it effectively.

Debug with:

See what resources are available


antigravity mcp resources list

Test a tool call directly


antigravity mcp call github-main listIssues --repo "your-org/your-repo"

Improve agent prompts:

.antigravity/agents/review-agent.yaml


prompt: |
You are a code review assistant with access to:
- GitHub issues and pull requests (tool: github-getPR)
- Project source code (resource: fs://src/*)
- Build logs (resource: logs://build)

When reviewing PRs:
1. Fetch the PR details with github-getPR
2. Read changed files from filesystem
3. Check if recent builds passed
4. Provide feedback on code quality, tests, and documentation

Restarting and Recovering

If a server becomes unresponsive:

Restart a specific server


antigravity mcp restart github-main

Restart all servers


antigravity mcp restart --all

Clear server cache (use with caution)


antigravity mcp clear-cache --server pg-prod

Complete Example: Putting It All Together

Let’s build a realistic integration for a typical web application.

Scenario: A Next.js app with:

  • GitHub repository for code
  • PostgreSQL database for user data
  • Local filesystem for source code
  • Custom MCP server for Stripe billing data

Step 1: GitHub integration

antigravity mcp wizard github

Follow prompts, use a read-only token


Step 2: PostgreSQL

antigravity mcp add postgres \
--name "App Database" \
--host ${DB_HOST} \
--database nextjs_app \
--username readonly_app \
--password env:DB_READONLY_PASS \
--ssl

Step 3: Filesystem

antigravity mcp wizard filesystem

Scope: project root only, read-only, exclude .env and node_modules


Step 4: Custom Stripe server

// stripe-mcp-server/src/index.ts
import { MCPServer, Tool } from '@antigravity/mcp-sdk';

class StripeMCPServer extends MCPServer {
constructor() {
super({ name: 'Stripe Billing', version: '1.0.0' });

this.registerTool(new Tool({
name: 'get_customer_subscriptions',
description: 'Get active subscriptions for a customer',
inputSchema: {
type: 'object',
properties: {
customerEmail: { type: 'string' }
},
required: ['customerEmail']
}
}, this.getSubscriptions.bind(this)));
}

async getSubscriptions(params: any): Promise {
const stripe = require('stripe')(process.env.STRIPE_SECRET_KEY);
const customers = await stripe.customers.list({ email: params.customerEmail });

if (customers.data.length === 0) {
return { subscriptions: [] };
}

const customerId = customers.data[0].id;
const subscriptions = await stripe.subscriptions.list({
customer: customerId,
status: 'active'
});

return {
customerId,
subscriptions: subscriptions.data.map(sub => ({
id: sub.id,
plan: sub.plan.nickname,
amount: sub.plan.amount,
currentPeriodEnd: sub.current_period_end
}))
};
}
}

new StripeMCPServer().start();

Step 5: Build and register

cd stripe-mcp-server
npm run build
antigravity mcp register ./dist/package.tgz --name "Stripe Billing"
antigravity mcp enable stripe-mcp-server

Step 6: Create an agent that uses all four

.antigravity/agents/support-agent.yaml


name: support-agent
model: claude-3.5-sonnet
mcpServers:
- github-main
- fs-project
- pg-prod
- stripe-mcp-server
prompt: |
You are a customer support specialist with full visibility into:
- Customer billing data (Stripe)
- User account details (PostgreSQL)
- Code repository (GitHub)
- Source code for debugging (Filesystem)

When a user asks about their subscription or account:
1. Look up their Stripe subscriptions by email
2. Check their user record in the database for account status
3. If there's a bug, examine relevant source code files
4. If it's a known issue, check GitHub issues
5. Provide clear, actionable responses with next steps

Step 7: Test the agent

antigravity agent run support-agent "Customer john@example.com says their subscription was charged but they can't access premium features"

The agent should:

  • Query Stripe for john@example.com’s subscription status
  • Check the database for his user account and premium access flag
  • Possibly examine auth middleware code to understand why access is blocked
  • Provide a diagnosis and next steps

Best Practices Checklist

Before you ship your MCP integration to production:

  • [ ] All credentials use environment variables or secret managers, not hardcoded
  • [ ] Database users have minimal required privileges (read-only where possible)
  • [ ] Filesystem access is scoped to specific directories
  • [ ] Custom servers validate all inputs and implement proper error handling
  • [ ] SSL/TLS is enforced for all network connections
  • [ ] Audit logging is enabled on all MCP servers
  • [ ] Agent permissions are restricted (agents can only access servers they need)
  • [ ] Connection timeouts are set (avoid hanging indefinitely)
  • [ ] Rate limiting is configured to prevent DoS
  • [ ] Documentation includes setup instructions and troubleshooting steps

External Resources

  • MCP Specification: https://spec.modelcontextprotocol.io/
  • Antigravity MCP Docs: https://docs.antigravity.dev/mcp
  • MCP SDK Reference: https://github.com/antigravity-dev/mcp-sdk
  • GitHub MCP Server: https://github.com/antigravity-dev/mcp-github
  • Security Best Practices: https://docs.antigravity.dev/security/mcp
  • Community MCP Servers: https://marketplace.antigravity.dev/mcp

Wrap-Up

MCP integration is where Antigravity transforms from a code assistant to a true development partner that understands your specific context. The combination of pre-built servers (GitHub, databases, filesystem) and the ability to build custom ones means you can connect Antigravity to any data source in your stack.

Remember: With great power comes great responsibility. Secure your credentials, follow the principle of least privilege, and audit access regularly.

In the next post, we’ll explore advanced agent orchestration patterns for complex multi-step workflows. Until then, happy integrating!