Raw Content
Article Extract: Building the Plumbing for AI to Use the Internet
Overview
This podcast episode features Alex Rattray, founder of Stainless, discussing Model Context Protocol (MCP)—a new standard enabling AI systems to interact with software tools and APIs.
Key Concepts
APIs and MCPs as Internet Infrastructure Rattray compares APIs to dendrites in the brain: “APIs are at the heart and center of [modern software], just like dendrites are the center of the mesh of the brain.” MCPs represent the next evolution, creating native interfaces for language models similar to how websites present interfaces for humans.
The Challenge of LLM-Friendly Design The fundamental difficulty lies in ergonomic API exposure. As Rattray explains, developing intuitive interfaces for language models proved harder than creating Python SDKs for developers, since understanding how LLMs process information remains less straightforward.
Design Principles for MCP Servers
Stainless has identified critical patterns:
- Limited tool sets - Too many options confuse models
- Clear naming and descriptions - Precise language guides appropriate usage
- Minimal input fields - Only essential parameters, reducing cognitive load
- Streamlined outputs - JSON filtering removes unnecessary data
Practical Application at Stainless
The company uses MCP servers for business operations—querying customer databases, cross-referencing HubSpot data, and aggregating Notion notes. They’ve also built what Rattray calls a “shared company brain” using Claude Code, automatically archiving useful insights in GitHub for team access.