agentsweb.org works with any HTTP client. Here's how to integrate it with the tools you're already using for AI agent web access.
The fastest way to give Claude Code web search and fetch capabilities. Install the MCP server:
npx -y intercept-mcp
Add to your Claude Code MCP config (~/.claude/settings.json):
{ "mcpServers": { "intercept": { "command": "npx", "args": ["-y", "intercept-mcp"] } } }
Now Claude Code can search the web, fetch pages, and read cached markdown — all through agentsweb.org.
Cursor supports MCP servers. Add to your Cursor MCP config:
{ "mcpServers": { "intercept": { "command": "npx", "args": ["-y", "intercept-mcp"] } } }
Your Cursor agent now has full web access — search, fetch, and cache via agentsweb.
Windsurf's Cascade supports MCP. Add the same configuration:
{ "mcpServers": { "intercept": { "command": "npx", "args": ["-y", "intercept-mcp"] } } }
Cascade can now search and read the web through the agentsweb markdown API.
For OpenAI's Codex and custom GPT agents, use the HTTP API directly:
GET https://agentsweb.org/research?q=your+query
GET https://agentsweb.org/fetch?url=https://example.com
Add these as function definitions in your agent's tool configuration. The JSON response parses cleanly into any agent framework.
Use the Python SDK or plain requests:
pip install agentsweb
Or integrate directly as a custom tool:
import requests
def fetch_page(url: str) -> str:
r = requests.get(f"https://agentsweb.org/fetch?url={url}")
return r.json()["markdown"]
Wrap it as a LangChain Tool and your agent has web access with caching, search, and prompt injection protection built in.
No SDK needed. Just HTTP:
import requests
# Search the web
r = requests.get("https://agentsweb.org/web?q=python+async+tutorial")
results = r.json()["results"]
# Fetch a specific page as markdown
r = requests.get("https://agentsweb.org/fetch?url=https://docs.python.org/3/library/asyncio.html")
markdown = r.json()["markdown"]
# Research: search + fetch + cache in one call
r = requests.get("https://agentsweb.org/research?q=rust+error+handling&count=3")
The simplest possible integration. No libraries, no dependencies:
curl "agentsweb.org/fetch?url=https://example.com"
curl "agentsweb.org/web?q=your+search+query"
curl "agentsweb.org/research?q=deep+learning+transformers"
curl "agentsweb.org/raw?url=https://example.com" (raw markdown, no JSON)
agentsweb is a REST API. If your tool can make HTTP GET requests, it can use agentsweb. No auth headers. No API keys. No OAuth. No SDK.
GET https://agentsweb.org/fetch?url={any_url} — get any web page as markdown
GET https://agentsweb.org/web?q={query} — search the web
GET https://agentsweb.org/research?q={query} — search + fetch + cache
That's the entire API surface for web scraping for AI. Three endpoints. Zero configuration. Works everywhere.