Sites/browserbase.com

Browserbase

Sign in to claim

browserbase.com

Browserbase provides a platform for automating web interactions using AI-powered browser agents.

developer-toolsbrowser-automationweb-scrapingai-integrationautomation-framework
67
AI-Readiness / 100
Level 4 · Agent-Ready
489/489
Pages analyzed
3/15
Questions answered
Live
For AI agents
Visit site →

Content

AI-judged Q&A · 70% of score

65

3 of 15 buyer questions answered cleanly

Your site doesn't cover:

limitstechnicalpricing
See all 11 gaps

Protocol

Technical hygiene · 30% of score

73

8 of 11 items installed

Missing:

WebMCP +3Content-Signal +3markdown accept +3
Install with Lattis · +9 pts

Protocol fix list

3 to ship · 8 passing

Start here

install the WebMCP widget (and host the discovery manifest)

Paste the widget script — that is the W3C WebMCP draft (navigator.modelContext) and the actual integration. Then also host the discovery manifest so crawlers can find your tool surface without rendering JS. Do both; the script is the spec, the manifest is just discoverability.

+3 pts

1. Runtime — paste in <head>

<script async src="https://lattis.dev/widget.js"></script>

Calls navigator.modelContext.provideContext() — the W3C WebMCP draft. Agents on the page see your tools live, scoped to browserbase.com.

2. Discovery — host the manifest

Download webmcp.json

Host at browserbase.com/.well-known/webmcp.json. The WebMCP spec defines no discovery mechanism, so crawlers can't see runtime registrations — this manifest closes that gap.

Paste into robots.txt

Static snippets that tell agents your policy.

Content-Signal policy

Cloudflare-launched (Sep 2025) signal in robots.txt. Declare your stance on search, ai-input, and ai-train.

+3 pts
User-agent: *
Content-Signal: search=yes, ai-input=yes, ai-train=no
Allow: /

Add to browserbase.com/robots.txt. Adjust values: search, ai-input, ai-train each accept yes or no.

Platform-level

Requires config or code on your side. Docs linked where useful.

Markdown content negotiation

Respond to Accept: text/markdown with plain markdown — agents pay ~80% fewer tokens. If you're on Cloudflare, it's a zone toggle.

+3 pts

Zone-level toggle if you're on Cloudflare. Non-CF: implement server-side content negotiation on Accept: text/markdown.

Cloudflare: Markdown for Agents

Already passing

robots.txtsitemap.xmlllms.txtAllow major AI crawlersClean crawl rateServer-rendered contentMCP server cardOpenAPI spec

Content gaps

11 gaps

Also covered by browserbase.com

technicalintegrationoperationssecurity

MCP Server

AI agents can query this site directly via MCP. Add this endpoint to Claude Code, Cursor, or any MCP client.

Endpoint

https://mcp.lattis.dev/s/browserbase-com/mcp

Claude Code

claude mcp add browserbase --transport http https://mcp.lattis.dev/s/browserbase-com/mcp

Cursor

{
  "mcpServers": {
    "browserbase": {
      "url": "https://mcp.lattis.dev/s/browserbase-com/mcp"
    }
  }
}

WebMCP — runtime + discovery

1. Widget script — drop in <head>. WebMCP-capable browsers (Chrome 146+ Origin Trial) call navigator.modelContext.provideContext() via this script — that's the W3C draft and the actual integration agents care about.

<script async src="https://lattis.dev/widget.js"></script>

Renders a small "AI-indexed by Lattis" badge bottom-right. Hide with [data-lattis] { display: none !important; }.

2. Discovery manifest — host alongside the script. The WebMCP spec defines no discovery mechanism, so crawlers can't see the runtime registration. This static JSON closes the gap.

Download webmcp.json

Host at browserbase.com/.well-known/webmcp.json.