DN Journal
Sign in to claimdnjournal.com
DN Journal is a media outlet that provides news and information on domain names and the domain industry.
Content
AI-judged Q&A · 70% of score
1 of 15 buyer questions answered cleanly
Your site doesn't cover:
Protocol
Technical hygiene · 30% of score
4 of 11 items installed
Missing:
Protocol fix list
7 to ship · 4 passingStart here
install llms.txt
Machine-readable index that routes AI agents to the pages you want them to read. We generate one for you — install at your root.
One-click installs
We host the heavy lifting — you ship one file or one line.
MCP server card
One JSON file at /.well-known/mcp/server-card.json tells agents you have an MCP server. Ours points at mcp.lattis.dev so Lattis handles queries for you.
Host at dnjournal.com/.well-known/mcp/server-card.json — points agents at Lattis as your MCP server.
WebMCP
Paste the widget script — that is the W3C WebMCP draft (navigator.modelContext) and the actual integration. Then also host the discovery manifest so crawlers can find your tool surface without rendering JS. Do both; the script is the spec, the manifest is just discoverability.
1. Runtime — paste in <head>
<script async src="https://lattis.dev/widget.js"></script>Calls navigator.modelContext.provideContext() — the W3C WebMCP draft. Agents on the page see your tools live, scoped to dnjournal.com.
2. Discovery — host the manifest
Download webmcp.jsonHost at dnjournal.com/.well-known/webmcp.json. The WebMCP spec defines no discovery mechanism, so crawlers can't see runtime registrations — this manifest closes that gap.
Paste into robots.txt
Static snippets that tell agents your policy.
Content-Signal policy
Cloudflare-launched (Sep 2025) signal in robots.txt. Declare your stance on search, ai-input, and ai-train.
User-agent: * Content-Signal: search=yes, ai-input=yes, ai-train=no Allow: /
Add to dnjournal.com/robots.txt. Adjust values: search, ai-input, ai-train each accept yes or no.
Platform-level
Requires config or code on your side. Docs linked where useful.
sitemap.xml
Not found at root or via robots.txt. Agents rely on it for URL discovery. Publish one covering your public pages.
Generate a sitemap covering your public pages. Reference it from robots.txt.
sitemaps.org protocol ↗Markdown content negotiation
Respond to Accept: text/markdown with plain markdown — agents pay ~80% fewer tokens. If you're on Cloudflare, it's a zone toggle.
Zone-level toggle if you're on Cloudflare. Non-CF: implement server-side content negotiation on Accept: text/markdown.
Cloudflare: Markdown for Agents ↗OpenAPI spec
No OpenAPI discoverable at standard paths (/openapi.json, /api-docs, etc.). If you have an API, publish the spec.
Publish at a standard path (/openapi.json) and link from your docs. Agents will discover it automatically.
OpenAPI Initiative ↗Already passing
Content gaps
14 gapsAlso covered by dnjournal.com
MCP Server
AI agents can query this site directly via MCP. Add this endpoint to Claude Code, Cursor, or any MCP client.
Endpoint
https://mcp.lattis.dev/s/dnjournal-com/mcpClaude Code
claude mcp add dnjournal --transport http https://mcp.lattis.dev/s/dnjournal-com/mcpCursor
{
"mcpServers": {
"dnjournal": {
"url": "https://mcp.lattis.dev/s/dnjournal-com/mcp"
}
}
}WebMCP — runtime + discovery
1. Widget script — drop in <head>. WebMCP-capable browsers (Chrome 146+ Origin Trial) call navigator.modelContext.provideContext() via this script — that's the W3C draft and the actual integration agents care about.
<script async src="https://lattis.dev/widget.js"></script>Renders a small "AI-indexed by Lattis" badge bottom-right. Hide with [data-lattis] { display: none !important; }.
2. Discovery manifest — host alongside the script. The WebMCP spec defines no discovery mechanism, so crawlers can't see the runtime registration. This static JSON closes the gap.
Download webmcp.jsonHost at dnjournal.com/.well-known/webmcp.json.