robots.txt & llms.txt — Deltastream Action Plan & Best Practice Template
robots.txt and llms.txt must exist at
https://www.deltastream.io/robots.txt and https://www.deltastream.io/llms.txt
1. robots.txt Assessment & Optimization
Current Content
# START YOAST BLOCK
User-agent: Googlebot
Disallow:
User-agent: bingbot
Disallow:
User-agent: *
Allow: /
Sitemap: https://www.deltastream.io/sitemap_index.xml
# END YOAST BLOCK
- File is present and crawl-friendly. References sitemap.
- Lacks explicit mention for key AI/data crawlers.
Optimization Suggestions
- List key AI/search bots explicitly (for allow/deny).
- Retain broad access for all legitimate crawlers.
- Future-proof for enterprise, AI, and LLM-related bots.
Optimized robots.txt (recommended example)
# Optimized robots.txt for deltastream.io User-agent: * Allow: / User-agent: Googlebot Allow: / User-agent: Bingbot Allow: / User-agent: Gemini Allow: / User-agent: Googlebot-News Allow: / User-agent: Google-CloudVertexBot Allow: / User-agent: Google-Extended Allow: / User-agent: OAI-SearchBot/1.0; +https://openai.com/searchbot Allow: / User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot Allow: / User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.1; +https://openai.com/gptbot Allow: / User-agent: ClaudeBot Allow: / User-agent: anthropic-ai Allow: / User-agent: Claude-Web Allow: / # Uncomment and customize to block sensitive/unwanted folders: # Disallow: /admin/ # Disallow: /cart/ # Disallow: /search/ # Disallow: /tmp/ # Disallow: /private/ # Disallow: /*?* Sitemap: https://www.deltastream.io/sitemap_index.xml
Action: Replace your root
robots.txt with the optimized version above for maximum clarity and future compatibility.
2. llms.txt Assessment and Creation
- Currently missing: llms.txt (404)
- Create at domain root, following structure for AI/LLM friendliness.
Optimized llms.txt (example template)
# Deltastream
> Deltastream is a modern, cloud-native streaming data platform built for real-time analytics and event-driven applications.
### Metadata
title: Deltastream | Real-Time Stream Processing Platform
description: Unified stream management and analytics for modern data-driven enterprises.
domain: www.deltastream.io
language: en
category: Stream Processing, Real-Time Data, Cloud Platform, Data Pipeline, Analytics, SaaS
keywords: Deltastream, Stream Processing, Real-Time, Data Analytics, Event-driven, Unified Data, API, Cloud Native
### Core Pages
- [Homepage](https://www.deltastream.io): Overview, customer benefits, and product features.
- [Platform](https://www.deltastream.io/platform): Technical capabilities and integrations.
- [Docs](https://www.deltastream.io/docs): Complete API and integration documentation.
- [Support](https://www.deltastream.io/support): Help, guides, FAQ.
- [Blog](https://www.deltastream.io/blog): Best practices, use-cases, and thought leadership.
### Accessibility
alt_text_present: true
structured_data: true
mobile_friendly: true
### SEO
robots_txt: /robots.txt
### API Example
- **Endpoint:** `/api/streams`
- **Method:** GET
- **Description:** List available streams.
- **Sample Response:**
```json
[
{ "name": "analytics-stream", "status": "active" }
]
```
### Contact
For further support, visit: [Support](https://www.deltastream.io/support) or email [email protected]
_last updated: 2024-06-06_
Action: Create
Keep factual, structured, and update for AI parsing.
llms.txt at root (https://www.deltastream.io/llms.txt) using the template above. Keep factual, structured, and update for AI parsing.
3. Summary Table
| File | Present? | Optimized? | Action | Location |
|---|---|---|---|---|
robots.txt |
Yes | Minor | Update for major AI/search bots, clarity, sitemap | Domain root |
llms.txt |
No | No | Create with structured, AI-friendly info | Domain root |
Key Points
- robots.txt & llms.txt must always be at the domain root (
/robots.txtand/llms.txt). - Optimize
robots.txtfor all major web and AI bots. - Use
llms.txtto clearly describe your site, APIs, and factual resources for LLMs and web search. - Keep both files updated for search, AI, and compliance needs.
For customized crawl rules or deeper API documentation, provide additional site details.