Analytics
Empathia.ai — robots.txt & llms.txt Analysis and Optimization Report

Empathia.ai — robots.txt & llms.txt Analysis and Optimization Report

Comprehensive side-by-side evaluation: policy, AI/SEO readiness, recommendations, and actionable improvements for www.empathia.ai

Report cover for Empathia.ai robots.txt and llms.txt analysis

1. robots.txt

A. Existing Content Overview

User-agent: *
Allow: /
User-agent: Googlebot
Allow: /
User-agent: Gemini
Allow: /
User-agent: Googlebot-News
Allow: /
User-agent: Google-CloudVertexBot
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: OAI-SearchBot/1.0; +https://openai.com/searchbot
Allow: /
User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot
Allow: /
User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.1; +https://openai.com/gptbot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: anthropic-ai
Allow: /
User-agent: Claude-Web
Allow: /
Sitemap: https://www.empathia.ai/sitemap.xml

Current Behavior:

  • All major and LLM-related crawlers are allowed full access.
  • All content areas are crawlable; nothing disallowed.
  • Sitemap directive is present, pointing to the correct URL.

B. Optimization Recommendations

  • Allow important crawlers, especially Googlebot, Bing, OpenAI, Gemini, Anthropic, Perplexity, etc.
  • Disallow sensitive/internal areas (/admin/, /private/, /login/, /tmp/) if present.
  • Avoid repetitive Allow rules when a global Allow: / is set, unless you want explicit control.
  • Keep Sitemap at the bottom for clarity.
  • Keep user-agent rules organized: most specific first, wildcard at the end.

Page/Section-Specific Recommendations:

  • Homepage, About, Contact, Blog, Product, Billing & Coding, Specialties, Support, Testimonials, Community:
    These should all remain crawlable and listed in the sitemap for best discoverability.
  • Admin/Login/Internal-only URLs (if they exist):
    Disallow: /admin/
    Disallow: /login/
    Disallow: /private/
    Disallow: /tmp/
    Disallow: /test/
    Disallow: /staging/
    
  • Parameter/duplicate pages (if present):
    Disallow: /*?sessionid=
    Disallow: /*?ref=
    

C. Optimized robots.txt Example

# Major Search Engines & LLM Indexers
User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /
User-agent: DuckDuckBot
Allow: /
User-agent: OAI-SearchBot/1.0; +https://openai.com/searchbot
Allow: /
User-agent: Gemini
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: anthropic-ai
Allow: /

# Disallow sensitive/admin areas (if these exist)
Disallow: /admin/
Disallow: /login/
Disallow: /private/
Disallow: /tmp/
Disallow: /staging/
Disallow: /test/

# Wildcard for all others
User-agent: *
Allow: /

Sitemap: https://www.empathia.ai/sitemap.xml

2. llms.txt (or LLM Optimization Page)

Empathia does not appear to have a public llms.txt file. Below is an optimized, AI-friendly summary in line with best practices:

A. Optimized llms.txt for www.empathia.ai

# Empathia AI
> Complete AI Scribe & Charting Assistant for Healthcare

Empathia is an all-in-one AI assistant for patient intake, clinical charting, and billing, designed for physicians and specialists.

### Metadata
title: Empathia AI | AI Scribe & Charting for Healthcare
description: Save time and improve patient care with Empathia’s AI-powered intake, real-time scribing, coding, and billing automation.
domain: www.empathia.ai
language: en
category: Healthcare, AI, Medical Automation, Scribing, Clinical Documentation, Practice Management
keywords: AI scribe, charting, medical automation, physician workflow, EMR integration, ICD, CPT, billing, specialty medicine
sitemap: https://www.empathia.ai/sitemap.xml

### Core Pages
- [Homepage](https://www.empathia.ai): Product overview, features, specialties, testimonials, quickstart steps.
- [Product Details](https://www.empathia.ai/product): Deep dive into core features and modules.
- [Billing & Coding](https://www.empathia.ai/billing-code): AI-powered coding and claims automation.
- [All Specialties](https://www.empathia.ai/#specialties): Specialty-specific workflows.
- [Onboarding Handbook](https://community.empathia.ai/onboarding-handbook): Setup, guides, FAQs.
- [Support](https://support.empathia.ai/support/solutions): Help desk, troubleshooting.
- [Testimonials](https://love.empathia.ai): Reviews and case studies.
- [Blog & Community](https://www.empathia.ai/blog): Updates, best practices.

### Accessibility
alt_text_present: true
structured_data: true
mobile_friendly: true
multi_language: true

### SEO
robots_txt: /robots.txt
sitemap_xml: /sitemap.xml
canonical: https://www.empathia.ai

B. Per-URL LLM/SEO Optimization Suggestions

  • Homepage - Current: Largely clear, but may lack explicit labels or section markers for LLMs.
    Optimized: Add metadata at the top, explicit H1, sections for features/specialties/links, and a bulleted info structure.
  • Product - Current: Details may be paragraph-heavy.
    Optimized: Use structured lists, explicit feature labels, Meta Title/Description at top, “Return to Homepage” button.
  • Billing & Coding - Current: Info is possibly mixed with other features.
    Optimized: Standalone section, structured bullet points, clear feature breakdown, Meta header.
  • Specialty Pages - Current: Generic copy per specialty.
    Optimized: Meta title/desc, specialty-specific template/ICD/CPT examples, clear “section” and “feature” markers.
  • Resources/Support/Testimonials/Blog - Current: May be siloed or lightly described.
    Optimized: Explicit descriptions, standard headings, bulleted service/support instructions, sample testimonials/case study formatting.

3. Summary Table of Optimization Recommendations

URL/Section Existing (Status) Optimization Suggestions
/robots.txt All: Allow; Sitemap present Disallow sensitive/internal only (if present). Organize rules by bot type. Avoid redundant Allows.
/llms.txt (Not present) Add a summary file with metadata, core page links, structured accessibility info, and standard SEO meta.
/, /product, /billing-code, /specialties/... Pages crawlable; clear to search Add explicit headings, meta tags, section markers, feature lists; reduce paragraph blocks; anchor links; use canonical tags; ensure inclusion in sitemap.
/admin/, /private/, ... (internal paths) Not protected (potentially) Add Disallow directives per path in robots.txt to block from crawling and accidental indexing.
Sitemap Present Keep up to date and ensure all important pages are listed for maximum indexation.
LLM/AI bot friendliness General access allowed Use explicit, structured meta and feature lists. Add llms.txt for machine-readable config if not present. Highlight links, sections/labels for easier LLM extraction & indexing.

Conclusion

  • Your robots.txt is already permissive and generally SEO/AI friendly; just consider addition of Disallow rules for sensitive areas and a bit of organization for clarity.
  • For LLM/AI optimization, create a llms.txt (or similar) that lists metadata, links, accessibility/SEO practices, and content structure.
  • Each individual major URL should begin with metadata, use clear sections/labels/headings, and bulleted/structured feature lists for best AI/searchbot parsing.
If you want a page-by-page markdown template for Empathia’s key pages, or want to see how the optimized llms.txt content would look per individual specialty, just request!