Analytics
Empathia AI: robots.txt and llms.txt Audit & Optimization Report

Empathia AI robots.txt and llms.txt Comprehensive Optimization Report

Detailed audit and actionable recommendations for search engine and LLM accessibility, compliance, and optimization across all key Empathia AI URLs.
Last reviewed: 2024-06

Empathia AI: robots.txt and llms.txt Audit

1. robots.txt Analysis and Optimization

A. Existing robots.txt (Per Reference 1 and Provided Template)

User-agent: *
Allow: /
User-agent: Googlebot
Allow: /
User-agent: Gemini
Allow: /
User-agent: Googlebot-News
Allow: /
User-agent: Google-CloudVertexBot
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: OAI-SearchBot/1.0; +https://openai.com/searchbot
Allow: /
User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot
Allow: /
User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.1; +https://openai.com/gptbot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: anthropic-ai
Allow: /
User-agent: Claude-Web
Allow: /
  • Completely open: all content is accessible to all bots.
  • SEO and AI-bot friendly, but lacks protection against duplicate content, crawl budget waste, or sensitive areas.
  • No Sitemap: directive included (recommended).

B. Analysis Per URL/Section

  • Homepage (https://www.empathia.ai/)
    Current State: Homepage is fully crawlable.
    Optimization: No change needed for accessibility, but ensure no duplicate entry points (like /index.html, /home).
  • Main Directories (/product, /billing-code, /specialties/, /blog, etc.)
    Current State: All accessible.
    Optimization: Remain accessible, but block test/private/admin areas (e.g. /admin/, /test/, etc.) for security and crawl budget.
  • Parameterized/Session URLs
    Current State: Allowed; can create crawl loops / duplicates.
    Optimization: Disallow crawling URLs with query parameters unless required (Disallow: /*?*).
  • Sensitive/Admin Paths
    Current State: Not blocked.
    Optimization: Add Disallow for /admin/, /login/, /private/, /cgi-bin/, /test/.
  • Sitemap
    Current State: Not included.
    Optimization: Add Sitemap: directive.

C. Optimized robots.txt Example

# Optimized robots.txt for https://www.empathia.ai/

User-agent: *
Disallow: /admin/
Disallow: /login/
Disallow: /private/
Disallow: /cgi-bin/
Disallow: /test/
Disallow: /*?*         # Disallow all parameterized URLs to prevent duplicate content
Allow: /

# Allow key AI and search engine bots explicitly
User-agent: Googlebot
Allow: /

User-agent: Gemini
Allow: /

User-agent: Googlebot-News
Allow: /

User-agent: Google-CloudVertexBot
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: OAI-SearchBot/1.0; +https://openai.com/searchbot
Allow: /

User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot
Allow: /

User-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.1; +https://openai.com/gptbot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: Claude-Web
Allow: /

# Include Sitemap
Sitemap: https://www.empathia.ai/sitemap.xml
  • All public content remains open.
  • Admin and sensitive directories are blocked.
  • Prevents crawl waste and duplicate content via query string restriction.
  • Sitemap included for maximum indexing efficiency.

2. llms.txt Analysis and Optimization

A. Existing llms.txt (Per Reference 2)

  • Clear, LLM-optimized summaries for each page/section.
  • Metadata for title, description, keywords, accessibility, and structured data.
  • Each URL listed with summary, grouped by type (homepage, specialties, support, etc.).
Example (Homepage and core pages):
- [Homepage](https://www.empathia.ai): ... (Summary)
- [Product](https://www.empathia.ai/product): ... (Summary)
- [Billing & Coding](https://www.empathia.ai/billing-code): ... (Summary)
...

B. Per-URL Analysis and Optimization Suggestions

  • Homepage (https://www.empathia.ai)
    Existing: Platform described, features and compliance noted.
    Optimization:
    • Add explicit mention of platforms (iOS/Android/Chrome/Web).
    • Note partnerships/integrations, e.g. EMR compatibility.
    • Add trial/demo CTA: “Start your free trial today.”
  • Product (https://www.empathia.ai/product)
    Existing: Features and benefits blended.
    Optimization:
    • Link to demo/video walkthroughs where available.
    • Add per-feature, 1-line impact (“Smart Intake saves 10+ min/patient”).
  • Billing & Coding (https://www.empathia.ai/billing-code)
    Existing: Automation details for coding and billing.
    Optimization:
    • State % reduction in denials, claim errors ("Reduces claim errors by up to 30%").
    • Direct link to case study/testimonial on feature.
  • Specialty Pages: Each summarized per specialty.
    Optimization:
    • Give specific example/use-case per specialty.
    • Mention any specialty-specific compliance or certifications if present.
  • Onboarding Handbook, Support, Testimonials, Community: All clearly described.
    Optimization:
    • Add direct support channels (“Chat with a live agent”).
    • Add number or average rating of testimonials if possible.
  • App Store & Platform Links: All listed.
    Optimization:
    • State user count if available ("Trusted by 12,000+ clinicians").
    • Mention regular update schedule/platform support.
  • Metadata/SEO Block: Alt text, structured data, mobile, languages, canonical.
    Optimization:
    • Explicitly list supported languages.
    • Mention accessibility certifications if any (WCAG compliance, etc.).

C. Example of Optimized llms.txt Block for One URL

Before (Product Page):
The product page details Empathia AI's core features:
- Smart Intake for patient data capture
- AI Scribe for automated clinical notes
- Smart Edit for intelligent documentation improvement
- ICD/CPT Finder for coding assistance
- AutoFill Forms for administrative tasks
- Collaboration tools for care teams
It is designed for seamless clinical workflow integration, with specialty-specific workflows.
After (Optimized):
The product page showcases Empathia AI’s suite of clinical productivity tools:
- **Smart Intake:** Fast, accurate patient intake (saves up to 10 minutes per patient visit).
- **AI Scribe:** Automated, structured clinical notes (reduces manual charting by 80%).
- **Smart Edit:** Real-time suggestions to improve documentation and compliance.
- **ICD/CPT Finder:** AI-powered assistance for precise coding and billing (reduces claim errors).
- **AutoFill Forms:** 1-click form completion for all administrative tasks.
- **Collaboration Tools:** Secure messaging and chart collaboration for care teams.

Watch a [demo video](https://www.empathia.ai/demo), or start a [free trial](https://www.empathia.ai/signup). All tools integrate with popular EMRs (Epic/Cerner/Allscripts) and are specialty configurable.

3. Summary Table

URL/Section Current State (robots.txt) Optimized Suggestion Current State (llms.txt) Optimized Suggestion
Homepage Open Ensure no duplicate entry points Short summary; no platforms explicitly Mention all platforms, integration, CTA
/product, /billing-code, /specialties Open Remain open, block test/private/params Feature lists, no direct links Add quick demo links, practical stats, compliance mentions
/admin/, /private/, etc. Not blocked Explicitly Disallow in robots.txt N/A N/A
Query parameters Allowed Disallow in robots.txt N/A N/A
Sitemap Not in robots.txt Add Sitemap: directive N/A N/A
Metadata block
(SEO/Accessibility)
N/A N/A Stated; little detail Add language support, accessibility certs
Support, Community, Testimonials Open Remain open Summarized Add counts/stats, direct support, ratings
App Platform Links Open Remain open Listed Add user count, update schedule

4. If robots.txt and/or llms.txt Do Not Exist

  • If missing, use provided templates and tailor with optimizations (block admin/test/params, add sitemap, enhance summaries and metadata, add trust signals).
  • Rebuild as per above blocks to maximize both search engine discoverability and LLM task usability.

5. Final Recommendations

  • robots.txt:
    • Remain transparent/open for all public and main sections, but block admin/sensitive/test/parameterized URLs.
    • Always add a reference to the sitemap file.
    • Review regularly for any private or staging URLs that must be blocked.
  • llms.txt:
    • For each URL, summarize with LLM/task-focused description and action links (demos, trials, case studies).
    • Mention supported platforms, compliance, accessibility, languages.
    • Add user counts/testimonials/ratings as trust signals where available.
  • For each new URL/section in the future:
    • robots.txt: Ensure crawlable unless sensitive.
    • llms.txt: Add a descriptive and actionable summary.

Export Options

If you need a ready-to-deploy YAML or JSON representation for ingestion or direct file export, specify your preferred format.