Analytics
fedfitness.com robots.txt & llms.txt Analysis and Optimization Report

robots.txt & llms.txt Analysis & Optimization for fedfitness.com

Comprehensive technical report for enhanced SEO and AI/LLM visibility.

1. robots.txt Analysis & Optimization

1.1 Existing Content / Issues

  • Multiple, repetitive User-agent: * and Disallow blocks
  • Unnecessary and syntactically invalid line: Disallow:/products/fed-weighted-eco-friendly-dumbbells (no space after Disallow:, accidentally blocking a real product page)
  • Potential slowdown for good bots via Crawl-delay
  • Complexity from overlapping wildcards and duplicate patterns
  • Risk of blocking main product/collection pages via overly broad or error-prone rules

1.2 Optimized robots.txt Content

# SEO-optimized robots.txt for fedfitness.com

User-agent: *
Disallow: /a/downloads/-/*
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*filter*&*filter*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /policies/
Disallow: /*?*oseid=*
Disallow: /*preview_theme_id*
Disallow: /*preview_script_id*
Disallow: /*?*ls=*&ls=*
Disallow: /search
Disallow: /cdn/wpm/*.js
Disallow: /recommendations/products
Disallow: /apple-app-site-association
Disallow: /.well-known/shopify/monorail

User-agent: AdsBot-Google
Disallow: /checkouts/
Disallow: /cart
Disallow: /orders

User-agent: Nutch
Disallow: /

User-agent: AhrefsBot
Crawl-delay: 10

User-agent: AhrefsSiteAudit
Crawl-delay: 10

User-agent: MJ12bot
Crawl-delay: 10

User-agent: Pinterest
Crawl-delay: 1

Sitemap: https://www.fedfitness.com/sitemap.xml

1.3 Per-URL Optimization Suggestions

URL Existing Status Optimized Action & Reason
/ (Homepage) Crawlable ✅ Keep allowed
/products/fed-weighted-eco-friendly-dumbbells ❌ Blocked (error) Remove faulty Disallow; allow crawling (SEO value for product pages)
/collections/*sort_by* Blocked ✅ Keep blocked (prevent filtered/duplicate collection crawling)
/search Blocked ✅ Keep blocked (no SEO value, prevents index bloat)
Summary (robots.txt):
  • Remove: lines accidentally blocking real product URLs
  • Simplify: deduplicate and clarify rules, use correct syntax (Disallow: /path)
  • Protect: admin, carts, checkouts, internal search, filtered/sorted collections
  • Allow: product, homepage, main collection pages to be crawled

2. llms.txt Analysis & Optimization

2.1 Existing Content / Issues

  • llms.txt does not exist (404 Not Found)
  • AI crawlers and LLM indexers have no instructions on what to crawl or avoid

2.2 Optimized llms.txt Proposal

# LLMS Content Guidelines for https://www.fedfitness.com

User-agent: *
Allow: /

Disallow: /cart/
Disallow: /checkout/
Disallow: /orders/
Disallow: /account/
Disallow: /private/
Disallow: /user-data/

Sitemap: https://www.fedfitness.com/sitemap.xml

# Content Structure (for LLMs):
# - Product, class, and wellness info: /products/ /classes/ /courses/ /wellness/
# - Articles/blog: /blog/
# - Info pages: /about/ /contact/ /terms-and-conditions/
# - Multimedia: /media/
# Quality Guidelines:
# - Clear, factual, regularly updated content
# - Semantic HTML structure (headings, lists)
# - Alt text for images, schema markup where possible

# AI Crawling Tips:
# - Prioritize factual/evergreen guides and class schedules
# - Exclude transactional/private pages
# - Use sitemap for crawling structure

2.3 Per-URL Optimization Suggestions

URL Existing Status Optimized Action & Reason
/llms.txt ❌ Missing (404) Create for AI-friendliness, as per template above
/classes/, /courses/, /wellness/, /blog/ ??? (not specified) Allow in llms.txt, ensure good structure for LLMs to parse
/cart/, /checkout/, /private/, /user-data/ ??? (not specified) Disallow in llms.txt to prevent private/user-specific info access
/sitemap.xml (present) Always include in llms.txt and robots.txt for discovery

2.4 Additional AI-Friendliness Steps

  • Use semantic HTML, descriptive alt text, and schema.org markup on site pages
  • Regularly update evergreen and FAQ content
  • Ensure all key pages are internally well-linked

3. Summary Table

File Current Issues Optimized Actions
robots.txt Redundant rules, accidental product block, syntax Deduplicate, fix syntax, unblock products, protect admin/search/checkout
llms.txt File missing Create as per AI/LLM guidance, allow main content, block private/checkout
Example URL/Section Crawlable? (after fix) Action
https://www.fedfitness.com/ Yes Main page allowed
/products/fed-weighted-eco-friendly-dumbbells Yes Unblock product, allow indexing
/collections/*sort_by* No Keep blocked for duplicate filtering
/search No Keep blocked
/llms.txt Yes (after creation) Create file, allow factual content, disallow private/checkout

4. Action Checklist

  • For robots.txt:
    • Replace with the optimized version above
    • Remove accidental blocks for key product pages
  • For llms.txt:
    • Create following the recommended template
    • Update as the site expands or adds new content types
Implement these optimizations to ensure best-in-class SEO and future AI visibility!
For more granular per-URL or file review, provide the contents or list each page for detailed analysis.