# ============================================================ # VedicDevs Website — robots.txt # Allow all legitimate crawlers | Block private paths only # ============================================================ # ── Primary crawlers (Google, Bing, Yahoo, DuckDuckGo) ────── User-agent: Googlebot Allow: / Disallow: /cache/ Disallow: /includes/ Disallow: /admin/ Disallow: /review/ User-agent: Bingbot Allow: / Disallow: /cache/ Disallow: /includes/ Disallow: /admin/ Disallow: /review/ User-agent: Slurp Allow: / Disallow: /cache/ Disallow: /includes/ Disallow: /review/ User-agent: DuckDuckBot Allow: / Disallow: /cache/ Disallow: /includes/ Disallow: /review/ # ── Social crawlers (for rich previews) ───────────────────── User-agent: facebookexternalhit Allow: / User-agent: Twitterbot Allow: / User-agent: LinkedInBot Allow: / User-agent: WhatsApp Allow: / # ── SEO tools ──────────────────────────────────────────────── User-agent: AhrefsBot Allow: / Crawl-delay: 10 User-agent: SemrushBot Allow: / Crawl-delay: 10 User-agent: MJ12bot Disallow: / User-agent: DotBot Disallow: / # ── Block bad bots ─────────────────────────────────────────── User-agent: SiteRip Disallow: / User-agent: HTTrack Disallow: / User-agent: wget Disallow: / User-agent: curl Disallow: / # ── Default catch-all ──────────────────────────────────────── User-agent: * Allow: / Disallow: /cache/ Disallow: /includes/ Disallow: /admin/ Disallow: /review/ Crawl-delay: 2 # ── Sitemap ────────────────────────────────────────────────── Sitemap: https://www.vedicdevs.com/sitemap.xml