Agent DirectorySEMrushSemrush Backlink Audit

Semrush Backlink Audit

SEMrush backlink audit crawler for link profiles.

What does Semrush Backlink Audit do?

SemrushBot-BA crawls your site to discover and collect backlink data for Semrush's backlink index. This data powers Semrush Backlink Audit, Backlink Analytics, and other SEO tools used by millions of marketers and site owners. When your site appears in Semrush's backlink reports, subscribers can click through to your pages directly from the Semrush interface, creating a referral traffic path.

Should I allow and optimize for Semrush Backlink Audit to drive organic growth?

Allowing SemrushBot-BA helps ensure your site's backlink profile is accurately represented in Semrush's tools. Semrush subscribers can click through to your pages from Backlink Analytics and Backlink Audit reports, which creates a referral traffic channel. While this traffic comes from SEO professionals rather than general consumers, accurate backlink data in Semrush also helps other site owners discover your content when researching link opportunities. Blocking this bot means your site may be underrepresented in Semrush's backlink index, reducing your visibility in one of the most widely used SEO platforms.

Here's how to optimize for Semrush Backlink Audit:

  • Allow SemrushBot-BA in your robots.txt to ensure accurate backlink indexing
  • Ensure your robots.txt returns HTTP 200 (a 4xx response causes Semrush to assume no restrictions)
  • Set a reasonable Crawl-delay value if needed (Semrush enforces a maximum of roughly 10 seconds)
  • Include a Sitemap directive in robots.txt to help the crawler discover important pages
  • Use clean, crawlable URLs for pages you want indexed in Semrush's backlink database
  • Keep server response times fast to avoid timeouts during crawls

Data Usage & Training

Content crawled by SemrushBot-BA is used to build and maintain Semrush's backlink index and power its SEO product suite. Semrush's public documentation does not state that crawled content is used to train general-purpose AI models. The data feeds into analytical products like Backlink Analytics, Backlink Audit, and Link Building tools.

How Semrush Backlink Audit Accesses Content

Here's how Semrush Backlink Audit accesses your site and understands your content:

  • Fetches HTML via standard HTTP requests using the user-agent string Mozilla/5.0 (compatible; SemrushBot-BA; +http://www.semrush.com/bot.html)
  • Partial JavaScript rendering capability
  • Respects robots.txt Disallow, Allow, Crawl-delay, and Sitemap directives
  • Treats 4xx responses on robots.txt as if no robots.txt exists
  • Requires robots.txt to return HTTP 200 to be recognized

SemrushBot-BA crawls continuously using an adaptive schedule. It adjusts revisit frequency based on your site's robots.txt policies, server load, and Semrush's internal crawl frontier priorities.

How to Block or Control Semrush Backlink Audit

To block SemrushBot-BA via robots.txt: User-agent: SemrushBot-BA Disallow: / This only blocks the Backlink Audit crawler. Semrush operates other bots with separate tokens (SemrushBot, SemrushBot-SI, etc.), so block each one individually if needed. IP-based blocking is unreliable because Semrush uses numerous non-consecutive IP addresses and does not publish IP ranges. If you experience issues, contact [email protected] directly. Make sure your robots.txt is accessible at your site root and returns HTTP 200, otherwise SemrushBot-BA will treat your site as having no restrictions.

Common Issues & Troubleshooting

Watch out for these common problems when working with Semrush Backlink Audit:

  • If robots.txt returns a 4xx status or is missing after a site migration, Semrush assumes no restrictions and crawls freely
  • IP-based blocking is unreliable because Semrush does not publish consecutive IP blocks
  • Crawl-delay values above roughly 10 seconds may be truncated or ignored entirely
  • Blocking SemrushBot-BA does not block other Semrush crawlers (SemrushBot, SemrushBot-SI, etc.), each requires its own robots.txt rule
  • High crawl volume can occur on large sites since the bot runs continuously and adapts to server capacity

Quick Reference

Platform
Agent Category
Growth Value
Official Documentation
semrush.com/bot/
User Agent String
semrushbot-ba
robots.txt Entry
User-agent: semrushbot-ba
Disallow: /

See which agents visit your site

Monitor real-time AI agent and bot activity on your site for free with Siteline Agent Analytics

Get started free

Frequently Asked Questions

Similar Agents & Bots

Learn More

Related Resources

💥 Get started

Ready to track Semrush Backlink Audit on your site?

Start monitoring agent traffic, understand how AI discovers your content, and optimize for the next generation of search.