SEO Questions & Answers Logo
SEO Questions & Answers Part of the Q&A Network
Q&A Logo

What’s an example of a robots.txt file that blocks admin pages but allows Googlebot to crawl everything else?

Asked on Sep 09, 2025

Answer

To block admin pages while allowing Googlebot to crawl everything else, you can use a robots.txt file to specify which parts of your site should not be accessed by web crawlers. This file is placed in the root directory of your website.

    User-agent: *
    Disallow: /admin/

    User-agent: Googlebot
    Allow: /
    
Additional Comment:
  • The "User-agent: *" line applies to all web crawlers, instructing them not to access the "/admin/" directory.
  • The "User-agent: Googlebot" section specifically allows Googlebot to crawl the entire site, overriding the general disallow rule for other bots.
  • Ensure that the robots.txt file is correctly placed in the root directory of your website (e.g., https://www.example.com/robots.txt).
  • Always test your robots.txt file using Google's robots.txt Tester tool to ensure it behaves as expected.
✅ Answered with SEO best practices.

← Back to All Questions
The Q&A Network