Google Publisher Center
Crawler for Google Publisher Center submissions.
What does Google Publisher Center do?
Google Publisher Center fetches RSS and Atom feeds that publishers have submitted through Google Publisher Center. It pulls this feed content to populate Google News listings and publisher-managed pages within Google's news surfaces. Because fetched content surfaces as linked listings in Google News, it can drive significant referral traffic back to your site.
Should I allow and optimize for Google Publisher Center to drive organic growth?
Yes, allow GoogleProducer if you use Google Publisher Center. This fetcher is the mechanism that gets your articles into Google News listings, which link directly back to your original content. Blocking it effectively removes your feed content from Google News surfaces. If you've submitted feeds through Publisher Center, you want this fetcher to reach them. Google News remains one of the highest-volume referral sources for news publishers.
Here's how to optimize for Google Publisher Center:
- Ensure your RSS/Atom feeds are valid, well-structured, and include full article metadata
- Add Google's IP ranges to your WAF allowlist to prevent false-positive blocks
- Verify GoogleProducer requests using reverse DNS to confirm they resolve to google.com domains
- Keep feed URLs consistent and avoid unnecessary redirects
- Include canonical URLs in your feed entries to ensure proper attribution
- Update your Publisher Center configuration promptly when feed URLs change
Data Usage & Training
Google has not publicly documented whether content fetched by GoogleProducer is used for AI model training. Based on available documentation, fetched feed content is used to populate Google News and Publisher Center surfaces. If you have concerns about training use, Google's broader AI training opt-out mechanisms (such as blocking Google-Extended) are separate from this fetcher.
How Google Publisher Center Accesses Content
Here's how Google Publisher Center accesses your site and understands your content:
- Fetches RSS and Atom feeds supplied through
Google Publisher Center - Does not render JavaScript
- Requests originate from Google IP ranges published at https://www.gstatic.com/ipranges/goog.json
- May ignore robots.txt for user-triggered feed fetches
- Operates as a feed fetcher, not a broad web crawler
GoogleProducer polls feeds on a schedule or when triggered by Publisher Center workflows. It does not perform continuous broad crawling.
How to Block or Control Google Publisher Center
Robots.txt may not reliably block GoogleProducer for user-triggered feed fetches. The most effective way to stop it is to remove or update your feeds directly in Google Publisher Center. You can also block by IP using Google's published ranges at https://www.gstatic.com/ipranges/goog.json, but this risks blocking other Google services. If you still want to try robots.txt:
User-agent: GoogleProducer
Disallow: /
Be aware this may have no effect on feed fetches that were explicitly submitted through Publisher Center.
Common Issues & Troubleshooting
Watch out for these common problems when working with Google Publisher Center:
- WAFs and security systems may block GoogleProducer requests because they appear as unusual traffic from Google IPs or google-proxy hosts
- User-agent-based allow rules are unreliable due to spoofing and UA string variants
- Robots.txt Disallow directives may not stop user-triggered feed fetches
- Blocking Google IP ranges too broadly can break other Google services like verification and indexing
- Feed URL changes in your CMS without corresponding Publisher Center updates will cause fetch failures
Quick Reference
googleproducerUser-agent: googleproducer
Disallow: /See which agents visit your site
Monitor real-time AI agent and bot activity on your site for free with Siteline Agent Analytics
Frequently Asked Questions
Similar Agents & Bots
Learn More
Related Resources
Ready to track Google Publisher Center on your site?
Start monitoring agent traffic, understand how AI discovers your content, and optimize for the next generation of search.


