Why did I get a “page is not accessible” note for some of my pages?
This message will appear if the On Page SEO Checker crawler was blocked or unable to crawl your page. Please check the robots.txt file on your website to make sure that it allows our user agents to crawl its pages.
If our bots are not blocked in robots.txt, to fix the issue you need to whitelist the following IP addresses and User-agent with your hosting provider and any plugins/services you may manage your site with (i.e Cloudflare, ModSecurity):
To specify the Port, use one of the following options:
Port 80: HTTP
Port 443: HTTPS
Additionally, you should also whitelist the Site Audit bot which is used to crawl pages at the following IP address:
18.104.22.168/25 (a subnet used by Site Audit only)
User-agent name: SiteAuditBot
If you receive the error message "SEMRushBot-Desktop couldn't crawl the page because it was blocked by robots.txt," the crawl-delay settings within your robots.txt do not comply with On Page SEO Checker.
On Page SEO Checker crawlers only accept crawl-delay of 1 second. Anything above this value would make the crawler ignore the page. This may result in the pop-up message, as well as only a few optimization ideas being given to the specific landing page.
To fix this issue, change the crawl-delay within your robots.txt to 1 second.
- I can’t Change the Target Location. What do I do?
- Why can’t I Upload a File to On Page SEO Checker?
- On Page SEO Checker doesn’t see a video on my page
- Configuring On Page SEO Checker
- On Page SEO Checker Overview
- Reviewing Your Optimization Ideas
- Top 10 Benchmarking
- On Page SEO Checker Detailed Analysis
- Tracking Results in the Idea Tasks Report