브라우저가 최신 버전이 아닙니다. 사이트가 제대로 표시되지 않을 수 있습니다. 브라우저를 업데이트해 주세요.

지식 베이스
Semrush Toolkits
SEO
Site Audit
Why are only a few of my website’s pages being crawled?

Why are only a few of my website’s pages being crawled?

Why are only a few of my website’s pages being crawled? image 1

If you’ve noticed that only 4-6 pages of your website are being crawled (your home page, sitemaps URLs and robots.txt), most likely this is because our bot couldn’t find outgoing internal links on your Homepage. Below you will find possible reasons for this issue.

Our crawler could have been blocked on some pages in the website’s robots.txt or by noindex/nofollow tags. You can check if this is the case in your Crawled pages report:

Why are only a few of my website’s pages being crawled? image 3

You can inspect your Robots.txt for any disallow commands that would prevent crawlers like ours from accessing your website.

If you see the following code on the main page of a website, it tells us that we’re not allowed to index/follow links on it and our access is blocked. Or, a page containing at least one of the two: "nofollow", "none", will lead to a crawling error.

You will find more information about these errors in our troubleshooting article.

Site Audit is currently equipped to parse homepages not larger than 4MB.

Why are only a few of my website’s pages being crawled? image 4
The limit for other pages of your website is 2MB. In case a page has too large HTML size, you will see the following error:
Why are only a few of my website’s pages being crawled? image 5