coolchat asked
Member (0 upvotes)

How to find/crawl the Robots.text to find all the links inside of the file?


Im accessing a website where it seems like most of the relevant links are being blocked/hidden by the robots file.

Is there a way to see the links that are being blocked by robots file?

Avatar for member KnowOneSpecial
Moderator (84 upvotes)

Robots.txt does not block any links, it does however tell Google and other SE's not to crawl the page, but that is up to the crawler to comply with and not all will honor the directive in the robots file.

So in most cases...

All you need to do is examine the source code of a page and you will see all the links.  Even the hidden ones in most cases.  Some are visible only to Google's known crawler ip addresses.  You can use Googles "Mobile Friendly Test" to see these links.

Good advice?
Comments 0
login to reply
Copyright ©2024 SEOChat. All Rights Reserved. Owned and operated by Search Ventures Ltd and Chris Chedgzoy.