coolchat
coolchat asked
Member (0 upvotes)

How to find/crawl the Robots.text to find all the links inside of the file?

Hi,


Im accessing a website where it seems like most of the relevant links are being blocked/hidden by the robots file.


Is there a way to see the links that are being blocked by robots file?

#robots#hiddenlinks#internallinks#externallinks#crawling
Avatar for member KnowOneSpecial
KnowOneSpecial
Moderator (84 upvotes)

Robots.txt does not block any links, it does however tell Google and other SE's not to crawl the page, but that is up to the crawler to comply with and not all will honor the directive in the robots file.

So in most cases...

All you need to do is examine the source code of a page and you will see all the links.  Even the hidden ones in most cases.  Some are visible only to Google's known crawler ip addresses.  You can use Googles "Mobile Friendly Test" to see these links.

Good advice?
0
0
Comments 0
login to reply
Copyright ©2024 SEOChat. All Rights Reserved. Owned and operated by Search Ventures Ltd and Chris Chedgzoy.