Thursday 3 September 2020

what is robots.txt and how to make perfect robots.txt file.

web crawler first checks robots.txt file . 

robots.txt file gives information to web crawler about on what pages to crawl and on what pages not to crawl.

this saves so much time of web crawler so crawler can easily crawl your website.

and it optimizes SEO .

It makes your website easily available on search engine very fast.

you can add custom robots.txt in your website's code.

you can check your robots,txt file by just typing the URL   your website url/robots.txt/

Here is the explanation by Neil Patel.

https://neilpatel.com/blog/robots-txt/

pic : neilpatel.com

robots

No comments:

Post a Comment