How to prevent from being indexed by Google

One of the problems of publish SharePoint in a public website is that the internal webpages with sensitive content will be indexed. To prevent this we can set an a IIS redirection rule, but if we want to make it simple, we can make a file config in the root file to prevent to the search robots to not to crawl certain content. To do this, we can create a robots.txt with the following:

User-agent: *
Crawl-delay: 10
Disallow: */_layouts/
Disallow: */_catalogs/
Disallow: */Lists/
Disallow: */Forms/

You can get more info from a previous post: https://sharepointrescue.wordpress.com/2015/05/20/what-is-robots-txt/

Till next time!

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s