Using HTTP code and setting up robots.txt. You need to configure the server so that it sends a server response code when attempting to access a prohibited page and at the same time instruct search engines not to crawl this page in the robots.txt file HTTP
Code The web server should be configured for example via an Apache .htaccess file to send an HTTP code for a prohibited page forbidden page.html» Tunisia Email List Files where forbidden page.html is the path to the page to be banned. Robots.txt file. line to your robots.txt file for the page you want to prevent from being crawled.

Useragent Disallow forbidden page.html This entry tells search engines that forbidden page.html should be ignored and not crawled. After completing these steps when users try to access a forbidden page the server will send them a response code and may redirect them to a crafted forbidden.html page. Search engines will also ignore this page through settings in the robots.txt file and it will not be indexed in their database. Important Google Help recommends not overusing codes to control indexing.