|
With this method we are going to "physically" eliminate the page , the relevant URL will disappear from the SERPs and will no longer be reachable in any way. What happen? When someone tries to reach that url the browser returns a 404 (Not Found) or 410 (Gone) error . After other crawls that return these status codes, Googlebot will remove the page from its index.Be extremely careful before carrying out such a drastic operation. Always ask yourself: Is the URL I want to delete really useless? Here's how to recognize a useless URL with three simple questions, the URL to delete oes it produce organic traffic ? Open Search Console, filter by page and see if it received any clicks.Is it displayed to users while browsing?
Look it up in Analytics and check the page viewsDo you receive Special Data incoming links? Feed it to Ahref, Majestic and find out.Once you have established the uselessness of the URL - the answer to the three previous questions must be a unanimous no - you can consider removing it permanently by deleting the page from the site. Once again, do all the necessary checks. If that URL receives links or makes sense to the user experience , opt for a 301 redirect .Don't use Robots.txt to delete a URLAre you wondering if it is possible to remove a URL using the Robots.txt file disallow ? Categorically no! Google itself tells us why in its guidelines, here is the official definition of robots.txt :A robots.txt file lets search engine crawlers know what pages or files they can and cannot request from your site. It is mainly used to avoid overloading the site with requests; it is not a mechanism that allows you to exclude a web page from Google. To exclude a web page from Google, you must use noindex instructions or password protect the page.The only valid alternative to noindex is therefore to block access by entering a password. Certainly a more cumbersome and expensive solution but undoubtedly effective.

In practice, with the disallow of robots.txt the indexing of a URL is not blocked but rather discouraged . If you have entered a certain URL in the robots disallow and then some external site links to it, the URL will continue to be present in the SERP , but a warning will appear in the description that this is not available due to the robots.txt file.Remove URLs from external sitesIt may happen that you have to ask Google to remove a URL/external content from search engines. A prime example, someone copying an article from you . What to do in case of plagiarism? Just submit a request ( DMCA type , American copyright law) on Google's Content Removal tool and cross your fingers.Delete URLs of external sites from GoogleGoogle's Content Removal toolEven if the request is successful, that URL will no longer appear on the SERPs but will continue to exist on the network. To permanently delete it, you must contact the owner of that website.
|
|