Discuz! Board

 找回密碼
 立即註冊
搜索
熱搜: 活動 交友 discuz
查看: 3|回復: 0

Delete the page permanently

[複製鏈接]

1

主題

1

帖子

5

積分

新手上路

Rank: 1

積分
5
發表於 2024-3-9 13:41:14 | 顯示全部樓層 |閱讀模式
With this method we are going to "physically" eliminate the page , the relevant URL will disappear from the SERPs and will no longer be reachable in any way. What happen? When someone tries to reach that url the browser returns a 404 (Not Found) or 410 (Gone) error . After other crawls that return these status codes, Googlebot will remove the page from its index.Be extremely careful before carrying out such a drastic operation. Always ask yourself: Is the URL I want to delete really useless? Here's how to recognize a useless URL with three simple questions, the URL to deleteoes it produce organic traffic ? Open Search Console, filter by page and see if it received any clicks.Is it displayed to users while browsing?



Look it up in Analytics and check the page viewsDo you receive Special Data incoming links? Feed it to Ahref, Majestic and find out.Once you have established the uselessness of the URL - the answer to the three previous questions must be a unanimous no - you can consider removing it permanently by deleting the page from the site. Once again, do all the necessary checks. If that URL receives links or makes sense to the user experience , opt for a 301 redirect .Don't use Robots.txt to delete a URLAre you wondering if it is possible to remove a URL using the Robots.txt file disallow ? Categorically no! Google itself tells us why in its guidelines, here is the official definition of robots.txt :A robots.txt file lets search engine crawlers know what pages or files they can and cannot request from your site. It is mainly used to avoid overloading the site with requests; it is not a mechanism that allows you to exclude a web page from Google. To exclude a web page from Google, you must use noindex instructions or password protect the page.The only valid alternative to noindex is therefore to block access by entering a password. Certainly a more cumbersome and expensive solution but undoubtedly effective.





In practice, with the disallow of robots.txt the indexing of a URL is not blocked but rather discouraged . If you have entered a certain URL in the robots disallow and then some external site links to it, the URL will continue to be present in the SERP , but a warning will appear in the description that this is not available due to the robots.txt file.Remove URLs from external sitesIt may happen that you have to ask Google to remove a URL/external content from search engines. A prime example, someone copying an article from you . What to do in case of plagiarism? Just submit a request ( DMCA type , American copyright law) on Google's Content Removal tool and cross your fingers.Delete URLs of external sites from GoogleGoogle's Content Removal toolEven if the request is successful, that URL will no longer appear on the SERPs but will continue to exist on the network. To permanently delete it, you must contact the owner of that website.

回復

使用道具 舉報

您需要登錄後才可以回帖 登錄 | 立即註冊

本版積分規則

Archiver|手機版|自動贊助|GameHost抗攻擊論壇

GMT+8, 2025-2-23 15:28 , Processed in 1.286636 second(s), 18 queries .

抗攻擊 by GameHost X3.4

© 2001-2017 Comsenz Inc.

快速回復 返回頂部 返回列表
一粒米 | 中興米 | 論壇美工 | 設計 抗ddos | 天堂私服 | ddos | ddos | 防ddos | 防禦ddos | 防ddos主機 | 天堂美工 | 設計 防ddos主機 | 抗ddos主機 | 抗ddos | 抗ddos主機 | 抗攻擊論壇 | 天堂自動贊助 | 免費論壇 | 天堂私服 | 天堂123 | 台南清潔 | 天堂 | 天堂私服 | 免費論壇申請 | 抗ddos | 虛擬主機 | 實體主機 | vps | 網域註冊 | 抗攻擊遊戲主機 | ddos |