PDA

View Full Version : Google: Do Not Block GoogleBot From Crawling 404s



Maria Jonas
07-15-2020, 11:17 PM
John Mueller of Google said it would be "a really bad idea which will cause all sorts of problems" if you block Google or other search engines from crawling pages that return a 404 server status code. He said "billions of 404 pages are crawled every day" by Google and it is normal.

One webmaster wrote that his "website automatically blocks user agents that get more than 10 404 errors, including Googlebot, so that's a problem." John responded to that that this is a really bad idea, he said "That sounds like a really bad idea which will cause all sorts of problems.. You can't avoid that Googlebot & all other search engines will run into 404s. Crawling always includes URLs that were previously seen to be 404."

dennis123
07-16-2020, 06:02 AM
Hi Friend,
Thanks, for providing good Information to community…

davidweb09
07-17-2020, 12:03 PM
Better way to redirect 404 errors to another working page in place to block indexing. https://www.brilliantglass.net/

shimarhussain12
09-26-2020, 02:35 AM
You can't avoid that Googlebot & all other search engines will run into 404s. Crawling always includes URLs that were previously seen to be 404.” So it means if you block Google or other search engines from crawling pages that return a 404 server status code would be a bad idea for your website.