Post by tonima5 on Jan 18, 2024 9:19:36 GMT
Incorrect redirection can also cause an error, so when installing a redirect you should always check whether everything is working correctly. Check the site for 404 errors Website address Consequences of a 404 error First, the user experience deteriorates. When seeing a 404 error, the user immediately closes the page and often does not even return to the site. Secondly, it negatively affects SEO. Keywords placed on a non-working page are lost. Third, visibility in search results deteriorates. Fourthly, search robots are reluctant to visit sites that have a lot of broken pages, which wastes the crawling budget. How to Find 404 Error Pages The following options exist: Screaming Frog . It is the main tool with which you should start analyzing 404 errors. Although it can be any other scanner, for example, Sitebulb, OnCrawl, etc.
These tools allow you to conduct in-depth analysis Email Marketing List of the site. They show full information about its status, including 404. Let's take Screaming Frog as an example. The required information can be found in the Response Codes tab. To see the exact location of each 404 error, you need to click on the specific URL and then check the Inlinks tab. how to find 404 error pages - Screaming Frog Using the Screaming Frog tool, you can also find 404 errors from external sources. Google Search Console . You can see the broken page in Google Search Console. Unfortunately, GSC does not display a complete list of problematic subpages, although it would be a good idea to look at the messages in the “Error” and “Excluded” sections. To analyze a 404 response, it's worth focusing on a list of subpages labeled like this: submitted URL not found (404); explicit error 404; not found (404). Additionally, for errors with this code, it's also worth looking at the URLs included in the list of subpages labeled "Crawl Error" and "Submitted URL Contains Crawl Errors." Analysis of server logs. It lets you know how Google robots navigate your site.
They are worth studying so that your crawling budget is not wasted. In this way, you can find pages that were visited by search robots, but returned 404. One of the tools used for log analysis is Screaming Frog. It is better to import within the last 30 days. The File Log Analyzer tool is also used. It's intuitive. After downloading the logs, you can see a graph with the distribution of server response codes. For a more detailed analysis, you should go to the URLs tab. There will be information about how often bots visit the site and what code status they return. During your analysis, it's also worth taking a close look at the pages that cause a 301 server response, as they may redirect to pages with a 404 server response or cause circular redirects. Map .xml . It is also recommended to check the Sitemap for possible 404 errors. This is especially important for resources whose sitemaps are not updated automatically. They may contain a large number of long-deleted URLs. Sitemap content information can be easily verified in GSC. However, in this tool, you must first provide the .xml map URL. The report will show the sending or indexing status.
These tools allow you to conduct in-depth analysis Email Marketing List of the site. They show full information about its status, including 404. Let's take Screaming Frog as an example. The required information can be found in the Response Codes tab. To see the exact location of each 404 error, you need to click on the specific URL and then check the Inlinks tab. how to find 404 error pages - Screaming Frog Using the Screaming Frog tool, you can also find 404 errors from external sources. Google Search Console . You can see the broken page in Google Search Console. Unfortunately, GSC does not display a complete list of problematic subpages, although it would be a good idea to look at the messages in the “Error” and “Excluded” sections. To analyze a 404 response, it's worth focusing on a list of subpages labeled like this: submitted URL not found (404); explicit error 404; not found (404). Additionally, for errors with this code, it's also worth looking at the URLs included in the list of subpages labeled "Crawl Error" and "Submitted URL Contains Crawl Errors." Analysis of server logs. It lets you know how Google robots navigate your site.
They are worth studying so that your crawling budget is not wasted. In this way, you can find pages that were visited by search robots, but returned 404. One of the tools used for log analysis is Screaming Frog. It is better to import within the last 30 days. The File Log Analyzer tool is also used. It's intuitive. After downloading the logs, you can see a graph with the distribution of server response codes. For a more detailed analysis, you should go to the URLs tab. There will be information about how often bots visit the site and what code status they return. During your analysis, it's also worth taking a close look at the pages that cause a 301 server response, as they may redirect to pages with a 404 server response or cause circular redirects. Map .xml . It is also recommended to check the Sitemap for possible 404 errors. This is especially important for resources whose sitemaps are not updated automatically. They may contain a large number of long-deleted URLs. Sitemap content information can be easily verified in GSC. However, in this tool, you must first provide the .xml map URL. The report will show the sending or indexing status.