▷ SEO & log analysis: understanding the concept of budget crawl 2020 -

In SEO, log analysis is a subject that is increasingly discussed at conferences of SEO experts. And for good reason, it is a tremendous lever to improve its natural referencing on search engines, even if at first it requires a minimum of technical knowledge. The purpose of this article is to understand how it works to better understand it …

This article is taken from the white paper 5 trends to explore to accelerate your SEO in 2018, do not hesitate to download it if the subject of SEO interests you.

Googlebot, Bingbot, DuckDuckBot … these indexing robots that browse the web to enrich the engine index

In the introduction to the white paper, we discussed the principle of robots. Also called “Web Crawler” (or “Indexing Robots”), they continuously navigate websites, with the aim of scanning the content of pages, saving them in a database to index their results, then visiting the links on the page to continue their journey. They also come to check if pages already saved in their databases have been updated, in order to take into account the enrichment of the content.

These robots are therefore the main point of contact between the website and the search engines: it is vital to be able to analyze their behavior on your site, in order to understand what content they favor and especially what content they abandon. This is how the concept of a “budget crawl” was born, the idea that robots should be optimized in order to favor their passage on high value content, rather than on low value pages. To fully understand this aspect, we can imagine that the robot has a limited number of credits (1 credit = 1 page viewed), and that it must be directed to the most profitable pages.

Take the example of a “Contact Us” content page. This page changes only very rarely, if ever. In addition, this type of page presents a content in low quantity, and therefore having little interest from an SEO point of view. It is therefore unnecessary for a robot, during its passage on your site, to check this page, rather, it would be preferable for it to go to a page presenting a real content with potential for indexing and therefore generating audience.

To find out how to assess the quality of your site’s content from an SEO point of view, don’t hesitate to download our SEO white paper.

How to analyze the passages of robots on your website?

Each action on a website, emanating from human beings (users) or robots (Web Crawler) will leave a trace on the server log data. Logs are files that record all the activity of a website, identifying the IP addresses coming to connect to your website, but also their journey on it.

Many tools are used to analyze these log files in order to isolate the activity of robots from all search engines (Google, Bing, Yahoo, Yandex, Baïdu, etc.). This will allow you to identify the frequency of visits, the number of pages viewed per visit, and to act accordingly.

How to optimize your “budget crawl”?

Let’s take a concrete example: a web page is very often visited by robots, but it has not been updated in 6 months, while other pages on the same theme have been created recently, and are rarely visited by the robots. You will therefore be able to update the oldest page, and introduce a hypertext link to the new content to promote, in order to help the robots to explore it, while promoting the referencing of the most recent pages. .

On a site with thousands or even millions of pages, reading these log files becomes a formidable weapon to optimize your SEO and favor the browsing of robots on pages with high positioning potential in search results.

DOWNLOAD THE SEO WHITE PAPER

Article written in collaboration with SiteImprove