The dangers of slow indexing. We speed up the submission of new pages to the index. How to speed up Google indexing Speed ​​up page indexing

Everyone wants published materials to instantly get into the index and immediately bring in visitors. This is especially true when promoting in Yandex, which sometimes requires several updates to index a new page. What if during this time the relevance of the article expires? It turns out that you wasted resources on creating it. If you want articles to be available in search within a couple of hours after publication, follow the tips below.

Why is the site not indexed?

There are three types of website indexing problems:

  • Pages take a long time to be indexed(often the reason is the low frequency of updating content on the site, and the robot simply visits it less often; the problem may also lie in poor SEO optimization, low-quality content and the site as a whole);
  • pages are not indexed by either Yandex or Google(in this case, you need to make sure that a ban on crawling the site by robots is not set in robots.txt, hosting settings or robots meta tags);
  • pages are not indexed by just one search engine(it’s worth checking the site for search engine filters).

The last two problems can be resolved either very easily (if you just need to allow page indexing) or very long (if you have to remove the site from the sanctions of Yandex or Google). In this article we will look at ways to speed up indexing.

Ways to speed up website indexing by search engines

1. Improve your on-page SEO

Make internal linking

In the body of articles, place links to other pages on the site. Also add related posts (at the end of articles) and popular or new posts (in the sidebar). Do not nest more than 3-4 levels - pages at levels 5 and 6 are indexed worse.

Do you want to learn how to place internal links in a way that improves your behavioral performance and search rankings? The article “How to do cross-linking for successful SEO and increased conversions” describes all the most effective approaches.

Write useful and unique texts

If you post plagiarism or superficial rewriting, then do not be surprised that the pages are not included in the index. Search engines consider such content to be duplicated and not of value to users. Read about what properly optimized texts for a website should be.

Publish posts and update pages more often

Search engines love “live” sites. If you haven’t posted anything for several months, the robot stops visiting the site regularly.

How to keep your site up to date:

  • publish new posts at least once a week;
  • add links from old pages to new materials;
  • update and update information on the website;
  • configure the Last-Modified and If-Modified-Since headers (so search engines will crawl old pages only after changes have been made to them).

Eliminate duplicates on the site

Duplicates not only create inconvenience for webmasters, but also worsen the quality of the site in the eyes of search engines. They can be:

  • explicit (URL with and without a slash at the end, with and without WWW, with HTTP and HTTPS, with GET parameters);
  • implicit (duplicate pages with comments, product pages with similar descriptions, pagination pages).

They are eliminated by setting up 301 redirects, specifying canonical pages, using robots.txt. All methods with examples are described in the article “What are duplicates on a website and how to deal with them?” .

2. Use webmaster tools

After launching the site, be sure to register it in panels for webmasters: in Yandex.Webmaster and Google Search Console. This way you will inform search engines about the appearance of a new site, you will be able to track the dynamics of indexing, and you will receive notifications about errors.

After creating a new page or updating an old one, you need to notify search engines about this in order to speed up indexing (re-indexing). This is done using the so-called “addurilok” (from the English “add URL” - “add URL”).

In Yandex, this tool is located in Webmaster (section Indexing / Page Recrawling). Enter the new URL and click Submit.


"Addurilka" Yandex

Before the Search Console interface changed in 2018, Google had a tool that allowed you to add any page for re-crawling. There are still a lot of articles on the Internet with links to this tool and descriptions of its work, but it is more not supported.


Old "addurilka" Google

New pages in Google no longer need to be added somewhere for indexing - it is enough that they are in sitemap.xml. But old pages on which you have changed the content can be re-indexed faster. To do this, go to the new Search Console interface to the “Check URL” section, enter the address of the page you are interested in and click the “Request Indexing” button. This is a signal to the robot to bypass the page out of turn.

These actions do not guarantee that the page will be included in the index or speed up the process (the search engines themselves do not promise anything). But since such tools exist, it would be a sin not to use them.

When a site has 10, 100 or 1000 pages, then, as a rule, there is no need to worry about the completeness of indexing. But if we are talking about a large project (mainly online stores) with hundreds of thousands of documents (and sometimes several millions), then problems may arise with indexing.

Below I will tell you how to improve indexing to achieve the maximum number of effective landing pages in the search engine index.

Indexing is the basis of search traffic. An unindexed page will not bring visitors from search. Poor indexing quality is one of the main “calls” about problems in website ranking.

Search engines have become more selective than before. Every year the number of sites and, accordingly, pages grows almost exponentially. If you add everything that is published on them to the index, this will be additional costs for server capacity. Therefore, search engines choose what and how much to index.

What to start from?

You can’t just enter, for example, site:domain.ru into Google, look at the results and say that my site is poorly indexed. Why? Firstly, the number of indexed pages must be looked at in Google Search Console and/or Yandex.Webmaster. Data from search results may differ significantly from real data.

Secondly, the resulting number of documents in the index must be compared with the number of useful pages of the resource. What does this include:

  1. Categories/sections;
  2. Product cards;
  3. Blog articles;
  4. Filters, labels or tags (if indexed);
  5. User Reviews;
  6. Reviews;
  7. Service menu (“About the company”, “Services”, “Payment”, “Delivery”, etc.);
  8. Brands/manufacturers;
  9. News

In general, it is necessary to analyze the structure of the project and add up the number of all useful pages. Then the resulting amount is compared with the number of documents in the index according to search engines. If:

Main tasks:

  1. prohibit indexing of everything unnecessary on the site;
  2. provide only modified and non-indexed content;
  3. Help search engines find important pages.

1. Set up Last-Modified and If-Modified-Since

These are HTTP headers that tell the indexing robot that the page was last modified. If it has been changed, then he will go to it and index it (if he didn’t know about it before)/reindex it. If not (the server returns a “304 Not Modified” response), it will ignore it and move on to another one.

Why do this? The fact is that each resource has its own limitations on the number of documents scanned at a time - the crawling budget. Its value is influenced by the quality of the project, the relevance of the content, the frequency of updates, etc.

As a result, the crawling budget is not spent on unchanged pages - the number of indexed documents increases.

2. Get rid of duplicates and junk pages

The second point, but not the least important. The crawling budget can be spent not only on unchanged content, but also on duplicates and “garbage”. What does this include? These are mainly the pages:

  • search;
  • with 404 errors;
  • calendar;
  • with sessions in URL;
  • filters;
  • tags/tags;
  • with low-quality content: empty, spammy;
  • service (recycle bin, administrative panel);
  • with printable versions;
  • sortings.

This is not a complete list. All cases are individual. As a rule, a quality audit dots the i’s.

On a note. Redirects and alternative pages ( , hreflang and probably ) also take away your crawling budget.

3. Organize competent linking

The point “lies” on the surface, but not everyone uses it. - a strong factor for improving indexing. What you can pay attention to here:

  • block "similar products";
  • accompanying;
  • popular;
  • with discounts and promotions;
  • previous/next product;
  • bread crumbs

The smaller the nesting of pages, the better for indexing. Make sure that any document can be reached from the main page in 3-5 clicks.

4. Create a Sitemap: XML + HTML

Also a well-known point, but with its own characteristics. Common mistakes:

  • broken submaps with old URLs (once everything worked, but after a while it stopped - you need to check the sitemap sometimes);
  • all links on one page (limit 50,000 URLs per map; if more, create several submaps);
  • sitemap on a subdomain (you need to publish the map within the main domain name).

There is the following interesting way. Not all documents are added to the sitemap, but only non-indexed documents. I haven’t tested this method myself, but there is undoubtedly logic in it. First, it will not be entirely easy to implement something like this (additional development and interaction with Yandex.Webmaster/Google Search Console is required). The second is to use it only when the task is to add a really large number of URLs to the index.

5. Reduce site loading time and optimize your server

Many webmasters know that if you crawl a slow site, for example, Screaming Frog SEOSpider or XENU, then the programs work significantly longer. Indexing robots have a similar operating principle. The fewer 5xx errors and faster:

  • response from the server;
  • page generation;
  • content delivery;
  • and so on.

the easier it will be for a search spider to interact with the site, the more fully it will be indexed.

This point is important not only in indexing, but also in ranking. In principle, they should not be neglected.

6. Gradually increase the number of pages

Set up a scheduler that will publish all pages gradually. For example, you can divide the total amount into 2-6 months (depending on the scale). This will make it easier for a search robot to index the resource. An additional bonus - since the project is constantly updated, it develops (a signal for ranking).

7. Direct traffic to non-indexed pages

"Traffic and indexing, where is the connection here?" - you ask. It may not have happened before, but it does now. 2 points:

  • Statistics services (for example, Yandex.Metrica) can send unindexed documents for indexing. What prevents Google Analytics from doing the same (although I haven’t found any official confirmation).
  • Browsers (Yandex.Browser, Google Chrome) can do the same.

The logic is as follows: since a page is visited, then most likely it is useful, which means it should be in the index. Install statistics services from search engines and purchase traffic. Not necessarily expensive contextual. You can use other sources:

  • social media;
  • teaser systems;
  • tasks on mailers;
  • SARs.

Accordingly, traffic flows to non-indexed documents.

8. Provide links

Another additional signal to search engines that the site needs to be indexed as completely as possible. Of course, you can’t put eternal links on every page. The overall level of trust on the part of search engines is more important here.

You can use a general link strategy for the main promoted pages (this will also be useful). A slightly “risky” option is to use rental links, but without fanaticism (select a high level of nesting and use).

Adding several million documents to the index is not an easy task. In some cases, only 1-2 points may help, but it is better to use the whole complex. What methods do you think are the most effective? Maybe you know other options? I will be glad to chat with you on this topic in the comments.

Indexing in search engines - adding information about a site to a database that is used to display search results for information. Searching results:

Indexing is performed by a search engine robot.

Why is it necessary to speed up site indexing in Yandex and Google?

Increasing the indexing speed protects the site from article theft. The problem will arise when the text is published on a resource that is indexed faster. Because of this, users will more often see information from there, and the search engine will decide that all other copies are borrowed text.

Typically, pages enter the index with a delay of 3-15 days. It is clear that this is a huge problem for news and event resources. Are you often interested in news from two weeks ago?

The site's position in the list of search results and the number of visitors to the resource indirectly depend on the speed of indexing.

Features of accelerated indexing of sites by search engines

There are differences in the process of indexing websites by different search engines. It usually turns out that Google indexes a new page earlier than Yandex.

Google indexing takes 2 or 3 days. The page is indexed and becomes available to users.

To see in which direction to develop your site for quick indexing by Google, just look at the search results for the last 24 hours on your topic.

Yandex indexing happens differently. Pages are included in the index when the search results are updated. There is a “quickbot” system, with the help of which a page can appear in search results literally within a day.

If this happens, then you don’t have to wait for Yandex results to be updated.

Virtual website hosting for popular CMS:

There are a number of ways to speed up site indexing.

Registration in panels for webmasters of search engines.

Special services are created to maintain the history of sites - their indexing, analysis of keywords and transitions. Adding a site to Yandex Webmaster and Google Webmaster allows you to perform many useful actions to speed up indexing.

Panels for webmasters are located at https://webmaster.yandex.ru for Yandex and https://www.google.com/webmasters/tools for Google. To work with Google services, you need to register a Google account (for Yandex, respectively, an account in the Yandex system).

It is useful to register to work with panels of different search engines - Rambler, Yahoo, Aport and others.

Adding a new page to Search Engine Addurls

Adding a page to Add Url is the only way to speed up indexing recommended by Yandex.

Creating the correct robots.txt file

The rules for indexing the site and links are entered into the “robots.txt” document. It is possible to create such a file yourself, but there are also special services for this.

There is no ready-made recipe here, because... Depending on the site engine used, you need to close “extra” files and folders. Here is an example of robots.txt for an online store on OcStore (Russian branch of OpenCart)

The finished file is placed in the root folder of the site. After this, the file is also checked by services for webmasters.

If, for example, you do not specify sorting parameters, this may result in hundreds or thousands of identical pages getting into the index. As a result, the search engine will consider them duplicates and “throw out” all but one from the index. The chance that this one will best answer the user’s question is minimal.

Therefore, find the recommended robots.txt for your site engine with explanations of what is closed and why. Very important - don’t add just anything. For example, a single “Disallow /” directive without specifying a specific page will prohibit the entire site from being indexed. That is, not a single page of your site will appear in search results.

Creating a sitemap in sitemap.xml format

The “Sitemap.xml” file is created specifically for search engines. It contains a list of pages to be indexed. Each new article must be added to the sitemap.xml.

The xml format map must be checked for correctness before uploading to webmaster tools.

Creating a sitemap in html format

A card is required for all resources. It’s good if all the pages of the site are marked on it.

When a new article appears on the site, it must be immediately added to the html map. The file is created for indexing and user convenience.

The quality and frequency of writing unique content for the site.

How often the robot will visit the site depends on the frequency of new articles. Regular addition of new information greatly increases the indexing speed. It is considered useful to create a strict schedule for the appearance of new information - once every 3 days or 2 times a week. The robot will visit the site according to this schedule and look for new articles.

Therefore, news resources publish dozens of pages a day - this is a guarantee that the search engine robot will visit such a site every day. For the same reasons, posts on social networks appear very quickly in search results.

Low quality content negatively affects indexing. It is not recommended to allow:

  • copying other people's articles;
  • publishing text with a large number of errors;
  • adding articles with a high density of key phrases.

The volume of the article also affects the speed of indexing. It is advisable to publish information with a length of at least 1500 characters and a uniqueness above 80 - 90%. It is useful to add at least one picture to the article.

Use of social networks.

The minimum that can and should be done is cross-posting on social networks. Those. site materials are posted either in whole or in part on social networks.

Search engines monitor the emergence of new information in social bookmarking services - Toodoo, BobrDobr, MoeMesto, Delicious. To increase the speed of indexing a page, you can leave links to it there. The popularity of bookmarks is minimal, but the method still works.

Placing external links to new articles on the site.

In addition to adding links from social networks yourself, you can order them. They are quickly indexed and help other sites do the same. You can order inexpensive links from LiveJournal.

Today, the dependence of indexing speed on the appearance of external links is minimal. Considering the high cost of such links on “cool” sites, it is cheaper to publish your thematic content on your site every day.

Adding a site to various ratings

Relinking of website pages.

Relinking a site improves search results and speeds up indexing. The essence of the method is the connection of all articles with links. This is especially important for young sites.

An example of linking on our website:

A popular way is to create a widget in the garden bar of recent articles. The presence of a link to new information on all pages of the site speeds up the speed of the robot’s reaction to the appearance of new information on the site.

Other ways to speed up indexing

  1. Creating an RSS broadcast on the site. There is a way to distribute a link to a new page across a database of sites. The Pingxpetfree program and other RSS feeds are suitable for this. It is recommended to create an account in Feedburner and broadcast your recordings there.
  2. Using the “three-click rule.” To improve indexing and behavioral factors of the site, it is recommended to distribute pages in such a way that there are no more than three transitions from the main page to them.
  3. Announcement of new articles. Regular announcements of new articles on blogs and social networks attract a robot to the site. Indexation speed is growing due to links from Twitter, VKontakte, LiveJournal, LiveInternet. The success of using the method also depends on the popularity of the page on a social network or blog platform.
  4. Communication on forums Posts and comments on forums and blogs are suitable for indexing a new site. To use this method, you should pay attention to popular and well-indexed resources. To attract a robot to each entry, you should not use this method.
  5. Adding social networking buttons to the site. Adding social media buttons makes it easier to copy a link to a page. Clicking these buttons by users will help increase traffic to the resource.

As you can see, there are many ways to speed up the indexing of a site in Yandex and Google, and some of them, in addition to indexing, can bring an interested audience to your site.

Live and learn. An excellent saying that once again reminds us that it is impossible to embrace the immensity and know everything at once. I must say thank you that I finally found out about this method, better late than never. If you don’t yet know how to quickly index a new article, then read carefully and don’t miss this trick: “How to speed up indexing of an article in Google.”

First, I’ll post my result of speeding up the entry of an article into Google. I’ll say right away that 4 minutes is not the fastest result, since I quite accidentally remembered that I needed to tell my readers about this trick, and only after I had done all the steps to share the article, I decided to check it in Google search . I was very happy that on the third page of the search I found a fresh, just written article.

What should be done?

You need to go to Google Webmaster Tools and find the “Crawling” section on the left side of the control panel. Select “View as Googlebot” in this section.

Enter the address of the page or site without the domain name and click “Crawn”, then “Add to Index”. After that, in the form that appears, click “Crawl this URL only.”

Once again I draw your attention: if the page address is https://site/internet-uroki-soveti/kak-uskorit-in...stati-v-google, you only need to enter internet-uroki-soveti/kak-uskorit-in...stati-v-google. After the actions you have completed, the page will appear in the panel with the “ready” status. If you notice the “partially completed” status in the picture, then this is precisely a consequence of the incorrectly entered page address. I was wrong)) This happens too.

One more trick

But that is not all. You can also speed up the indexing of the entire site. Entirely. You've probably seen a chart in your webmaster's toolbar showing the number of indexed pages on your site. The picture shows data from two sites:

The most frightening thing is the first picture, where the number of indexed pages is very different from the number sent to the index. That is, I send them to the index, but they categorically do not want to be indexed. What do we do? We will treat it, namely, we will send the sitemap.xml map to the index. Just like we did with the website page in the “View like googlebot” section. Thus, we will try to draw the search engine’s attention to our site and will closely monitor changes in the situation.

If Google doesn’t want to look at our sites on its own, let’s send it personal invitations, right? After all, timely indexed articles will send visitors to us who are eager to find answers to their questions, which, in fact, is what we are trying to achieve - an increase in visitors from search engines.

If you are already convinced that your article or site is indexed, then you can rest. I suggest you watch a cool video about North Korea. Just beautiful!

Dear readers, today we will talk about speeding up the indexing of a website or blog.

What is site indexing? This is the addition of your newly written articles to the index (base) of search engines. Those. The faster the search robot came to your site and copied new articles to itself, the faster they will be indexed and the better! Now I will tell you everything in order.

Why is it important to speed up site indexing?

To speed up indexing, of course, the most important thing is the XML sitemap. Don't forget to indicate its address in the webmaster and robots.txt file. This method is good because you need to do it once and forget it.

Set up your robots.txt file

Use search engine boosters

Addurilka (English: Add URL - add address) is a service through which you can inform the search engine about the release of a new article. This is a kind of queue for indexing, which you join when you write a new article.

In addition, by adding an article to the Yandex add-on, you will receive a response based on the indexing result. Or the service will inform you about an error or impossibility of indexing the article. So this service is very convenient for speeding up the indexing of a site, of course, if you are not too lazy to add articles to it every time.

Set up ping

Ping is the signal your server sends to search engines when an article is published. This signal indicates that the new article needs to be indexed. This method of speeding up site indexing is also convenient because it does not require any action other than one-time setup.

It’s easy to set up: go to “Options” -> “Writing”. At the very bottom, in “Update Services”, enter a list of ping servers. Copy the complete list from.

We can talk endlessly about how to speed up website indexing by Yandex and Google. The methods considered are the most successful and useful in my opinion.

I advise you to use as many of the listed methods as possible to speed up site indexing, and the results will not be long in coming. But if you know any other important and effective ways, be sure to write in the comments.

Fast indexing everyone!