Fast indexing of pages. Step-by-step instructions for setting up indexing. Bulk check of pages for indexing

How to make fast indexing in Yandex? We read and find out!
All webmasters and bloggers know that Google indexes any sites and all updates very quickly and without problems. But with Yandex, everything is not easy. At the end of the article, I share my personal experience, which recently helped a lot.

First, you have to wait for weeks until the Runet leader of search deigns to notice a new project. And then, even when the pages are indexed, there is still no stability. Pages every now and then drop out of the Yandex database. Additional efforts are required, which are discussed below.

What is needed for fast indexing of pages in Yandex

A list of the main activities helping to improve the indexing of a site in Yandex:

  1. Website creation in accordance with the recommendations of the search engine.
  2. Correct configuration of the robots.txt crawler management file.
  3. Sitemap creation.
  4. Registering a site in Yandex (adding a URL).
  5. Uploading a robots.txt file and an XML sitemap to Yandex.
  6. Regular posts of new quality content.
  7. Preliminary registration of authorship in Yandex.

This search engine has learned to understand the meaning of the texts almost like a person. In any case, the search engine is quite capable of distinguishing an informational exclusive from a cheap rewrite in automatic mode... As they say, without human help.

Therefore, the main focus is on the quality of publications. Size matters too. V recent times requirements for the volume of the text have increased. Now Yandex wants to see articles of 4500 characters or more.

Clear structuring of the site - if robots cannot get to the pages, what kind of indexing is there? The fact that pages appear and then drop out of the index suggests that algorithms doubt the sufficient quality of the content.

Technical issues are related to control files. Check the frequency of site crawls listed for each page in the robors.txt file. It also happens that the webmaster indicates indexing once a month, and then wonders?

The more frequent the updates, the more better indexing... Robots do not enter dead resources. And the best indicator real life there will be Likes and Shera in social networks. Don't forget to set Like & Share buttons.

Additional measures to speed up indexing in Yandex

Works very well for indexing creating a blog on the Twitter platform. Provided there are several thousand active readers, one tweet with a link to the update is enough for the page to fly into the Yandex index like a bullet.

Yandex now has its own ping service! So, now you don't have to worry about running updates through a lot of third-party pingers. Although it will not be superfluous. Some content management systems have special bulk ping plugins with built-in ping sites database. You need to find, install and configure.

Fast indexing of pages in Yandex: personal experience

Recently, after purchasing and transferring a new site to beget hosting, about 50 pages flew out of the Yandex index and new articles were not indexed for about two weeks. I decided through two of my old acquaintances (indexgator.com and fastbot.org) to drive the services into search. Both services work on the principle of posting links on twitter and livejournal blogs. The service is not expensive, somewhere around 50 kopecks for 1 url, but, it seems to me, not efficient enough.

Without waiting for the desired result, I decided to try getbot.guru... Helps to index only in Yandex, with Google it is unlikely to help. There are 3 tariff plans in getbot:

  • Absolute.
  • Express.
  • Express light.

Prices from 0.1 to 2 rubles, depending on the chosen tariff plan... I chose "Absolute" (2 rubles - 1 url), where the average percentage of indexing is 80-90%, and finally got what I wanted. What we liked - the indexing is automatically checked and if the pages are not included in the index, then the money is returned to the account. You can order the same url again.

The option described could also be attributed to additional activities, but since the social network Ya.ru closed, it can be considered not working. True, the search engine made it clear that instead of Ya.ru, there will now be a livejournal. Or did it just seem so to me? 🙂 But he exported all contacts there. Personally, now I definitely send every article to the life magazine and new pages fly into the index quickly.

Optimizers who like to tinker with analytics have noticed for a long time that sometimes it is enough to go to the page in the browser with the RDS-bar, and the article is immediately indexed. This can be seen in the RDS itself. How it works is unknown. And there is no guarantee that it will definitely work.

We have released a new book “Content Marketing in social networks: How to get into the head of subscribers and fall in love with your brand. "

Subscribe to


Site indexing is the process of finding, collecting, processing and adding information about a site to a database search engines.

More videos on our channel - learn internet marketing with SEMANTICA

Site indexing means that the search engine robot visits the resource and its pages, examines the content and enters it into the database. Subsequently, this information is issued by key queries... That is, netizens enter a query into the search bar and receive a response to it in the form of a list of indexed pages.

If we talk simple language, it will turn out something like this: the whole Internet is a huge library. Any self-respecting library has a directory that makes it easy to find the information you need... In the mid-90s of the last century, all indexing was reduced to such cataloging. found keywords on sites and formed a database from them.

Today bots collect and analyze information according to several parameters (errors, uniqueness, usefulness, availability, etc.) before entering it into a search engine.

Search engine algorithms are constantly updated and become more and more complex. Databases contain a huge amount of information, despite this finding the information you need does not take much time. This is an example of quality indexing.

If the site has not been indexed, then the information may not reach users.

How Google and Yandex indexes sites

Yandex and Google are perhaps the most popular search engines in Russia. In order for search engines to index a site, you need to report it. This can be done in two ways:

  1. Add a site for indexing using links on other resources on the Internet - this method is considered optimal, since the robot considers the pages found in this way useful and their indexing takes place faster, from 12 hours to two weeks.
  2. Submit the site for indexing by filling out a special form of the search engine manually using the services Yandex.Webmaster, Google Webmaster Tools, Bing Webmaster Tools, etc.

The second method is slower, the site is queued and indexed for two weeks or more.

On average, new sites and pages are indexed in 1-2 weeks.

It is believed that Google indexes sites faster. This is because the Google search engine indexes all pages, both useful and unhelpful. However, only high-quality content gets into the ranking.

Yandex works slower, but indexes useful materials and immediately excludes all junk pages from the search.

The site is indexed like this:

  • the search robot finds the portal and examines its contents;
  • the information received is entered into the database;
  • in about two weeks, the material that has successfully been indexed will appear in the search results upon request.

There are 3 ways to check the indexing of a site and its pages in Google and Yandex:

  1. using tools for webmasters - google.com/webmasters or webmaster.yandex.ru;
  2. by entering special commands in search string, the command for Yandex will look like this: host: site name + first-level domain; and for Google - site: site name + domain;
  3. using special automated services.

Checking indexing

This can be done using:

  1. search engine operators - look in the help or;
  2. services of special services, for example rds bar;

How to speed up website indexing

The speed at which new material appears in the search results depends on how quickly the robots will index, the faster the target audience will come to the site.

To speed up indexing by search engines, you need to follow a few guidelines.

  1. Add site to search engine.
  2. Regularly fill the project with unique and useful content.
  3. Site navigation should be convenient, access to the pages should be no longer than 3 clicks from the main page.
  4. Place the resource on fast and reliable hosting.
  5. Correctly configure robots.txt: eliminate unnecessary bans, close service pages from indexing.
  6. Check for errors, quantity keywords.
  7. Make internal linking (links to other pages).
  8. Post links to articles on social networks, social bookmarks.
  9. You can create a sitemap, you can even create two - for visitors and for robots.

How to close a site from indexing

Close a site from indexing - prevent search robots from accessing the site, some of its pages, parts of text or images. This is usually done in order to hide from public access classified information, technical pages, development-level sites, duplicate pages, etc.

This can be done in several ways:

  • Using robots.txt, you can prevent the indexing of a site or page. To do this, at the root of the website, a Text Document, which sets out the rules for search engine robots. These rules consist of two parts: the first part (User-agent) indicates the addressee, and the second (Disallow) prohibits indexing of any object.
    For example, the prohibition of indexing the entire site for all search bots looks like this:

User-agent: *

Disallow: /

  • Using the robots meta tag, which is considered the most correct for closing one page from indexing. With help noindex tags and nofollow, you can prevent the robots of any search engines from indexing a site, page or part of the text.

An entry for prohibiting indexing of the entire document will look like this:

You can create a ban for a specific robot:

What indexing affects when promoting

Thanks to indexing, sites get into the search engine. The more often the content is updated, the faster it happens, since bots come to the site more often. This results in a higher ranking when issued on request.

Site indexing in search engines gives an influx of visitors and contributes to the development of the project.

In addition to content, robots evaluate traffic and visitor behavior. Based on these factors, they draw conclusions about the usefulness of the resource, visit the site more often, which raises it to a higher position in the search results. Consequently, the traffic increases again.

Indexing is an important process for promoting projects. For indexing to be successful, search engines need to make sure the information is useful.

The algorithms by which search engines work are constantly changing and becoming more complex. The purpose of indexing is to enter information into the database of search engines.

Transitions of users to sites from search engines are one of the primary sources of obtaining visitors, otherwise - potential users of the product / service presented on the resource. Timely information of the site by search engines allows you not to lose your customers. Therefore, the actions associated with the representation in the search should be mandatory paramount, especially for newbie sites.

Why, when indexing, it is worth, first of all, focus on Google and Yandex

This is due to the fact that the level of development of the main characteristics of "search engines" surpasses all other systems presented to date:

  • Accuracy - how much the documents found by the system match the request. For example, when the user enters "buy a fur coat" into the search bar, the "search engine" displays 90-100% percent with the given unchanged combination of these words. The higher the percentage of similarity, the better.
  • Completeness - the number of documents, relative to all available on the network on this topic, which the system issues to the user. If a total of 100 documents on the question "Food for a 1 year old child" are conditionally on the network, and the "search engine" has submitted only 70 for consideration, the completeness will be equal to 0.7. A search engine with a high value "wins".
  • Search speed is related to technical characteristics and the capabilities of each "search engine". The higher it is, the more users will be satisfied with the system.
  • The visibility of the search is the quality of the presentation of information upon request, the system's hints regarding those documents that were found upon request. This is the presence of simplifying elements on the results page.
  • - a characteristic that denotes the time interval between receiving information and entering the index into the database. Large search engines have a so-called "quick base", which allows them to index new information in a short time.

Step-by-step instructions for setting up indexing

Before you send a site for indexing by search engines, you need to make preliminary preparation... This is due to several points:

  • Competent preliminary work will exclude the indexing by the search engine robot of unnecessary or incompletely executed and registered information.
  • If the robot detects flaws - unwritten metadata, grammatical, unclosed uninformative links - the search engine will respond to the site owner with a low rating, incorrect submission of material in the search results, etc.
  • While the preparatory work for demonstration to "search engines" is being carried out, it is necessary to hide the information from the robots and index it by a corresponding entry in the robots.txt file.

Proper preparation for indexing will include:

1. Development of meta tags, description and title of pages:

  • Title must be no more than 60 characters. This is the main title of the page and the most important of the tags.
  • Description consists of readable phrases positioning this page, that is, it is necessary to write the main theses, what exactly will be discussed in this material.
  • The keywords tag assumes the prescription of all possible words on a given issue. Recently, the value of this tag has diminished in the eyes of search engines,.
  • The meta tag revisit (or revisit-after) will talk about the period when the site updates are planned, this is a kind of request-recommendation of the optimizer for the robot, indicating the optimal period of time until the next check of the resource.
This tag should only be used with maximum confidence in the result. Otherwise, this action can only have the opposite effect.

2. Concealment of internal and uninformative sections of the site. This robot is also produced in the robots.txt file. "Search engine" considers this kind of information "weedy", and therefore it will be a minus in the process of checking the resource.

5. Highlighting keywords and highlights in bold should be taken care of, as the search engine regards these words as the most important, which is not always the case in fact.

6. All existing images must be signed with the alt tag.

7. It is necessary to check the texts for the number and revolutions in the text so that the robot does not ignore the information due to the high rate of text nausea.

8. An obligatory item before submitting an application to search engines for indexing a resource is to check spelling, grammatical and stylistic errors. If there are any in the description, the system will display information in such a form that it can filter out a large percentage of those wishing to visit the site even at the stage of issuing upon request.

In order for the resource to come out among others in the search results by search query user, you need to configure indexing in the main search engines used:

1. Google Search Console:

  • The form for adding a resource is available at the link https://www.google.com/webmasters/tools/submit-url.
  • To use the service, you need to log in from your Google account.
  • A window will appear in which you must enter the address of the resource that requires indexing.
  • Website ownership will be verified by downloading HTML file to the root of the resource.
  • Google system will issue a message confirming the ownership of the site, which will indicate the inclusion of the resource in the index by this "search engine".
  • Here the form for adding is located at: http://webmaster.yandex.ru/addurl.xml.
  • A form will open where you need to register the address home page promoted resource. The system usually requires you to enter a captcha, after which you need to click the "Add" button.
  • The Yandex search engine checks the resource, after which it gives an answer, a decision on the issue of indexing. If Yandex writes that the site has been added, then the resource has been queued for indexing. Server problems will cause the system to respond: "Your hosting is not responding."
If the "search engine" displays the message "The specified URL is prohibited from indexing", this indicates the sanctions imposed on the site. In this case, you will need to urgently contact Yandex technical support specialists.

In addition to indexing in the main systems, do not forget about the slightly less well-known "search engines":

  • Rambler focuses on indexing a resource in Yandex, therefore, to add an index to its base, it is enough to go through indexing in the main search engine.
  • Indexing in Mail.ru is performed here: http://go.mail.ru/addurl.
  • The traffic of the Russian search engine Nigma.ru is about 3,000,000 per day. You can apply for indexing in this system here: http://www.nigma.ru/index_menu.php?menu_element=add_site.

Other ways to customize indexing

Search engines make the decision to index the site, regardless of the desire of the owner of the resource for indexing.

Therefore, the term "customization" in relation to the indexing process does not sound entirely correct.

It would be more correct to say about the formation of conditions for a search engine to make a positive decision about indexing a resource.

These conditions include:

  • Creation in social networks, telling about the resource. The direction of the flow of visitors by posting records with interesting information with a proposal to follow the link to clarify points of interest, order, obtain more information on the indicated issue.
  • To increase the likelihood of approval of the site and indexing it in Google, it is useful to register an account in this system and start being active.
It is necessary to understand that without indexing the site by search engines, all subsequent promotion actions will be useless.

That's why this action it is necessary to do it first (for new sites) and periodically check this moment when you include fresh information and add whole pages (for existing resources).

What is site indexing in search engines is known to many webmasters. They are eagerly awaiting an update of the search base to rejoice in the indexing results or to find and fix optimization errors that interfere with high-quality indexing and further website promotion.

Thanks to high-quality indexing of sites on the Internet, you can find anything you want.

How does the indexing system work in major search engines?
Search engines have robotic programs (search bots) that constantly “walk” the links in search of new pages. If they find new page that meets the requirements of the algorithm of this search engine, then it is included in the search results and indexed.


pic: Indexing helps find sites

The most valuable and at the same time complex is the algorithms of search engines, by which they select pages for their search base. Different search engines have their own: some are better, some are a little simpler. This must also be taken into account when indexing the site. They say that you can find anything on the Internet. And thanks to what can you find? Right! Thanks to the high-quality indexing of sites.

How do I add a site to the search engine index?

How to quickly and easily add your site to the search engine index? It would seem that there is nothing complicated in this: you just need to place the site on the network, and the search engines themselves will rush to it. If everything were that simple, then numerous SEO optimizers would be out of work.

Let's see what indexing is. Indexing is the process of adding pages on your site to the search engine's database. In simple terms, the search engine collects your pages so that they can then be shown to users for specific queries. In what order to display and for what queries - this is the topic of more than one article.

Indexing a site is quite simple: you need to "tell" the search engine that you have a site that may be of interest to it. Each search engine has a form for adding sites to the index. Here are links to forms for adding sites to the index of some search engines:

To speed up indexing, many recommend registering your site with social bookmarking systems. This is really justified, tk. search robots (programs that carry out indexing) visit such sites very often. If they see a link to your resource there, then its indexing will not take long.

Registration of a site in search engines and social bookmarks can be carried out both independently and entrusted with this matter to companies that deal with the issues of site promotion.

Why do I need indexing?

Do you need a website that increases your company's sales and promotes your products? Or maybe you need a website that is profitable by itself? Maybe you want to keep a personal diary and get paid for it? If you answered in the affirmative to any of these questions, then you should at least in general terms understand what the indexing of a site in search engines is.

Follow the main condition - create a site “for people”, convenient and with unique content.

Indeed, if your site is not in the search results of the largest search engines (Yandex, Google, Rambler ...), then you may not even hope to make a profit and promote your goods or services. The site will be an extra burden, eating away the firm's budget for its maintenance.

A completely different situation will arise if the site is indexed. Moreover, the more pages have been indexed, the better. The main thing that is necessary for successful indexing is optimization and uniqueness of the site content.

Search engines are developing rapidly, indexing algorithms are constantly being improved. Now search engines can easily identify plagiarism or unreadable text. Therefore, follow the main condition that is necessary for successful indexing - create a site “for people”, convenient and with unique content.

Site indexing not only gives a large number of targeted visitors (which ultimately affects the sales of your company's products), it also contributes to the development of the project itself and can direct the site owner along a more promising way to expand his Internet project.

How often is indexing on the Internet?

On many large forums devoted to the promotion and promotion of sites, you can find topics with approximately the same names: search engines. What is it, and how often are search engine databases "up-dated"? How does all this affect indexing? Let's try to figure it out.

A person who knows a little about the terminology of the Internet probably knows what "up" is. But what is the search base up, or the indexing update, only those who are engaged in the promotion and promotion of sites know. We understand that search engine data cannot be constantly updated. This is fraught not only with banal server overloads, but also with equipment failure. Of course, small databases can constantly change their state, and if we are talking about the databases of search engines, which are responsible for indexing sites, then this is a completely different matter.

Imagine how many requests the indexing database receives every second. And what will become of it if the indexing information is still changing in parallel? Naturally, it may not withstand, which was observed at the dawn of the development of search engines.

Today this problem has been solved in a fairly universal way: data on indexing from search robots are stored in temporary databases, and the update of the "main" database occurs with a delay of several days. Therefore, the indexing of sites in major search engines is quite fast and without glitches.

Preparing the site for indexing.

Many novice webmasters on specialized forums ask the same question: how to properly prepare a site for indexing. Perhaps these recommendations will help you:

  1. Successful indexing requires high-quality unique content. This is perhaps the first and most important condition. If your site uses "stolen" content, then the likelihood that the indexing will be successful is low.

  2. Do not use "gray" and "black" methods of page optimization: once and for all give up the list of keywords in the background color of the page, as well as various iframe structures. If a search engine robot suspects you of such violations, then Domain name will be generally prohibited from indexing.

  3. After you have uploaded the site to the server, do not rush to add it wherever possible. Check again the content, code for validity, internal linking of pages. If everything is done correctly, notify search bots and invite them for indexing.

  4. Check for meta tags, keywords and descriptions in them, page titles and image alts. If all this is available, then you can safely carry out indexing.

  5. Add your site to search engines through special panels.

As you can see, the tips are pretty simple. But for some reason, many novice optimizers do not pay enough attention to them, and then complain that the indexing of their sites is delayed for several months.

Other materials

Create a sitemap. If you are using engines, then the easiest way is to download a plugin that will automatically generate it and update it after new pages appear. If you created a static site, then you need to create a sitemap yourself or using special services. On the Internet you can find full list recommendations from search engines. In particular, it is better to make a map in xml format and place it in the root folder.

Next, you need to add it to the web-master panels of various PSs. This action is carried out in a couple of clicks and does not represent anything complicated. The map will allow search engines to know about all the pages that are present on your site, determine the criteria for their importance, as well as their location in the network structure.

Non-reference methods

To index all old records, they can be manually added via special service in the webmaster's tools. Exists special programs that help automate this process. However, you will have to manually enter the CAPTCHA code. If you do not want to waste energy on this, then use the services of sites like antigate.

Permanent publishing helps to increase the indexing speed of new pages. Search engines see that your content is frequently updated on your resource and much more often send a robot to check for updates. However, this is not a quick process. You need to publish content every day for at least three months, and preferably several times a day. In addition, in addition to regular publishing, you need to apply methods to accelerate indexing.

Reference methods

You can use runs on bookmark sites. This will allow the robot to quickly navigate and analyze your pages. but this way over time, its relevance is used extremely rarely today. Instead, it is better to use social signals: posts on various social networks that link to a new page.

Good links help increase the number of pages in the index. They need to be bought from resources that are respected by search engines. As a rule, these are large portals with high rates of TIC, PR, which often publish new content. It is not necessary to buy a perpetual link, it is enough to purchase a temporary link for a couple of months.

Also, the indexing of pages can increase internal linking. You need to place as many links to other pages within the site as possible. That is, link articles to each other. This will allow the robot to quickly go through all the pages of your resource and add them to the database.