"Gray" optimization methods. Gray methods of search engine optimization

Description

Gray optimization is the main tool for website promotion. Unlike black methods, gray ones rarely lead to a ban or penalties from search engines, which allows specialists to use them to bring resources to the first positions in a relatively short time.

To gray methods search engine optimization include an increase in the frequency of keywords in the texts of pages, which often leads to their unreadability, doorways without a redirect, when in the case of access to the doorway, automatic redirection to the promoted resource does not start, buying links and much more.

Increasing Keyword Frequency

When determining relevance, search engines first of all pay attention to how many times a phrase occurs on a page that is identical to the user's query. This parameter is called keyword frequency. The higher it is, the more relevant the site is considered. Until recently, optimizers deliberately increased the frequency keywords up to complete illegibility of texts. On the this moment search engines actively fight against such methods and reduce the ranking when they are detected.

To determine the frequency of keywords, special mathematical algorithms, which calculate the number of occurrences of requests for the volume of the defined text. Wherein optimal ratio considered 3-5%. Since search engine robots are not able to evaluate texts in terms of readability, this circumstance allows optimizers to increase the frequency of keywords to a certain limit, which, on the one hand, violates the rules for using search engines, and on the other hand, does not go beyond the criteria set by them. .

doorways

Gray optimization methods include the creation of "gray" and "white" doorways:

  • "Grey doorways". Their main goal is to transfer the acquired indicators to the main site, that is, acting as donors. Unlike black views, gray ones have meaningful textual content and are inherently full-fledged sites.
  • "White Doorways". This type first of all, it carries an advertising load and does not violate the licenses of search engines. When creating them, original content is used and an attractive graphic design. When hitting the “white doorway”, the visitor sees advertising text with links to the promoted site and, if desired, can follow them.

Buying links

Buying links is one of the main ways to promote websites in search engines and essentially represents the placement of links to third party resources in order to increase the indicators of promoted resources. As a rule, specialized exchanges are used for this. Search engines are struggling with similar method, but its use cannot result in a ban.

Search Engine Attitude

Gray optimization, unlike black, is carried out on the verge of the criteria established by search engines. That is why, in some cases, search engines may temporarily or permanently block such a site. In addition, the requirements they impose on the promotion of resources are constantly changing, and what was previously considered legal may become prohibited in the near future.

Any Internet resource, regardless of its specifics and timing of creation, requires timely and effective promotion. If this process once already launched, in no case should it be suspended - this can nullify all efforts aimed at promotion in the past. Daily at global network a lot of websites appear, which provides quite tough conditions for competition. That is why a site devoid of attention from its owner can easily be lost among tens of thousands of its counterparts.

Today in the global network there are many offers from marketing agencies and individual webmasters involved in search engine promotion of sites. However, before using the services of a particular specialist, it would be useful to ask him what methods he is going to use in the process of promoting your web resource. To date, there are several basic definitions of ways to bring a website to the first positions of search engines.

Firstly, this is a "white" technique, which involves the use of events officially authorized by search engines. Website promotion that uses prohibited methods is called "black" and, despite its effectiveness at the beginning, can lead to negative consequences in the future. A web resource promoted using this approach will most likely be blocked by search engines in the near future, which, in turn, will lead to a complete collapse of the company's Internet presence on the network.

"Grey" methods website promotion: on the border of permitted and prohibited

Valuing their reputation, experienced optimizers try not to use "black" methods of promotion, the use of which does not contribute to promising and long-term cooperation with search engines. However, most often the optimal solution is on the verge between prohibited and legal methods of promotion.

Most often, this approach is called “gray”, because in the process of promotion, clearly violating generally accepted rules and techniques are not used. At the same time, when using "gray" methods of promotion there is no guarantee that after some time the resource will not be blocked due to changes in search algorithms.

"Gray" website promotion: main positions

The "gray" methods of website promotion include:

  1. Frequent use of type tags And , which affect the ranking algorithm of search robots. This technique allows you to slightly increase the rating of the site, however, it is not long-term in the future.
  2. Exchange of non-thematic links. This method promotion allows you to draw the attention of users to the resource by "throwing" them links, the subject of which does not relate to the content of the resource. Most likely, a website promoted using this technique will only use a one-time demand, since a user who, by clicking on a link, does not find what he was looking for, will never look at such a page again.
  3. Purchasing links from other websites. This technique, which can be identified by the presence of links inserted out of place in the text, is widely used by services such as XAP and SAPE.
  4. Services of automatic content exchange services. This includes resources such as outlink.ru, addpage.ru, etc. It should be noted that currently search engines do not recognize sites participating in such services.
  5. Paid linkers to buy external links. The most popular search engines are able to recognize low-quality links, the number of which is growing exponentially, so this technique practically does not allow to achieve any significant result.
  6. Exchange of non-thematic articles. The effect of such a technique will be approximately the same as that of working with non-thematic links. The difference is that in this case, the site owner will additionally need materials for publication, for which you also need to pay.
  7. Using links from non-existent web pages. Such links are taken from resources in which, for one reason or another, there are shortcomings in the development of scripts that are responsible for search forms. This method website promotion works when, along with the query of interest to him, the user enters certain characters into the search string.
  8. Acquiring sites for your content on other sites or posting materials taken from other resources on the pages of your own website. Websites that sell article space are at risk of being penalized by search engines, in particular for low level taking into account the ranking in the text of the posted articles. At the same time, external links to sites that buy places for their material will not be taken into account by search robots in the indexing process.

The use of "gray" methods in the process of site promotion

Despite the ugliness of the "gray" methods of website promotion, they continue to be widely used by many webmasters. They play the role of non-long-term strategic measures aimed at achieving a quick result necessary for further promotion.

An experienced optimizer through the complex use of "white" and "gray" methods can short time achieve a better result than when using white methods alone. It should be noted that the inept use of “gray” promotion methods can lead to various kinds of trouble.

Search engine filters are constantly being improved, which will eventually lead to a drop in the rating of a web resource, and in especially difficult cases, to exclusion from the search results. Thus, such a responsible task as website promotion should be entrusted only to professionals who are able to foresee the consequences of the steps taken.

But there are cases when the border between black and gray methods is erased, the resource is blocked forever or for a certain time.

Gray optimization methods include:

  • high;
  • buying external links to the site;
  • doorway without redirect (no redirection of the user to the site).

Too high keyword density

The relevance of a page is calculated by search robots based on the frequency of a key that is identical to the query. The higher the frequency, the more relevant the resource, respectively. Naturally, the optimizers took this moment into account and began to oversaturate the texts with keys, which made them unreadable.

Along with the change in search engine algorithms, sites containing such content began to lose positions in the search, falling under the filters.

The optimal ratio of keywords in the text, according to most SEOs, is 3-5%. Unfortunately, search bots can't read texts like users do, so SEOs fill content with keywords "to the eyeballs". Thus, they violate the rules, and the site risks falling under the filter.

doorways

Gray doorways are donor sites whose main function is to translate parameters to the recipient site. These sites contain quality content, which actually distinguishes them from black doorways.

Buying links

One of the most common ways gray SEO. The site owner buys links on other sites, trying to increase the PR and TIC of the main site. There are even exchanges that offer similar services. Despite the fact that search engines disapprove of such sites, in most cases, measures to increase resource indicators do not lead to a ban.

Compared to black methods of site promotion, gray optimization is on the verge of the criteria defined by search engines. Therefore, in some cases, search engines may temporarily or even permanently block a site that resorts to such methods. In addition, over time, the queries that are presented to search engines for change in the direction of increasing quality criteria. It turns out that not so long ago it was quite legal, very soon it can become a violation.

Internet Marketing. Complete collection practical tools Virin Fedor Yurievich

"Gray" optimization methods

"Gray" optimization methods

Some optimization methods are "illegal" (from the point of view of search engines), but they allow you to achieve a quick, sometimes immediate effect. These methods are collectively referred to as "spamdexing". When using "gray" optimization methods, the site can be "banned" by the search engine overnight, and then returning it back to the index will be quite a difficult task. In the event that the site is excluded from the index for violations, it is necessary to contact the search engine support service, ask for clarification, possibly make corrections - in general, enter into negotiations.

Most often, such optimization methods are used by random optimization companies whose task is to quickly make money and disappear from the market. Indeed, the results of the company’s work in this case are visible very quickly, almost immediately, and it doesn’t matter that in a few days or weeks these results will turn into dust, and the customer will have to negotiate with search engines for a long time, and then start all optimization work from the beginning .

Not better situation when an employee of the company, having read various forums, having picked up knowledge from blogs of optimizers, begins to optimize the company's website, not imagining the consequences that his actions may have. The problem is aggravated by the fact that the same person will subsequently communicate with the moderator of the search engine to resolve the contradiction. The consequences of such a conversation are unpredictable.

Therefore, it is very useful to understand which optimization methods search engines refer to as "gray" or "black". Remember also that the main tool to combat cheating is competitors, who themselves will tell the moderator of the search engine about the violations found.

So, the "gray" optimization methods include:

1) FILLING - filling the page with keywords and expressions that do not carry a semantic load in the context of the site, most often invisible to the user. I already mentioned this term above. The padding most often looks like small text, consisting of the same word or a small set of words at the bottom of the page. Quite often, when stuffing, the same text color is used as the background color. Search engines catch this trick automatically, but only if it's done with standard HTML tools, not with CSS.

Padding is one of the oldest methods of speed optimization, today it does not bring impressive results, since the weight of the frequency of occurrence in the text of the expression is not as large as it used to be. However, this method is often used because search engines' automatic filters do not always work for it.

The stuffing option is a completely “allowed” way: setting a link to the main product groups on the main and all other pages, as, for example, is done on the site www.pleer.ru. This optimization option increases the relevance of the page. An example of stuffing is shown in fig. 5.13;

Rice. 5.13. Example of padding on a page www.cy-pr.ru/78-nedorogaya_raskrutka.html

2) CLOAKING - demonstration to the search robot indexing the site, and ordinary user different content. In other words, the user sees one site, and the indexer search robot- another. Separation occurs at the level of the user's IP address or by the User Agent: a special script on the server determines the user according to certain criteria and prepares specific page. This method, which bears such a discordant name, is often used by sites not at all to deceive search engines, but to make life easier for users. For example, showing the user a site in the language that he has set as the main language in the browser, or many news sites show users a different set of news depending on the place where the user is currently located.

However, search engines consider this behavior incorrect and sometimes ban such sites, despite the ambiguity of usage. In the same time automatic system no search engine has yet defined cloaking, so all suspicious cases are identified by moderators or, again, competitors. All such cases are dealt with manually, which somewhat reduces the risk automatic shutdown, although the moderator sometimes makes mistakes;

3) DOORWAY - creation a large number separate independent pages optimized for low-frequency queries that redirect visitors to the site being optimized. Thus, it is not the site itself that is being optimized, but pages that have nothing to do with the site and are located on third party hosting under others domain names. This kind of optimization is very noticeable to those who actively use search engines, although its forms can be very bizarre. It automatically redirects to a completely different site blank pages with a single link containing a query entered by users, leading to no one knows where, entire pages with a list of links, sometimes having nothing to do with what you were looking for. Doorways are non-thematic pages designed solely to "forward" the visitor to the desired address.

Recently, doorways are more often used to directly redirect visitors to advertising links. Collecting free clicks on low-frequency queries in search engines, they send users to pages full of ads, such as Runner pages.

Search engines are very tough on doorways, because they really worsen search results, litter it with their countless clones. Today, doorways are created automatically using scripts that can create thousands and hundreds of thousands of pages in a matter of hours, and this problem is very relevant for search engines. To understand the scale of the disaster, it is enough to recall two cases that occurred in 2006. Around the middle of the year issuance of Google was literally overwhelmed with several millions doorways created by one enterprising Belarusian optimizer. Some of the pages he created are still functioning, bringing traffic to someone. And in the early spring of 2006, in an interview, Yandex Editor-in-Chief Elena Kolmanovskaya noticed that the number of doorways that appeared in the search engine index over the past two months exceeded their number in all previous years of work. It was in 2006 that the work of doorway workers increased dramatically, and this situation is depressing even today.

In cases of frequent search queries(those that are asked thousands, tens and hundreds of thousands of times a month) doorways are not visible to the user, they do not rise to the top ten pages of the issue (and even the first hundred). The reason, of course, is the activity of optimizers, who use much more serious methods than those included in doorways. But for low frequency requests where there is little or no optimizer activity, doorways thrive. Here they can be found in almost every issue.

Unfortunately, there is no unambiguous and effective way identifying doorways and removing them from search index. Some obvious options are identified and removed by robots, a number of pages are removed due to user complaints, but technology does not stand still not only with search engines, but also with spammers. So far, the doorways are winning this battle.

The most large-scale doorway grids, combining several million pages that collect traffic in different search engines, create special services selling traffic to interested sites. Today, these are powerful agencies with sufficient funds for the development and development of technologies. An example of a classic doorway can be seen in Fig. 5.14;

4) NEPOTISM - mutual exchange of links in order to increase the citation index ( PageRank) without the meaningful meaning of the references, that is, without thematic necessity. After the start active use mechanisms for determining the relevance (weight) of pages by the volume of its citation at different levels, whole burial grounds of links appeared on sites where mutual links to tens, hundreds, and sometimes thousands of sites were placed. This really increased and increases Page Rank, albeit not by much. IN this case the insignificance of the increase is redeemed by the number of links, that is, the number of these most insignificant increases.

As a result, whole “bunches” of sites intertwined with commercial links appeared on the Internet, and “link exchange” at some point became a very active business, and more than a dozen messages “I propose to exchange links” came to my own long-forgotten site every day. Now the activity has subsided a little, and links are mostly bought, that is, permanent placement of a link on a popular site is acquired, since it is easier, more reliable and better controllable than an exchange.

Search engines struggle with nepotism because, not without reason, they see in it an artificial increase in the “weight” of a site that is not justified by real significance. To date, there is no reliable and adequate system for recognizing the mutual exchange of links between more than three sites that link "in a circle", therefore, the identification of such sites is most often done on the basis of complaints from competitors.

It is important that one of the methods that search engines use to combat nepotism is the reverse citation index, that is, lowering the weight of sites that are linked from such "link dumps" (of course, known search engine). This recent innovation requires a more careful approach to working with links.

Spamdexing is used very often today, and up to a certain point it is, of course, very effective, otherwise it would not be used. However, if the work of the site is planned not for a month, but for years in advance, then spamdexing is not worth using: in the long run, it does not bring a positive result. First of all, I have already noted this, due to the high activity of competitors.

From the book Magazine `Computerra` No. 722 author Computerra magazine

From the book The Essence of COM Technology. Programmer's Library author Boxing Donald

From the book Let's Build a Compiler! by Crenshaw Jack

A Word on Optimization Earlier in this chapter, I promised to give some hints on how we can improve the quality of the generated code. As I said, getting compact code is not the main goal of this book. But you need to at least to know that we are not in vain spending our

From the book High Quality Code Generation for C Programs author Hisley Philip N

Scope of Optimization The term "optimizing compiler" is used by compiler vendors in general sense, to refer to compilers that provide some level of optimization, from the simplest to the most complex. To distinguish the degree

From the book Overclock your site author Matsievsky Nikolay

Optimization methods Exist various methods machine-dependent and machine-independent code optimization. They can be applied at all syntactic levels. One of the simplest methods is "multiplication of constants". When applied, any reference to a constant

author Raymond Eric Steven

Client-Side Web Page Optimization Techniques Your website slows down and you have no idea how to deal with it? The amount of animation on the page has exceeded all conceivable and inconceivable boundaries, and you do not know what to do? There are dozens and hundreds of pictures on the page, the designer surpassed himself

From the book The Art of Unix Programming author Raymond Eric Steven

4.6. Methods of extreme optimization Than more number external resources that the browser accesses when loading, themes more time required to display the page. Typically, web pages access many external CSS And JavaScript files. All style and script files

From the book Computer Tips (collection of articles) author author unknown

From the Linux book: Complete guide author Kolisnichenko Denis Nikolaevich

From the Photoshop CS4 book author Zhvalevsky Andrey Valentinovich

1.6.15. Optimization rule: create prototypes make them work before optimizing The most basic argument for prototyping was first put forward by Kernighan and Plauger: "90% of actual and real functionality is better than 100%

From the book Internet Marketing. Complete collection of practical tools author Virin Fedor Yurievich

12.1. Avoiding Optimization The most powerful optimization technique in any programmer's toolkit is to do nothing. This very Zen-inspired advice is true for several reasons. One of them is the exponential effect

From the book Programming for Linux. Professional approach author Mitchell Mark

2022, leally.ru - Your guide to the world of computers and the Internet