Semantic core - how to make it right? Expanding the semantic core. Apps and services for automatic grouping of search queries

Probably, you have heard a similar term more than once or you came across it both on the websites of companies and in personal blogs. True, not everyone took the trouble to explain what exactly was being discussed. Therefore, I decided to write a detailed article about what the semantic core is, why you need it, how to assemble it and use it for your site.

The semantic core of the site is a list of keywords and phrases for which both the project as a whole and its individual pages are optimized. When compiling a semantic core, it must be borne in mind that it must contain targeted queries, that is, those that are relevant to the business or the subject of the content project.

What is the semantic core for?

If we are talking about a new site, it is the basis for designing its structure, future pages and forming a content plan. When working on an existing web project, the semantic core can be used to improve the optimization of sections and pages, as well as increase the number of landing pages, which is necessary to increase traffic from search engines.

A page that isn't keyword-optimized essentially doesn't provide any benefit to the site owner. Search engines do not understand in response to which user requests to show it, and what benefit it might be. Therefore, even if there are hundreds of such pages on the site, its traffic may be scanty.

Therefore, words from the semantic core are used to optimize both the article titles and the rest of the text, including images.

How to build a semantic core

For this, specialized services and programs are used. To collect the most complete list of keywords, you need to combine different sources. Only then can you get the most complete set of phrases.

This is important, as the competition in search results is getting tougher, and hundreds of sites are competing for the Top 10 in some topics. The more popular the keywords for which the page is focused, the more competition will be, and, accordingly, it is very difficult to get traffic for such requests. At the same time, there are thousands of low-frequency queries that generate traffic, but most competitors do not use them for some reason.

Therefore, a well-composed semantic core is parsed using various programs and services, including those designed to analyze competitors. For example, in my work, I use a number of the following sources, combining them depending on the task:

  • Key collector
  • Keyword databases
  • Competitor analysis services.

Key Collector Program

An indispensable tool for an SEO specialist, a program that is a real "combine" of opportunities for working with keywords. Allows you to collect keywords from Yandex and Google statistics services, search suggestions, and parse data from a number of other sources.

Separately, it is worth noting the availability of opportunities for effective filtering and clustering of queries, as well as evaluating key phrases by several dozen parameters.

Keyword databases

Unlike programs that collect keywords when necessary, databases already contain a huge number of queries, which allows you to quickly make the desired selection and save a lot of time. The two most popular programs of this kind are Pastukhov's Base and Bukvarix.

Both the first and the second program already contain more than a billion requests. At the same time, the difference is in fact that the Pastukhov Base provides a convenient tool for working with keywords (filtering, sorting, etc.), and Bukvarix, in fact, allows you to simply make selections, which then need to be processed in other programs. But, on the other hand, the authors allow you to use it for free.

Competitor analysis services

What could be more tempting than finding out the semantic core of a competitor? Sooner or later, such an idea arose in the mind of every specialist, and, I must say, there are services that can partially solve this problem. Examples of such services: Prodvigator.ua, Megaindex.ru, Semrush.com and a number of others.

With their help, you can get data on the keywords for which the site you need is shown in the search, and even those queries that are used to navigate to specific pages.

Of course, this will only be partial information, since only the owner of the resource can see the complete data. But, even this information is sometimes enough to expand your own semantic core and find non-obvious queries that generate traffic.

At the same time, if you need to collect a simple small list of queries, for example, to write one article or optimize a category in an online store, you can use services directly from the most search engines.

For such purposes, you can use the Keyword Planner from Google AdWords and the word selection service from Yandex.

By combining the use of all the listed tools, depending on the task, you can collect the most complete semantic core for your project.

Nevertheless, as you can see, I only give a list of programs and services that can be used, and do not even try to write detailed instructions on how to work from each of them. The information on working with the same Key Collector would be extensive enough to cover the writing of an entire book.

Therefore, if you do not have time to study all this, the most correct decision would be to order the semantic core of the site from professionals. Our project team also provides such a service, and you can be sure of the quality of the work performed. You can write to us through the form.

What to do after compiling the semantic core?

When the parsing of keywords from all sorts of sources is finished, you need to clean them from dummy words, implicit duplicates and inappropriate queries. Then, cluster and distribute across the pages of the site. Or, if we are talking about a new project, start designing a structure based on the collected words and phrases.

Dummy words are those queries that have a very large difference in meanings between exact and broad matches. Or, their frequency by exact match is zero.

Such phrases are not valuable and that are optimized for them, in the end, will not bring traffic.

Clustering is the distribution of keywords into thematic groups based on meaning or main occurrence. Also, there are specialized services that allow you to cluster queries based on search results. True, I adhere to the position that the highest quality grouping can be performed only in manual or semi-automatic mode. An example of the result of such work:

Each cluster is the basis for optimizing a section or an individual page of the site. That is why it is so important that the requests are selected as correctly as possible.

As I wrote at the very beginning, the semantic core is used to design a site structure or distribution within an existing one. Mind maps are great for visualizing the structure of a project. Take a look at an example of one of the projects:

How to order a semantic core and how much does it cost

As I already wrote, collecting key queries for a site is an important component of search engine optimization and promotion. To perform such work at a good level, you need the appropriate experience and skills, and most importantly, the availability of specialized tools and time. If you are not confident in your abilities, or simply do not have enough time, you can order a ready-made semantic core from us by sending.

Our specialists will contact you to clarify the details, determine the cost and terms of the work. Also, if necessary, we will give you advice on the correct distribution of requests on the site pages, and other useful recommendations.

Hello everyone!

What to do with the semantic core? This question is asked, probably, by all newbies in SEO-promotion (judging by myself) and for good reason. Indeed, at the initial stage, a person does not understand why he sat so much and collected keywords for the site, or using other tools. Since I also suffered with this question for a long time, I will release, perhaps, a lesson on this topic.

What is the purpose of assembling the semantic core?

First, let's figure out why we were collecting the semantic core in general. So, all SEO promotion is based on the use of keywords that users enter into the search boxes. Thanks to them, things like the structure of the site and its content are created, which in fact are the main factors in.

Also, do not forget about external optimization, in which the semantic core plays an important role. But more on that in the next lessons.

To summarize: SJ is necessary for:

  • Creation of a site structure that will be understandable to both search engines and ordinary users;
  • Content creation. Content nowadays is the main way to promote a site in the search results. The higher the quality of the content, the higher the site is; the more quality content, the higher the site is. More on creating quality content, more;

What to do with the semantic core after compilation?

So, after you have compiled the semantic core, that is: collected the keywords, cleaned and grouped them, you can start forming the structure of the site. In fact, when you grouped the requests as we did in lesson # 145, you have already created the structure of your web resource:

You just need to implement it on the site and that's it. Thus, you will form a structure not based on what you have in your assortment, but on the basis of consumer demand. By doing so, you will not only benefit the web resource in terms of SEO, but also do the right thing from the point of view of the business as a whole. It is not for nothing that they say: if there is demand, then there must be supply.

We seem to have figured out the structure, now let's move on to the content. Once again, by grouping the queries in the Key Collector, you have found themes for your future content with which you will fill the pages. For example, let's take the Mountain Bikes group and break it down into small subgroups:


Thus, we have created two subgroups with key queries for separate pages. Your task at this stage is to form groups (clusters) so that each cluster contains semantically identical keywords, that is, identical in meaning.

Remember one rule: each cluster has a separate page.

So, of course, it is not very convenient for beginners to group, since you need to have a certain skill, so I will show you another way of forming topics for articles. Let's use Excel this time:


Already on the basis of the resulting data, you can form individual pages.

So I carry out clustering (grouping) and everything suits me perfectly. I think that now you understand what to do with the semantic core after compilation.

Perhaps the example given in this lesson is too general, since it does not give a specific picture. I just want to convey to you the very essence of the action, and then you yourself will work with your head. So I apologize in advance.

If this lesson became useful for you and helped in solving the problem, then please share the link on social networks. And, of course, subscribe to blog updates if you haven't already.

Good luck, friends!

See you soon!

Previous article
Next article

Semantic core of the site- This is a set of keywords and phrases that will most fully describe the subject of the site and its focus.

Compilation of the semantic core- this is the second most important step in creating a website after choosing a topic for a new website. The whole future promotion of a new site in search engines will depend on the compilation of the semantic core.

At first glance, it is not difficult to find keywords for a website. But this process has a large number of nuances that must be taken into account when compiling a semantic core.

In today's article we will try to understand all the features of compiling the semantic core.

What is the semantic core for?

Semantic core important for a site of any subject from an ordinary blog to an online store. Blog owners need to constantly work to improve their search rankings and Semantic Core keywords play a major role in this. Online store owners need to know: how buyers search for a product that an online store distributes.

To bring your site to the TOP-10 search queries you need to correctly compose a semantic core and optimize high-quality and unique content for it. Without unique content, it's not worth talking about the benefits of compiling a semantic core. Each unique article needs to be optimized for one or more similar queries.

How to make the semantic core of the site?

One of the online services for the selection of keywords can help in drawing up the semantic core of the site. Almost all search engines have such online services. Have Yandex - Wordstat, at Google - Google AdWords, at Rambler - Rambler Adstat... In online services for the selection of keywords, you can select the main keywords on a specific topic in various word forms and combinations.

Using Wordstat in the left column, you can see the number of requests per month, not only for a given word, but also for various combinations of a given word (phrase). The left column also contains statistics for the keywords that users searched for with a given query. This information can be useful for creating relevant content for the site.

Also in Yandex Wordstat, you can select a specific region in order to find out about the statistics of requests only for a specific region. Such information can be useful to companies that provide services only within one region.

The program can also help to compose the semantic core. Key collector... With this program, you can quickly collect all the keywords, determine their effectiveness and competitiveness. The program can also analyze the site for compliance with the semantic core.

The main drawback of the Key Collector program- it is paid. The cost of the program is 1500 rubles.

To form a semantic core, you can also use the drop-down tips in search engines. If you enter a "semantic core" into Google, then along with it, Google will provide a few more keywords related to the entered query.

What to do with the semantic core of the site?

After compiling a list of keywords, it is necessary to divide the entire list into conditional groups depending on the frequency of the request. All search queries are divided into: high-frequency, mid-frequency and low-frequency.

It is better to arrange the semantic core in the form of a table, at the top of which high-frequency queries will be indicated, below - medium-frequency, even lower - low-frequency. Words and phrases in each subsequent line should be similar in subject matter and morphology.

A properly composed semantic core of the site can greatly facilitate the further promotion of the site in search engines. Website promotion affects its traffic, and with it, income.

Semantic Core is a scary name that SEOs have come up with to refer to a fairly simple thing. We just need to select the key queries for which we will promote our site.

And in this article I will show you how to correctly compose the semantic core so that your site quickly goes to the TOP, and does not stagnate for months. It also has its own "secrets".

And before we move on to compiling the SY, let's take a look at what it is and what we should eventually come to.

What is the semantic core in simple words

Oddly enough, but the semantic core is a regular excel file, which contains a list of key queries for which you (or your copywriter) will write articles for the site.

For example, this is how my semantic core looks like:

I have marked in green those keywords for which I have already written articles. Yellow - those for which I am going to write articles in the near future. And colorless cells - this means that it will come to these requests a little later.

For each key request, I have defined the frequency, concurrency, and come up with a "catchy" heading. You should get about the same file. Now my SN consists of 150 keywords. This means that I am provided with "material" for at least 5 months in advance (even if I write one article a day).

Below we will talk about what to prepare for if you suddenly decide to order the collection of the semantic core from specialists. Here I will say briefly - you will be given the same list, but only for thousands of "keys". However, in SA, it is not quantity that matters, but quality. And we will focus on this.

Why do you need a semantic core at all?

And in fact, why do we need this torment? You can, after all, just write quality articles and attract an audience by this, right? Yes, you can write, but you won't be able to attract.

The main mistake 90% of bloggers make is just writing high-quality articles. I'm not kidding, they have really interesting and useful materials. But search engines don't know about it. They are not psychics, they are just robots. Accordingly, they do not put your article in the TOP.

There is another subtle point here with the title. For example, you have a very high-quality article on the topic "How to do business in a" mordo book ". There you describe everything about Facebook in great detail and professionally. Including how to promote communities there. Your article is the highest quality, useful and interesting article on this topic on the Internet. Nobody was lying next to you. But it still won't help you.

Why high-quality articles fly out of the TOP

Imagine that your site is not visited by a robot, but by a live inspector (assessor) from Yandex. He realized that you have the coolest article. And rukami put you in first place in the search results for the request "Promotion of the community on Facebook".

Do you know what happens next? You will fly out of there very soon. Because no one will click on your article, even in the first place. People enter the request “Community promotion on Facebook”, and your headline is “How to do business in a“ mordo book ”. Original, fresh, funny, but ... not on request. People want to see exactly what they were looking for, not your creative.

Accordingly, your article will idle to occupy a place in the TOP of the SERP. And a living assessor, an ardent admirer of your creativity, can beg the authorities as much as they want to leave you at least in the TOP-10. But it won't help. All the first places will be occupied by articles empty, like the husks of seeds, which were copied from each other by yesterday's schoolchildren.

But these articles will have the correct "relevant" title - "Promoting the Facebook community from scratch" ( step by step, in 5 steps, from A to Z, free etc.) Offensive? Still would. Well, fight against injustice. Let's put together a competent semantic core so that your articles take the first places they deserve.

Another reason to start making up SJ right now

There is one more thing that for some reason people think a little about. You need to write articles often - at least every week, and preferably 2-3 times a week in order to gain more traffic and faster.

Everyone knows this, but almost no one does it. And all because they have "creative stagnation", "they can not force themselves in any way," "just laziness." But in fact, the whole problem is precisely in the absence of a specific semantic core.

I entered one of my basic keys - "smm" in the search field, and Yandex immediately gave me a dozen tips, what else might be of interest to people who are interested in "smm". I just have to copy these keys into a notebook. Then I will check each of them in the same way, and collect clues for them as well.

After the first stage of collecting the SN, you should have a text document in which there will be 10-30 wide basic keys, with which we will work further.

Step # 2 - Parsing Base Keys in SlovoEB

Of course, if you write an article for the request "webinar" or "smm", then a miracle will not happen. You will never be able to reach the TOP for such a broad request. We need to split the base key into many small queries on this topic. And we will do this with the help of a special program.

I am using KeyCollector but it is paid. You can use a free analogue - the SlovoEB program. You can download it from the official website.

The most difficult thing in working with this program is to configure it correctly. I am showing how to properly configure and use Slovoeb. But in that article, I focus on the selection of keys for Yandex Direct.

And here, let's take a look at the features of using this program for compiling a semantic core for SEO step by step.

First, we create a new project, and name it according to the broad key that you want to parse.

I usually give the project the same name as my base key so I don't get confused later. And yes, I will warn you of one more mistake. Don't try to parse all base keys at the same time. Then it will be very difficult for you to filter out “empty” keywords from gold grains. Let's parse one key at a time.

After creating the project, we carry out the basic operation. That is, we actually parse the key through Yandex Wordstat. To do this, click on the "Vorstat" button in the program interface, enter your basic key, and click "Start collection".

For example, let's parse the base key for my "contextual advertising" blog.

After that, the process will start, and after a while the program will give us the result - up to 2000 key queries, which contain "contextual advertising".

Also, next to each request there will be a "dirty" frequency - how many times this key (+ its word forms and tails) was searched per month through Yandex. But I do not advise you to draw any conclusions from these numbers.

Step # 3 - Collecting Accurate Key Frequencies

Dirty frequency won't show us anything. If you focus on it, then do not be surprised when your key for 1000 requests does not bring a single click per month.

We need to identify pure frequency. And for this, we first select all the found keys with check marks, and then click on the "Yandex Direct" button and start the process again. Now Slovoeb will search for us the exact frequency of the request per month for each key.

Now we have an objective picture - how many times a request was entered by Internet users in the last month. Now I propose to group all key queries by frequency, so that it is more convenient to work with them.

To do this, click on the "filter" icon in the "Frequency" column! ", And specify - to filter out keys with the value" less than or equal to 10 ".

Now the program will show you only those queries, the frequency of which is less than or equal to the value "10". You can delete these queries or copy them for the future to another group of key queries. Less than 10 is very little. Writing articles for these queries is a waste of time.

Now we need to choose those keywords that will bring us more or less good traffic. And for this we need to find out one more parameter - the level of concurrency of the request.

Step # 4 - Checking Competitive Queries

All "keys" in this world are divided into 3 types: high-frequency (HF), mid-frequency (MF), low-frequency (LF). They can also be highly competitive (VC), medium competitive (SK) and low competitive (NK).

As a rule, HF requests are simultaneously received by VC. That is, if a request is often searched for on the Internet, then there are a lot of sites that want to promote it. But this is not always the case, there are happy exceptions.

The art of compiling a semantic core is precisely in finding such queries that have a high frequency, and their level of competition is low. It is very difficult to determine the level of competition manually.

You can focus on indicators such as the number of main pages in the TOP-10, the length and quality of texts. the level of trust and particles of sites in the TOP of the issuance on request. All of this will give you some idea of ​​how tough the competition for positions is for this particular query.

But I recommend that you take advantage of service Mutagen... It takes into account all the parameters that I mentioned above, plus a dozen more that neither you nor I have probably even heard of. After analysis, the service gives the exact value - what is the level of competition for this request.

Here I checked the query "setting up contextual advertising in google adwords". The mutagen showed us that this key has a competitiveness of "more than 25" - this is the maximum value that it shows. And this query has only 11 views per month. So it definitely doesn't suit us.

We can copy all the keys we picked up in Slovoeb and do a bulk check in Mutagen. After that, we will only have to look at the list and take those requests that have a lot of requests and a low level of competition.

Mutagen is a paid service. But you can do 10 checks a day for free. Moreover, the cost of verification is very low. For the entire time of working with him, I have not yet spent even 300 rubles.

By the way, at the expense of the level of competition. If you have a young site, then it is better to choose queries with a competition level of 3-5. And if you have been promoting for more than a year, then you can take 10-15.

By the way, at the expense of the frequency of requests. Now we need to take the final step, which will allow you to attract a lot of traffic even for low-frequency requests.

Step # 5 - Collecting Tails for Selected Keys

As has been proven and verified many times, your site will receive the bulk of traffic not from the main keys, but from the so-called "tails". This is when a person enters strange keywords into the search box, with a frequency of 1-2 per month, but there are a lot of such requests.

To see the "tail" - just go to Yandex and enter your chosen keyword in the search bar. Here's what you will see roughly.

Now you just need to write out these additional words in a separate document and use them in your article. Moreover, you do not need to always put them next to the main key. Otherwise, search engines will see "over-optimization" and your articles will drop in the search results.

Just use them in different places in your article, and then you will receive additional traffic from them as well. I would also recommend that you try to use as many word forms and synonyms as possible for your main keyword query.

For example, we have a request - "Setting up contextual advertising". Here's how to reformulate it:

  • Customize = customize, make, build, run, run, enable, host ...
  • Contextual advertising = context, direct, teaser, YAN, adwords, cms. direct, adwords ...

You never know exactly how people will look for information. Add all these additional words to your semantic core, and use when writing texts.

So, we collect a list of 100 - 150 keywords. If this is your first time compiling the semantic core, it may take you several weeks.

Or maybe well, break his eyes? Maybe there is an opportunity to delegate the preparation of the CL to specialists who will do it better and faster? Yes, there are such specialists, but you don't always need to use their services.

Should I order a CJ from specialists?

By and large, the semantic core specialists will only take you steps 1 - 3 from our diagram. Sometimes, for a large additional fee, steps 4-5 will also be done - (collecting tails and checking the competitiveness of requests).

After that, they will give you several thousand keywords that you will need to work with in the future.

And the question here is whether you are going to write articles yourself, or hire copywriters for this. If you want to focus on quality, not quantity, then you have to write yourself. But then it won't be enough for you to just get a list of keys. You will need to choose topics that you know well enough to write a quality article.

And here the question arises - why do we actually need specialists in the field of syllabus? Agree, parsing the base key and collecting the exact frequencies (steps # 1-3) is not at all difficult. It will take you literally half an hour of time.

The most difficult thing is to select RF requests that have low competition. And now, as it turns out, you need HF-NK, on ​​which you can write a good article. This is exactly what will take you 99% of your time working on the semantic core. And no specialist will do this to you. Well, is it worth spending money on ordering such services?

When the services of a specialist in CN is useful

It's another matter if you initially plan to attract copywriters. Then you don't need to understand the subject of the request. Your copywriters will not understand it either. They will simply take a few articles on this topic and compile their own text from them.

Such articles will be empty, squalid, almost useless. But there will be many of them. On your own, you can write a maximum of 2-3 high-quality articles per week. And the army of copywriters will provide you with 2-3 shit text a day. At the same time, they will be optimized for requests, which means they will attract some kind of traffic.

In this case, yes, feel free to hire a specialist in YA. Let them also compose TK for copywriters at the same time. But you understand, this will also cost some money.

Summary

Let's go over the main points in the article again to consolidate the information.

  • The semantic core is simply a list of keywords for which you will write articles on the site for promotion.
  • It is necessary to optimize the texts for the exact key queries, otherwise your even the highest quality articles will never reach the TOP.
  • SN is like a content plan for social media. It helps you not to fall into a "creative crisis" and always know exactly what you will be writing about tomorrow, the day after tomorrow and in a month.
  • To compile the semantic core, it is convenient to use the free Slovoeb program, you just need it.
  • Here are five steps to compiling a CL: 1 - Selecting base keys; 2 - Parsing base keys; 3 - Collecting the exact frequency for inquiries; 4 - Checking the concurrency of keys; 5 - Collection of "tails".
  • If you want to write articles yourself, then it is better to make the semantic core yourself, for yourself. The CL compilers will not be able to help you here.
  • If you want to work for quantity and use copywriters to write articles, then it is entirely possible to involve delegate and compiling a semantic core. If only there was enough money for everything.

Hope this tutorial was helpful to you. Save it to your favorites so as not to lose it, and share it with your friends. Don't forget to download my book. There I show you the fastest way from zero to the first million on the Internet (extract from personal experience for 10 years =)

See you later!

Yours Dmitry Novosyolov

Good day. Recently, a lot of letters have been sent to the mail in the following style:

  • "Last time I stupidly did not have time to register for the marathon because I was on vacation, but as such there were few announcements ..."
  • "Sing, I saw that a course was being prepared, can you tell me the exact dates and how many classes there will be?"
  • "How much will the course cost? What will be the material? Is it a marathon or an electronic recording?"

I will try to answer some of the questions:

  1. I cannot say the exact date of the course release. It will be exactly in October and most likely at the end.
  2. The course will be open on sale for a maximum of 5 days, I will dial a group with which it will be interesting for me to work and reach specific figures, then I will close access. So don't miss the registration dates.
  3. At the last marathon, some participants achieved incredible results (I will share the graphs in the following lessons), but these results were achieved only by those who did all their homework and attended all classes, so registration will be limited in time and in number. Most likely, the first 30 will make some significant bonus.

For now, you can ask me a question by mail ( [email protected] site), in the comments or sign up for pre-registration by completing this survey.

Now let's move on to delicious. 🙂

The semantic core is assembled, what's next?

All SEOs are repeating around that it is necessary to collect the semantic core of the site. This is certainly true, but, unfortunately, many do not even know what to do with this good? Well, we collected it, what's next? I wouldn't be surprised if you belong to this category too. In general, some clients order a semantic core, and even after collecting it with the highest quality, they throw your work down the drain. I want to cry when you see this. Today I will tell you about what to do with the semantic core in fact.

If you haven’t created it as expected, here are links to lessons on compiling a semantic core.

I will demonstrate everything with a simple example to facilitate your understanding in this difficult matter. Let's say we needed to build a semantic core for a site that talks about WordPress. Quite naturally, one of the headings of this site will be "WordPress Plugins".

By the way, do not forget to search for phrases in Russian when parsing keywords. That is, for the "WordPress plugins" category, you need to parse not only the phrase "wordpress plugin", but also the phrase "WordPress plugin". It often happens that in Russian the name of a brand or product is searched for even more than in the original spelling in English. Remember this.

After collecting the SY (short for "semantic core") for this heading, we get something like this Excel file:

As you can see, there are quite a few requests and everything is piled up. Further, we simply group in Excel by cutting / pasting keywords that are similar in meaning. Separate groups of keywords with some empty line for clarity.

It would be great if the keywords in the subgroups are sorted by exact frequency (this will come in handy for the future). It turns out that these subgroups are keywords that are contained in these articles. If the semantic core is compiled quite qualitatively, we will not miss anything and cover ALL queries that are included in this heading.

Common words such as "wordpress plugins" we leave for the page with the heading, that is, we place the necessary SEO optimized text right in the heading. Be sure to read my article on to know how to do it right.

Even if you are not writing an article yourself, then this file with a semantic core breakdown into groups is an ideal guide for a copywriter. That is, he already sees the structure of the article and understands what he should write about. Needless to say, how much traffic can be collected in this way?

Ideally, of course, if you write the articles themselves or you have a smart SEO copywriter. In any case, be sure to read the article about that. Even if you are not writing yourself, show this article to a copywriter and the effect of your content will not keep you waiting. After a while, you will be pleasantly surprised by the traffic growth.

By the way, if possible, suitable keywords should be made headings, of course in a more natural form. That is, something like this:

Remember, no spam, my friends, over-optimization is evil. It is important here, as in the entire structure of the site, the correct structure of the article. Remember once and for all: search engines are very fond of well-structured sites, and I generally keep quiet about people. We all love it when everything on the site is laid out in regiments, understandable, intelligible, beautiful.

Well guys, that's all for today, we'll meet in the next lesson, which you should also like. Do you like what I write? I'm right? 🙂 If yes, do not forget about retweets, likes and other "goodies", and I especially love comments. It's a trifle for you, but for me it's all very nice. You are good fellows, my friends? 🙂

P.s. Do you need a website? Then the creation of a website Kiev is probably what you need. Trust the professionals.