Link exchange

   The essence of link exchanges is that you use a separate page to publish links to other sites and get similar backlinks from them. Search engines do not like link exchanges because, in many cases, they distort search results and do not offer anything useful for Internet users. However, it is still an effective way to increase link popularity if you observe a few simple rules.

   - Link exchange sites that are related to the topic. Exchanging links with unrelated sites is ineffective and unpopular.

   - Before the exchange, make sure that your link will be published on a page of "good". This means that this site should have a reasonable PageRank (3-4 or higher is recommended), it should be available for indexing by search engines, the connection must be direct, the total number of connections the site must not exceed 50, and so on.

   - Do not create major directories link to your site. The idea of ​​such a directory an attractive, because it gives you an opportunity to exchange links with many countries on various topics. You will have a topic category for each site listed. However, when trying to optimize your website you are looking for quality links rather than quantity and has some potential pitfalls. No conscious SEO webmaster will publish a quality link to you, if it receives an invalid link from the "farm" your link directory in return. In general, the PageRank of the pages of such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There were also instances where countries banned for using such directories.

   - Use a separate page on the website for link exchanges. It should have a reasonable PageRank and should be indexed by search engines, etc. Do not publish more than 50 links on a page (otherwise search engines may fail to take some of the links into account). This will help you to find other partners aware of SEO link exchanges.

   - Search engines try to find mutual connections. This is why you should, if possible, publish backlinks to a domain / site other than the one you are trying to promote. The best variant is when you promote resource site1.com and publish backlinks site2.com resources.

    - Exchange links with caution. Webmasters who are not honest enough will often remove your links from their resources after a while. Check your backlinks from time to time.

      Press releases RSS, thematic resources
 
This section is about internet marketing rather than pure SEO. There are many sources of information and news feeds to publish press releases and news on various topics. Such sites can supply you with direct visitors and increase your sites popularity. If you do not find it easy to create a press release or a piece of news, hire copywriters - they will help you find or create something newsworthy.

   Look for resources that deal with topics related to your site. You can find many Internet projects that not in direct competition with you, but sharing the same topic as your website. Try to approach the site owners. It is quite likely that they would be willing to publish information about your project.

   One last tip for getting inbound links - try to create small variations in inbound link text. If all inbound links to your site are exactly the same text links and there are many of them, the search engines can flag it as a spam attempt and penalize your site.

4 Indexing a site

   Before a site appears in the search results, a search engine must index it. An indexed pages are viewed and analyzed by a robot with relevant information stored in the database search engine. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine can not know anything about it and it can not display the information from the site ..

   Average sized sites (with dozens to hundreds of pages) are usually indexed correctly by search engines. However, you should remember the following points when building your website. There are two ways to allow a search engine to learn about a new site:

   - Submit site address manually using a form on the search engine, if possible. In this case, you are the one who informs the search engine about the new site and its address goes on for indexing. Only the main page of the site should be added, the search robot will find rest by following the links pages.

   - Let the robot find the site on their own. If there is at least one inbound link to your source from other sources indexed, the search robot will soon visit and index your site. In most cases, this method is not recommended. Get some inbound links to your site and just wait until the robot visits it. This can actually be faster than manually adding it on submission. Indexing a site typically takes from a few days up to two weeks depending on the search engine. Search engine Google is fast bunch.

   Try to make your website friendly to search robots by following these rules:

   - Try to make a site accessible to your site from the home page in no more than three mouse clicks. If the structure of the site does not allow you to do this, create a map of the so-called site that will allow this rule to be respected.

   - Do not make common mistakes. Session identifiers make indexing more difficult. If you use script navigation, make sure you copy these links with regular ones because search engines can not read scripts (see more details about these and other errors in section 2.3).

   - Remember that search engines index no more than the first 100-200 KB of text on a page. Therefore, the following rule - do not use pages with text larger than 100 KB if you want them fully indexed.

   You can manage the behavior of search robots using robots.txt file. This tab allows you to explicitly allow or prohibit those specific index pages on your site.

   Databases of search engines are constantly being updated, the data in them can change, disappear and reappear. This is why the number of indexed pages your site may sometimes differ. One of the most common reasons for a site away from index server unavailability. This means that the robot can not access it when trying to index the site. After the server is restarted, the site should eventually reappear in the index.

   You should note that inbound links your site has, the faster it gets re-indexed. You can track the process of indexing your site by analyzing server log files where all visits of search robots are logged. We will give details of SEO software that allows you to track such visits in a subsequent section.

  Choosing keywords

    Initially choosing keywords
 
Choosing keywords should be your first step when building a website. You must be available keyword list to include in the text of your site before you start composing it. To determine the keywords your website, you should use the services offered by SEO search engines in the first instance. Sites such as www.wordtracker.com and inventory.overture.com are good starting places for English language sites. Note that data providers can sometimes differ significantly from what are actually the best keywords for your website. You should also note that the search engine Google gives no information about the frequency of search queries.

   Once you have defined your list of approximate initial keywords, you can analyze your competitor websites and try to find out what they are using keywords. You may find some other relevant keywords that are appropriate for your site.

    common and rare keywords
 
There are two distinct strategies - choosing a small number of highly popular keywords or optimize for a large number of less familiar words. In practice, both strategies are often combined.

   The disadvantage of keywords that attract frequent queries is that competition is high for them. This is often not possible for a new location to get anywhere near the top of the search result listings for these questions.

   Keywords associated with rare action, it is often enough to mention just a combination of required word in a web page or to perform minimal optimization text. Under certain circumstances, few questions can offer a fairly large amount of search traffic.

   The goal of most commercial sites to sell a product or service, or to make money in some way by their visitors. This should be kept in mind when working your SEO (search engine optimization) and keyword selection. If you are optimizing a commercial website, then you should try to attract targeted visitors (those who are willing to pay for the product or service provided) on your website rather than focusing on a large number of visitors.

   Instance. Query "Monitor" is much more popular and competitive than the question "monitor Samsung 710N" (the exact name of the model). However, the second query is much more valuable to a seller observers. It is also easy to get traffic from it, because its rate is low competition, there are many other locations not owned Samsung 710N monitors vendors. This example highlights another potential difference between rare and frequent search queries that should be taken into account - rare search questions can provide you with a few visitors in general, but more targeted visitors.

    Competitive rates rating search queries
 
When you have completed your keywords list, you must identify the main keywords for which you will optimize your pages. A suggested technique for this below.

   Few questions were rejected at the same time (for the moment). In the previous section, we described the usefulness of such questions is rare, but they do not require special optimization. They are likely to occur naturally in the text of your website.

   As a rule, the rate is very high competition for the most popular phrases. This is why you need to get a realistic idea of ​​your site competition. To assess the level of competition you need to evaluate a number of parameters for the first 10 pages appear in search results:
   - Average PageRank of pages in the search results.
   - The average number of links to these sites. Check this by using a variety of search engines.
   Additional parameters:
   - The number of Internet sites that contain specific search term, the total number of search results for that search term.
   - Number of internet pages containing the exact phrase match keyword. Search for the phrase is bracketed by quotation marks to get this number.

   These additional parameters allow you to indirectly assess how difficult it would be to take your website to the top of the list for this particular phrase. And prescribed parameters, you can also check the number of countries present in the search results in the main directories, such as DMOZ and Yahoo.

   Analysis of the above mentioned parameters and compare them with those of your site will allow you to predict with reasonable certainty the chances of getting your website to the top of the list for a particular phrase.

   Considering the level of competition for all your keyword phrases, you can now choose a moderate number of popular key phrases with an acceptable rate competition, which you can use to promote and optimize your site.

    Refining your keyword phrases
 
As mentioned above, the search engine services often provide inaccurate information keyword. This means that it is not uncommon to get an optimal set of keywords of the site in your first attempt. Once your site is up and running and you have done some initial promotion, you can get additional statistics keyword, which will relieve some fine-tuning. For example, you will be able to estimate your site search results for certain phrases and you will have the number of visits to your site for these phrases.

   With this information, you can clearly define good and bad keyword phrases. Often there is no need to wait until your site gets near the top of all search engines for the phrases you assessment - one or two search engines are enough.

   Instance. Suppose your website is ranked first in search engine Yahoo for a particular phrase. At the same time, this page is not yet registered in MSN, or Google search results for that phrase. However, if you know the percentage of visits to your website from various search engines (eg, Google 70%, Yahoo - 20%, MSN search - 10%), you can predict the approximate amount of traffic for this phrase from these other searches engines and decide if it is suitable.

   As well as detecting bad phrases, you can find some good new. For example, you can see that a keyword phrase you did not optimize your site for bringing traffic useful, despite the fact that your site is in the second or third page in search results for this phrase.

   Using these methods, you will arrive at a set of new refined keyword phrases. You now need to start rebuilding your site: Change the text to include more of the best phrases, create new pages for new phrases, etc.

   You can repeat this exercise several times SEO and, after a while, you will have an optimal number of key phrases for your site and significantly increase search traffic.
   Here are some more tips. According to statistics, the main page takes up to 30% -50% of all search traffic. It has high visibility in the search engines and it has the largest number of inbound links. This is why you need to optimize the home page of your site to match the most popular question and competitive. Each page should be optimized for a page or two key words and combinations, probably for a number of rare questions. This will increase the chances to get to the top page of search engine lists for particular phrases.