Blogger templates

Home » 2013
<HTML5>

   What is Html5 HTML 5 Will Affect SEO | What HTML is not | History of search engines | Seo for Wordpress


 What is HTML5?

HTML5 is a language for structuring and presenting content for the World Wide Web, a core technology of the Internet. It is the latest revision of the HTML standard (originally created in 1990) and currently remains under development. Its core aims have been to improve the language with support for the latest multimedia while keeping it easily readable by humans and consistently understood by computers and devices (web browsers, parsers etc.). It adds a great Multimedia and Semantic Web capability to the internet!

Following its immediate predecessors HTML 4.01 and XHTML 1.1, HTML5 is a response to the observation that the HTML and XHTML in common use on the World Wide Web is a mixture of features introduced by various specifications, along with those introduced by software products such as web browsers, those established by common practice, and the many syntax errors in existing web documents. It is also an attempt to define a single markup language that can be written in either HTML or XHTML syntax. It includes detailed processing models to encourage more interoperable implementations; it extends, improves and rationalizes the markup available for documents, and introduces markup and APIs for complex web applications.

In particular, HTML5 adds many new syntactical features. These include the <video>, <audio>, and <canvas> elements, as well as the integration of SVG content. These features are designed to make it easy to include and handle multimedia and graphical content on the web without having to resort to proprietary plugins and APIs. Other new elements, such as <section>, <article>, <header>, and <nav>, are designed to enrich the semantic content of documents. New attributes have been introduced for the same purpose, while some elements and attributes have been removed. Some elements, such as <a>, <cite> and <menu> have been changed, redefined or standardized. The APIs and DOM are no longer afterthoughts, but are fundamental parts of the HTML5 specification. HTML5 also defines in some detail the required processing for invalid documents, so that syntax errors will be treated uniformly by all conforming browsers and other user agents.


Markup

HTML5 introduces a number of new elements and attributes that reflect typical usage on modern websites. Some of them are semantic replacements for common uses of generic block (<div>) and inline (<span>) elements, for example <nav> (website navigation block), <footer> (usually referring to bottom of web page or to last lines of HTML code), or <audio> and <video> instead of <object>. Some deprecated elements from HTML 4.01 have been dropped, including purely presentational elements such as <font> and <center>, whose effects are achieved using Cascading Style Sheets. There is also a renewed emphasis on the importance of DOM scripting (e.g., JavaScript) in Web behavior.
The HTML5 syntax is no longer based on SGML despite the similarity of its markup. It has, however, been designed to be backward compatible with common parsing of older versions of HTML. It comes with a new introductory line that looks like an SGML document type declaration, <!DOCTYPE html>, which triggers the standards-compliant rendering mode. As of 5 January 2009, HTML5 also includes Web Forms 2.0, a previously separate WHATWG specification.


-->

How the Changes in HTML 5 Will Affect SEO

Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.
A new <article> tag. 
The new <article> tag is probably the best addition from a SEO point of view. The <article> tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the <article> tag, this will make the HTML code cleaner because it will reduce the need to use <div> tags. Also, probably search engines will put more weight on the text inside the <article>
tag as compared to the contents on the other parts of the page.
A new <header> tag.
 The new <header> tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The <header> tag is very similar to the <H1> tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard–coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.
A new <footer> tag. 
The <footer> tag might not be as useful as the <header> one but still it allows to include important information there and it can be used for SEO purposes as well. The <header> and <footer> tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.
A new <section> tag. 
The new <section> tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the <article> tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.


As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance, <audio>, <video> or <dialogue> tags are also part of the HTML 5 standard and they will allow to further separate the content into the adequate categories. There are many other tags but they are of relatively lower importance and that is why they are not discussed.

For now HTML 5 is still far in the future. When more pages become HTML 5–compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.

However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.

HTML5



What HTML is not
It's not like print.
In print both text and images are embedded into paper as colored dots. On the web images and text are handled differently.

Images are sent as discrete files over the net to be displayed as illuminated color pixels on your monitor. Text is displayed as text.

The image to the right is a file named "tangent-galvanometer.gif". This page, an HTML file named "what-its-not.htm", requested that tangent-galvanometer.gif be sent from my server to your computer. Next it was positioned on the screen and this text wrapped around it.

Why not simply define the color of each and every pixel and have the page uploaded as one big image? There are two reasons:

Webpages would be too big – they would take too long to download.
The spiders, such as Googlebot, would not be able to read them. The words "Tangent Galvanometer" in the picture on the right is embedded in the image and therefore invisible to the spiders.

On the other hand this line of text is visible to the spiders* and has been entered into the search engine databases.

Those of you coming to web design from having worked in print will find web design quite different – not impossibly so, but it is different. It's not like math.

HTML 5 is no new Geometry. There is an order and logic to math that computer science aspires to but can not match.

In the last couple of decades some smart people made – and continue to make – HTML from scratch. On the whole they have done a good job. HTML 5 is certainly better than HTML 2, 3 or 4, but it is flawed and those flaws tend to raise their ugly heads at the most exasperating moments. Be prepared.

There will be times when you ask: "What idiot thought this up?". That idiot is probably alive and well – and making millions in Silicon Valley. However you can take some small comfort in the fact that your irritation might well be warranted.

Do not expect the elegant perfection of Euclid's Geometry and you will not be disappointed.


History of search engines


History of search engines
   In the early days of the development of the Internet, its users were a privileged minority and the amount of information available was relatively small. Access was limited mainly to employees of various universities and laboratories who used it to access scientific information. In those days, the problem of finding information on the Internet was not nearly as critical as it is now.

   Site directories were one of the first methods used to facilitate access to information resources on the network. Links to these resources are grouped according to topic. Yahoo was the first project of its kind opened in April 1994. How many places in the Yahoo directory inexorably increased, the developers of Yahoo made searchable directory. Of course, it was not a search engine in its true form, because it was limited to the search of these sources who were placed in directory listings. It does not actively seek out resources and the concept of SEO is still accomplish.

   Such link directories are widely used in the past, but nowadays they have lost much of their popularity. The reason is simple - even modern directories with lots of resources only provide information on a small portion of the website. For example, large directory on the network is currently DMOZ (or Open Directory Project). It contains information on about five million resources. Compare this with the Google search engine database that contains more than eight billion documents.

   Project webcrawler began in 1994 and was the first full-featured search engine. Lycos and AltaVista search engines appeared in 1995 and for many years Alta Vista was the major player in this field.

   In 1997 Sergey Brin and Larry Page created Google as a research project at Stanford University. Google is now the most popular search engine in the world.

   Currently, there are three major international search engines - Google, Yahoo and MSN Search. They each have their databases and search algorithms. Many other search engines use results originating from these three major search engines and the same SEO expertise can be applied to all of them. For example, search engine AOL (search.aol.com) uses the Google database while AltaVista, Lycos and AlltheWeb all use the Yahoo database.

Common Principles search engine


Common Principles search engine
   To understand SEO you should be aware of the architecture of search engines. They all contain the following main components:

   Spider - a browser-like program that dismisses web pages.

   Crawler - a program that automatically follows all links on any web page.

   Indexer - a program that analyzes web pages downloaded by the spider and crawler.

   Database-storage for downloaded and processed pages.

   Results engine - Search Results extracts from the database.

   Web server - a server that is responsible for interaction between the user and other search engine components.

   Implementation of specific search mechanisms may differ. For example, + Spider Crawler + Indexer component group might be implemented as a single program that downloads web pages, analyzes them and then uses their links to find new sources. However, the components listed are inherent to all search engines and SEO principles are the same.

   Spider. This website program downloads as a web browser. The difference is that a browser displays the information presented on each page (text, graphics, etc.) while a spider does not have any visual components and works directly with the underlying HTML code of the page. You may know that there is a standard Web browser option to view HTML source code.

   Crawler. This program finds all links on each page. Its task is to determine where the spider should go either praising or alloys based on a predefined list of addresses. Crawler follows these links and tries to find documents not already known to the search engine.

   Indexer. This component parses each page and analyzes the various elements, such as text, headers, structural or stylistic features, special HTML tags, etc.

   Database. This is the storage area for data search engine downloads and analyzes. Sometimes it is called search engine index.

   Engine results. Engine results page rank. It determines which pages best match the user's request and in a way that sites should be listed. This is done in accordance with the search engine ranking algorithms. It follows that page rank is valid and interesting property and any SEO specialist is most interested in, when trying to improve his site search results. In this article, we will discuss the factors that affect SEO on site in some detail.

   Web server. Search engine web server usually contains a HTML page with an input field where the user can specify the search query he or she is interested in web server is also responsible for displaying search results to the user in the form of an HTML page.

Internal factors ranking: SEO


Internal factors ranking

   Several factors affect the position of a page in search results. They can be divided into external and internal factors ranking. Internal ranking factors are those that are controlled by aware SEO website owners (text, layout, etc) and will be described next.

Page Web layout factors relevant to SEO


Page Web layout factors relevant to SEO

 The amount of text on a page

   A page consisting of only a few sentences is less likely to get to the top of a search engine list. Search engines favor sites that have a high content of information. In general, you should try to increase the text content of your site in the interest of SEO. Optimal size pages 500-3000 words (OR 2000 to 20,000 characters).

   Search engine visibility is increased as the amount of page text increases due to increased probability random and accidental search queries causing it to be listed. This factor sometimes results in a large number of visitors.

The number of keywords on a page


The number of keywords on a page

   Keywords should be used at least 3-4 times in the text of the page. Upper limit depends on the size of the overall site - great site, more keyword repetitions can be made. Keyword phrases (word combinations consisting of different keywords) are worth special mention. SEO best results were observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same way. In addition, all the words of the phrase should be used separately several times in the remaining text. There should also be a difference (dispersion) in the number of entries for each of these repeated words.

   Let's take an example. Suppose we choose a site for the term "SEO software" (one of our SEO keywords for this country) it would be better to use the "SEO software phrase" in time 10 text, the word "SEO" 7 times elsewhere in the text and the word "software" 5 times. numbers here are only for illustration, but they show the general idea of ​​SEO quite well.

SEO Keyword density 2.1.3 and
   Page Keyword density is a measure of the relative frequency of the word in the text expressed as a percentage. For example, if a specific word is used 5 times in a page containing 100 words, keyword density is 5%. If a keyword density is too low, the search engine will not pay much attention to. If the density is too high, the search engine may activate its spam filter. If this happens, the site will be penalized and its position in search listings will be deliberately lowered.

   Optimum value for keyword density is 5-7%. In the case of keyword phrases, you must calculate the total density of each specific keywords that include phrases to make sure it is within limits. In practice, a keyword density of more than 7-8% does not seem to have any negative consequences SEO. However, this is not necessary and can reduce the readability of content from a user's point of view.

Location of keywords on a page


Location of keywords on a page

   A very short rule for SEO experts - closer a keyword or keyword phrase is the beginning of a document, the more important it becomes for the search engine.

Format Text and SEO


Format Text and SEO

Search engines pay special attention to page text that is highlighted or given special formatting. We recommend:


  - Use keywords in titles. Headings are text highlighted with the «H» HTML tags. «H1» and «H2 tags» are most effective. Currently, the use of CSS allows you to redefine the appearance of text highlighted with these tags. This means that «H» tags are used less than nowadays, but are still very important in SEO work.;
   - Highlight keywords with bold fonts. Do not highlight all the text! Just highlight each keyword two or three times on the page. Use the «» strong tag for highlighting instead of the more traditional «B» bold tag.

«TITLE» tag


«TITLE» tag

   This is one of the most important tags for search engines. Make use of this fact in your SEO work. Keywords should be used in the title tag. Links to your site which is usually shown in the search results will contain text derived from the title tag. It functions as a kind of virtual card business for your sites. Often, TITLE tag text is the first information about your website that the user sees. This is why it should not only contain keywords, but also be informative and attractive. You want researchers to be tempted to click on your link listed and navigate your website. As a rule, 50-80 characters of the title tag are displayed in the search results so you should limit the size of the title in this length.

Keywords related


Keywords related

   A simple rule SEO - Use keywords in the text of page links that refer to other pages on your site and for any external internet resources. Keywords in such links can increase page rank less.

«ALT attributes on images»


«ALT attributes on images»

   Any page image has a special optional attribute known as "alternative text." Is specified using the HTML «ALT» tag. This text will be displayed if the browser fails to download the image or if the image display is disabled browser. Search engines save the value of image ALT attributes when they Parse (INDEX) sites, but do not use it for ranking search results.

   Currently, the search engine Google takes into account in the account text alt attributes of those images that are links to other sites. ALT attributes of other images are ignored. There is no information about the other search engines, but we can assume that the situation is similar. We consider that keywords can and should be used in alt attributes, but this is not vital for SEO purposes.

Meta Description Tag


Meta Description Tag

   Kjo është përdorur për të specifikojë përshkrimet faqe. Ajo nuk ndikojnë në procesin e rangut SEO, por ajo është shumë e rëndësishme. Një shumë e motorët e kërkimit (përfshirë një të madhe - Google) informacionit ekran nga ky tag në rezultatet e kërkimit, nëse ky tag është i pranishëm në një faqe dhe në qoftë se përmbajtja e tij përputhet me përmbajtjen e faqes dhe kerkimi.

   Përvoja ka treguar se një pozitë të lartë në rezultatet e kërkimit nuk ka gjithmonë të garantojë një numër të madh të vizitorëve. Për shembull, në qoftë se rezultati i konkurrentëve tuaj 'search përshkrim është më tërheqëse se ai për faqen tuaj atëherë përdoruesit search engine mund të zgjidhni burimeve të tyre në vend i juaji. Kjo është arsyeja pse ajo është e rëndësishme që Meta juaj Përshkrimi tag teksti të jetë i shkurtër, por informativ dhe tërheqëse. Ajo duhet gjithashtu të përmbajnë fjalë kyçe të përshtatshme në faqe.

Keywords Meta tag


Keywords Meta tag

   This Meta tag was initially used to determine the keywords for the page but is hardly ever used by search engines now. It is often ignored in SEO projects. However, it would be advisable to specify this code only if there is a resurgence in its use. Following rule must be observed for this tag: only keywords currently used in the text should be added to the site.

Site Structure


Site Structure

 Number of pages

   The general rule is SEO: the more, the better. Increase the number of pages on your website increases the visibility of the site for the search engines. Also, if new information is constantly being added on the site, search engines consider this as the development and expansion of the site. This can provide additional advantages in ranking. You should periodically publish more information on your site - news, press releases, articles, helpful tips, etc.

Menu Navigation


Menu Navigation
 
As a rule, every page has a navigation menu. Use keywords in links menu, it will give you additional SEO importance for pages that refer to the links.

Keywords names on the page



Keywords names on the page

   Some SEO experts consider that using keywords in the name of a HTML page file can have a positive effect on the position of its search results.

 Avoid subdirectories

 If there are no more pages to your site (up to a few dozen), it is best to place them all in the root directory of your site. Search engines consider such pages to be more important than those in subdirectories.

 One page - one keyword phrase
 
For maximum SEO strives to optimize each page for its own keyword phrase. Sometimes you can choose two or three related phrases, but you should not try to optimize a page for 5-10 phrases certainly at the same time. Such phrases would probably produce no effect on the page.

 SEO and Main

Select the home page of your site (domain name, index.html) for word combinations that are the most important. This page is more likely to get on top of search engine lists. My SEO observations suggest that the main page may account for up to 30-40%% of total search traffic for some countries

Common SEO Mistakes

header graphic

Very often sites are designed with a header graphic. Often, we see an image of the busy company logo full-page width. Do not do it! Off the top of a page is a very valuable, where you need to enter your most important keywords for better SEO. In case of a graphic image, that prime position is wasted since search engines can not make use of the images. Sometimes you may come across completely absurd situations: header contains text information, but to make his appearance more attractive, it is created in the form of an image. The text can not be indexed by search engines and so will not contribute to page rank. If you need to submit a logo, the best way is to use a hybrid approach - place the graphic logo at the top of each page and size it so that it does not occupy its entire width. Use a text header to make up the rest of the width.

Graphic navigation menu
 
The situation is similar to the previous one - internal links on your site must contain keywords, which will give an extra edge in the SEO rankings. If your navigation menu consists of graphic elements to make it more attractive, search engines will not be able to index text links. If it is not possible to avoid using a graphic menu, at least remember to specify correct ALT attributes for all images.

Session Identifier


Script navigation

   Sometimes scripts are used for site navigation. As a a SEO worker, you need to understand that search engines can not read or execute scripts. Thus, a link specified with the help of a script will not be available to the search engine robot will not follow it and so parts of your site will not be indexed. If you use site navigation scripts then you must provide copies of the regular HTML to make them visible to everyone - your human visitors and search robots.

      Session Identifier
 
Some countries use the session identifier. This means that every visitor gets a unique parameter (& session_id =) when he or she arrives at the site. This ID is added to the address of every site visited in the country. Session ID help site owners to collect useful statistics, including information about the behavior of visitors. However, from the point of view of a robot, a site with a new address is a brand new. This means that, every search robot comes to such a place, he will get a new session identifier and will consider the pages as new ones whenever it visits them.

   Search engines have algorithms for consolidating mirrors and pages with the same content. Sites with session IDs should, therefore, be recognized and indexed correctly. However, it is difficult to index such sites and sometimes they can be indexed incorrectly, which has a negative effect on SEO rankings page. If you are interested in SEO for your website, I recommend you avoid session identifiers if possible.

    follow
 
Redirects make site analysis more difficult for search robots, with resulting negative effects on SEO. Do not use redirects unless there is a clear reason to do so.

    Hidden text, a deceptive method SEO
 
The last two issues are not really mistakes but deliberate attempt to deceive search engines using illegal SEO methods. Hidden text (when text color matches the background color, for example) allows site owners to fill a page with their desired keywords without affecting page logic or visual layout. Such text is invisible to human visitors but will be seen by search robots. The use of such deceptive optimization methods may result in banning of the site. This can be excluded from the index (database) search engine.


One-pixel links, SEO trick


One-pixel links, SEO trick

   This is another deceptive SEO technique. Search engines consider the use of small, almost invisible, graphic image links just one pixel wide and high as an attempt at deception, which may lead to a site ban.

  External factors ranking

    Why inbound links to sites are taken into account
 
As you can see from the previous section, many factors that affect the ranking process are under the control of webmasters. If these were the only factors, then it will be impossible for search engines to distinguish between a genuine high quality document and a page created specifically to achieve high ranking, but contains no information search useful. For this reason, an analysis of inbound links to the site being evaluated, is one of the main factors in page rankings. This is the only factor that is not controlled by the site owner.

   It makes sense to assume that interesting sites will have inbound links. This is because the owners of other sites on the Internet will tend to have published links to a website if they feel it is a valuable resource. Search engine will use this inbound link criterion in its assessment of the significance of the document.

   Therefore, two main factors affect how many pages are stored by search engine and sorted for display in search results:

    - Relevance, as described in the previous section on internal ranking factors.

    - The number and quality of inbound links, also known as Link Popularity Link citation or citation index. This will be described in the next section.

    Link importance (citation index, link popularity)
 
You can easily see that by simply counting the number of inbound links does not give us enough information to assess a country. It is clear that a link from www.microsoft.com means much more than a link from some site www.hostingcompany.com/ myhomepage.html ~. You should take into consideration the importance of the link and the number of connections.

   Search engines use the notion of citation index to assess the number and quality of inbound links to a site. Citation Index is a numerical rating of the popularity of a resource expressed as an absolute value that represents the importance of the page. Each search engine uses its own algorithms to evaluate a citation index page. As a rule, these values ​​are not published.

   As well as the value of absolute citation index, a scaled citation index is sometimes used. This value shows the relative popularity of a site in relation to the popularity of other sites on the Internet. You will find a detailed description of citation indexes and algorithms used for their evaluation in the next sections.


Link text (anchor text)


Link text (anchor text)

   Text of any about inbound links page is vital in the ranking of search results. The anchor (or link) text is the text between the HTML tags «A» and «/ A» and is displayed as text that you click in a browser to go to a new page. If the link text contains relevant keywords, search engine considers it a very important additional recommendation that site actually contains valuable information relevant to the search.

    The importance of referring sites
 
As link text, the search engines also take into account the contents of general information refers to any website.

   Example: Suppose that we are using SEO to promote a car sales resource. In this case, a link from a site about car repair will be much more important than a similar link from a site about gardening. The first link has been published in one source having a similar theme so it will be more important to the search engines.

    Google PageRank - theoretical basics
   Google Company is the first company to patent system regardless of inbound links. The algorithm was named PageRank. In this section, we will describe the algorithm and how it can affect the search result ranking.

   PageRank is estimated separately for each web site and determined by PageRank (citation) of other websites dealing with. This is a kind of "virtuous circle." The main task is to find a criterion that determines the importance of the page. In the case of PageRank, it is possible the frequency of visits to a site.

   I will now describe how the user's behavior when following links to surf the network is modeled. It is assumed that the user starts viewing sites from some random site. Then he or she follows links to other Internet resources. There is always a possibility that the user can leave a country without following any links from outside and start viewing documents from a random page. PageRank algorithm assesses the probability of this event as 0.15 at each step. Probability that our user continues surfing by following one of the links available on the current page is therefore 0.85, assuming that all links are equal in this case. If he or she continues surfing indefinitely, popular pages will be visited many times more than less popular sites.

   PageRank of a web page specified is defined so as the possibility that a user may visit the website. It follows that the sum of the probabilities for all web sites is exactly because the user is supposed to visit at least one site at any given moment.

   Since it is not always convenient to work with these probabilities PageRank can be transformed mathematically into a more easily understood number for viewing. For example, we are used to seeing a PageRank number between zero and ten in Google Toolbar.

   According to the ranking model described above:
   - Each page on the Net (even if there are no inbound links to it) initially has a PageRank greater than zero, although it will be very small. There is a small chance that a user can navigate to it by accident.
   - Each page has external connection distributes part of its PageRank reference site. PageRank contributed to these linked sites, is inversely proportional to the total number of links on the linked-from page - the more links there are, the lower the PageRank allocated to each on-site.
   - PageRank a "damping factor" is applied in this process so that the total on page spread is reduced by 15%. This is equal to the probability, as described above, the user will not visit any of the related pages, but will navigate to an unrelated website.

   Let's see how this PageRank process might affect the process of ranking search results. We say "may" because the pure PageRank algorithm just described is not used in the Google algorithm for quite a while now. We will discuss the most current and sophisticated version soon. There is nothing difficult about the PageRank influence - after the search engine finds a number of relevant documents (using internal criteria text), they can be divided according PageRank that it would be logical to assume that a document has a number largest high-quality inbound links contain valuable information.

   Thus, the PageRank algorithm "pushes up" those documents that are most popular outside the search engine.

Google PageRank


Google PageRank - practical use

   Currently, PageRank is not used directly in the Google algorithm. This is to be expected since pure PageRank characterizes only the number and quality of inbound links to a site, but it completely ignores the text of the links and the content informative reference. These factors are important in the ranking of the site and they are taken into account in subsequent versions of the algorithm. It is thought that the current Google ranking algorithm to rank pages according to thematic PageRank. In other words, it emphasizes the importance of links from sites with content related to similar themes or topics. The exact details of this algorithm are known only to Google developers.

   You can determine the PageRank value for any web site with the help of the Google Toolbar, which shows a PageRank value within the range of 0-10. It should be noted that Google ToolBar does not show the exact probability PageRank value, but PageRank range a special place in every range (0-10) is determined based on a logarithmic scale.

   Here is an example: every page has a PageRank real value known only to Google. Derive a displayed range for their ToolBar PageRank, they use a logarithmic scale, as shown in this table
          Real PR ToolBar PR
          January 1 to 10
          10-100 2
          100-1000 3
          1000-10.000 4
Etc.

   This shows that the PageRank displayed in the Google toolbar ranges are not all equal. It is easy, for example, to increase PageRank from one to two, while it is much more difficult to increase it 6-7.

   In practice, PageRank is used mainly for two purposes:

   1. Quick check of sites popularity. PageRank does not provide accurate information about referring pages, but it allows you to quickly and easily get a feel for the level of popularity and sites to follow trends that may result from your SEO work. You can use this "rule" measures for English language sites: PR 4-5 is typical for countries with average popularity. PR 6 indicates a very popular place, while PR 7 is almost unreachable for a regular webmaster. You should congratulate yourself if you manage to achieve it. PR 8, 9, 10 can only be reached from the pages of major companies such as Microsoft, Google, etc. PageRank is also useful when exchanging links and in similar situations. You can compare the quality of the sites offered in exchange with pages from your own site to determine whether the rate should be accepted.

   2. Evaluation of competitiveness level for a search query is a vital part of SEO work. Although PageRank is not used directly in the ranking algorithms, it allows you to assess the relative competitiveness indirect page for a particular query. For example, if the search engine displays sites with PageRank 6-7 in the top search results, a site with PageRank 4 is not likely to get to the top of the list using the same query results.

   It is important to recognize that the values ​​displayed in the Google toolbar PageRank are recalculated only occasionally (every few months) so Google Toolbar displays somewhat outdated information. This means that the search engine Google tracks changes in inbound links much faster than these changes are reflected in the Google Toolbar.

Increasing link popularity


Increasing link popularity

      Submitting to directories general-purpose
   On the Internet, many directories contain links to other network resources grouped by topic. The process of adding your site information to called submission.

   Such directories can be paid or unpaid, they may require a backlink from your website, or they can have no such requirement. Number of visitors to these directories is not large so they will not send a significant number of your site. However, the search engines count links from these directories and this may increase your pages search result placement.

   Important! Only those directories that publish a direct link to your website are valuable from an SEO point of view. Script driven directories are almost useless. This point deserves a more detailed explanation. There are two methods for the release of a connection. A direct link has been published as a standard construction (HTML «a href = ...», etc..) Alternatively, links can be edited with the help of various scripts, redirects and so on. Search engines understand only those that are specified links directly into HTML code. That is why SEO value to a directory that does not publish a direct link to your site is close to zero.

   You do not need to submit your site to FFA (free for all) directories. Such directories automatically publish links on any search topic and are ignored by search engines. The only thing an FFA directory entry will give you an increase in mail sent to the address of your published e-mail. In fact, this is the main purpose of FFA directories.

   Be wary of promises from various programs and SEO services that your source of hundreds of thousands of search engines and directories. No more than a hundred or more directories really useful on the Net - this is the number that take seriously and professional SEO submission services work with this number of directories. If an SEO service promises submissions large numbers of resources, it simply means that the basis of presentation data mainly consists of FFA archives and other useless resources.

   Give priority to manual or semiautomatic SEO submission, not to completely rely on automatic processes. Submitting sites under human control is generally much more efficient than fully automatic presentation. Value of presenting a site paid directories or publishing a backlink should be considered individually for each directory. In most cases, it does not make much sense, but there may be exceptions.

   Submitting sites to directories often do not result in a dramatic effect on site traffic, but it slightly increases the visibility of your site on search engines. This option is useful SEO is available to all and does not require a lot of time and expense, so do not overlook it when promoting your project.

      directory DMOZ
 
Directory DMOZ (www.dmoz.org) or Open Directory Project is the big directory on the Internet. There are many copies of the main DMOZ site and so if you submit your site to DMOZ directory, you will get a valuable link from the directory itself, as well as dozens of additional links from related sources. This means that the DMOZ directory is of great value for a webmaster seo aware.

   It's not easy to get your site in DMOZ directory, there is an element of luck involved. Your site may appear in the directory a few minutes after it has been filed or it can take months to come.

   If you have the exact details of your website in the appropriate category then it should eventually appear. If it does not appear after a reasonable time, then you can try to contact the editor of your category with a question about your application (DMOZ site gives you such opportunity). Of course, there are no guarantees, but it can help. DMOZ directory submissions are free of charge for all countries, including commercial ones.

   Here are my final recommendations on applications for DMOZ site. Read all site requirements, descriptions, etc. to avoid breaking the rules submission. Such a violation would likely result in a refusal to consider your request. Please remember, the presence in the DMOZ directory is desirable, but not mandatory. Do not despair if you do not get in this directory. It is possible to achieve higher positions in search results without this directory - do many countries.

Link exchange


Link exchange

   The essence of link exchanges is that you use a separate page to publish links to other sites and get similar backlinks from them. Search engines do not like link exchanges because, in many cases, they distort search results and do not offer anything useful for Internet users. However, it is still an effective way to increase link popularity if you observe a few simple rules.

   - Link exchange sites that are related to the topic. Exchanging links with unrelated sites is ineffective and unpopular.

   - Before the exchange, make sure that your link will be published on a page of "good". This means that this site should have a reasonable PageRank (3-4 or higher is recommended), it should be available for indexing by search engines, the connection must be direct, the total number of connections the site must not exceed 50, and so on.

   - Do not create major directories link to your site. The idea of ​​such a directory an attractive, because it gives you an opportunity to exchange links with many countries on various topics. You will have a topic category for each site listed. However, when trying to optimize your website you are looking for quality links rather than quantity and has some potential pitfalls. No conscious SEO webmaster will publish a quality link to you, if it receives an invalid link from the "farm" your link directory in return. In general, the PageRank of the pages of such directories leaves a lot to be desired. In addition, search engines do not like these directories at all. There were also instances where countries banned for using such directories.

   - Use a separate page on the website for link exchanges. It should have a reasonable PageRank and should be indexed by search engines, etc. Do not publish more than 50 links on a page (otherwise search engines may fail to take some of the links into account). This will help you to find other partners aware of SEO link exchanges.

   - Search engines try to find mutual connections. This is why you should, if possible, publish backlinks to a domain / site other than the one you are trying to promote. The best variant is when you promote resource site1.com and publish backlinks site2.com resources.

    - Exchange links with caution. Webmasters who are not honest enough will often remove your links from their resources after a while. Check your backlinks from time to time.

      Press releases RSS, thematic resources
 
This section is about internet marketing rather than pure SEO. There are many sources of information and news feeds to publish press releases and news on various topics. Such sites can supply you with direct visitors and increase your sites popularity. If you do not find it easy to create a press release or a piece of news, hire copywriters - they will help you find or create something newsworthy.

   Look for resources that deal with topics related to your site. You can find many Internet projects that not in direct competition with you, but sharing the same topic as your website. Try to approach the site owners. It is quite likely that they would be willing to publish information about your project.

   One last tip for getting inbound links - try to create small variations in inbound link text. If all inbound links to your site are exactly the same text links and there are many of them, the search engines can flag it as a spam attempt and penalize your site.

4 Indexing a site

   Before a site appears in the search results, a search engine must index it. An indexed pages are viewed and analyzed by a robot with relevant information stored in the database search engine. If a page is present in the search engine index, it can be displayed in search results otherwise, the search engine can not know anything about it and it can not display the information from the site ..

   Average sized sites (with dozens to hundreds of pages) are usually indexed correctly by search engines. However, you should remember the following points when building your website. There are two ways to allow a search engine to learn about a new site:

   - Submit site address manually using a form on the search engine, if possible. In this case, you are the one who informs the search engine about the new site and its address goes on for indexing. Only the main page of the site should be added, the search robot will find rest by following the links pages.

   - Let the robot find the site on their own. If there is at least one inbound link to your source from other sources indexed, the search robot will soon visit and index your site. In most cases, this method is not recommended. Get some inbound links to your site and just wait until the robot visits it. This can actually be faster than manually adding it on submission. Indexing a site typically takes from a few days up to two weeks depending on the search engine. Search engine Google is fast bunch.

   Try to make your website friendly to search robots by following these rules:

   - Try to make a site accessible to your site from the home page in no more than three mouse clicks. If the structure of the site does not allow you to do this, create a map of the so-called site that will allow this rule to be respected.

   - Do not make common mistakes. Session identifiers make indexing more difficult. If you use script navigation, make sure you copy these links with regular ones because search engines can not read scripts (see more details about these and other errors in section 2.3).

   - Remember that search engines index no more than the first 100-200 KB of text on a page. Therefore, the following rule - do not use pages with text larger than 100 KB if you want them fully indexed.

   You can manage the behavior of search robots using robots.txt file. This tab allows you to explicitly allow or prohibit those specific index pages on your site.

   Databases of search engines are constantly being updated, the data in them can change, disappear and reappear. This is why the number of indexed pages your site may sometimes differ. One of the most common reasons for a site away from index server unavailability. This means that the robot can not access it when trying to index the site. After the server is restarted, the site should eventually reappear in the index.

   You should note that inbound links your site has, the faster it gets re-indexed. You can track the process of indexing your site by analyzing server log files where all visits of search robots are logged. We will give details of SEO software that allows you to track such visits in a subsequent section.

  Choosing keywords

    Initially choosing keywords
 
Choosing keywords should be your first step when building a website. You must be available keyword list to include in the text of your site before you start composing it. To determine the keywords your website, you should use the services offered by SEO search engines in the first instance. Sites such as www.wordtracker.com and inventory.overture.com are good starting places for English language sites. Note that data providers can sometimes differ significantly from what are actually the best keywords for your website. You should also note that the search engine Google gives no information about the frequency of search queries.

   Once you have defined your list of approximate initial keywords, you can analyze your competitor websites and try to find out what they are using keywords. You may find some other relevant keywords that are appropriate for your site.

    common and rare keywords
 
There are two distinct strategies - choosing a small number of highly popular keywords or optimize for a large number of less familiar words. In practice, both strategies are often combined.

   The disadvantage of keywords that attract frequent queries is that competition is high for them. This is often not possible for a new location to get anywhere near the top of the search result listings for these questions.

   Keywords associated with rare action, it is often enough to mention just a combination of required word in a web page or to perform minimal optimization text. Under certain circumstances, few questions can offer a fairly large amount of search traffic.

   The goal of most commercial sites to sell a product or service, or to make money in some way by their visitors. This should be kept in mind when working your SEO (search engine optimization) and keyword selection. If you are optimizing a commercial website, then you should try to attract targeted visitors (those who are willing to pay for the product or service provided) on your website rather than focusing on a large number of visitors.

   Instance. Query "Monitor" is much more popular and competitive than the question "monitor Samsung 710N" (the exact name of the model). However, the second query is much more valuable to a seller observers. It is also easy to get traffic from it, because its rate is low competition, there are many other locations not owned Samsung 710N monitors vendors. This example highlights another potential difference between rare and frequent search queries that should be taken into account - rare search questions can provide you with a few visitors in general, but more targeted visitors.

    Competitive rates rating search queries
 
When you have completed your keywords list, you must identify the main keywords for which you will optimize your pages. A suggested technique for this below.

   Few questions were rejected at the same time (for the moment). In the previous section, we described the usefulness of such questions is rare, but they do not require special optimization. They are likely to occur naturally in the text of your website.

   As a rule, the rate is very high competition for the most popular phrases. This is why you need to get a realistic idea of ​​your site competition. To assess the level of competition you need to evaluate a number of parameters for the first 10 pages appear in search results:
   - Average PageRank of pages in the search results.
   - The average number of links to these sites. Check this by using a variety of search engines.
   Additional parameters:
   - The number of Internet sites that contain specific search term, the total number of search results for that search term.
   - Number of internet pages containing the exact phrase match keyword. Search for the phrase is bracketed by quotation marks to get this number.

   These additional parameters allow you to indirectly assess how difficult it would be to take your website to the top of the list for this particular phrase. And prescribed parameters, you can also check the number of countries present in the search results in the main directories, such as DMOZ and Yahoo.

   Analysis of the above mentioned parameters and compare them with those of your site will allow you to predict with reasonable certainty the chances of getting your website to the top of the list for a particular phrase.

   Considering the level of competition for all your keyword phrases, you can now choose a moderate number of popular key phrases with an acceptable rate competition, which you can use to promote and optimize your site.

    Refining your keyword phrases
 
As mentioned above, the search engine services often provide inaccurate information keyword. This means that it is not uncommon to get an optimal set of keywords of the site in your first attempt. Once your site is up and running and you have done some initial promotion, you can get additional statistics keyword, which will relieve some fine-tuning. For example, you will be able to estimate your site search results for certain phrases and you will have the number of visits to your site for these phrases.

   With this information, you can clearly define good and bad keyword phrases. Often there is no need to wait until your site gets near the top of all search engines for the phrases you assessment - one or two search engines are enough.

   Instance. Suppose your website is ranked first in search engine Yahoo for a particular phrase. At the same time, this page is not yet registered in MSN, or Google search results for that phrase. However, if you know the percentage of visits to your website from various search engines (eg, Google 70%, Yahoo - 20%, MSN search - 10%), you can predict the approximate amount of traffic for this phrase from these other searches engines and decide if it is suitable.

   As well as detecting bad phrases, you can find some good new. For example, you can see that a keyword phrase you did not optimize your site for bringing traffic useful, despite the fact that your site is in the second or third page in search results for this phrase.

   Using these methods, you will arrive at a set of new refined keyword phrases. You now need to start rebuilding your site: Change the text to include more of the best phrases, create new pages for new phrases, etc.

   You can repeat this exercise several times SEO and, after a while, you will have an optimal number of key phrases for your site and significantly increase search traffic.
   Here are some more tips. According to statistics, the main page takes up to 30% -50% of all search traffic. It has high visibility in the search engines and it has the largest number of inbound links. This is why you need to optimize the home page of your site to match the most popular question and competitive. Each page should be optimized for a page or two key words and combinations, probably for a number of rare questions. This will increase the chances to get to the top page of search engine lists for particular phrases.

Miscellaneous information on search engines


Miscellaneous information on search engines

    Google sandbox
 
In early 2004, a new and mysterious term appeared among SEO specialists - Google sandbox. This is the name of a new Google spam filter that excludes new sites from search results. Work Sandbox filter results in new jobs being absent from search results for almost every phrase. This also happens with sites that have high quality and unique content which are promoted using lawful techniques.

   Sandbox is currently applied only in the English segment of the Internet; sites in other languages ​​are not yet affected by this filter. However, this filter can expand its influence. It is assumed that the sandbox filter purpose is to exclude spam sites - indeed, no search spammer will be able to wait for months until he gets the right results. However, many perfectly valid New suffer the consequences. So far, there is no accurate information about what is actually sandbox filter. Here are some guesses based on practical experience seo:

   - Sandbox is a filter that is applied to new places. A new site has been in the sandbox and is kept there for some time until the search engine starts treating it as a normal country.

   - Sandbox is a filter applied to new inbound links to new pages. There is a fundamental difference between this and the previous assumption: the filter is not based on the age of the site, but at the age of inbound links to the site. In other words, Google treats the site normally, but it refuses to recognize any inbound links to it if they have existed for a few months. Since such inbound links are one of the main factors to rank, ignoring inbound links is equivalent to the country being absent from search results. It's hard to say which of these assumptions is true, it is quite possible that they are both true.

   - The site can be held in the sandbox from 3 months to one year or more. It has also been noticed that the pages are released from the sandbox in batches. This means that the time the pages are kept in the sandbox is not calculated individually for each country, but also for groups of countries. All pages created within a certain period of time are put into the same group and they are eventually released all at the same time. Thus, the individual pages in a group can take different times depending on the sandbox where they were in the capture-release cycle group.

   Typical signs that your site is in the sandbox include:

   - Your page is indexed by Google and robot normally visits it regularly.
   - Your site has a PageRank, search engine knows about and correctly displays inbound links to your site.
   - A search of the Internet address (www.site.com) displays accurate results, exact title, PCS (resource description), etc.
   - Your page is found rare and unique word combinations present in the text of its pages.
   - Your page is not displayed in the first thousand results for any other questions, even for those for which it was originally created. Sometimes, there are exceptions and the site appears among 500-600 positions for some questions. This will not change the situation sandbox, of course.

   There is no practical way to bypass the sandbox filter. There have been some suggestions about how it can be done, but they are no more than suggestions and are of little use for a regular webmaster. The best course of action is to continue work on SEO site content and structure, and wait patiently until the sandbox is enabled, after which you can expect a dramatic increase in ratings, to positions 400-500.

    Google LocalRank
 
On 25 February 2003, the company Google patented a new algorithm for ranking sites called LocalRank. It is based on the idea that we should not with their quotes sites ranked global links, but by the way they are cited among the sites that deal with topics related to the specific question. LocalRank algorithm is not used in practice (at least, not in the form that is described in the patent). However, the patent contains several interesting innovations we think any SEO specialist should know about. Almost all search engines now consider topics that refer to pages are devoted. Looks rather different algorithms are used to algorithm LocalRank and patent research will allow us to learn general ideas about how it can be implemented.

   While reading this section, please keep in mind that it contains information theoretical than practical guidelines.

   These three items constitute the main idea of ​​algorithm LocalRank:

   1. An algorithm is used to select a certain number of documents relevant to the research question (let it be N). These documents are sorted first by some criteria (this can be, PageRank importance or a group of other criteria). Let us call this OldScore numerical value criterion.

   2. Each of the selected NN websites goes through a new ranking procedure and it takes a new rank. Let's call it LocalScore.

   3. Values ​​OldScore and LocalScore for each page are multiplied to give a new value - NewScore. Sites are listed at the bottom under NewScore.

   Key procedures in this algorithm is a new ranking procedure, which gives every page a new order LocalScore. Let us examine this new procedure in more detail:

   0. An initial ranking algorithm is used to select the N pages relevant to the search. Each of the sites N is allocated a value OldScore by this algorithm. New ranking algorithm only needs to work on these selected sites N. .

   1. While LocalScore calculation for each page, the system selects those from N sites that have inbound links on this page. Let this number M. At the same time, any other pages from the same host (as determined by IP address) and pages that are given page statement shall be exempt from M.

   2. M group is divided into subsets Li. These subsets contain pages grouped according to the following criteria:
   - I belong to one (or similar) hosts. Thus, the sites of which the first three octets in their IP addresses are the same will get in a group. This means that sites whose IP addresses belong xxx.xxx.xxx.0 range xxx.xxx.xxx.255 shall be deemed to belong to a group.
   - Sites that have the same or similar content (mirrors)
   Pages in the same location (domain).

   3. Each page in each subset Li OldScore on. A site with great OldScore rank is taken from each subset, the rest of the pages are excluded from the analysis. So, we've got some among K pages referring to this page.

   4. K among sites are sorted by parameter OldScore, then only the first pages k (k is a predetermined number) are left in K. Among the rest of the pages are excluded from the analysis.

   5. LocalScore is calculated in this step. Parameters OldScore are combined together for the rest of k pages. This can be shown with the help of the following formula:

   There is a predetermined parameter m can vary from one to three. Unfortunately, the patent for the algorithm in question did not describe this parameter in detail.

   After LocalScore is calculated for every page of N set, values ​​NewScore are calculated and the pages are re-ranked according to the new criteria. The following formula is used to calculate NewScore:

   NewScore (I) = (a + LocalScore (I) / MaxLS) * (+ b OldScore (I) / MaxOS)

   The page to which the new grade is calculated.

   a and b - are numerical constant (no detailed information about these parameters patent).

   MaxLS - is maximum LocalScore among those calculated.

   MaxOS - is the maximum value among values ​​OldScore.

   Now let's put aside the math and explain these simple steps in question.

   In step 0) the relevant pages in question are selected. Algorithms that do not take into account the link text used for this. For example, relevance and overall popularity links are used. We now have a group of values ​​OldScore. OldScore is an assessment of each site based on Relevance, overall link popularity and other factors.

   In step 1) inbound links to the web pages of interest are selected from the group obtained in step 0). The group whittled down by removing mirror and other countries in steps 2), 3) and 4) so ​​that we are left with a set of truly unique sites that all share a common theme with the site that is under analysis. Analyzing inbound links from sites in this group (ignoring all the other sites on the Internet), we get local popularity (thematic) link.

   LocalScore values ​​are then calculated in step 5). LocalScore is an assessment of a particular site among the pages related to the topic. Finally, the sites are rated and sorted by using a combination LocalScore and OldScore.

    SEO tips, assumptions, observations
 
This section provides information based on an analysis of various articles SEO, communication optimization specialists, practical experience and so on. This is a collection of interesting ideas and useful tips and assumptions. Do not see this section as it is written in stone, but as a collection of information and suggestions for your consideration.

   - Links out. Publish about authoritative sources in your subject area using appropriate keywords. Search engines place a high value in relation to other resources based on the same theme.

   - Links out. Do not publish links to FFA pages and other countries excluded from the search engines indexes. Doing so may reduce the assessment of your site.

   - Links out. A page must not contain more than 50-100 out connection. More links will not hurt your credit rating site but links beyond that number will not be recognized by search engines.

   - Within the site-wide links. These are links provided on every page of the site. It is believed that the search engines do not approve of such links and does not consider them while ranking pages. Another thought is that this is only true for large countries with thousands of pages.

   - Ideal keyword density is a common theme SEO discussion. Real answer is that there is the ideal keyword density. It is different for each query and the search engines calculate it dynamically for each query. Our advice is to analyze the front pages in the search results for a particular query. This will allow you to evaluate the optimal approximate density for specific actions.

   - Age Site. Search engines prefer old sites because they are more durable.

   - Site Updates. Search engines prefer sites that are constantly evolving. Development sites are those in which new information and new sites periodically appear.

   - Domain Zone. Search engines prefer sites that are located in the area. Edu,. Mil. Gov, etc. Only the relevant organizations can register domains such domains so that they are more reliable.

   - Search engines pursue percent of visitors to immediately return to the search, as they visit a site through a search result link. A number of immediate returns means that the content is not probably related to the relevant topic and the ranking of such a page gets lower.

   - Search engines track how often a link is selected in the search results. If some link is only occasionally selected, it means that the site is of little interest and assessment of such a page gets lower

   - Use synonyms and derived forms such as keywords, the search engines will appreciate that (word stem).

; - Search engines consider a very rapid increase in inbound links as artificial promotion and this results in the reduction of evaluation. This is a controversial topic, because this method can be used to reduce the assessment of one's competitors.

   - Google no matter inbound links, if they are (or similar) hosts the same. It was discovered using host IP addresses. Pages whose IP addresses are within the range of xxx.xxx.xxx.0 in xxx.xxx.xxx.255. are considered to be on the same host. This view is more likely to be rooted in the fact that Google have expressed this idea in their patents. However, Google employees claim that no IP address restrictions are placed on inbound links and there is no reason not to believe them.

   - Search engines check the information on the owners of domains. Inbound links originating from a variety of countries that belong to all of an owner are considered less important than normal links. This information is presented in a patent.

   - Search engines prefer sites with long-term domain registrations.

Creating correct content


Creating correct content

   Content of a site plays an important role in promoting the country for many reasons. We will describe some of them in this section. We will also give you some tips on how to populate your site with good content.

   - Unique content. Search engines value new information that has not been previously published. This is why you need to compose own site text and not plagiarize more. A site based on materials received from other countries is much less likely to get to the top in search engines. As a rule, the original source material is always higher in search results.

   - While creating a site, remember that it is primarily created for human visitors, not search engines. Getting visitors to visit your website is only the first step and this is an easy one. Really difficult task is to make them stay in the country and converting them into buyers. You can do this by using good content that is interesting for real people.

   - Try to update information on the site and add new pages on a regular basis. Search engines value sites that are constantly evolving. Also, more useful text your site contains, the more visitors pull. Write articles on the topic of your site, publishes visitor views, create a forum to discuss your project. A forum is only useful if the number of visitors is not enough for him to be active. Interesting and attractive content guarantees that the site will attract visitors interested.

   - A site created for people rather than search engines has a better chance to enter the relevant directories such as DMOZ and others.

   - An interesting site on a particular topic has much better chances to get links, comments, reviews, etc. from other sites in this topic. Such reviews can give you a good flow of visitors, while inbound links from such sources will be highly valued by search engines.

   - As a final tip ... there is an old German proverb: "A shoemaker sticks to his last," which means, "Do what you can do better." If you can write breathtaking prose and creative text for your website then this is great, however. Most of us have no special talent for writing text attractive and we have to rely on professionals such as journalists and technical writers. Obviously, this is an extra expense, but it is justified in the long run.

    Selecting a domain and hosting
 
Currently, anyone can create a web site without incurring any expense. Also, there are many companies that offer free hosting services that will publish your website in exchange for their right to display ads. Many Internet service providers will allow you to publish your website on their servers, if you are their customer. However, all these changes have serious drawbacks that you should seriously consider if you are creating a commercial project.

   Firstly, and most importantly, you need to get your domain for the following reasons:

   - A project that does not have its own domain is considered as a temporary project. Indeed, why should we believe a source if its owners are not prepared to invest in a small amount that is required to create a sort of minimum corporate image? It is possible to publish the material free using free resources based or ISP-based hosting, but any attempt to create a commercial project without your domain is doomed to fail.

   - Your Domain allows you to choose your hosting provider. If necessary, you can move your site to another hosting provider at any time.

    Here are some helpful tips for choosing a domain name.

   - Try to make it easy to remember and make sure there is only one way to pronounce and spell it.

   -. Com extension areas are the best solution to promote international projects in English. Domains from the area. Net. Org,. Biz, etc., are available, but less preferable.

   - If you want to promote a site with a national flavor, use a domain from the respective national area. Use de-. German sites, it -. Italian sites, etc.

   - In the case of countries that contain two or more languages, you must assign a separate domain for each language. National search engines are more likely to appreciate such an attitude that subsections for different languages ​​located in one place.

   A domain costs $ 10-20 per year, depending on the specific registration service and area.

   You need to take these factors into consideration when choosing a hosting provider:

   - Bandwidth Access.
   - Uptime Server.
   - Cost per gigabyte traffic and the amount of prepaid traffic.
   - The site is well located in the same geographic region as the majority of your visitors expected.

   The cost of hosting services for small projects is about $ 5-10 per month.

   Avoid "free" offers while choosing a domain and a hosting provider. Hosting providers sometimes offer free domains to their customers. Such areas are often not registered to you, but to hosting company. Hosting provider will be the owner of the domain. This means that you will not be able to change the hosting service for your project, or you may even be forced to buy from your domain at a premium price. Also, you do not need to register your domains through your hosting company. This can make moving your site to another hosting company more difficult even if you are the owner of your domain.

6.6 Change site address
   You may need to change the address of your project. Perhaps the source has started on a free hosting service and has developed into a very commercial project must have its own domain. Or maybe the owner has simply found a good name for the project. In any case, moving to a new address can be challenging and it is a difficult and unpleasant task to move a project to a new address. For starters, you'll have to start promoting the new address from scratch. However, if this action is inevitable, you might as well make the change as beneficial as possible.

   Our advice is to create your new website to new location with new and unique content. Place about many significant new resources in the old site to allow visitors to easily navigate to your new page. Do not completely remove the old site and its content.

   This approach will allow you to get visitors from search engines for both the old and the new website. At the same time, you get an opportunity to cover additional topics and keywords, which can be more difficult in a resource.

SEO software review


SEO software review

   In previous chapters, we explained how to create your website and what methods are available to promote it. This last section is dedicated to SEO software tools that can automate many SEO work for your website and can achieve even better results. We will discuss SEO Administrator SEO software suite that you can download from our site (www.seoadministrator.com).

    Ranking Monitor
 
Each specialist SEO optimization is faced with the task of regularly checking his sites positions in search engines. You can check these positions manually, but if you have dozens of keywords and search engines to monitor 5-7, the process becomes a real chore.

   Ranking Monitor module will do everything automatically. You are able to view information on your website ratings for each keyword and in a variety of search engines. You will see the dynamics and history of your website positions, as well as upward and downward trends in the position of your website for your specified keywords. The same information is displayed in a visual form.

    Link popularity Checker
 
This program will automatically collect all available search engines and create a complete list of copied without inbound links to your source. For each link, you will see the important parameters such as text and links PageRank of the page reference. If you study this article, you will know how important these parameters are. As well as see general list of inbound links, you can find out how inbound links change over time.

    Indexing Site Tool
 
This handy tool will show you all the pages indexed by a particular search engine. This is a must-have tool for anyone who is creating a new resource website. PageRank values ​​will be displayed on each page indexed.

    Log Analyzer
 
All information about your visitors is stored in log files on your server. Log analyzer module will present this information in convenient and visual reports. The information displayed includes:
   - Originating sites
   - Keywords used,
   - Which country they are from
   - Much more ...

    Page Rank Analyzer
 
This utility collects a large amount of competitive information in the list of sites that you specify. For each site that automatically determines parameters such as Google PageRank, number of inbound links and the presence of each site to DMOZ and Yahoo directories. It is an ideal tool to analyze the level of competition for a particular query.

    Keyword Suggestion Tool
 
This tool collects relevant keywords for your website and shows their popularity (the number of questions per month). It also assesses the degree of competition of a particular keyword phrase.

    HTML analyzer
 
This application analyzes the HTML code of the page. It considers the weight and density of keywords and creates a report on the exact optimization site text. It is useful when creating your website and is also a great tool to analyze your competitors websites. This allows you to analyze both local HTML pages and online projects.

Instead of a conclusion: Promote your website step by step

   In this section, I will explain how I use SEO to promote my sites. This is a kind of systematic review where I briefly recap the previous sections. Of course, I use SEO SEO Administrator software extensively in my work and so I'll tell you how I use it in this instance.

   To be able to start working with a site, you must have some basic knowledge of SEO. This can take quite quickly. The information presented in this document is entirely appropriate and I should point out that you do not have to be a guru optimization to achieve results. Once you have this basic knowledge you can then start work, experiments, at the top of search listings pages and so on. This is where SEO software tools are useful.

   1. First, we have created a rough list of keywords and check their rate of competition. We then evaluate our chances against the competition and choose words that are popular enough and have average competition. Keywords are selected using the keyword suggestion tool. It is also used to perform an approximate control their rate of competition. We use PageRank Analyzer module to perform a detailed analysis of the search results for the most interesting questions and then make our final decision about what keywords to use.

   2. Next, we start composing text on our website. I write a part of it by myself, but I believe the most important parts of specialists in technical writing. In fact, I think the quality and attractiveness of the text is the most important attribute of a page. If the textual content is good, it will be easier to get inbound links and visitors.

   3. In this step, we begin by using the HTML module Analyzer to create the appropriate keyword density. Each page is optimized for its keyword phrase.

   4. We provide various web directories. There are many services to take care of that chore for us. In addition, SEO Administrator will soon have a feature to automate the task.

   5. After these initial steps are completed, we wait and check the search engine index to ensure that search engines are different processing site.

   6. In this step, we can begin to control the positions of the site for our keywords. These positions are not likely to be good at this early stage, but they will give us some useful information to start fine-tuning the SEO work.

   7. We use Link Popularity Checker module to track and work to increase link popularity.

. 8. We use the Log Analyzer module to analyze the number of visitors and work on his upbringing. We also periodically repeat steps 6) - 8).





SEO Services - SEO start-up package

Our Start-up Package is a set of SEO services for those who want to optimize their sites and to make their promotion. This comprehensive package provides you with all the information you need to get the best possible search engine placement for your web pages.

Our Start-up Package consists of three SEO services:

1. Audit and practical recommendations website SEO.
We will thoroughly check and analyze your website. We will then give you clear and practical SEO recommendations on what you can make changes to improve the ranking of your site. This service will assess:
- Keyword density in your text site.
- Correct use of certain html tags.
- Site link structure.
- Various other important details SEO.
The package also includes the following additional reports SEO:
- Current position your site on various search engines.
- How well your site is indexed by the major search engines.
- External links to your website, including the location of the link and link text.
- Suggested keywords, competition levels, to promote your website.
- Current Word and key phrase density within your website.
These reports will be accompanied by comments to ensure that you get the maximum benefit from their SEO.

2. SEO promotional resources.
For this SEO service, we use special software analysis to provide you with a list of comprehensive promotional resources where you can publish links to your site. You will receive the following reports:
- Page with link exchange submission forms. These sites allow you to quickly set up link exchanges and increase your link popularity. This is the main factor contributing to high rankings in the search engines.

- Sites that link / partner / resource sites related to the main themes of your website. This part of SEO service is another valuable list of potential partners Link Exchange

- A report on sites that link to your competitors. If they are interested in your competitor's websites then they are likely to be interested in your website and may be willing to publish a link.

- An overall report on sites that deal with your topic. These sites deal with themes similar to that of your website and may be willing to publish information about your project. This report includes web directories, news and information sources, blogs, forums, etc.
These SEO reports can list thousands of sites that can, in theory, to publish links to your site. Checking and actually getting these links is a big job, but the result will be well worth your efforts!

Seo for Wordpress


1.Install Wordpress
choose an intelligent domain-name which
contains kayword(s) (it possible)
choose the right" tld for you

2.robots. txt
create a file & name it, robots, txt"
put it in the root directory of your domain
to allow the bots of all the search engines
to crawlall all of you content, insert # 1in
your robots.txt
be careful with the use of robots txt, you
can harm your blog by using it the wrong
way, better read this great tutorial

3.. htaccess
create of file & name it,,. htaccess'
put  it in the root directory of your domain
to set your default url with www, edit
your _htaccess and insert #2
wordpress with also use. htaccess to create
speaking URLs
to use. htaccess, your apache webserver
have to supports mod_rewrite

4.URL structure for posts
optimize your url structure by using the
properties of wordpress (permalinks)
don't include the category in your post-
URLs,perhaps if you change the category of
a post later, wordpress will not create a new
url
by including a four-digit number, your posts
URL is optimized for Google news [perhaps
you want to get in there]
don't include too much folders
for  example you can set your post URL by
using # 3

5.choose your theme
you can find many themes, here are some
great resources:
Theme Viewer
Theme Designer
Candy College
Blonggounsut
Wordpress Themes
Noupe
Magazine Theme
Hackwordpress
Dr. Web
download some themes you like, but also
inspect the source code of the theme:be
sure that the source code is well stuctured


6. Optimize theme: Header
optimize your title, chouse an ideal title
for all kinds of pages of your blog: #4
if necessary, do the same with your
descriptions
dont't let search engines indexing all of
your pages to avoid duplicate content: #3
add some language information: #6
specify the location of your robots.txt: #7


7. Optimize theme: Body
remove the link of your posts headline
use headlines to highlight text passages as
important, not to style your design
also remove the <h2> headlines
from your sidebar
use pagerank sculpting careflully

8: Must have plugins
Add link attribute for doing pagerank sculpting
the easy way
Custom Query String Reloaded to change the
number of posts displayed in your archives,
categories...
Dagon Design Sitemap Generator to create a
HTML-sitemap for better internal link building
Google XML Sitemaps to create web-sitemaps
Pagebar 2 to optimize the navigation in your
blog
Similar Posts to optimize your internal link
structure sociable includes buttons for social
bookmarking pages in your posts
wpSEO to optimize title, description & keyords
easiyl

9: Usertracking
install a usertracking system too analyse your
traffic

10: Adding a sitemap
add your sitemap in Google Webmastertools

11: Archives
create a monthly & weekly archive