Six essential components of using Google Webmaster Tools Service for your website promotion

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 5,00 out of 5)
Загрузка...
Опубликовано автором admin

Hello, dear reader!

In the last article devoted to the promotion of websites in search engines — «Seo — getting acquainted with our site closer», we discussed the importance and the necessity to add your web resource to the service of Google — Webmaster Tools. In today’s article we will focus on what useful can be found in this specified service and what to do after adding your site.

So you have successfully passed the verification and added the site to Webmaster Tools. We will briefly touch on six key points that you should understand.
At the entrance to the profile of your site you can see a summary of the status of your resource — Current Status (Current Status), which includes the «Crawl Errors», «Search Queries», «Sitemaps». Additionally, in the toolbar (Dashboard) on the left, all the necessary information is in the sections or subsections of menu.

1. Crawl Errors

The first tab of Current Status panel — Crawl Errors is very important, so we go right to it and examine.

Crawl errors screenshot

On a similar chart the URL of the error is displayed — pages which give 404 server response. Less you have them on your site, better it is. Error 404 means that for the moment the requested resource is not found. This may be the case if the page with the content has changed its url address and the old address is not available, a link to the page is broken, etc. More details about 404 error and the ways of its elimination you can find in Internet, for example here —
http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools

Of course, if the number of 404 errors on the site is not great (not measured in many hundreds or even thousands) and the 404 page itself, the output for the user, is made quite informative and correct, then such errors will hardly have a significant impact on search engine rankings. However, it is important to know the cause of these errors to prevent their unjustified appearing on your site, as well as the correct and well-timed solutions.

2. Sitemaps

Another important point — sitemap files. Go to the tab «Optimization» of the left navigation menu and click on the «Sitemaps». In this case, you will have a special page with information about the loaded from your site map. For our website the page has this look.

Sitemaps page screenshot
You can see the number of submitted and indexed pages, the last date of the document crawl by crawler. The most important — is to make sure that Google sees your site map and properly reads it!

If you have not created the map of your website in the xml format, you can easily do it with the help of service http://www.xml-sitemaps.com/, following the simple instructions. The only insignificant disadvantage of this service is the limited number of pages in the generated site map for you — up to five hundred. If your web site has over 500 pages, you can use the unlimited version just for $ 19,99.

To add documents to your site (the root directory), you need any ftp client and the actual credentials to access your resource by ftp. If you have correctly filled in the file on the site, it must be opened by the following url — http://www.yoursite.com/sitemap.xml.
In this paragraph we should briefly touch on that, what is needed for and why sitemap is so important to search engines in terms of a good estimation of your resource.

More simply, the search engines want to find quickly and view web pages of a site, not wasting time on hard accessible, deep embedded pages, and also various technical errors. For example, the page to which the link from the drop-down menu follows, can be not visited by robot for a long time, due to the fact that PS does not like these techniques that are quite comfortable for site owners and visitors. On the other hand, if this page contains important information and you want to see it in a good position in the search results, then this page should definitely be available for indexing!
Actually, for this cases we need xml sitemap, which is avoided by robot first. Moreover, Google not very good treats the web sites that do not have such file. This is a failure of its most basic recommendations.

3. HTML Improvements

The next important tab of the left navigation menu is HTML Improvements section.

Htmla Improvements page's screenshot

This section of the service is intended to draw the attention of webmasters and site owners on the comments to the quality, the removal of which can improve your web resource in the eyes of the search engines, especially Google.

Strictly speaking, it is a meta tag <title> and <description>.

Remarks on the content of the meta tag description are divided into: duplicate, long and short. Remarks on the title are divided into: duplicate, long, short, missing and non-informative title tags. By clicking on the appropriate section, you can see exactly which pages require adjustment. Already starting from the names you can see that the meta tags of your pages should not be too short, too long, or informative. In addition, the major disadvantage of your site will be duplicated tags, which often occur with the presence of duplicated pages. As we are not going to talk in detail about the rules of filling meta tags of the pages in this article, we are providing a link to the competent source, where you can learn about it more — http://www.seomoz.org/learn-seo/title-tag ,  http://www.seomoz.org/learn-seo/meta-description

Once again we want to emphasize the importance of analyzing and correcting such errors on the pages of your web resource. For example, using HTML Improvements section it is possible to reveal the presence of duplicated pages, which is not welcomed by Google!

4. Blocked URLs

The next important point of the service is Blocked URLs.

This section displays information about the file of your website robots.txt. This file is used to indicate for the search engines those pages or directories of the site, access to which should be closed for the search robots. These directories can include service folders and files of the site that do not require assessing the quality of the content, presented on your resource, but take the time to get around them by robots. Examples can include the / cgi-bin /, / assets /, / client /, / manager / etc depending on how your web site is programmed.

In addition, in robots.txt may specify the pages which are duplicates. In this case, it is also possible to use meta tag noindex, redirect 301of tag canonical. It should be noted that when dealing with eliminating duplicates the latter methods may be more effective than the robots.txt, but more difficult in implementation. More information can be found here — http://www.seomoz.org/learn-seo/robotstxt

Also in the robots.txt file it is useful to specify the path to your sitemap in xml format.

There are a lot of publications about correct preparation of robots.txt files.

5. Links to your site

The next important for any web master and owner of the resource tab — Traffic / Links to Your Site.

Links to your site page's screenshot

Everyone, even a beginner in learning the principles and rules of promotion in the search engines, knows how important for a positive outcome is the quality of external links that lead to your website. This section contains information about links to your site seen by Google.

Using the service, you can analyze and evaluate the quality of the link mass, the total number of links from one referring domain, and the number and URL addresses of your resource, to which the links from that domain follow.

Also, for convenience, all links to your website can be downloaded as a single document / spreadsheet.

6. Messages

And the last quite important in our opinion section of the Service — Messages. This directory displays the messages that are sent to the owner of the resource. If Google detects problems with indexing, during modifying the primary mirror of the site etc. you will be notified with a message.

In some cases, the reports from Google can be critical. For example, before the launch of new algorithms for avaluating the quality of sites in April 2012, all owners of Web resources that break quality standards, have been sent warning letters to relevant content. And after a while, many of these sites were penalized.

Information line about new messages is displayed on the main page of the service — «New and important». If in this line is listed the following record — «No new messages or recent critical issues», it means that everything is ok with the site — there’s nothing critical.

In the next articles of «SEO and Promotion» rubric we will continue to talk about the web site promotion in search engines, affecting new and already covered topics, which in this area — are a huge amount.

See you on the pages http://tonytemplates.com/blog/

 

Запись опубликована в рубрике SEO & Promotion. Добавьте в закладки постоянную ссылку.
3 комментария: Six essential components of using Google Webmaster Tools Service for your website promotion
  1. liquid e cigarette говорит:

    Tough time trying to subscribe — Is any individual else having trouble?

    Fantastic read though!

  2. James говорит:

    Great post! Amazing that you added to essential components robots.txt file. Somehow a lot of people forget about it. I wrote here sitechecker.pro/robots-tester/ how to use robots when creating a new site.

Post a comment