Wednesday

The 'fetch as Google' in webmaster tools allows you to see your pages as Google sees them.



Though Google has gotten good at tracking down new web pages, it still offers multiple ways for site owners to inform it of URLs they want the search engine to crawl and index. Recently, Google added one more: Fetch as Google.

If you’re familiar with many of Google’s features, you know that Fetch as Google has been around for nearly two years. Webmasters can use it through Google Webmaster Tools. Tell Fetch as Google to crawl a specific URL on a site you’ve verified, and you’ll see your page the way Google would see it. This is great for figuring out and debugging website issues that don’t show up if you’re simply looking at the site through a web browser.

Now, though, Fetch as Google lets you take things one step further. If it successfully fetches your URL, you can request that Google index it by clicking a “Submit to index” link. You won’t want to do this with every page on your website, as there are important limits and points to consider, but it’s great to have this additional option.

One important point to keep in mind is that you can only submit 50 pages a week in this fashion. You can submit all pages linked from the URL you’re submitting you’re submitting as well, but you can only do that 10 times in a month. Additionally, Google notes that submissions of images or video are more appropriately made by using Sitemaps rather than Fetch as Google.

You should also keep in mind that submitting a URL in this way does not mean that Google will definitely put the page in its index. In its blog post on the topic, Google states “that we don’t guarantee that every URL submitted in this way will be indexed; we’ll still use our regular processes – the same ones we use on URLs discovered in any other way – to evaluate whether a URL belongs in our index.”

On the other hand, using Fetch as Googlebot does speed up the process of crawling your URLs. Google will do that within a day of your submission. It’s worth remembering that not everything Google crawls goes into its index. And not every URL that Google learns about gets crawled right away, in any case. As Vanessa Fox explained on Search Engine Land, it arranges the list of URLs it discovers in priority order before crawling them. Google uses a number of factors to determine the priority of a web page, including the page’s overall value, PageRank, frequency of change and how important Google thinks the page is (this is why news home pages tend to get crawled and indexed very quickly).

If you want to take advantage of this new way to submit URLs from your website to Google for crawling and possible inclusion in its index, it’s best to use it for new areas and categories on your site. If you’re making a major update, Fetch at Googlebot will also serve your need to get Google’s attention quickly. This will help you to speed up URL removal or cache updates as well.






source: http://www.seochat.com/c/a/website-promotion-help/submit-urls-with-fetch-as-googlebot/