Webmaster feed

+1 reporting in Google Webmaster Tools and Google Analytics

Webmaster Central Blog - Thu, 06/30/2011 - 07:34
Webmaster level: All

It’s been a busy week for us here at the Googleplex. First we released +1 buttons to Google search sites globally, then we announced the beginning of the Google+ project.

The +1 button and the Google+ project are both about making it easier to connect with the people you trust online. For the +1 button, that means bringing advice from trusted friends and contacts right into Google search, letting the users who love your web content recommend it at the moment of decision.

But when you’re managing a website, it's usually not real until you can measure it. So we’re happy to say we’ve got one more announcement to make -- today we’re releasing reports that show you the value +1 buttons bring to your site.

First, +1 metrics in Google Webmaster Tools can show you how the +1 button affects the traffic coming to your pages:


  • The Search Impact report gives you an idea of how +1‘s affect your organic search traffic. You can find out if your clickthrough rate changes when personalized recommendations help your content stand out. Do this by comparing clicks and impressions on search results with and without +1 annotations. We’ll only show statistics on clickthrough rate changes when you have enough impressions for a meaningful comparison.
  • The Activity report shows you how many times your pages have been +1’d, from buttons both on your site and on other pages (such as Google search).
  • Finally, the Audience report shows you aggregate geographic and demographic information about the Google users who’ve +1’d your pages. To protect privacy, we’ll only show audience information when a significant number of users have +1’d pages from your site.
Use the +1 Metrics menu on the side of the page to view your reports. If you haven’t yet verified your site on Google Webmaster Tools, you can follow these instructions to get access.

Finally, you can also see how users share your content using other buttons besides +1 by using Social Plugin Tracking in Google Analytics. Once you configure the JavaScript for Analytics, the Social Engagement reports help you compare the various types of sharing actions that occur on your pages.


  • The Social Engagement report lets you see how site behavior changes for visits that include clicks on +1 buttons or other social actions. This allows you to determine, for example, whether people who +1 your pages during a visit are likely to spend more time on your site than people who don’t.
  • The Social Actions report lets you track the number of social actions (+1 clicks, Tweets, etc) taken on your site, all in one place.
  • The Social Pages report allows you to compare the pages on your site to see which are driving the highest the number of social actions.
If you’re using the default version of the latest Google Analytics tracking code, when you add +1 buttons to your site, we automatically enable Social Plugin Tracking for +1 in your account. You can enable tracking for other social plugins in just a few simple steps.

Social reporting is just getting started. As people continue to find new ways to interact across the web, we look forward to new reports that help business owners understand the value that social actions are providing to their business. So +1 to data!

Written by Dan Rodney, Software Engineer

+1 around the world

Webmaster Central Blog - Tue, 06/28/2011 - 07:31
Webmaster Level: all

A few months ago we released the +1 button on English search results on google.com. More recently, we’ve made the +1 button available to sites across the web, making it easy for the people who love your content to recommend it on Google search.

Today, +1’s will start appearing on Google search pages globally. We'll be starting off with sites like google.co.uk, google.de, google.jp and google.fr, then expanding quickly to most other Google search sites soon after.

We’ve partnered with a few more sites where you’ll see +1 buttons over the coming days.


If you’re a publisher based outside of the US, and you’ve been waiting to put +1 buttons on your site, now’s a good time to get started. Visit the +1 button tool on Google Webmaster Central where the +1 button is already available in 44 languages.

Adding the +1 button could help your site to stand out by putting personal recommendations right at the moment of decision, on Google search. So if you have users who are fans of your content, encourage them to add their voice with +1!

Posted by , Product Marketing Manager

Your fast pass through security

Webmaster Central Blog - Thu, 06/23/2011 - 21:58
Webmaster level: All

Security checks are nobody's cup of tea. We've never seen people go through airport baggage checks for fun. But while security measures are often necessary, that doesn't mean they have to be painful. In that spirit, we’ve implemented several major improvements to make the Google Site Verification process faster, more straightforward, and perhaps even a pleasure to use—so you can get on with the tasks that matter to you.

New verification method recommendations


You’ll quickly notice the changes we’ve made to the verification page, namely the new tabbed interface. These tabs allow us to give greater visibility to the verification method that we think will be most useful to you, which is listed in the Recommended Method tab.


Our recommendation is just an educated guess, but sometimes guesses can be wrong. It’s possible that the method we recommend might not work for you. If this is the case, simply click the "Alternate methods" tab to see the other verification methods that are available. Verifying with an alternate method is just as powerful as verifying with a recommended method.

Our recommendations are computed from statistical data taken from users with a similar configuration to yours. For example, we can guess which verification methods might be successful by looking at the public hosting information for your website. In the future we plan to add more signals so that we can provide additional customized instructions along with more relevant recommendations.

New Google Sites Are Automatically Verified
For some of you, we’ve made the process even more effortless—Google Sites administrators are now automatically verified for all new sites that they create. When you create a new Google Site, it’ll appear verified in the details page. The same goes for adding or removing owners: when you edit the owners list in your Google Site's settings, the changes will automatically appear in Webmaster Tools.

One thing to note is that we’re unable to automatically verify preexisting Google Sites at this time. If you’d like to verify your older Google Sites, please continue to use the meta tag method already available.

We hope these enhancements help get you through security even faster. Should you get pulled over and have any questions, feel free to check out our Webmaster Help Forums.


Written by , , (Software Engineers)

WordPress Plugin for Webmaster Tools verification

Webmaster Central Blog - Thu, 06/23/2011 - 21:58
Webmaster Level: All

For webmasters with self-hosted WordPress blogs, there’s now a Webmaster Tools site verifcation plugin for Wordpress that completely automates our verification process! You can install it directly from the “Install Plugins” control panel built into your WordPress blog, or you can download the ZIP file from the WordPress plugin site. This plugin can only be used by self-hosted WordPress blogs; it can’t be installed on blogs hosted on wordpress.com.

With verified ownership of your site in Webmaster Tools, you can receive specific statistics and information (e.g. relevant search queries, malware notices) about your site directly from Google. For recent news about verification for other types of sites, please see our recent blog post, “Your fast pass through security.”


Written by , Software Engineer

Introducing schema.org: Search engines come together for a richer web

Webmaster Central Blog - Thu, 06/23/2011 - 21:55
Webmaster Level: All

Today we’re announcing schema.org, a new initiative from Google, Bing and Yahoo! to create and support a common set of schemas for structured data markup on web pages. Schema.org aims to be a one stop resource for webmasters looking to add markup to their pages to help search engines better understand their websites.

At Google, we’ve supported structured markup for a couple years now. We introduced rich snippets in 2009 to better represent search results describing people or containing reviews. We’ve since expanded to new kinds of rich snippets, including products, events, recipes, and more.


Example of a rich snippet: a search result enhanced by structured markup. In this case, the rich snippet contains a picture, reviews, and cook time for the recipe.
Adoption by the webmaster community has grown rapidly, and today we’re able to show rich snippets in search results more than ten times as often as when we started two years ago.

We want to continue making the open web richer and more useful. We know that it takes time and effort to add this markup to your pages, and adding markup is much harder if every search engine asks for data in a different way. That’s why we’ve come together with other search engines to support a common set of schemas, just as we came together to support a common standard for Sitemaps in 2006. With schema.org, site owners can improve how their sites appear in search results not only on Google, but on Bing, Yahoo! and potentially other search engines as well in the future.

Now let’s discuss some of the details of schema.org relevant to you as a webmaster:

1) Schema.org contains a lot of new markup types.
We’ve added more than 100 new types as well as ported over all of the existing rich snippets types. If you’ve looked at adding rich snippets markup before but none of the existing types were relevant for your site, it’s worth taking another look. Here are a few popular types:
Or, view a full list of all schema.org types. The new markup types may be used for future rich snippets formats as well as other types of improvements to help people find your content more easily when searching.

2) Schema.org uses microdata.
Historically, we’ve supported three different standards for structured data markup: microdata, microformats, and RDFa. We’ve decided to focus on just one format for schema.org to create a simpler story for webmasters and to improve consistency across search engines relying on the data. There are arguments to be made for preferring any of the existing standards, but we’ve found that microdata strikes a balance between the extensibility of RDFa and the simplicity of microformats, so this is the format that we’ve gone with.

To get an overview of microdata as well as the conventions followed by schema.org, take a look at the schema.org Getting Started guide.

3) We’ll continue to support our existing rich snippets markup formats.
If you’ve already done markup on your pages using microformats or RDFa, we’ll continue to support it. One caveat to watch out for: while it’s OK to use the new schema.org markup or continue to use existing microformats or RDFa markup, you should avoid mixing the formats together on the same web page, as this can confuse our parsers.

4) Test your markup using the rich snippets testing tool.
It’s very useful to test your web pages with markup to make sure we’re able to parse the data correctly. As with previous rich snippets markup formats, you should use the rich snippets testing tool for this purpose. Note that while the testing tool will show the marked up information that was parsed from the page, rich snippets previews are not yet shown for schema.org markup. We’ll be adding this functionality soon.

The schema.org website and the rich snippets testing tool are in English. However, Google shows rich snippets in search results globally, so there’s no need to wait to start marking up your pages.

To learn more about rich snippets and how they relate to schema.org, check out the Rich snippets schema.org FAQ.

By and , Search Quality team

Easier URL removals for site owners

Webmaster Central Blog - Thu, 06/23/2011 - 05:23
Webmaster Level: All

We recently made a change to the Remove URL tool in Webmaster Tools to eliminate the requirement that the webpage's URL must first be blocked by a site owner before the page can be removed from Google's search results. Because you've already verified ownership of the site, we can eliminate this requirement to make it easier for you, as the site owner, to remove unwanted pages (e.g. pages accidentally made public) from Google's search results.

Removals persist for at least 90 days
When a page’s URL is requested for removal, the request is temporary and persists for at least 90 days. We may continue to crawl the page during the 90-day period but we will not display it in the search results. You can still revoke the removal request at any time during those 90 days. After the 90-day period, the page can reappear in our search results, assuming you haven’t made any other changes that could impact the page’s availability.

Permanent removal
In order to permanently remove a URL, you must ensure that one of the following page blocking methods is implemented for the URL of the page that you want removed:
This will ensure that the page is permanently removed from Google's search results for as long as the page is blocked. If at any time in the future you remove the previously implemented page blocking method, we may potentially re-crawl and index the page. For immediate and permanent removal, you can request that a page be removed using the Remove URL tool and then permanently block the page’s URL before the 90-day expiration of the removal request.



For more information about URL removals, see our “URL removal explained” blog series covering this topic. If you still have questions about this change or about URL removal requests in general, please post in our Webmaster Help Forum.

Written by , Webmaster Trends Analyst

Our SEO Guide — now available in ten more languages

Webmaster Central Blog - Thu, 06/23/2011 - 03:18
Webmaster Level: Beginner

We’re very glad to announce that our recently updated SEO guide is now available in ten more languages: Spanish, French, German, Russian, Turkish, Finnish, Swedish, Hungarian, Traditional Chinese and Simplified Chinese.

For this new version, we’ve thoroughly reviewed and updated the content; we’ve also added a brand new section on best practices for mobile websites.



You can download each PDF file in its full 32-page glory from goo.gl/seoguide and print it to have it as a useful resource.

Posted by and , Search Quality Team

Introducing Page Speed Online, with mobile support

Webmaster Central Blog - Wed, 06/22/2011 - 22:12
Webmaster level: intermediate

At Google, we’re striving to make the whole web fast. As part of that effort, we’re launching a new web-based tool in Google Labs, Page Speed Online, which analyzes the performance of web pages and gives specific suggestions for making them faster. Page Speed Online is available from any browser, at any time. This allows website owners to get immediate access to Page Speed performance suggestions so they can make their pages faster.

In addition, we’ve added a new feature: the ability to get Page Speed suggestions customized for the mobile version of a page, specifically smartphones. Due to the relatively limited CPU capabilities of mobile devices, the high round-trip times of mobile networks, and rapid growth of mobile usage, understanding and optimizing for mobile performance is even more critical than for the desktop, so Page Speed Online now allows you to easily analyze and optimize your site for mobile performance. The mobile recommendations are tuned for the unique characteristics of mobile devices, and contain several best practices that go beyond the recommendations for desktop browsers, in order to create a faster mobile experience. New mobile-targeted best practices include eliminating uncacheable landing page redirects and reducing the amount of JavaScript parsed during the page load, two common issues that slow down mobile pages today.


Page Speed Online is powered by the same Page Speed SDK that powers the Chrome and Firefox extensions and webpagetest.org.

Please give Page Speed Online a try. We’re eager to hear your feedback on our mailing list and how you’re using it to optimize your site.

Posted by and , Page Speed team

High-quality sites algorithm goes global, incorporates user feedback

Webmaster Central Blog - Wed, 06/22/2011 - 21:54
Over a month ago we introduced an algorithmic improvement designed to help people
find more high-quality sites in search. Since then we’ve gotten a lot of positive responses about the change: searchers are finding better results, and many great publishers are getting more traffic.

Today we’ve rolled out this improvement globally to all English-language Google users, and we’ve also incorporated new user feedback signals to help people find better search results. In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before. The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.

Based on our testing, we’ve found the algorithm is very accurate at detecting site quality. If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively. Google's quality guidelines provide helpful information about how to improve your site. As sites change, our algorithmic rankings will update to reflect that. In addition, you’re welcome to post in our Webmaster Help Forums. While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms.

We will continue testing and refining the change before expanding to additional languages, and we’ll be sure to post an update when we have more to share.

Posted by , Google Fellow

Sharing advice from our London site clinic

Webmaster Central Blog - Wed, 06/22/2011 - 21:52

Webmaster level: Beginner - Intermediate

We recently hosted our second site clinic, this time at TechHub in London, UK. Like last year, here’s our summary of the topics that came up.

  • Title tags and meta description tags are easy ways to improve your site's visibility in Google search results, yet we still see webmasters not fully utilizing their potential. We have a bit of help available about writing good page titles and descriptions which you can read to brush up on the subject. That said, you can ignore the meta keywords, at least as far as Google is concerned.
  • One way Google's algorithms determine the context of content on a page is by looking at the page’s headings. The way semantic markup is used throughout a site, including h1, h2, and h3 tags, helps us to understand the priorities of a site’s content. One should not fret, though, about every single H tag. Using common sense is the way to go.
  • Just as we recommend you structuring pages logically, it is similarly important to structure the whole website, particularly by linking to related documents within your site as necessary. This helps both users and search engine bots explore all the content you provide. To augment this, be sure to provide a regularly updated Sitemap, which can be conveniently linked to from your site’s robots.txt file for automatic discovery by Google and other search engines.
  • Duplicate content and canonicalization issues were discussed for many websites reviewed at the site clinic. Duplicate content within a website is generally not a problem, but can make it more difficult for search engines to properly index your content and serve the right version to users. There are two common ways to signal what your preferred versions of your content are: By using 301 redirects to point to your preferred versions, or by using the rel="canonical" link element. If you’re concerned about setting your preferred domain in terms of whether to use www or non-www, we recommend you check out the related feature for setting the preferred domain feature in Webmaster Tools.
  • Another commonly seen issue is that some sites have error pages which do not return an error HTTP result code, but instead return the HTTP success code 200. Only documents that are actually available should reply with the HTTP success result code 200. When a page no longer exists, it should return a 404 (Not found) response. Header responses of any URL can be checked using Fetch as Googlebot in Webmaster Tools or using third party tools such as the Live HTTP Headers Firefox addon or web-sniffer.net.
  • Ranking for misspelled queries, e.g. local business names including typos, seems to be an area of concern. In some cases, Google’s automatic spelling correction gets the job done for users by suggesting the correct spelling. It isn’t a wise idea to stuff a site's content with every typo imaginable. It’s also not advisable to hide this or any other type of content using JavaScript, CSS or similar techniques. These methods are in violation of Google’s Webmaster Guidelines and we may take appropriate action against a site that employs them. If you’re not sure how Googlebot “sees” your pages, e.g. when using lots of JavaScript, you can get a better idea by looking at the text-only version of the cached copy in Google web search results.
  • Users love fast websites. That’s why webpage loading speed is an important consideration for your users. We offer a wide range of tools and recommendations to help webmasters understand the performance of their websites and how to improve them. The easiest way to get started is to use Page Speed Online, which is the web-based version of our popular Page Speed Chrome extension. Our Let's make the web faster page has great list of resources from Google and elsewhere for improving website speed, which we recommend you to read.

We’d like to thank the TechHub team, who helped us facilitate the event, and give a big thank you to all participants. We hope you found the presentation and Q&A session interesting. We've embedded the presentation below.

And as we mentioned in the site clinic, sign up at the Google Webmaster Help Forum to discuss any further questions you might have and keep an eye on our Webmaster Central Blog.



Written by , ,

An update on Google Video: Finding an easier way to migrate Google Video content to YouTube

Webmaster Central Blog - Wed, 06/22/2011 - 21:49
[Cross-posted on the YouTube Blog]

Last week we sent an email to Google Video users letting them know we would be ending playbacks of Google Videos on April 29 and providing instructions on how to download videos currently hosted on the platform. Since then we’ve received feedback from you about making the migration off of Google Video easier. We work every day to make sure you have a great user experience and should have done better. Based on your feedback, here’s what we’re doing to fix things.

Google Video users can rest assured that they won't be losing any of their content and we are eliminating the April 29 deadline. We will be working to automatically migrate your Google Videos to YouTube. In the meantime, your videos hosted on Google Video will remain accessible on the web and existing links to Google Videos will remain accessible. If you want to migrate to YouTube now, here’s how you do it:
  • We’ve created an “Upload Videos to YouTube” option on the Google Video status page. To do this, you’ll need to have a YouTube account associated with your Google Video account (you can create one here). Before doing this you should read YouTube’s Terms of Use and Copyright Policies. If you choose this option, we’ll do our best to ensure your existing Google Video links continue to function.


If you’d prefer to download your videos from Google Video, that option is still available.

As we said nearly two years ago, the team is now focused on tackling the tough challenge of video search. We want to thank the millions of people around the world who have taken the time to create and share videos on Google Video. We hope today's improvements will help ease your transition to another video hosting service.

Posted by , Engineering Manager

Do 404s hurt my site?

Webmaster Central Blog - Wed, 06/22/2011 - 21:47
Webmaster level: Beginner/Intermediate

So there you are, minding your own business, using Webmaster Tools to check out how awesome your site is... but, wait! The Crawl errors page is full of 404 (Not found) errors! Is disaster imminent??


Fear not, my young padawan. Let’s take a look at 404s and how they do (or do not) affect your site:

Q: Do the 404 errors reported in Webmaster Tools affect my site’s ranking?
A: 404s are a perfectly normal part of the web; the Internet is always changing, new content is born, old content dies, and when it dies it (ideally) returns a 404 HTTP response code. Search engines are aware of this; we have 404 errors on our own sites, as you can see above, and we find them all over the web. In fact, we actually prefer that, when you get rid of a page on your site, you make sure that it returns a proper 404 or 410 response code (rather than a “soft 404”). Keep in mind that in order for our crawler to see the HTTP response code of a URL, it has to be able to crawl that URL—if the URL is blocked by your robots.txt file we won’t be able to crawl it and see its response code. The fact that some URLs on your site no longer exist / return 404s does not affect how your site’s other URLs (the ones that return 200 (Successful)) perform in our search results.

Q: So 404s don’t hurt my website at all?
A: If some URLs on your site 404, this fact alone does not hurt you or count against you in Google’s search results. However, there may be other reasons that you’d want to address certain types of 404s. For example, if some of the pages that 404 are pages you actually care about, you should look into why we’re seeing 404s when we crawl them! If you see a misspelling of a legitimate URL (www.example.com/awsome instead of www.example.com/awesome), it’s likely that someone intended to link to you and simply made a typo. Instead of returning a 404, you could 301 redirect the misspelled URL to the correct URL and capture the intended traffic from that link. You can also make sure that, when users do land on a 404 page on your site, you help them find what they were looking for rather than just saying “404 Not found."

Q: Tell me more about “soft 404s.”
A: A soft 404 is when a web server returns a response code other than 404 (or 410) for a URL that doesn’t exist. A common example is when a site owner wants to return a pretty 404 page with helpful information for his users, and thinks that in order to serve content to users he has to return a 200 response code. Not so! You can return a 404 response code while serving whatever content you want. Another example is when a site redirects any unknown URLs to their homepage instead of returning 404s. Both of these cases can have negative effects on our understanding and indexing of your site, so we recommend making sure your server returns the proper response codes for nonexistent content. Keep in mind that just because a page says “404 Not Found,” doesn’t mean it’s actually returning a 404 HTTP response code—use the Fetch as Googlebot feature in Webmaster Tools to double-check. If you don’t know how to configure your server to return the right response codes, check out your web host’s help documentation.

Q: How do I know whether a URL should 404, or 301, or 410?
A: When you remove a page from your site, think about whether that content is moving somewhere else, or whether you no longer plan to have that type of content on your site. If you’re moving that content to a new URL, you should 301 redirect the old URL to the new URL—that way when users come to the old URL looking for that content, they’ll be automatically redirected to something relevant to what they were looking for. If you’re getting rid of that content entirely and don’t have anything on your site that would fill the same user need, then the old URL should return a 404 or 410. Currently Google treats 410s (Gone) the same as 404s (Not found), so it’s immaterial to us whether you return one or the other.

Q: Most of my 404s are for bizarro URLs that never existed on my site. What’s up with that? Where did they come from?
A: If Google finds a link somewhere on the web that points to a URL on your domain, it may try to crawl that link, whether any content actually exists there or not; and when it does, your server should return a 404 if there’s nothing there to find. These links could be caused by someone making a typo when linking to you, some type of misconfiguration (if the links are automatically generated, e.g. by a CMS), or by Google’s increased efforts to recognize and crawl links embedded in JavaScript or other embedded content; or they may be part of a quick check from our side to see how your server handles unknown URLs, to name just a few. If you see 404s reported in Webmaster Tools for URLs that don’t exist on your site, you can safely ignore them. We don’t know which URLs are important to you vs. which are supposed to 404, so we show you all the 404s we found on your site and let you decide which, if any, require your attention.

Q: Someone has scraped my site and caused a bunch of 404s in the process. They’re all “real” URLs with other code tacked on, like http://www.example.com/images/kittens.jpg" width="100" height="300" alt="kittens"/></a... Will this hurt my site?
A: Generally you don’t need to worry about “broken links” like this hurting your site. We understand that site owners have little to no control over people who scrape their site, or who link to them in strange ways. If you’re a whiz with the regex, you could consider redirecting these URLs as described here, but generally it’s not worth worrying about. Remember that you can also file a takedown request when you believe someone is stealing original content from your website.

Q: Last week I fixed all the 404s that Webmaster Tools reported, but they’re still listed in my account. Does this mean I didn’t fix them correctly? How long will it take for them to disappear?
A:
Take a look at the ‘Detected’ column on the Crawl errors page—this is the most recent date on which we detected each error. If the date(s) in that column are from before the time you fixed the errors, that means we haven’t encountered these errors since that date. If the dates are more recent, it means we’re continuing to see these 404s when we crawl.

After implementing a fix, you can check whether our crawler is seeing the new response code by using Fetch as Googlebot. Test a few URLs and, if they look good, these errors should soon start to disappear from your list of Crawl errors.

Q: Can I use Google’s URL removal tool to make 404 errors disappear from my account faster?
A:
No; the URL removal tool removes URLs from Google’s search results, not from your Webmaster Tools account. It’s designed for urgent removal requests only, and using it isn’t necessary when a URL already returns a 404, as such a URL will drop out of our search results naturally over time. See the bottom half of this blog post for more details on what the URL removal tool can and can’t do for you.

Still want to know more about 404s? Check out 404 week from our blog, or drop by our Webmaster Help Forum.

Posted by , Webmaster Trends Analyst

Flash support in Instant Previews

Webmaster Central Blog - Wed, 06/22/2011 - 21:45
Webmaster level: All

With Instant Previews, users can see a snapshot of a search result before clicking on it. We’ve made a number of improvements to the feature since its introduction last November, and if you own a site, one of the most relevant changes for you is that Instant Previews now supports Flash.



An Instant Preview with rich content rendered

In most cases, when the preview for a page is generated through our regular crawl, we will now render a snapshot of any Flash components on the page. This will replace the "puzzle piece" icon that previously appeared to indicate Flash components, and should improve the accuracy of the previews.

However, for pages that are fetched on demand by the "Google Web Preview" user-agent, we will generate a preview without Flash in order to minimize latency. In these cases the preview will appear as if the page were visited by someone using a browser without Flash enabled, and "Install Flash" messages may appear in the preview, depending on how your website handles users without Flash.

To improve your previews for these on-demand renders, here are some guidelines for using Flash on your site:
  • Make sure that your site has a reasonable, seamless experience for visitors without Flash. This may involve creating HTML-only equivalents for your Flash-based content that will automatically be shown to visitors who can't view Flash. Providing a good experience for this case will improve your preview and make your visitors happier.

  • If Flash components are rendering but appear as loading screens instead of actual content, try reducing the loading time for the component. This makes it more likely we'll render it properly.

  • If you have Flash videos on your site, consider submitting a Video Sitemap which helps us to generate thumbnails for your videos in Instant Previews.

  • If most of the page is rendering properly but you still see puzzle pieces appearing for some smaller components, these may be fixed in future crawls of your page.
If you have additional questions, please feel free to post them in our Webmaster Help Forum.

As always, we'll keep you updated as we continue to make improvements to Instant Previews.

Posted by , Product Manager

More guidance on building high-quality sites

Webmaster Central Blog - Wed, 06/22/2011 - 21:42

Webmaster level: All

In recent months we’ve been especially focused on helping people find high-quality sites in Google’s search results. The “Panda” algorithm change has improved rankings for a large number of high-quality websites, so most of you reading have nothing to be concerned about. However, for the sites that may have been affected by Panda we wanted to provide additional guidance on how Google searches for high-quality sites.

Our advice for publishers continues to be to focus on delivering the best possible user experience on your websites and not to focus too much on what they think are Google’s current ranking algorithms or signals. Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we've rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.

What counts as a high-quality site?

Our site quality algorithms are aimed at helping people find "high-quality" sites by reducing the rankings of low-quality content. The recent "Panda" change tackles the difficult task of algorithmically assessing website quality. Taking a step back, we wanted to explain some of the ideas and research that drive the development of our algorithms.

Below are some questions that one could use to assess the "quality" of a page or an article. These are the kinds of questions we ask ourselves as we write algorithms that attempt to assess site quality. Think of it as our take at encoding what we think our users want.

Of course, we aren't disclosing the actual ranking signals used in our algorithms because we don't want folks to game our search results; but if you want to step into Google's mindset, the questions below provide some guidance on how we've been looking at the issue:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

Writing an algorithm to assess page or site quality is a much harder task, but we hope the questions above give some insight into how we try to write algorithms that distinguish higher-quality sites from lower-quality sites.

What you can do

We've been hearing from many of you that you want more guidance on what you can do to improve your rankings on Google, particularly if you think you've been impacted by the Panda update. We encourage you to keep questions like the ones above in mind as you focus on developing high-quality content rather than trying to optimize for any particular Google algorithm.

One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.

We're continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search. As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture. This way your site will be more likely to rank well for the long-term. In the meantime, if you have feedback, please tell us through our Webmaster Forum. We continue to monitor threads on the forum and pass site info on to the search quality team as we work on future iterations of our ranking algorithms.

Written by , Google Fellow

Page Speed Online has a shiny new API

Webmaster Central Blog - Wed, 06/22/2011 - 21:41
AndrewRichardWebmaster level: intermediate

A few weeks ago, we introduced Page Speed Online, a web-based performance analysis tool that gives developers optimization suggestions. Almost immediately, developers asked us to make an API available to integrate into other tools and their regression testing suites. We were happy to oblige.

Today, as part of Google I/O, we are excited to introduce the Page Speed Online API as part of the Google APIs. With this API, developers now have the ability to integrate performance analysis very simply in their command-line tools and web performance dashboards.

We have provided a getting started guide that helps you to get up and running quickly, understand the API, and start monitoring the performance improvements that you make to your web pages. Not only that, in the request, you’ll be able to specify whether you’d like to see mobile or desktop analysis, and also get Page Speed suggestions in one of the 40 languages that we support, giving API access to the vast majority of developers in their native or preferred language.

We’re also pleased to share that the WordPress plugin W3 Total Cache now uses the Page Speed Online API to provide Page Speed suggestions to WordPress users, right in the WordPress dashboard. “The Page Speed tool itself provides extremely pointed and valuable insight into performance pitfalls. Providing that tool via an API has allowed me to directly correlate that feedback with actionable solutions that W3 Total Cache provides.” said Frederick Townes, CTO Mashable and W3 Total Cache author.

Take the Page Speed Online API for a spin and send us feedback on our mailing list. We’d love to hear your experience integrating the new Page Speed Online API.

Andrew Oates is a Software Engineer on the Page Speed Team in Google's Cambridge, Massachusetts office. You can find him in the credits for the Pixar film Up.

Richard Rabbat is the Product Management Lead on the "Make the Web Faster" initiative. He has launched Page Speed, mod_pagespeed and WebP. At Google since 2006, Richard works with engineering teams across the world.

Posted by Scott Knaster, and , Page Speed Team

Introducing the Google Webmaster Team

Webmaster Central Blog - Wed, 06/22/2011 - 21:39

We’re pleased to introduce the Google Webmaster Team as contributors to the Webmaster Central Blog. As the team responsible for tens of thousands of Google’s informational web pages, they’re here to offer tips and advice based on their experiences as hands-on webmasters.

Back in the 1990s, anyone who maintained a website called themselves a “webmaster” regardless of whether they were a designer, developer, author, system administrator, or someone who had just stumbled across GeoCities and created their first web page. As the technologies changed over the years, so did the roles and skills of those managing websites.

Around 20 years after the word was first used, we still refer to ourselves as the Google Webmaster Team because it’s the only term that really covers the wide variety of roles that we have on our team. Although most of us have solid knowledge of HTML, CSS, JavaScript and other web technologies, we also have specialists in design, development, user experience, information architecture, system administration, and project management.


Part of the Google Webmaster Team, Mountain View

In contrast to the Google Webmaster Central Team—which mainly focuses on helping webmasters outside of Google understand web search and how things like crawling and indexing affect their sites—our team is responsible for designing, implementing, optimizing and maintaining Google’s corporate pages, informational product pages, landing pages for marketing campaigns, and our error page. Our team also develops internal tools to increase our productivity and help to maintain the thousands of HTML pages that we own.

We’re working hard to follow, challenge and evolve best practices and web standards to ensure that all our new pages are produced to the highest quality and provide the best user experience, and we’re constantly evaluating and updating our legacy pages to ensure their deprecated HTML isn’t just left to rot.

We want to share our work and experiences with other webmasters, so we recently launched our @GoogleWebTeam account on Twitter to keep our followers updated on the latest news about our projects, web standards, and anything else which may be of interest to other webmasters, web designers and web developers. We’ll be posting here on the Webmaster Central Blog when we want to share anything longer than 140 characters.

Before we share more details about our processes and experiences, please let us know if there’s anything you’d like us to specifically cover by leaving a comment here or by tweeting @GoogleWebTeam.

Posted by , Google Webmaster Team

Troubleshooting Instant Previews in Webmaster Tools

Webmaster Central Blog - Wed, 06/22/2011 - 21:36
Webmaster level: All

In November, we launched Instant Previews to help users better understand if a particular result was relevant for a their search query. Since launch, our Instant Previews team has been keeping an eye on common complaints and problems related to how pages are rendered for Instant Previews.

When we see issues with preview images, they are frequently due to:
  • Blocked resources due to a robots.txt entry
  • Cloaking: Erroneous content being served to the Googlebot user-agent
  • Poor alternative content when Flash is unavailable
To help webmasters diagnose these problems, we have a new Instant Preview tool in the Labs section of Webmaster Tools (in English only for now).



Here, you can input the URL of any page on your site. We will then fetch the page from your site and try to render it both as it would display in Chrome and through our Instant Preview renderer. Please keep in mind that both of these renders are done using a recent build of Webkit which does not include plugins such as Flash or Silverlight, so it's important to consider the value of providing alternative content for these situations. Alternative content can be helpful to search engines, and visitors to your site without the plugin would benefit as well.

Below the renders, you’ll also see automated feedback on problems our system can detect such as missing or roboted resources. And, in the future, we plan to add more informative and timely feedback to help improve your Instant Previews!

Please direct your questions and feedback to the Webmaster Forum.

Posted by , Software Engineer

Authorship markup and web search

Webmaster Central Blog - Wed, 06/22/2011 - 21:31
Webmaster level: Intermediate

Today we're beginning to support authorship markup—a way to connect authors with their content on the web. We're experimenting with using this data to help people find content from great authors in our search results.

We now support markup that enables websites to publicly link within their site from content to author pages. For example, if an author at The New York Times has written dozens of articles, using this markup, the webmaster can connect these articles with a New York Times author page. An author page describes and identifies the author, and can include things like the author’s bio, photo, articles and other links.

If you run a website with authored content, you’ll want to learn about authorship markup in our Help Center. The markup uses existing standards such as HTML5 (rel=”author”) and XFN (rel=”me”) to enable search engines and other web services to identify works by the same author across the web. If you're already doing structured data markup using microdata from schema.org, we'll interpret that authorship information as well.

We wanted to make sure the markup was as easy to implement as possible. To that end, we’ve already worked with several sites to markup their pages, including The New York Times, The Washington Post, CNET, Entertainment Weekly, The New Yorker and others. In addition, we’ve taken the extra step to add this markup to everything hosted by YouTube and Blogger. In the future, both platforms will automatically include this markup when you publish content.

We know that great content comes from great authors, and we’re looking closely at ways this markup could help us highlight authors and rank search results.

Posted by , Software Engineer

Announcing Instant Pages

Webmaster Central Blog - Wed, 06/22/2011 - 21:31
Webmaster level: All

Earlier today we announced Instant Pages, a new feature to help users get to their desired search results even faster--in some cases even instantly! The Instant Pages feature is enabled by prerendering technology that we are building into Chrome and then is intelligently triggered by web search when we're very confident about which result is the best answer for the user's search.

This prerendering technology is currently in the Chrome Dev Channel and will be going to Beta later this week.

You can see Instant Pages in action in this video:


At Google we're obsessed with speed. We've seen time and time again how an increase in apparent speed leads to a direct increase in user happiness and engagement. Instant Pages helps visitors arrive at your site faster. When we trigger Instant Pages for your site, users can start interacting with your site almost immediately, without having to wait for text and images to load. We'll only trigger Instant Pages when we have very high confidence that your site is the exact result users are looking for. Search traffic will be measured in Webmaster Tools just like before this feature, with only results the user visited counted. We'll take the time this summer before the feature launches in stable versions of Chrome to collect your feedback.

The vast majority of sites will automatically work correctly when prerendered in Chrome. Check out the prerendering post on the Chromium blog if you want to learn more about how prerendering works in Chrome or how you can detect that your site is being prerendered.

Written by , Software Engineer

Webinar: Implementing the +1 Button

Webmaster Central Blog - Wed, 06/22/2011 - 21:30
Webmaster Level: All

A few weeks ago, we launched the +1 button for your site, allowing visitors to recommend your content on Google search directly from your site. As people see recommendations from their friends and contacts beneath your search results, you could see more, better qualified traffic from Google.

But how do you make sure this experience is user friendly? Where should you position the +1 button? How do you make sure the correct URL is getting +1’d?

On Tuesday, June 21 at 3pm ET, please join Timothy Jordan, Google Developer Advocate, to learn about how to best implement the +1 button on your site. He’ll be talking through the technical implementation details as well as best practices to ensure the button has maximum impact. During the webinar, we’ll review the topics below:
  • Getting started
  • Best practices
  • Advanced options
  • Measurement
  • And, we’ll save time for Q&A
If you would like to attend, please register here. To download the code for your site, visit our +1 button tool on Google Webmaster Central.

Posted by , Product Marketing Manager
Syndicate content