You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. should we not allow Googlebot to index these pages to improve our overall site speed?
“But in general, most of the time, as long as your browser isn’t timing out, as long as it’s not starting to be flaky, you should be in relatively good shape,” he continues. “You might, however, think about the user experience. If users have to wait 8, 9, 10, 20 seconds in order to get a page back, a lot of people don’t stick around that long. So there’s a lot of people that will do things like cache results and then compute them on the fly later. And you can fold in the new results.”
“But if it’s at all possible to pre-compute the results, or cache them, or do some sort of way to speed things up, that’s great for users,” Cutts says. “Typically, as long as there is just a few number of pages that are very slow or if the site overall is fast, it’s not the kind of thing that you need to worry about. So you might want to pay attention to making it faster just for the user experience.But it sounds like I wouldn’t necessarily block those slower pages out from Googlebot unless you’re worried that you’re in one of those 1 out of a 1,000, where you’re really, really the outlier in terms of not being the fastest possible site.”
In November, we referenced another video Cutts did talking about page speed, where he also dropped the “1 out of 100 searches” stat. He said basically not to overly stress about speed as a ranking factor. Both the new video and that video were actually uploaded to YouTube in August, so this advice is already older than it appears. Today’s video, however, was just made public by Google, so it stands to reason that the advice from the company remains the same.
Source: http://www.webpronews.com/should-you-block-google-from-crawling-your-slower-pages-2012-03