Today’s question comes from Remiz Rahnas in Kerala, India. Remiz asks:
“Google announced page load speed matters for ranking. Should we be doing content-only pages for Googlebots? By removing images and loads of CSS & Java Script.”
The answer to your question is Nooooooo and if I could reach through the screen and hit the ESC key and Ctrl C and Break I would because that’s cloaking. You never want to do something completely different for googlebot than you’d do for regular users. That’s the definition of cloaking.
If you have something in your code, saying “If user agent equals Googlebot or if the IP address is Google, do something different”. That’s the mere definition of cloaking.
So you might think the page load speed is a factor I should care about, so let’s make things fast for Googlebot. But that is not the right approach.
#1 we’re not only using Googlebot to determine how fast a particular page or site loads, so it wouldn’t even work.
#2 if you’re changing the content that can show up people will be able to look at the cache page and see oh, it’s nothing but a text page, this is something that’s very strange and so the complaint about your site cloaking.
Think about it, whenever you include CSS, Java Script images, most of the time those are external, we’re not even going off to load those at that particular time.
So knowing that external stuff exists, doesn’t necessarily mean that we’re going to go off and fetch it and incorporate it all into the page. But you do want to show the same page to users that you show to Googlebot.
Don’t do anything to try to speed things up for Googlebot because that’s cloaking and that’s a much higher risk, then just trying to speed things up only for Googlebot.
Quick Answer: No, this is cloaking