Three days after the soft launch the control shop showed up in searches for the brand name indicating that Google had found the site and indexed the content. However, whilst the client side rendered site had been crawled it still hasn't been indexed for the brand name after five days.
As the three sites have been registered with Google Webmaster Tools (GWT) the crawl stats show that Googlebot has been visiting all three sites:
(The important detail to note on these graphs is that the y-axis for the client side rendered site has a maximum of eight rather than 12.)
From this it's clear that the control and optimised sites have been crawled at the same rate so the speed of indexing the content can only be down to the index request that was submitted for the optimised site.
The client side rendered site has been crawled, albeit to a slightly lesser extent, but still hasn't been indexed for the content. Within GWT it's possible to view how Googlebot sees a page - it's the first step in submitting an index request - which confirms that there are no issues with rendering the page:
For both the control site and the client side rendered site there is a warning in GWT that Googlebot was unable to access the robots.txt file (neither site has one at this point). Reading the "more information" that is provided for this message indicates that when a robots.txt is not found (404 status code) then the site will be crawled. However, as the client side rendered site has no server side URL routing it sends a success response (200 status code) and the page shell to all requests and this is confusing matters because the response is invalid syntax for a robots.txt.
The control site and the client side rendered site also both have a warning that Googlebot hasn't been able to find the URLs
mobile/. This error is not present on the optimised site which has submitted a sitemap so it appears that, in the absence of a sitemap, Googlebot is trying to determine the site structure by trial and error.
The first result from this stage of the experiment confirms the previous finding that using Google's Webmaster Tools to submit an index request gets content indexed faster than waiting for it to be discovered by Google.
Secondly, whilst Googlebot can render client side content it isn't indexed as efficiently as server side content.
Thirdly, having a sitemap and robots.txt will help with indexing a site.
This is part of a series of posts that cover experiments in SEO.
- Previous post: Soft Launch Redux
- Next post: Indexing Client Side Content