With both the optimised and control sites indexed the client side rendered site was lagging behind. As it was already several days behind the control site, rather than waiting to see how long it would take to be indexed naturally and in order to move on with further experiments, it was time to help the process along.
When using a site search operator the homepage of the site was showing up but with no snippet. This indicated that the page itself had been crawled and indexed but the client side content hadn't been rendered.
After using Google Webmaster Tools (GWT) to check that the page could be rendered by Google, an indexing request was submitted. Previously it had taken less than two hours to index a page but, even though the server logs showed that Googlebot had requested all the assets required to render the page, this time it took at least 12 hours to update the index with the rendered content that referred to the brand name.
With it now established that client side rendered content can be indexed the next experiments will focus on the control and optimised sites. The client side rendered site will track the control site but, until there is an experiment specifically around client side rendered content, there will be no routine indexing requests.
Googlebot will index client side rendered content but it takes significantly longer than server side rendered content and may require manual intervention to request indexing.
This is part of a series of posts that cover experiments in SEO.
Previous post: Initial Indexing