Evergreen browsers

Orde Saunders' avatarUpdated: Published: by Orde Saunders

Firefox and Chrome browsers both use a rolling, date based, release cycle. These browsers feature a built-in updater that ensures they receive updates as soon as possible after they are released. This continuous update is why they are referred to as evergreen browsers.

As a result of their evergreen status, support for these browsers is often phrased as "latest stable and previous stable" but how useful is this statement? To reliably assess this we will need to consider how effective the automatic updates are and how long older versions remain in significant use.

Update efficiency

To judge the update efficiency we will compare the amount of traffic for the latest and stable and previous stable versions on individual days for a period of three weeks from the official release date. In this analysis we are ignoring all versions of the browser that aren't these two versions, see the section on the long tail for information on other browser versions.

The data for this was taken from a site with sufficiently high traffic to be representative (over ten thousand visits per day for each browser).

The data shown below is the percentage split of visits between the two versions rounded to the nearest whole number.

Firefox

Firefox currently updates on a planned six week cycle.

Firefox 23-24

Firefox 24 adoption rate

Datev23 (%)v24 (%)
2013-09-17982
2013-09-18937
2013-09-198515
2013-09-208020
2013-09-218119
2013-09-228020
2013-09-238020
2013-09-248020
2013-09-257921
2013-09-267822
2013-09-277822
2013-09-287723
2013-09-297822
2013-09-307921
2013-10-017624
2013-10-025743
2013-10-033070
2013-10-042080
2013-10-051684
2013-10-061684
2013-10-071387
2013-10-081090

Firefox 24-25

Firefox 25 adoption rate

Datev24 (%)v25 (%)
2013-10-29982
2013-10-30937
2013-10-318515
2013-11-018218
2013-11-028218
2013-11-038218
2013-11-048119
2013-11-058020
2013-11-065743
2013-11-073070
2013-11-082278
2013-11-091882
2013-11-101783
2013-11-111387
2013-11-121090
2013-11-13991
2013-11-14793
2013-11-15694
2013-11-16694
2013-11-17694
2013-11-18694
2013-11-19595

Update pattern

There seems to be a pattern in these two releases where the update gets to 15-20% adoption and then plateaus for several days before moving to over 80% adoption in the next couple of days. Whilst I've not done the same detailed daily breakdown, the adoption curves in the analytics show the same pattern going back to at least the version 16-17 upgrade so I'm guessing it might be a feature of the updater that does a staged roll out. (Alternatively it could be a consistent anomaly in the source of the statistics I'm using.)

Chrome

Chrome currently updates on a planned six week cycle.

Chrome 29-30

Chrome 30 adoption rate

Datev29 (%)v30 (%)
2013-10-011000
2013-10-021000
2013-10-031000
2013-10-04919
2013-10-056436
2013-10-064654
2013-10-073664
2013-10-082080
2013-10-091189
2013-10-10793
2013-10-11595
2013-10-12595
2013-10-13496
2013-10-14397
2013-10-15298
2013-10-16298
2013-10-17199
2013-10-18298
2013-10-19298
2013-10-20298
2013-10-21199
2013-10-22199

Chrome 30-31

Chrome 31 adoption rate

Datev30 (%)v31 (%)
2013-11-12991
2013-11-13928
2013-11-147822
2013-11-156139
2013-11-165248
2013-11-173961
2013-11-182377
2013-11-191387
2013-11-20991
2013-11-21793
2013-11-22694
2013-11-23595
2013-11-24595
2013-11-25496
2013-11-26397
2013-11-27397
2013-11-28298
2013-11-29298
2013-11-20298
2013-12-01298
2013-12-02298
2013-12-03199

Long tail

Whilst the data for the update efficiency only looked at the latest two versions of a given browser we also need to look at the situation for all versions of that browser. Whilst we can see that during the transition period of the update we need to consider both versions of the browser, we need to ensure that there are no other versions that make up a significant portion of our visits.

For this we will look at Firefox as it has a longer transition period for the update and a more frequent release cycle than Chrome as this will magnify the effect we are looking at. This data is taken from 2013-10-29, the day before version 25 was officially released. Looking at the data for the switch between versions 23 and 24 we can see this was slower than the subsequent 24 to 25 update so we might reasonably expect this, if anything, to again magnify the effect.

Version%
Total100.00
2482.95
3.63.09
232.77
171.90
251.43
211.06
220.89
160.77
120.77
200.63
180.54
190.42
150.35
100.30
140.30
80.27
40.26
110.21
130.18
60.15
90.14
70.13
50.12
260.06
270.03
3.50.03
30.03

As can be seen, there is a long tail of previous versions but, even in total, they are not particularly significant. In fact, if we take the latest and previous stable as an aggregate at this point they make up 86% of visits.

Of particular note in this data is the presence of version 3.6 in second place - this is the last version of Firefox before they introduced the automatic update mechanism. However, even though it is more common than the previous stable version, it is still not significant in terms of use (especially if you also factor in Firefox's share of traffic from all browsers).

Conclusion

During the transition phase as browsers are updated there will be a period of time when both the latest stable version and the previous stable version are in use. Immediately after the release of a new stable version, and during the subsequent transition phase, some visitors wiil be using the previous version but, just before the next version is released, use of the previous version will be insignificant.

Whilst you should always examine your own individual situation, in general for evergreen browsers it is safe to make the assumption that by testing in the latest and previous stable versions you are covering the majority of your visitors using those browsers at any given moment in time.