This is an update to the previous blog we published in October 2013. In that blog, we compared performance of the Salesforce Console in a number of browsers: Internet Explorer 7/8/9, Chrome and Firefox. In the releases since the 2013 blog, we have invested significant effort to improve performance of the Salesforce Console. Especially starting Summer 2015 release, we have enhanced our performance testing capabilities with the following improvements:
- Improved test coverage: we now regularly test console performance for a number of different variations of typical use cases. For example, with and without sidebar components, with and without MRU, primary tabs as well as subtabs, different types of user actions such as opening one tab at a time versus opening a number of tabs at the same time, and also tested different entities including Account, Case, Opportunity, Lead, and Contact.
- Starting from Summer 2015 release we have put in place a console performance test automation framework which has allowed us to run nightly tests to cover an increased number of test cases on Virtual Machines against six browsers: IE8, IE9, IE10, IE11, Chrome and Firefox. Before Summer 2015 release console performance tests were conducted manually or semi-manually where performance tests were automated with a script but the collection of memory data were done manually.
- The improved test coverage and automated tests have reduced the time to discovery and resolution of performance issues significantly.
Below is a sample of scenarios selected from the nightly console performance automation runs.
Memory Characteristics
To test the memory characteristics of each browser, we simulated different workloads. We simulated a standard workload that constituted open and closing tabs, then tried a similar workload using sub-tabs, and finally stress tested a workload that was based on bursts of activity.
Scenario 1: Simulate a Tab-level Workload
In this scenario we have a basic page layout with no console sidebar components. In the test one Account tab is opened, and then closed after 20 seconds. The operation is then repeated for 100 loops, with a total number of 100 tabs opened and closed. The objective of this test is to simulate interaction behavior for a typical workload at primary tab level. Below is the plot of measured browser memory footprint versus the number of tabs being opened:
As can be seen from the plot, Chrome performed best for this scenario, with the lowest browser memory footprint among the browsers. Firefox has a slightly higher memory footprint but held steady during the test run, so is IE11. IE10 shows a moderate increase in memory footprint and IE9 is the worst among the 5 browsers, seeing a more steep increase during the run.
Scenario 2: Simulate a Sub-tab Workload
This scenario is similar to scenario #1 above except that instead of primary tab, we are opening Account entity as sub-tabs in this test. The objective of this test is to simulate interaction behavior for a typical workload at sub-tab tab level.
Here we can also see Chrome and Firefox have best performance, followed by IE10 and IE11, while IE9 has the largest increase in memory footprint.
Scenario 3: Stress Testing Bursts
In this scenario, we open 10 Account tabs at a time, and then close them all after 20 seconds. The action is then repeated for 30 loops, with a total number of 300 tabs opened and closed at the end of the test. The objective of this test is to stress test interaction behavior for a bursty workload at primary tab level.
We can see Chrome is again the winner, where memory footprint held steady throughout the test, while IE9/IE10/IE11 all see significant increase in browser memory footprint. The “bursty” nature of tab opening (opening multiple tabs quickly within a very short period of time) appears to cause Firefox memory footprint to fluctuate widely, which should have to do with the specific garbage collection mechanism adopted by the browser.
Console First View
In addition to browser memory footprint, in the 2013 blog we also measured “console first view” timing as well as the time it takes to load a tab after a number (for example, 300) of tabs had been opened and closed. The corresponding measurements done for the Winter 2016 release are as follows:
Browser | First View Load Time (sec) | ||
---|---|---|---|
IE9 | 15 | 15 | 16 |
IE10 | 6.7 | 7.7 | 7.7 |
IE11 | 6.2 | 5.5 | 6.7 |
Firefox (40) | 8.5 | 7.9 | 8.7 |
Chrome (45) | 8.4 | 9.1 | 8.7 |
The 301st tab open time (after opening and closing 300 tabs)
IE9 | 2.8 |
IE10 | 2.3 |
IE11 | 2.5 |
Firefox (40) | 2.2 |
Chrome (45) | 1.8 |
Note that browser memory footprint growth has overwhelmingly been the primary factor affecting console performance. If the browser memory footprint grows to a very large value, then the time it takes the browser to do garbage collection may substantially grow, which in turn will affect the responsiveness experienced by the user. So as long as browser memory footprint remains small, the console performance can be expected to be relatively fast.
Conclusion
From the automated tests run nightly against a number of scenarios representing typical console usage, we can see Chrome remains the browser of choice. It manages browser memory better than other browsers tested, and the overall memory footprint remains in a relatively low range.
Note that there are a large number of factors affecting the actual memory footprint readings of any given test so the readings presented above are not directly comparable with the numbers from the 2013 blog. Even though when tested in the same environment, console performance numbers do show a significant improvement in more recent releases (Summer 2015 and after) compared to previous releases.