Professional Documents
Culture Documents
Psssst do you want to know a secret? How about five? Below I will share with you five dirty little secrets ofweb performance; some may
surprise you.
I would venture to say there is no such thing as an average when it comes to web performance given the variability of the user experience. As
a result, the truth is often obscured. Averages can be misleading based on outliers or based on the number of test runs performed. If only
three test runs were performed, is that really indicative of average performance?
For example, say you conducted an A/B test for a configuration change to determine impact on page timings. Both averages across ten runs
were 4.1 seconds. Looking just at the average one might say there is no difference in the user experience, but if you look at the individual test
runs a different story is told.
Run 1
Run 2
Run 3
Run 4
Run 5
Run 6
Run 7
Run 8
Run 9
Run 10
10
10
Looking at the individual data points it is harder to say that the performance is the same for configuration A & B. In configuration B response
times go up for 8 out of 10 users, while with configuration A response times are significantly higher for 2 out of 10 users. Which is better?
Would you be surprised to hear that they both load in the exact same amount of time?Yet itfeelslike the J.Crew page loads faster (at least
Ithink it does).
Humans have memory bias and perceive time differently. We have a tendency to perceive things as takinglonger than they actually do; then,
when recalling an event, we think it took even longer than we originally perceived it to. The bottom line is, if a userperceivesa page as loading
slowly, thats all that matters.
Measurements and metrics can be manipulated to present anything in a good or bad light. If the average isnt showing improvement, try
reporting on the median or percentile. If start render times arent showing a good enough gain, look at TTFB or page load times. Not all pages
are showing an improvement? Present results for only the home page. In web performance there are many metrics that matter;reporting on a
single metric doesnt show the whole picture. But when some metrics show improvement and others dont, how do you make an informed
decision?
When presented with performance results, ask questions and ask for additional metrics especially if you are only presented with a limited set
of data (Tip: always ask for full histograms to make judgments). While it is frustrating to have to explain to customers why 1 page out of 10
tested didnt show the results expected, I always prefer to share more data than less when conducting A/B or performance tests.
In the coming weeks we will share more details on how to find the truth in the maze of web performance.
Request a Demo