Wednesday, August 12, 2009

Effective Web Performance

Slap up some measurements. Look at some graphs. Make a few calls. Your site is faster. You're a hero.


Effective Web performance is something that requires planning, preparation, execution, and the willingness to try more than once to get things right. I have discussed this problem before, but wanted to expand my thoughts into some steps that I have seen work effectively in organizations that have effectively established Web performance improvement strategies that work.

This process, in its simplest form, consists of five steps. Each step seems simple, but skipping any one of them will likely leave your Web performance process only half-baked, unable to help your team effectively improve the site.

1. Identification - What do we want/need to measure?

We want to measure everything. From everywhere.

This is an ineffective approach to Web performance measurement. This approach leads to a mass of data flowing towards you, causing your team to turn and flee, finding any way possible to hide from the coming onslaught.

Work with your team to carefully chose your Web performance targets. Identify two or three things about your site's performance that you want to explore. Make these items discrete and clearly understood by everyone on your team. Clearly state their importance to improving Web performance. Get everyone to sign off on this.

Now, what was just said above will not be easy. There will be disagreements among people, among different parts of the organization, about which items are the most crucial to measure. This is a good thing.

Perhaps the greatest single hindrance to Web performance improvement is the lack of communication. An active debate is better than quiet acceptance and a grudging belief that you are going the wrong way. Corporate silos and a culture of assurance will not allow your company to make the decisions you need to have an effective Web performance strategy.

2. Selection - What data will we need to collect?

In order to identify a Web performance issue (which is far more important than trying to solve it), the data that will be examined will need to be decided on. This sounds easy - response time and success rate. We're done.


Now, if your team wants to be effective, they have to understand the complexity of what they are measuring. Then an assessment of what useful data can be extract to isolate the specific performance issue under study can be made.

Choose your metrics carefully, as the wrong data is worse than no data.

3. Execution - How will we collect the data?

Once what is to be measured is decided on, the mechanics of collecting the data can be decided on. In today's Web performance measurement environment, there are solutions to meet every preferred approach.

  • Active Synthetic Monitoring. This is the old man of the methods, having been around the longest. A URL or business process is selected, scripted, and them pushed out to an existing measurement network that is managed/controlled. These have the advantage of providing static, consistent metrics that can be used as baselines for long-term trending. However, they are locked to a single process, and do not respond or indicate where your customers are going now.

  • Passive User Monitoring - Browser-Side. A relative newcomer to the measurement field, this process allows companies to tag pages and follow the customer performance experience as they move through a site. This methodology can also be used to discretely measure the browser-side performance of page components that may be invisible to other measurement collection methods. It does have a weakness in that it is sometimes hard to sell within an organization because of its perceived similarity to Web analytics approaches and its need to develop an effective tagging strategy.

  • Passive User Monitoring - Server-Side. This methods follows customers as they move through a site, but collects data from a users interaction with the site, rather than with the browser. Great for providing details of how customers moved through a site and how long it took to move from page to page. It is weak in providing data on how long it took for data to be delivered to the customer, and how long it took their browser to process and render the requested data.

Organizations often choose one of the methods, and stay with it. This has the effect of seeing the world through hammer goggles: If all you have is a hammer, then every problem you need to solve has to be turned into a nail.

Successful organizations have a complex, correlative approach to effective Web performance analysis. One that links performance data from multiple inputs and finds a way to link the relationships between different data sets.

If your team isn't ready for the correlative approach, then at least keep an open mind. Not every Web performance problem is a nail.

4. Information - How do we make the data useful?

Your team now has a great lump of data, collected in a way that is understood, and providing details about things they care about.

Now what?

Web performance data is simply the raw facts that come out of the measurement systems. It is critical that during the process of determining why, what and how to measure that you also decided how you were going to process the data to produce metrics that made sense to your team.

Strategies include:

  • Feeding the data into a business analytics tool

  • Producing daily/weekly/monthly reports on the Key Performance Indicators (KPIs) that your team uses to measure Web performance

  • Annotate change, for better or worse

  • Correlate. Correlate. Correlate. Nature abhors a vacuum.

Providing a lot of raw data is the same as a vacuum - a whole bunch of nothing.

5. Action - How do we make meaningful Web performance changes?

Data has been collected and processed into meaningful data. People throughout the organization are having a-ha moments, coming up with ideas or realizations about the overall performance of the site. There are cries to just do something.

Stick to the plan. And assume that the plan will evolve in the presence of new information.

Prioritizing Web performance improvements falls into the age-old battle between the behemoths of the online business: business and IT.

Business will want to focus on issues that have the greatest effect on the bottom-line. IT will want to focus on the issues that have the greatest effect on technology.

They're both wrong. And they're both right.

Your online business is just that: a business that, regardless of its mission, based on technology. Effective Web performance relies on these two forces being in balance. The business cannot be successful without a sound and tuned online platform, and the technology needed to deliver the online platform cannot exist without the revenue that comes from the business done on that platform.

Effective Web performance relies on prioritizing issues so that they can be done within the business and technology plans. And an effective organization is one that has communicated (there's that word again) what those plans are. Everyone needs to understand that the business makes decisions that effect technology and vice-versa. And that if these decisions are made in isolation, the whole organization will either implode or explode.


Effective Web performance is hard work. It takes a committed organization that understands that running an online business requires that everyone have access to the information they need, collected in a meaningful way, to meet the goals that everyone has agreed to.


  1. Good overview, lots of good nuggets here. What would you see the breakdown being between synthetic, webbug passive, and hardware passive out in the field?

  2. Good overview, lots of good nuggets here. What would you see the breakdown being between synthetic, webbug passive, and hardware passive out in the field?