By Jerry Harbour
April 7, 2011
I left my house early one morning to drive to the airport. After starting my car and turning on the lights, I noticed that my dashboard was not illuminated. In a panic, I wondered if I should even attempt to drive to the airport. Although I could certainly still navigate my car without the aid of a dashboard, it just seemed that it wouldn’t be as safe or effective as with one.
Admittedly managers can also “drive” their organizations without any type of performance dashboard. Like driving a car, however, they simply can’t do it as effectively, safely, and successfully as they can with one. By developing and using a well-constructed performance dashboard and associated underlying performance measurement system framework, managers and staff members can better:
- Monitor performance at multiple levels.
- Proactively analyze and resolve problems.
- Improve decision making.
- Identify areas for improvement and evaluate results.
- Asses and manage risk.
In short, a performance measurement system helps managers focus on what truly counts. But what constitutes an effective performance measurement system and just as importantly, how does an organization go about developing one?
At its most basic level and as illustrated in Figure 1, a performance measurement system contains a set of performance metrics that are “warehoused” in a repository database and displayed via some graphical display mechanism. Performance metrics represent initial data measurement inputs. They are the means by which performance is measured, monitored, improved and managed. Metrics are the heart and soul of any performance measurement system and their development, therefore, must always come first.
Metrics in turn are stored or warehoused in a repository database. Increasingly the term data warehouse is used for such performance metric storage capabilities. A data warehouse commonly represents a relational database designed to store and handle large numbers of metrics that can be queried on command.
The final component of a performance measurement system is the actual display of the metric data itself. Just as a picture is worth a thousand words, a well-crafted performance metric display is worth a thousand numbers. Increasingly the term “performance dashboard” is used for such graphical displays. Author Wayne Eckerson in his book of the same name, “Performance Dashboards,” defines a performance dashboard as a “-¦ layered information delivery system that parcels out information, insights and alerts to users on demand so they can measure, monitor, and mange business performance more effectively.”
Whereas many people only view a performance dashboard as something displayed on a single computer screen, Eckerson correctly describes it as an integrated delivery system. Such a delivery system often consists of differing dashboard categories (e.g., human capital or supply chain management) that contain individual dashboards (as displayed on a single computer screen). Individual dashboards in turn contain various graphical charts (normally two to six) that draw data from a defined range within a linked database.
Note the two arrows in Figure 1. The first arrow represents the linkages and collection protocols for imputing defined metrics into a repository database. The second arrow allows a performance dashboard system to extract metric data from a specified region or range within a database.
Performance dashboards are mostly used for strategic, analytical, and/or operational purposes.
The major difference between strategic and operational dashboards is primarily the timeframe. Performance data is usually aggregated by week, month, or quarter in strategic dashboards, whereas a much shorter timeframe is used in an operational dashboard system, one that normally covers minutes to hours to a few days at most. Additionally, strategic dashboards allow higher level goal assessment, whereas operational dashboards are primarily used for monitoring purposes.
Figure 2 displays a strategic-type dashboard from a health care organization. Here the basic timeframe is on a per month basis. Figure 3 displays an operational dashboard from a call center. Note that in Figure 3, metric data is updated every five minutes, thereby allowing frontline-supervisors more immediate feedback concerning ongoing operational performance.
Analytical dashboards provide much greater underlying detail and drill-down capacity than do strategic dashboards. In truth, however, there is often considerable overlap between strategic and analytical dashboards.
Although it is tempting to define a performance measurement system only in terms of some specific dashboard display software product, it is important to always remember that any successful performance measurement system development effort must focus on developing the whole thing, including the development of useful metrics - the critical few – that are then displayed in a usable, understandable, and decision-actionable manner.
Jerry Harbour, Ph.D. is a recognized thought leader in the field of performance management and measurement, and the author of numerous published articles and four books on the subject. He holds a Ph.D. in applied behavioral studies from Oklahoma State University. He can be reached at [email protected] More on dashboards at www.idashboards.com.