The purpose of this thesis project was to examine how to best to use principles of modern Business Intelligence (BI) and dashboards to support real-time decision making and performance improvement in a modern fire service agency. The problem that this project attempted to solve is the disconnect between personnel and organizational elements of a fire department and overall organizational goals and program measurement. Simply put, the average fire department employee does not see the connection between his work and the quarterly or yearly statistics that most fire service agencies use for performance measurement.
In terms of scoping, this thesis confined itself to the general knowledge necessary for a fire service agency to construct a first generation real-time dashboard for use in an iterative development process. The intent is for this dashboard to be used to support real-time feedback and decision-making. The thesis did not debate whether dashboards are in fact needed, nor did it debate the merits of the various styles of dashboards that are already extant in the business world. Nor did this thesis attempt to serve as a “consumer report” for the various software packages that are currently available to construct dashboards.
The methodology used in this thesis was a form of action research. The guiding principle was the two-interlinked-cycle theory of action research as proposed by McKay and Marshall. As explained by McKay and Marshall this method of research is ideal for solving new Information System problems and for solving practical problems while creating new knowledge. The first of the two interlocked cycles is that of research (in this case a literature search and review), and the second being that of a Lean start up Build-Measure-Learn cycle. The thesis only examined the research and build phases of these two cycles to construct a prototype dashboard.
To lay the groundwork it was first necessary to understand performance measurement and improvement from a local government and fire service perspective. From the viewpoint of local governments, it was helpful to understand that performance measurement provides transparency and reassurance to citizens who want to slash unneeded spending but who also “care about the scope and quality of services being provided.” However, the most relevant purpose for performance measurement is to improve performance with all other purposes being secondary.
With these factors in mind, performance measurement terms were defined. Effectiveness was defined as “how well a service does what it is supposed to do,” while efficiency “is concerned with how well resources are used in providing the service.” Effectiveness is usually measured via outputs (tangible discrete products such as speeding tickets) and outcomes (more general “consequences of supplying public services to targeted recipients).”
From the business world, the concept of lead and lag program measures was found to be useful. Lag program measures are defined as “tracking measures…[of] performance that is already in the past,” while lead measures are defined as the “measures of new behaviors that will drive success on the lag measures.”
Fire department measuring systems were then discussed. One traditional measure of fire service performance is that of the Insurance Services Organization Public Protection Classification (PPC) system, which measures fire department readiness for fire suppression based on a static checklist. Another measuring system that was examined was the Commission on Fire Agency Accreditation International (CFAI) process.
Robert Behn suggests that a combination of output and outcome measures is necessary to capture the whole of the state of organizational effectiveness. He suggests that by focusing internally on output measures, and improving such, organizations can realize gains in outcomes. Both the National Fire Protection Association and CFAI recommend goals which seem to adhere to this principle; for example three core fire suppression outcome measures which are proposed by both are the following: the percentage of times that a fire department confines a fire to the room (or structure) of origin, fire deaths and injuries per capita, and the number of firefighter injuries. All of these externally reported outcome measures encompass a variety of subordinate output measurements.
Putting these concepts together led to the realization that it would be best to first decide upon the lag measurements that best define fire service performance and then work backwards through the relevant business processes to identify which lead measures can be most readily affected in a real-time manner. These lead measures then can be displayed on a real-time dashboard as “it is well known that providing goals and feedback are two of the most effective interventions” for improving performance. Other purposes of the dashboard would include monitoring completion of employees rote tasks, and the provision of situational awareness information such as traffic, weather, expected call loads, and hospital statuses.
The desirability of having differing dashboards for differing levels of supervision was also examined. It appeared that functionally this was desirable, but that also all employees in an organization should have access to a “single source of truth.”
Dashboard graphic design principles were also discussed. There was near unanimity among the authors surveyed that dashboards should be simple to understand, fit on one screen, that indicators should be able to be drilled down upon (clicked on to display a second screen) to display more detailed information, and that the dashboards need to be designed in an iterative process with end-user feedback. In terms of other visual characteristics, the most important information should be located on the upper left and center, and it is also important to consider colorizing and co-locating related information.
The research suggests that by following the principles learned from the business world, fire service agencies can construct dashboards to improve performance. Prototype first generation dashboards were able to be constructed. However, further research will be needed to examine if dashboards are effective at improving performance and what aspects of the dashboards are the most effective at doing so.
 Judy McKay and Peter Marshall, “The Dual Imperatives of Action Research,” Information Technology and People 14, no. 1 (2001): 47, 50.
 Eric Reis, The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses (New York: Crown Business, 2011), 200.
 David Edwards and John Clayton Thomas, “Developing a Municipal Performance-Measurement System: Reflections on the Atlanta Dashboard,” Public Administration Review 65, no. 3 (May 2005): 369–76, doi:10.1111/j.1540-6210.2005.00461.x.
 Robert D. Behn, “Why Measure Performance? Different Purposes Require Different Measures,” Public Administration Review 63, no. 5 (September 2003): 586–606, doi:10.1111/1540-6210.00322.
 Jennifer Flynn, “Fire Service Performance Measures” (report, National Fire Protection Association, November 2009), 5.
 XiaoHu Wang, “Perception and Reality in Developing an Outcome Performance Measurement System,” International Journal of Public Administration 25, no. 6 (January 2002): 808, doi:10.1081/PAD-120003819.
 Chris McChesney, Sean Covey, and Jim Huling, The 4 Disciplines of Execution (New York: Free Press, 2012), 11.
 “How the PPC Program Works,” Insurance Services Office, accessed May 3, 2017, www.isomitigation.com/
 Behn, “Why Measure Performance,” 595.
 Flynn, “Fire Department Performance Measures,” 20.
 Bruce F. Chorpita, Adam Bernstein, and Eric L. Daleiden, “Driving with Roadmaps and Dashboards: Using Information Resources to Structure the Decision Models in Service Organizations,” Administration and Policy in Mental Health and Mental Health Services Research 35, no. 1–2 (March 2008): 114–23, doi:10.1007/s10488-007-0151-x.
 Richard P. DeShon et al., “A Multiple-Goal, Multilevel Model of Feedback Effects on the Regulation of Individual and Team Performance,” Journal of Applied Psychology 89, no. 6 (2004): 1035–56, doi:10.1037/0021-9010.89.6.1035.
 Amelia Cahyadi and Adi Prananto, “Reflecting Design Thinking: A Case Study of the Process of Designing Dashboards,” Journal of Systems and Information Technology 17, no. 3 (August 2015): 286, doi:10.1108/JSIT-03-2015-0018.
 Ibid., 297.
 Michael K. Allio, “Strategic Dashboards: Designing and Deploying Them to Improve Implementation,” Strategy and Leadership 40, no. 5 (2012): 24, doi:10.1108/10878571211257159.
 Wayne G. Bremser and William P. Wagner, “Developing Dashboards for Performance Management,” The CPA Journal; New York 83, no. 7 (July 2013): 67.