bsc-3-edit-baru

Upload: hery-prambudi

Post on 02-Nov-2015

220 views

Category:

Documents


0 download

DESCRIPTION

...............

TRANSCRIPT

  • A Comparison of Systemwide andHospital-Specific PerformanceMeasurement ToolsClarence Yap, M.D., associate, McKinsey & Company, New York, New York; EmilySiu, BSc, analyst. Health Results Team, Ontario Ministry of Health and Long-Term Care, Toronto, Canada; C. Ross Baker, Ph.D., professor. Department of HealthPolicy, Management, and Evaluation, University of Toronto, Ontario, Canada; andAdalsteinn D. Brown, D.Phil, assistant professor. Department of Health Policy,Management, and Evaluation, University of Toronto, and Lead, Health Results Team,Ontario Ministry of Health and Long-Term Care

    E X E C U T I V E S U M M A R YBalanced scorecards are being implemented at the system and organizational levelsto help managers link their organizational strategies with performance data tobetter manage their healthcare systems. Prior to this study, hospitals in Ontario,Canada, received two editions of the system-level scorecard (SLS)a framework,based on the original balanced scorecard, that includes four quadrants: systemintegration and management innovation (learning and growth), clinical utilizationand outcomes (internal processes), patient satisfaction (customer), and financialperformance and condition (financial). This study examines the uptake of theSLS framework and indicators into institution-specific scorecards for 22 acute careinstitutions and 2 non-acute-care institutions.

    This study found that larger (teaching and community) hospitals were sig-nificantly more likely to use the SLS framework to report performance data thandid small hospitals [p < 0.0049 and 0.0507) and that teaching hospitals used theframework significantly more than community hospitals did (p < 0.0529). Themajority of hospitals in this study used at least one indicator from the SLS in theirown scorecards. However, all hospitals in the study incorporated indicators thatrequired data collection and analysis beyond the SLS framework.

    The study findings suggest that SLS may assist hospitals in developinginstitution-specific scorecards for hospital management and that the balancedscorecard model can be modified to meet the needs of a variety of hospitals. Basedon the insight from this study and other activities that explore top priorities forhospital management, the issues related to efficiency and human resources shouldbe further examined using SLSs.

    For more information on the concepts in this article, please contact Dr. Brown at [email protected]. To purchase an electronic reprint of this article, go to www.ache.org/pubs/jhmsub.cfm, scroll down to the bottom of the page, and click on the purchase link.

    251

  • JOURNAL OF HEALTHCARE MANAGEMENT 50:4 JULY/AUGUST 2005

    S ince an initial call for greaterresearch into the use of thebalanced scorecard framework inhospital care (Forgione 1997), anumber of hospitals have reportedimplementing the framework andachieving success with its use (Kaplanand Norton 2001; Curtright, Stolp-Smith, and Edell 2000). Manyexamples of balanced-scorecard-like measurement systems forindividual hospitals and hospitaldepartments are presented in theliterature (e.g., Curtright, Stolp-Smith, and Edell, 2000; Harber 1998;Jones and Filip 2000; MacDonald1998; Meliones 2000; Rimar andGarstka 1999; Gordon et al. 1998).A number of papers also documentthe development or diffusion ofthe balanced scorecards in hospitaland healthcare systems (e.g., Bakerand Pink 1995; Castaneda-Mendez,Mangan, and Lavery 1998; Zelman,Pink, and Matthias 2003; Chan andHo 2000; Magistretti, Stewart, andBrown 2002; Pink and Freedman1998; Sahney 1998). The balancedscorecard is used by different typesof healthcare organizations, andit is modified to reflect each usergroup's realities. Documented usersinclude hospital systems, hospitals,long-term-care facilities, nationalhealthcare organizations such as theJoint Commission on Accreditationof Healthcare Organizations, andhealth departments of local and federalgovernments (Zelman, Pink, andMatthias 2003).

    System-level scorecards (SLSs)typically indicate the performanceof multiple hospital or healthcare

    organizations in the dimensions oflearning and growth, internal businessprocesses, customer, and financialperformance. These performancedimensions are often adapted inways that are specific to the hospital.However, to what extent SLSs reflect thestrategies of the individual hospitalsis unclear. The strategy of the fundersfor SLStypically hospital associationsor payers such as governmentmaybe different, and a common set ofmeasures for all hospitals may notinclude the diversity of an individualhospital's own strategies. However, thelink between strategy and performancemeasurement is critical to the successof the balanced scorecard (Kaplan andNorton 2001). A review of the literatureindicates that the most common reasonfor the failure of balanced scorecards isthat they are developed by individualsexternal to the organization or bythose who are unfamiliar with theorganization's strategy (McCunn 1998).

    With SLS, the critical questionis. To what extent do the strategiesof the payer or hospital systemresemble the strategies of thehealthcare organization's individualhospitals, which supports the useof a standardized scorecard for thehospital's specific management needs?Such a model may be possible inhospital systems such as the ones inOntario, Canada, where the provincialgovernment provides about 85 percentof hospitals' funding (CanadianInstitute for Health Information2003) and where issues such as capitalexpansion and the introduction ofhigh-cost, high-technology caremodalities are largely centrallycoordinated. SLS may have an added

    252

  • A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

    attraction in systems like those inOntario, where many small hospitals(more than 25 hospitals have fewerthan 50 beds) have limited resourcesto support the careful development ofsuccessful scorecards (e.g., Kaplan andNorton 2001).

    At the time of this study, acutecare hospitals in Ontario receivedone edition of SLSs called the HospitalReport (Baker et al. 1999). These score-cards, based on the balanced scorecardframework established by Kaplan andNorton (1992), included 38 indicatorsreported for virtually every hospitaland an additional 21 reported at theregional level. These indicators cov-ered system integration and changemanagement (learning and growth),clinical utilization and outcomes (in-ternal processes), patient satisfaction(customer), and financial performanceand condition (financial). Sponsoredby the provincial government and theprovincial hospital association. HospitalReport 1999 enjoyed the voluntaryparticipation of 89 hospitals out of 129eligible hospitals, including 12 teachinghospitals (100 percent of teachinghospitals), 63 community hospitals (82percent of community hospitals), and14 small hospitals (35 percent of smallhospitals) (Baker et al. 1999).

    The SLS uptake into and responsefrom Ontario hospitals over time haveled to an expansion of the HospitalReport series beyond the acute care hos-pital sector and into four other sectors:complex continuing care, rehabilitation,emergency department care, and men-tal health. An ongoing excerpt from theSLS focuses on women's healthcare inhospitals.

    Although the SLS provides a com-

    mon measurement tool for hospitalsand is an important component ofboth the provincial government's andhospitals' own accountability policies,hospitals need to develop their own,institution-specific scorecards for anumber of reasons. First, a hospital'sscorecard has to reflect only the in-stitution's own strategies. Second, ahospital's performance data on thescorecard should be updated more of-ten than is possible with the yearly SLS.Third, a hospital needs to disaggregatescorecard data to the business-unit leveland to align incentives and practiceswithin the hospital. Fourth, institution-specific scorecards encourage managersand staff to change their behavior andactivities to achieve corporate strategicobjectives, because the performanceindicators on these institutional score-cards, such as patient satisfaction andclinical efficiency, show how individ-uals can directly influence the results(Oliveira 2001). However, at the timeof the release of the first Hospital Report,only two Ontario hospitals had pub-lished examples of their own scorecards(Harber 1998; MacDonald 1998), anda review of hospital performance mea-surement activities across Canada didnot identify many additional scorecardexamples (Baker et al. 1998).

    This article reports on the uptake ofSLS {Hospital Report) into Ontario hos-pitals' own performance measurementsystems within one year of the SLS'srelease. It also explores some of thehospital characteristics associated withthe uptake of the standardized model.Figure 1 summarizes the Hospital Reportframework used in 1999.

    253

  • JOURNAL OF HEALTHCARE MANAGEMENT 50:4 JULY/AUGUST 2005

    F I G U R E 1Summary of the Hospital Report 1999 Framework

    Clinical Utilization and Outcomes Access Outcome Clinical efficiency

    Financial Performance and Condition Financial viability Efficiency Liquidity Capital Human resources

    Patient Satisfaction Patient satisfaction

    System Integration and Management Innovation Information use Internal coordination of care Hospital-community integration

    METHODSIn 2000, through a mailing list main-tained by the Ontario Hospital Associ-ation, we contacted all acute care hos-pitals in Ontario (n=129), requestingthem to send in a copy of their bal-anced scorecards, executive dashboards,or other corporatewide performancemeasurement tools developed by thehospitals. Hospitals could return copiesof this material by fax, e-mail, or mailto us (the researchers) or to the On-tario Hospital Association. After onemonth, we sent a reminder by mailto all hospitals. Through its memberrelations department, the OntarioHospital Association also nominated

    hospitals that were most likely tohave a hospital scorecard; from thenominations, we contacted membersof senior management at ten leadinghospitals by phone or in person torequest copies of their performancemeasurement framework. We perusedthrough conference proceedings andsearched Medline for examples ofhospital scorecards. Based on thesefollow-ups, we believe that no otherscorecards were being implemented athospitals in Ontario at that time.

    We entered information from allreturned material into a database andcreated a final list of all indicators con-tained in any of the submitted perfor-

    254

  • A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

    mance measurement tools. To classifythe indicators, one of us conducted ini-tial cataloging and another conducteda second. Another researcher reconciledany differences found between the firstand the second indicator classifications.We coded indicators on each hospital'sscorecard as "same," "similar," or "dif-ferent" from the indicators on the SLS.We tallied the number of indicatorson each hospital's scorecard that weresimilar to indicators on the SLS accord-ing to whether the indicators were thesame, and then we separately talliedthe number using only indicators thatwere the same. We made several as-sumptions when classifying indicatorsas the same as or similar to indica-tors on the SLS. In several instances,we contacted hospitals to clarify ourinterpretations. Rules used to codeindicators are available from us andare included in Appendix 1.^ We codedthe use of the quadrants included inthe SLS in a similar fashion.

    We stratified acute care hospitalsinto the three peer groups used in On-tario: community, small, and teaching.We did not include the two non-acutehospitals in our tests. We used Fisher'sexact test to compare proportions ofhospitals across peer groups becausethis test has a more stringent signifi-cance criteria to account for small cellsizes.

    R E S U L T SThirty acute care hospitals (responserate of 23 percent) and two non-acutecare hospitals responded. Some hos-pitals returned a range of performancemeasurement tools, including corporateand unit-level balanced scorecards.

    draft versions of a scorecard, and ex-ecutive dashboards with a differentframework from the balanced score-card. A total of 24 hospitals providedbalanced scorecards, which included 22acute care hospitals (17 percent of allacute care hospitals) and 2 non-acutecare hospitals (1 mental health carehospital and 1 chronic care hospital).Two more acute care hospitals (2 per-cent of all acute care hospitals) weredeveloping balanced scorecards at thetime of the survey, one acute care hos-pital (0.8 percent) stated that it used abalanced scorecard but did not provideevidence and thus was coded as notusing a balanced scorecard, and fiveacute care hospitals (4 percent) did notuse the balanced scorecard frameworkto communicate performance data.Follow-up by mail and by phone didnot identify any additional scorecards.

    Since completion of the study, wehave not found evidence of additionalscorecards at any Ontario hospitals.Although the response rate was rela-tively low (23 percent), we believe thatthis percentage represents almost theentire sample of balanced scorecardsat Ontario hospitals during the studyperiod. Although we may have workedfrom a nearly complete sample ofOntario hospitals that used balancedscorecards, the relatively low numberof responses means that differencesacross hospitals should be interpretedwith caution because they will rarely bestatistically significant.

    Based on 1999 data, an averageof 656 beds were available in teach-ing hospitals, 265 beds in commu-nity hospitals, and 39 beds in smallhospitals. Acute care institutions that

    255

  • JOURNAL OF HEALTHCARE MANAGEMENT 50:4 JULY/AUGUST 2005

    TABLE 1Use of the SLS Framework Quadrant Among Acute Care Hospitals with Balanced Scorecards(n=22)

    None

    At least At least At least At least Only used1 HR 2 HR 3 HR 4 HR the 4 HR

    Quadrant Quadrants Quadrants Quadrants Quadrants

    Community (n=15)Small (n=2)Teaching (n=5)Total number of hospitals

    0112

    1514

    20

    1314

    18

    9 (10)14

    14 (15)

    814

    13

    704

    11

    used balanced scorecards tended to belarger; 5 were teaching hospitals (42percent of all teaching hospitals), 15were community hospitals (20 percentof all community hospitals), and 2were small hospitals (5 percent of allsmall hospitals). These differencesacross peer groups were significantusing Fisher's exact test and suggestthat larger hospitals were more likelyto have scorecards (teaching versussmallp < 0.0049; community versussmallp < 0.0507). Similarly, the twonon-acute hospitals (mental healthhospital and chronic care hospital) hadstrong research and teaching missions.

    Likewise, use of a framework sim-ilar to the SLS framework was signifi-cantly more common among teachinghospitals compared to communityhospitals [p < 0.0529) and small hos-pitals (p < 0.0079). Twenty acute carehospitals employed a framework withat least 1 SLS quadrant, and 13 used atleast all 4 of the SLS quadrants. Table1 summarizes the formats used by the22 acute care hospitals that providedbalanced scorecards and shows thefrequency with which hospitals used

    quadrants in their own scorecardsthat were the same as those in the SLSframework. The number in parenthesesindicates the number of hospitals thatused the same or similar quadrants.

    Thirteen hospitals (59 percent ofacute care hospitals with a scorecard)employed a similar framework to thatused by the SLS, with at least fourquadrants to represent system integra-tion and managerial innovation, clin-ical utilization and outcomes, patientsatisfaction, and financial performanceand condition. Two of these hospitalsalso added extra domains of indica-tors that emphasized or added certainaspects of performance.

    One goal of the balanced score-card is to identify a core set of indi-cators that can be used to summa-rize an organization's performance(Chan and Ho 2000). On average,hospitals reported 27.3 indicators perscorecard (range: 13-53). Anothergoal of the balanced scorecard is toprovide a clear set of metrics withwhich to evaluate an organization'sstrategy (Kaplan and Norton 2001).Thus, the extent to which hospitals

    256

  • A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

    TABLE 2Number of SLS Indicators Used Among Acute Care Hospitals with Balanced Scorecards (n=22)

    1-10 11-20 21-30 31-40 41 +

    CommunitySmallTeachingTotal number of hospitals

    1325

    20

    (1)(0)(0)(1)

    1001

    (4)

    (3)(7)

    0000

    (6)(2)(2)

    (10)

    1001

    (2)

    (2)

    0000

    (2)

    (2)

    use similar indicators may reflect theextent to which hospitals are pursuingsimilar strategies in Ontario's largelysingle-payer hospital system. Table 2shows the number of indicators thatwere used in the 22 acute care hospi-tals' scorecards, regardless of whetherthey followed the SLS framework. Thenumber in parentheses shows the num-ber of indicators that were the same orsimilar.

    Although there were differences inthe number of SLS indicators used, themajority of acute care hospitals witha scorecard, regardless of peer group,used at least one indicator from theSLS. All hospitals used at least onesimilar indicator, and 82 percent ofhospitals used at least one indicatorthat was the same as an indicator onthe SLS. Table 3 describes the numberof all indicators, regardless of whetherthey are from the SLS or not, andthe number of same SLS indicatorsincluded in the corresponding SLSquadrants by the 13 acute care hospi-tals that used an SLS-based framework.The number in parentheses indicatesthe number of indicators that are thesame or similar.

    Table 4 shows the total numberof indicators different from the SLS

    used by the 24 hospitals that providedbalanced scorecards, regardless ofwhether the hospitals followed theSLS model. The number in parenthesesreflects the number of indicators whenboth same and similar indicators werecounted as being the same as the oneson the SLS.

    All of the acute and non-acute carehospitals with balanced scorecards hadat least one indicator that was differentfrom those contained in the SLS, andall had a fairly high average numberof different indicators. This suggeststhat hospitals developed their ownscorecards using data beyond thoseeasily available in the SLS framework toreflect the difference in their corporatestrategies.

    Interestingly, the choice of theSLS indicators was not consistentlyrelated to the availability of data. Ofthe 41 unique clinical utilization andoutcomes indicators, 25 (61 percent)were based on routinely collecteddischarge abstracts; the remainder werebased on other data sources, includingchart abstracts and admission dischargeand transfer systems. In contrast, of the24 unique financial performance andcondition indicators, 22 were based

    257

  • JOURNAL OF HEALTHCARE MANAGEMENT 50:4 )ULY/AUGUST 2005

    TABLE 3Mean Number and Range of Indicators Included in Balanced Scorecards That Follow ttie SLS Model(n=13)

    Meanin SLS

    Mean

    number of indicatorsquadrant

    number of SLS

    ClinicalUtilization

    andOutcomes

    7.6; 3-13(19.7; 3-34)0.54 (15.3)

    PatientSatisfaction

    5.6; 2-10(8.2; 2-16)0.62 (4.1)

    FinancialPerformance

    andCondition

    5.5; 1-12(6.1; 1-12)

    1.5 (2.7)

    SystemIntegrationand Change

    5.2; 2-10(5.5; 2-10)0.38 (1.3)

    indicators in quadrant

    TABLE 4Total Numher of Indicators Different from the SLS Indicators (n=24)Number of different indicators 0 1-10 11-20 21-30 31 +Number of hospitals 0 3(2) 9(7) 9(11) 3(4)

    on routinely collected financial incomeand balance sheet data.

    D I S C U S S I O N ANDC O N C L U S I O NLike hospitals in other jurisdictions,hospitals in Ontario face tremendouschallenges as they experience mergers,closures, and funding restraints (Chanand Lynn 1998). Managers of hospitalsystems are increasingly concernedabout measuring and managing orga-nizational performance in an attemptto remain focused on delivering high-quality patient care while maintainingexpenditures within global budgetsthat are centrally established. Not sur-prisingly, a number of hospitals haveadopted balanced scorecards, which is

    likely to help improve organizationalperformance. This article is consistentwith earlier work by Chan and Ho(2000), who studied the uptake of thebalanced scorecard model by hospitalexecutives across Canada. Their studyreveals that most executives supportedthe strategic use of the balanced score-card in management.

    It is unclear whether the adoptionof the SLS framework or indicatorsresulted from its relevance to individ-ual hospitals or from ease of usingalready-available data to measure per-formance. Hospitals with their ownbalanced scorecard were more likelyto be teaching or community hospi-tals than smaller hospitals. A lack of

    258

  • A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

    resources may have prevented smallhospitals from developing any type ofscorecard. This suggests that a greaterproportion of small hospitals withscorecards follow the SLS framework(containing at least four of the SLSquadrants) because lack of resourcesprevents them from further developingtheir scorecards. However, the pro-portion of small hospitals that usedthe framework was not significantlygreater than that of larger hospitals,which may be a result of the smallsample size. It may also result from theunsuitability of the SLS framework forsmall hospital management, but it isnot possible to test this approach usingthe available data.

    Increased attention on balancedscorecards may also have influencedthe development of an institutionalbalanced scorecard and uptake of theSLS framework by hospitals. Accordingto the Hospital Report 2004 StrategicPriorities Survey, 68 of 123 acute carehospitals (55 percent) reported thatthey had a documented and clearlyarticulated accountability framework(e.g., local balanced scorecard) usedin the planning and evaluation ofinternal programs. This suggests thatthe balanced scorecard may be in-creasingly recognized as a valuablemanagement tool for hospitals sincethis study. Further assessments onthe use of the Hospital Report (SLS)framework will examine the use ofthe SLS among Ontario acute carehospitals over time. Increased or sus-tained use of the framework may sig-nify its relevance, but research shouldfurther evaluate its relevance to hospi-tal managers.

    The relatively strong, early use of acommon set of indicators by nearly aquarter of hospitals in Ontario suggeststhat SLSs may have some relevance inquasi-market systems, where stronggovernment regulation and a largelysingle-payer system can encourage thedevelopment of similar strategies acrossseparate organizations. Compared toother systems, healthcare organizationsin Ontario are not unique in terms ofgovernment intervention in operationalor capital financing and other forms ofregulation, although Ontario may havegreater degrees of such intervention.These findings suggest that system-levelbalanced scorecards may have somerelevance to a number of quasi-markethospital systems. At a minimum, broadbenchmarking activities that providea common set of indicators to a largenumber of hospitals may be valuable,as these organizations design andimplement strategies to respond tocommon environmental characteristics.

    The early uptake of the SLS frame-work by acute care hospitals with abalanced scorecard suggests that theSLS may be a useful starting point ashospitals develop their own perfor-mance measurement system. A largerproportion of teaching hospitals usedthe framework, suggesting that theframework may be more relevant tothis group of hospitals than to otherpeer groups. The greater use of the Hos-pital Report among teaching hospitalsmay be an artifact, but this is unlikelybecause of its statistical significanceone of the few strongly statisticallysignificant findings in our work.

    The use of available data may havebeen another factor in the use of the

    259

  • louRNAL OF HEALTHCARE MANAGEMENT 50:4 JULY/AUGUST 2005

    framework; however, an average of 19indicators different from the SLS perscorecard were found even when indi-cators similar to the SLS were included.This suggests that hospitals were ableto collect their own data. Furthermore,not all the SLS indicators were used.Regardless, the uptake of SLS indicatorsat the local level is interesting andsuggests that there may be a role forthese sorts of cross-organizationalperformance measurement tools insupporting hospital management. Thismay result from the fact that the frame-work reduces data-collection burden,that hospitals have similar strategies,or that hospitals may find difficulty inobtaining performance measurementexperts to develop more appropriateindicators.

    A number of hospitals in our studyhave taken great efforts to ensure abroad spectrum of indicators to helpalign their strategy. Although some ofthese hospitals built from indicatorsincluded in the SLS, their indicatorsdiffered from the SLS framework inthree interesting and common ways,which reflect the strategic and account-ability needs of individual hospitals:

    1. Some institutions, particularlyteaching hospitals, focused onresearch and teaching indicatorssuch as new research contracts andmedical student education. Theseindicators reflect the relative impor-tance of research and teaching atthese institutions.

    2. Some institutions looked at mea-sures of community benefit andrelated their own performance tothe community they serve.

    3. Some institutions reorganized in-dicators to highlight other perspec-tives such as patient perspective orinstitutional perspective. Althoughthis makes it difficult to work fromthe cause-and-effect relationshipsembedded in the balanced score-card framework, it may reflect theimportance of internal and externalcommunication of needs. It mayalso help highlight trade-offs be-tween different groups within thehospital, such as between staff andpatients.

    Likewise, a number of commondifferences highlight some useful di-rections for future scorecards thatdescribe an entire system. First, asindicated by 17 of 24 (71 percent)institution-specific scorecards and inthe Hospital Report 2004 Strategic Pri-orities Survey as an important area offocus, entire-system scorecards shouldexamine human resources issues, whichinclude recruitment, retention, staffsatisfaction, turnover, and quality ofwork life. Second, as the province ofOntario provides the majority of fundsto hospitals, greater focus should beplaced on efficiency as a key sourceof competitive advantage. An exampleof such an indicator used by certainhospitals is turnaround time for clinicalinformation such as health records,diagnostic imaging, and labs.

    The SLS appears to be an importanttool used by healthcare institutions tomeasure their individual performance,but other measures of performance thatthese specific institutions used maynot have been accounted for in thisstudy. These include consulting reports.

    260

  • A COMPARISON OF PERFORMANCE MEASUREMENT TOOLS

    reports that v^ere under developmentor were not shared, and individualunit-level scorecards not known bycontacts at the hospital. Exclusion ofthese documents is unlikely to repre-sent a serious bias to this study, but itmust be acknowledged as the use ofa hospital-specific balanced scorecardmay be underestimated in our results.

    This study is the first to exam-ine the use of the balanced scorecardamong hospitals. Future assessments ofscorecard use are necessary to examinethe continuing impact on hospitals.The study also presents a somewhatcontrasting perspective on earlier workthat focuses on individual organi-zation's own scorecards, suggestingthat features of scorecards and otherperformance measurement tools mayhave some relevance across differentorganizations and that joint perfor-mance measurement activities shouldnot be neglected because of differencesin strategy across organizations. Furtherwork may usefully focus on interven-tions that may help hospitals integratecommon indicators into institution-specific scorecards, on factors in health-care systems that lead to or are associ-ated with common sets of indicators,and on the growth of the scorecards asa measurement framework over time.Within Ontario, ongoing research isidentifying the growth in the scorecardas a measurement tool across the hos-pital system and describing similaritiesin hospital strategies.

    N o t e1. To access the appendix, go to www.ache.org/pubs/jhmsub.cfm and scrolldown.

    A c k n o w l e d g m e n t sPaula McColgan helped collect infor-mation on hospital scorecards. Specialthanks to Carey Levinton for his guid-ance and assistance in statistical analysisand interpretation. The Hospital Re-port Project is supported by the OntarioHospital Association and the OntarioMinistry of Health and Long-Term Care.

    R e t e r e n c e sBaker, G. R., and G. H. Pink. 1995. "A Bal-

    anced Scorecard for Canadian Hospitals."Healthcare Management Forum 8 (4): 7-21.

    Baker, G. R., G. M. Anderson, A. D. Brown,1. McKillop, M. Murray, and G. H. Pink.1999. Hospital Report '99A BalancedScorecard for Ontario Hospitals. Toronto,ON: Ontario Hospital Association.

    Baker, G. R., N. Brooks, G. Anderson, A.Brown, I. McKillop, M. Murray, and G. H.Pink. 1998. "Healthcare PerformanceMeasurement in Canada: Who's DoingWhat?" Hospital Quarterly 2 (2): 22-26.

    Canadian Institute for Health Information.2003. Hospital Report 2002: Acute Care.Toronto, ON: Ontario Hospital Associa-tion and Government of Ontario.

    Castaneda-Mendez, K., K. Mangan, and A. M.Lavery. 1998. "The Role and Applicationof the Balanced Scorecard in HealthcareQuality Management." Journal for Health-care Quality 20 (1): 10-13.

    Chan, Y. C-L, and S-J K. Ho. 2000. "The Useof Balanced Scorecards in Canadian Hos-pitals." Unpublished paper. Michael G.DeGroote School of Business, McMasterUniversity, Hamilton, Ontario.

    Chan, Y. C-L, and B. E. Lynn. 1998. "Op-erating in Turbulent Times: How On-tario's Hospitals Are Meeting the CurrentFunding Crisis." Health Care ManagementReview 23 (3): 7-18.

    Curtdght, I. W., S. C. Stolp-Smith, and E. S.Edell. 2000. "Strategic Performance Man-agement: Development of a PerformanceMeasurement System at the Mayo Clinic."Journal of Healthcare Management 45 (1):58-68.

    Forgione, D. A. 1997. "Health Care Financialand Quality Measures: International Call

    261

  • JOURNAL OF HEALTHCARE MANAGEMENT 50:4 JULY/AUGUST 2005

    for a 'Balanced Scorecard' Approach,"Joumal of Health Care Finance 24 (1): 55 -58,

    Gordon, D,, M. Carter, H, Kunov, A, Dolan,and E Chapman, 1998. "A StrategicInformation System to Facilitate the Useof Performance Indicators in Hospitals,"Health Services Management Research 11(2): 80-91,

    Harber, B, 1998, "The Balanced ScorecardSolution at Peel Memorial Hospital."Hospital Quarterly 1 (4): 59-61, 63.

    Jones, M. L. H., and S. J. Filip. 2000. "Imple-mentation and Outcomes of a BalancedScorecard Model in Women's Services inan Academic Health Care Institution"Quality Management in Health Care 8 (4):40-51.

    Kaplan, R. S., and D. P Norton. 1992. "TheBalanced ScorecardMeasures that DrivePerformance." Harvard Business Review 70(1): 71-79.

    . 2001. The Strategy Focused Organiza-tion. Boston: Harvard Business SchoolPublishing Corporation.

    Macdonald, M. 1998. "Using the BalancedScorecard to Align Strategy and Perfor-mance in Long Term Care." HealthcareManagement Forum 11 (3): 33-38.

    Magistretti, A. I., D. E. Stewart, and A. D.Brown. 2002. "Performance Measurement

    in Women's Health: The Women's HealthReport, Hospital Report 2001 Series, ACanadian Experience." Women's HealthIssues 12 (6): 327-37.

    McCunn, P 1998. "The Balanced Score-card . . . The Eleventh Commandment."Management Accounting 76 (11): 34-36.

    Meliones, J. 2000. "Saving Money, SavingLives." Harvard Business Review 78 (6):57-62, 64, 66-67.

    Oliveira, J. 2001. "The Balanced Scorecard:An Integrative Approach to PerformanceEvaluation." Healthcare Financial Manage-ment 55 (5): 42-46.

    Pink, G. H., and T. J. Freedman. 1998. "TheToronto Academic Health Science Coun-cil Management Practice Atlas." HospitalQuarterly 1 (3): 26-34.

    Rimar, S., and S. J. Garstka. 1999. "The 'Bal-anced Scorecard': Development andImplementation in an Academic ClinicalDepartment." Academic Medicine 74 (2):114-22.

    Sahney, V. 1998. "Balanced Scorecard as aFramework for Driving Performance inManaged Care Organizations." ManagedCare Quarterly 6 (2): 1-8.

    Zelman, W. N., C. H. Pink, and C. B.Matthias. 2003. "Use of the BalancedScorecard in Health Care." Journal ofHealth Care Finance 29 (4): 1-16.

    P R A C T A P P L I C A T

    Mimi P. Lowi-Young, FACHE, FCCHSE, vice president, Central Canada, andexecutive director, Ontario Division, Canadian National Institute for the Blind

    S ince the inception of balanced scorecards, there has been limited research tounderstand the link between strategy and performance measurements as wellas the usefulness and applicability of the indicators to individual institutions. Thepopularity of balanced scorecards has increased in the hospital sector over the pastfew years. The focus on accountability to flinders (especially when the hospitals arefunded at 85 percent from a single payer) and to the patients and the communityserved by the hospital necessitated the development of a framework to measureperformance. The system-level scorecard provides that overall framework, includingthe relevant quadrants regardless of the size or type of hospital. The data can alsoserve to provide benchmarking opportunities.

    262