Is the laboratory productive? Are we truly holding down costs? Dowe make the best possible use of our personnel? Are our methods fortest and instrument evaluation valid? Laboratory managers askthemselves these difficult questions all the time. Often, they’resatisfied that the lab is moving in the right direction, but nowadministration is asking, “Can you prove it?” Last year, on the eve of national introduction of prospectivepayment, we decided it was time for serious introspection. A fullscaledepartmental audit would tell us what kind of shape the laboratory wasin. Such an analysis is routine in big business, but it is fairlyuncharted territory for hospital departments. It also can be time-consuming and nerve-cracking, yet it’sworth the struggle. You end up with a very good handle on laboratoryoperations–how much of your resources are devoted to differentactivities–and guidelines for improvements.
The exercise makes iteasier to prepare annual budgets, possibly including changes in staffingand instrumentation as well as new charges for tests. Similarly, itprovides a solid basis for long-range planning. Your proposals can beamply justified to administrators with data developed in the audit. We examined the laboratory from every angle: staffing,productivity, and workload; test order patterns, department work flow,and patient demographics; cost per test, single-test versus batch cost,and instrumentation; and budget figures.
The audit had two facets. The first involved an analysis of thelab as a unit to look at overall productivity, efficiency, costs, andrevenue. We planned to use external monitors–Ohio Hospital ManagementServices (OHMS), Monitrend, and CAP workload recording–to rate ourperformance internally, regionally, and nationally. Ohio Hospital Management Services is an independent monitoringgroup that works with the Ohio Hospital Association. As part of avoluntary hospitalwide program, OHMS compares the laboratory’smonthly workload with that of other hispital departments and with theworkload in labs at hospitals of similar bed size. Our 406-bed hospitalalso participates in Monitrend, a computerized reporting systemdeveloped by the American Hospital Association to monitor the activitiesof different departments.
In this program, we’re compared withlaboratories in institutions of similar size and case mix nationwide,with hospital laboratories of similar size in the state, and withhospital labs in our area. The second facet of the audit was a concurrent analysis of theindividual lab sections. By working with the section supervisors, Ihoped to glean better information about workload and staffing andultimately develop new standards for measuring efficiency. Figure I outlines our basic audit goals. Generally, these were toassess lab management, determine the exact workload, rate ourcost-effectiveness, and evaluate our sections’ ability to meetservice demands. The project seemed fairly straightforward. As audit coordinator, Inaively thought we could wrap things up in four weeks. Despitecountless hours volunteered by a dedicated staff, the time frameeventually stretched to four months.
Indeed, I spent an entire monthjust researching the feasibility of various kind of studies, finding outwhat information was available within the hospital, and along with otherpersonnel, writing the programs for our CompuPro 816A minicomputer.Once the groundwork was laid, I charted what we hoped to accomplish andlisted tentative completion dates. Everyone in the laboratory helped gather data. Phlebotomists, forexample, timed how long routine and Stat collections took. Technologistscarried out time studies on the tests they performed. Supervisorsdetermined direct costs for each test and reviewed many other aspects ofsection operations. In addition to the clerical staff, technologistsand supervisors input data as time permitted and kept me informed sothat I could keep track of who was doing what and what remained to bedone. The minicomputer performed more than 400,000 statisticalcombinations, and the final report totaled 300 pages, including graphsand tables.
How many Stats chemistry ran by hour and shift would be asingle combination. In examining laboratory activity by type of patient, to take oneexample, we pulled requisition slips for a high-volume month, alow-volume month, and an “average” month during 1983. Datawere entered on patient type (inpatient, outpatient, clinic, etc.);routine, ASAP, or Stat testing; number of tests requested; sectionperforming the test; day of the week; time of day; turnaround time; andother factors. This single exercise generated 50,246 computer records. We initially had hoped to look at each laboratory section as aseparate functional cost center.
The plan was to use standard costaccounting methods to allocate indirect costs to each section based onactivity, space utilization, use of support services, and employeehours. But the information needed for this type of audit was notreadily available; most hospitals don’t use such parameters toallocate indirect costs. The method we settled on was to take thelab’s percentage of hospital revenues and apply that to hospitalexpenses to determine our share. We analyzed all test procedures in terms of direct and indirectcosts, gross margin, and net margin. Our general goal was simply toimprove administration’s awareness of the lab’s financialstatus.
To accomplish this, we felt we needed to determine thecontribution of the total laboratory operation (gross revenue minustotal expenses) and the full cost per work unit generated within eachlab section (total section expenses divided by total section workunits). Direct costs were defined as salary expenses, supplies,reagents, consumables, equipment, and other costs generated solely fromoperating the lab. All tests were evaluated singly and, where appropriate, as batchprocedures. If more than one instrument was used for a specific test,both were evaluated. Backup methods and instruments used more than 10per cent of the time were also evaluated.
In this manner, we learnedwhat the cost differences were in performing the same test on differentanalyzers. If technologists are aware of these diffrences, they canemploy instruments more economically. Over the three-week period, each of the more than 100 time studieswas performed by at least five technologists to cover the various shiftsand days of the week. On multichannel instruments, a single study wouldcover several tests. The resulting times, checked against CAPstandards, generally were slightly better than the norm. To adjust forstatistical variance on single tests, we added 4 per cent to theperformance times. That figure was recommended to us by OHMS timemanagement engineers.
It might not apply to other labs. After evaluating the test times, we used established laboratorytest cost analysis methods of summarize the per-test cost of allsupplies, reagents, and quality control. Collection time and collectionsupplies were handled separately to evaluate their costs in relation tobatch and Stat procedures. The hospital’s computer-generateddepartment reports provided test volume and revenue figures.Technologist and phlebotomist costs per minute were based on thelaboratory’s average wage rates for each position. Following several weeks of data gathering, we began feeding thenumbers into the laboratory’s computer, which was programmed tosummarize direct costs, calculate indirect costs, gross margin, and netmargin, and analyze patient types and work flow. The entire lab staffspent spare time over a two-month period entering all the data.
It took18 computer hours to run the statistical analysis that generated the400,000 different data combinations mentioned earlier. Meanwhile, supervisors reviewed their sections for ways to cutcosts, speed up turnaround time, and improve efficiency. Using one ofour programs, they calculated direct and total costs for all availabletests by every method employed.
This covered about 250 procedures indiffering forms–single tests, batches, profiles, and panels. Our new centrifugal analyzer proved to be a major cost-cutter. Wehad estimated it would yield $40,000 in annual savings. According to the audit, this goal was reached in eight months; for the full year,savings amounted to $60,000. The audit also confirmed that a newautomated blood culture instrument had increased productivity topreviously projected levels. High-volume tests were further examined for direct cost/revenue andtotal cost/revenue ratios. We compared the cost of doing a single Stattest versus batching and then analyzed batching patterns and send-outsto see if specimens were handled as expediently as possible. Allhigh-volume procedures were flow-charted from the time the physicianwrote the order until the test result reached the nursing station.
Turnaround times were, in fact, good. As for current send-outs, wedetermined that the volume and the instrumentation needed did notjustify performing any of the tests in-house. Workload was the next item on the audit agenda. We reviewed fiveyears’ worth of CAP workload statistics for each section andprojected future growth rates. By tracing each section’s changingannual workload as a percentage of the laboratory’s total output,we were able to pinpoint shifts and trends.
With the help of thecomputer, I could breadk down any section’s patientpopulation–percentage of inpatients, routine outpatients, and clinic,preadmission testing, and emergency room patients. Figure II showssimplified inpatient/outpatient worklod ratios for all of the labsections. Even more interesting were the percentages of routine, Stat, timed,pre-op, and ASAP requests received by each section (Figure III). Wealso plotted the volume of such requests in two-hour increments duringthe day (Figure IV). Another graph depicted total laboratory workloadby day of the week. These kinds of analyses helped us compare normalstaffing patterns with actual needs. Again, we found that for the motpart our staffing closely matched work volume on all shifts in allareas.
By now, it may sound as if the audit didn’t lead to anychanges, but read on. The supervisors’ final task was to review instrumentation.This went beyond the kind of review conducted for annual budgetpreparation. We wanted a list of all instruments, including purchaseprice and date of acquisition, maintenance and replacement costs,service contracts, depreciation–and volume and revenue accounted for bythe instruments. This status report would help us plan for futurecapital outlays. The volume and revenue figures might tell us whetherthe purchase was justified and how well we were evaluating new produts. Assembling all this information gave us a clearer picture of eachsection.
Many of our management procedures and standards were upheld.For example, the audit proved that our method for establishing the costof new tests was valid and that criteria for batching generallyhadn’t changed. The statistics further demonstrated that our utilization ofpersonnel was good and that our productivity was exceptional.Flow-charting high-volume procedures showed that our test processingsystem works well, while the evaluation of instrumentation indicatedthat it is well maintained and often exceeds the estimated useful lifeexpectancy.
Much of the data would please administration. Calculations basedon Monitrend formulas revealed that our direct expenses per adjustedpatient day were 22 per cent below the national average and 20 per centless thant those of state and regional comparison groups. Thelab’s direct expenses and salary expenses relative to workload wereabout 16 per cent under the national average and 11 per cent below thoseof area comparison groups.
Although the audit results were largely favorable, we did pinpointseveral areas that merited further study. Here are some of the changesthat resulted during the past year: * Test charges. Often in reviewing rates, the immediateinclination is to raise charges for high-volume tests as a means ofmaximizing revenue. But the audit disclosed that we weren’tcovering indirect costs on other tests, particularly longstanding assaysin chemistry as well as some of the more esoteric procedures.
So wefirst made sure all tests were at breakeven or better before raising anyof them further. * Staffing adjustments. The audit established thelaboratory’s productivity at 57.7 units per hour worked.
That wasa 96 per cent efficiency rate, compared with the 80s range that the CAPrecommends. Indeed, we were too high. The only way we could attainsuch efficiency was by having supervisors working at the bench andconsistently putting in more than 50 hours per week. Our data persuadedadministration to approve the addition of three technical FTEs to thelaboratory staff. * Scheduling adjustments. Assumptions about the lab’s busiestand slackest periods were corrected somewhat by the audit results.
Likemany other labs, ours earmarked the Thursday before or the Friday afterweekend duty as a compensatory day off. These weekdays, however, turnedout to be peak workload periods. Tuesday is now the compensatory day. The data also indicated that when we scheduled the same employeeinto microbiology at 6:30 a.m. for an entire week, the early startproved to be wearing, and reports got out more slowly after severaldays.
It was better to split the duty between two technologists or usepart-timers. Scheduling will have to change with the shift in ordering patternsunder prospective payment. Traditionally, the day shift has done themajority of the work while other shifts covered the lab for Stat work.With DRGs, physicians are having their Medicare patients enter thehospital later in the day to trim part of the length of stay.
They wantthe lab work started in the afternoon or evening and posted on thechart. This puts more pressure on the later shifts. * Clerical services. Although the hospital has expandedconsiderably, we had done little to upgrade our clerical capabilities.
A mush-rooming outpatient load further strained the staff and thesystem. The audit identified the lab office as one bottleneck for testreporting. We plan to streamline the filing system to speed up storageand retrieval, and we are reevaluating job descriptions, priorities, andstaffing to bring the clerical service in line with current needs. Wealso have remodeled the front office. Inadequate transportation of test requests and distribution andcharting of laboratory reports require further investigation. Thenursing staff’s confidence in the pneumatic tube system must bebolstered, or it should be replaced with another transport system.
Acomputerized order entry system would elminate several problems in thisarea. * Patient processing. The audit made it clear that thelaboratory’s percentage of outpatient work was much higher than thenorm and that we weren’t processing these patients as efficientlyas we could.
Physicians’ offices had to make separate outpatientappointments for laboratory work, x-rays, and ECGs. Now, with acentralized scheduling system, one phone call books a patient for allrequired hospital services. We also have expanded seating in thewaiting room and eliminated overflow of outpatients. On admissions for diagnostic workups, a new approach channelspatients through the laboratory and other ancillary services beforesending them to the floor. This minimizes the late-afternoon rush thatalways seemed to hit just as the day staff was leaving. Test requestsare no longer held overnight, duplicate orders are down, and thelaboratory is spared numerous follow-up telephone calls and phlebotomy trips to the floor. * Outpatient marketing. We could do even more outpatient work.
Ourstudy found that the laboratory received outpatients from only 40 percent of physicians with active staff privileges at the hospital. Togain a larger share of outpatient testing, the laboratory would needmore competitive test charges, biling systems, and reporting practices;a professional courier service; and a marketing-oriented accountrepresentative. We also learned lessons about auditing. Trial and error taught usthat some of the statistics so painstakingly collected were irrelevantand superfluous.
For example, in determining our indirect costs, wespent a lot of time working out depreciation rates for our instruments,but we made little use of these figures. It is important to keep yourgoals in mind. Also recognize that there are usually several sources for the samekind of information and that their perspectives and answers may varymarkedly. We spent a good deal of time trying to reconcile data andmake certain that we were indeed comparing apples with apples and notwith oranges. It is equally important to keep the audit within thehospital’s financial system and to work closely with its financialconsultants. If the hospital uses zero base budgeting, that’s theonly methodology you can use. If you try to introduce statistics fromanother system, they won’t mesh with hospital figures.
Set a realistic time frame. Future reviews–annual updates of keydata and biennial full-scale audits–won’t take four months nowthat we have baseline figures and a good idea of how to proceed. Butthe initial audit is time-consuming. An independent test of our efforts came from an outside consultinggroup studying ways to cut hospital and medical expenses in ourcommunity. As part of their comprehensive study, the consultantspresented the laboratory with a massive questionnaire, which we wereeasily able to complete, thanks to the availability of our auditresults. After a review of our responses and the audit itself, theconsultants concluded that they couldn’t make a singlerecommendation for laboratory improvements. We weren’t surprised.