The Top 10 Mistakes in Managing Safety Performance

Paul Balmert

Paul Balmert is a graduate of Cornell University’s School of Industrial and Labor Relations, and his career in chemical manufacturing spans 30 years. In 2000, Mr. Balmert formed Balmert Consulting (, a consulting practice that is principally focused on improving operations execution, including improving the management effectiveness in leading and managing safety performance. Mr. Balmert is the author of the best-selling book Alive and Well at the End of the Day; The Supervisor’s Guide to Managing Safety in Operations. He can be reached at

V. Scott Pignolet

January 1, 2004

Mistake Number 6: Measuring safety performance differently than the rest of the business. "If you can’t measure it, you can’t manage it."

The people running operations–making the product, delivering the service, handling the materials-really are world class when it comes to measuring how well their business is performing. They’re all over all the important details of how much, how well, how often. If the operation is performing well, they can tell you why; if it isn’t, they know all about the problems. It’s all part of running the business.

It reminds us of world-class athletes like Tiger Woods, and how well they understand exactly what they’re doing.

It’s not exactly a coincidence that the sophistication and level of intensity of performance measurement we see in operations match the measurement regimen of world-class athletes. It wasn’t always that way. In the last 30 years–the working career of our generation of managers–business operations and competitive athletics witnessed a revolution in the practice of performance measurement.

For most of the 20th century, competitive athletes learned how to play the game by copying what others did. They would improve on that by the combination of their own natural talent, conversations with other players, and trial and error during practice.

By the 1970s, technology began to enter the equation. Many believe that the most revolutionary technology was the equipment itself. Sure, equipment plays a role-in sports such as golf and the pole vault-but not in baseball, swimming, or track. The more interesting–and we would argue more powerful–effect of technology on athletic performance has been in measuring, evaluating and training.

High-resolution, slow-motion video has given coaches the ability to discern the fine movements and body positions that account for a significant part of sports performance. On the practice field and in competition, thorough and exhaustive measurement of every aspect of performance has become commonplace. It’s no longer just about the scoreboard: in football, the performance numbers that coaches are paying attention to are metrics such as average gain on first down, average gain per pass attempted, and the ratio of runs to passes.

For the individual athletes, the gym has been renamed the fitness center, where you’ll find practically every competitive athlete in every sport in the world. (OK, we’ll leave bowling off that list. Some things will never change.) Measurement of individual performance by sport and position is now the standard. Upper-body strength is measured by bench press for offensive linemen; speed in the 40-yard dash for linebackers and wide receivers; vertical leap for basketball players.

While athletes were using measurement to dramatically improve, those of us in operations were doing exactly the same, following the same approach. Our version of high-resolution slow motion video was computer technology. We made great use of the microchip to improve the performance of our equipment and our people. Our coaches and trainers were some of the best brains to be found in the world of quality improvement, work process re-engineering, and business management: names like Deming, Drucker, and Campy.

It’s a great story, and one of which we can be justifiably proud.

Since we all knew the most important part of our job as managers was sending people home safe, you’d think the next place we’d apply what we learned about performance measurement was to managing safety. While that makes perfect sense, it’s not what most of us did.

Measuring: Business and Safety

Sure, we kept lots of numbers and statistics about how our safety performance was going. We made many decisions based on what we thought the numbers were telling us. The differences between how we used performance measures for the business and how we used numbers to manage safety were startling.

Business measures are easy to understand; safety measures are not.

We could have easily explained any of our business performance measures to our fifth-grade sons and daughters. Production gets measured in barrels, truckloads, boxes, and feet. Cost gets measured in dollars and compared to budgets; quality by the number of conforming products and customer complaints; schedule in hours, milestones, and percentage complete.

Every one of our kids could understand these measures. More importantly, so could our employees.

As for safety, we lived and died by the total recordable injury frequency rate.

Frequency rates may be a great idea for the safety staff or the president of the company, but they were pretty much useless to many of us out on the job. First, there’s the issue of what counts as an injury; there are volumes written on this one, much of that in government regulations that look like the tax code.

Have an injury in our department, and somebody would have to calculate a frequency rate for us. Our number went from zero to 60 faster than a stolen sports car. That’s because the rates are calculated based on manhours worked, which roughly equate to injuries per 100 workers per year. Rarely did we have 100 working people on this job, or the injury right at the end of the year.

Of course, we’d post the rate on the sign at the gate so everyone could see it, and even had pay bonuses based on the rate. But only the guys over in the safety office could tell us what the rate actually meant.

What kind of a performance measure is that?

Everybody in operations kept score for the business; the safety office told us how well we were doing at safety.

Every shift, our staff added up their business performance numbers. Because they helped collect the data, they knew all about the numbers and the reasons why they were what they were. If you had a question about yesterday’s production or shipments, you could pick up the phone and ask the guy on the production line or in the warehouse what the story was. He’d tell you all about the reasons why production was up or shipments were down.

Our safety department counted the safety performance numbers. They’d get the medical reports; accident and near-miss reports, training records; and medical costs from the insurance carrier. Then they’d report the results to us (the managers).

That process usually left the rest of the organization out of the loop. We’d be the first to hear about problems and trends, and have nobody to ask about the trends or what was really going on. What kind of system is that?

For the business, we had lots of things to count; for safety, we often counted zeros.

We counted production in units-pounds, barrels, feet, dollars, and miles. There were plenty of those to count: everybody worked hard and produced much. Counting items was a huge part of our lives, as well it should be.

Fortunately, we seldom had anything to count for safety performance. People came in, worked and went home safe at the end of the day. That’s good news in every respect, but it did leave us counting a lot of zeros.

Zeros look good on the scoreboard. They weren’t of much use in telling whether our performance was getting better or worse. We’d go for a long stretch with no injuries. Then, bam, in a matter of a few weeks, we’d see a couple of injuries and that would send the injury rate off the charts. We were either doing great or doing awful, and we never could predict from the injury numbers what would happen in the future.

Everybody could tell good from bad performance for the business; for safety, sometimes we weren’t sure which direction was up.

Run a few weeks in a row at less than capacity, and everybody in the company knew there was a production problem. If we managed to come in below budget, we were heroes. When the number of customer complaints decreased, we all saw that as a good development, which would ultimately show up in sales and profits.

For some of our safety measures, good and bad weren’t all that clear. Say, for example, the times when the number of near-miss incidents was on the rise: did that mean we were headed for a big problem? We managers never could agree on the answer to that one. Half of us said, "watch out" and half of us said, "good news."

If we say that safety meeting attendance was falling, should we worry that we were about to have an accident? Everybody knew the relationship between customer complaint and sales, but we were never sure about the relationship between safety meetings and injuries.

In operations, if we didn’t have enough data to know what to do, we collected more data. For safety, we’d usually act on the data we had

When we had production or product quality problems, we were always quick to call in the experts. They knew how to dig through the data and find the cause of the problem. If the cause couldn’t be found, they’d go out and collect more data until they had the information they needed.

When it came to safety problems, it seemed like we never needed to call in the experts. Or collect more data. Or admit that the answer wasn’t obvious. We managers were always sure we knew what the problem was, and how to correct it.

Or so we thought.

In retrospect, we should have followed our approach of measuring product quality, customer satisfaction, and reliability. That would have made our lives as far simpler, and we probably would have gotten better results with less effort.

It’s one of the biggest mistakes we managers made.

Mistake Number 5: Trying to buy a game.

"This club is guaranteed to improve your score by 20 percent."

 -From a golf equipment informational

Sooner or later anyone who’s ever golfed has fallen to the temptation: buying the latest club to hit the market. The one guaranteed to knock strokes off next Saturday’s round.

Every once in a while, the latest technology works like magic. At least for a few rounds, and then we revert to form.

Most of the time, nothing really changes. Eventually the new club winds up in the back corner of the workshop, where it has plenty of good company with all the other clubs we bought to help us play better.

After all, lowering the score is the goal of every golfer, just as lowering the injury rate is the goal of every manager.

On a gorgeous autumn day a few years back, a famous golf teacher named Bob Toski put on a clinic for 60 of us in the maintenance and construction business. Along the way, he asked for a show of hands: "How many of you bought expensive new drivers or putters this year?" Every hand went up.

Then he asked, "How many of you invested in golf lessons?" One poor guy timidly raised his hand, perhaps embarrassed to admit he was actually taking lessons.

Toski glared at us: "There’s your problem: you think you can get better buying a game. It doesn’t work that way."

Toski was right about playing better golf–and right about improving safety performance.

As managers, we were always on the lookout for a quick and easy way to improve safety performance. We’d buy the carrot-and-stick approach: put in a safety incentive system, and simultaneously make an example out of the poor fellow who got hurt yesterday. We tried hiring safety inspectors and safety police. We re-wrote safety procedures; put in observation programs and employee safety committees.

Sometimes the methods worked. But more often, they didn’t work any better than that new golf club. Why was that?

Buying a safety game meant we managers didn’t have to change how we managed. We could just keep on swinging the way we always did, but with different results. Our new equipment would do the heavy lifting for us. Or so we thought.

It doesn’t work that way; not for golf and not for managing safety performance.

If we want better results, we have to change, and that requires us to invest in improvement. For golf, that means lessons from the pro, and hard time on the practice tee. It’s just that simple. You can’t send somebody out there to practice for you, and you can’t buy a lower score with your MasterCard.

When it comes to improving safety performance, it works exactly the same way. Getting people working safely is all about execution. Improving the way people in the organization perform their work every day–execution–requires leadership, and better leadership than what’s been employed in the past. We can’t expect better results with the same swing in golf or management.

The route to better leadership is the same as playing golf: "taking lessons from the pro and spending time on the practice tee." It’s just that simple.

Instead of buying a game, we’re investing in improvement.

More Safety Information

NIA Establishes Theodore H. Brodie Safety Award

If we had realized that years ago, we’d have likely seen far greater improvement in safety performance over the years. Sure, it would have taken a greater initial investment of our time and effort as managers. But, over the long haul it would have been a great investment.

Instead, we fell victim to trying to buy a game. It’s one of the biggest mistakes we made managing safety performance.