What’s the Point of Planning?

NOTE: this article first appeared in January 2005.  It is freely available through the ASQ website. 

What’s the Point of Planning?

Bruce A. Waltuck

November, 2005

Appeared in the Journal of Quality and Participation January 2006

What is the real purpose of strategic planning in an organization?  How can plans be crafted that are effective tools for managers and front-line workers alike?  What can senior leaders do to best integrate the objectives expressed in their strategic plan, with the processes of data collection, analysis, and ongoing improvement?  To inform our understanding of these questions, we’ll take a look at the planning, analysis, and feedback processes utilized over the past five years at a Federal regulatory compliance agency.

The agency in question enforces a broad range of Federal laws.  Some of these laws affect employers.  To influence compliance with these laws, and to correct violations, the agency has a field staff of investigators.  The investigators engage in a variety of compliance activities, from education and outreach seminars, all the way through criminal litigation.  The agency uses a nationwide database to record and track all compliance and investigative activities.  Each year, this agency stops its normal activities, and involves everyone in its annual strategic planning process.

Before we take a closer look at the way this agency’s planning process has evolved over time, let’s consider the reasons an organization makes a strategic plan in the first place.  Traditional wisdom suggests that a government agency in particular, derives its revenue from the public, through Congress and the budget process.  Each fiscal year begins with a known quantity of dollars, and a known quantity of human and capital assets.  Like most of us in our own lives, from time to time we ask the same question made famous by Ed Koch, the former mayor ofNew York.  “How’m I doin’?” he would ask the people ofNew York.  Businesses and yes, even government agencies, also want to know how they are doing.

Strategic planning derives fundamentally from the sense of where we are, and a sense of where we want to be.  In typical business terms, we talk about our mission and our vision.  The resources we get are the enablers of achieving that vision.  So at the end of a fiscal year, if someone asks us how we did, we can look at some measures of our work, and give them an answer.  Did we reach our goals?  If not, why not?  Our sense of what we should be doing, derives from our mission – why we exist at all as an organization – and our values – the ideas we think are important in filtering our decisions about what to do.

 So our government agency, through its leaders and managers, has a sense of its mission and purpose.  But what IS the mission and purpose of a regulatory compliance agency?  That turns out to be a question with more than one possible answer.  We might think of a speed limit law, and the officers who enforce it.  Is the purpose of planning their enforcement activity to achieve compliance with the law, or to issue tickets that generate revenue through fines?    We might also think about discrimination and harassment laws, and employer policies.  Yes, there are penalties and consequences for the offenders.  But the Federal Equal Employment Opportunities Commission has written that the purpose of such laws is to prevent and avoid violative behaviors in the workplace.  Clearly these are two related, but different ways to look at a compliance-oriented mission.

For more than a decade, our government agency has used a comprehensive top-down and bottom-up planning process.  Senior leadership, involving both career and politically-appointed managers, crafts a broad set of objectives.  For each major compliance program, the national office typically defines various goals and targets.  For one law, seek to achieve 90% compliance within the next five years.  Achieve a minimum 5% improvement over the coming year.  Conduct x-number of investigations in a certain program area this year, and to assure that your time is well-spent, perhaps 95% of those investigations should yield a finding of some violation.  To assure that the staff works efficiently, don’t take more than 90 days to complete at least 90% of your investigations.  To further assure efficiency, no more than 10% of your investigations should consume more than 50 hours of investigative time.  These are real objectives set by the senior staff of this particular agency over the past five to ten years.  We’ll come back to what these objectives mean, and to the kinds of (often unintended) consequences that such objectives create.

Our government compliance agency would send out its national plan to all of its regional and local office managers across the country.  At the local office level, both professional and support staff typically worked to create the detailed local plan.  Teams formed for each major compliance initiative area.  The office would meet as a whole to go over the national plan, and then the initiative teams split off to meet separately and make their plans.  Under this system, which was in place for many years, each team plotted its own utilization of staff hours for its activities.  There was no way for a given team, or a local office manager, to know in advance if the collective planning of their teams would account for 85% or 112% of the total available staff hours that year.  In some cases, nationally-mandated activities required very specific levels of support and activity.  But in most cases, a review of the team submissions was done by the local managers, and some adjustments in planned time (up or down) had to be made. 

Although the local office teams were working to devise plans that would achieve the nationally-set goals, they also had a considerable amount of freedom to design unique ways to do this.  Innovation could and did occur.  In 1996, one local office created a totally new approach to compliance in a particular industry.  A year later that approach was hailed as “the model for the rest of the country.”   In the 1997 national plan, this agency adopted the “model” in this initiative as the standard nationwide.  The innovative approach of one office, inspired positive change organization-wide.

As the saying goes, “with freedom, comes responsibility.”  We noted earlier that government agencies, like other businesses, periodically examine how they are doing in achieving their goals.  Stakeholders including Congress and the public, want to know how their funds are being used.  Agency managers want to know if the goals they set are realistic, and if progress is being made.  Our government agency typically set both output goals (how many/how much), and outcome goals (how well, or impact).

Data on the activities and performance of this agency derive from local investigator and manager input into a nationwide database system.  But there has not been any policy or guidance on precisely how various key activities are to be defined for data collection purposes.  One significant example of this came to light several years ago.  In one industry, our government agency frequently audits business with multiple subcontractors on-site.  Certain regulations apply equally and separately to both the primary employer, and to these subcontractors.  In one region of the country, our agency’s managers and field staff might check an establishment with ten independent contractors on-site.  Separate investigations of the main employer, and each of the ten contractors would follow.  Findings of compliance or violations of the law would be reported, and in some cases monetary liability and civil money penalties would be assessed.

But a serious disconnect was occurring in the collection and analysis of agency performance data.  In some parts of the country, local agency officials would report this example as a single investigative action.  All of the time expended in separately checking all 11 enterprises (the principal employer and ten contractors) would be entered in one (often large) sum.  In the rest of the country however, the long-standing practice was to enter a separate investigative action for each of the 11 individual contractor/employers.  Hours expended, and monetary liability/penalties were separately stated under each of the 11 actions.  In this (actual) case, one office reported a single action with 110 staff hours.  The other office reported 11 actions with around 12 hours on each. 

In the eyes of the database reporting system, there was no distinction made between these two sets of performance data.  No “red flag” went up to alert anyone that there was a serious problem with data entry, and performance analysis.  In a chance meeting between the two managers inWashingtonone day, the disparity came to light.  Someone reported the issue to the head of this particular initiative program in the agency’s national office.  While some discussion took place, no policy was issued about defining agency performance data or reporting.   It came as no surprise to those aware of the problem, that the outgoing Assistant Secretary of this agency mentioned in their farewell letter, their “disappointment” that the agency had not done better in achieving their objectives in this industry.

It should come as no surprise to any of us, reading this story, that the report of agency performance was negative.  Looking back, we can note again what the strategic plan set forth as agency goals and metrics.  Managers and individual auditors are gauged on the number of actions they do each year.  More is better.  Another measure of presumed efficiency is the average number of hours per action.  Less is better.  It is easy enough to understand why some regions would decide to split a place with 11 contractors into 11 separate actions (this is, in fact, the preferred method, because of separate legal coverage and liability issues that apply to each contractor individually).  But why did other regions refuse to do this, and skew the agency’s apparent performance by reporting just a single investigative action?  According to one district manager, it was because their regional leaders did not want the appearance of “pumping up the numbers” which they felt the other regions were doing.

This may not make much sense to us.  But the regional and local offices were periodically given accountability audits.  An internal team reviewed local practices and procedures.  There was a fear on the part of some regional heads, that they would be in trouble if they were viewed as artificially inflating performance figures.   According to one agency specialist in this initiative, this particular problem in data collection and analysis remains to this day.

If the data that is collected is not collected in a consistent manner, that data is essentially useless for any regional or national aggregated analysis or comparison.  But the process of making annual strategic plans goes on.  How can an organization know what strategies have been effective, if the performance data and analysis is useless?  How can the organization know what strategies to consider in future years? 

Over the past few years, strategic planning in our government agency has changed in several ways.  The national office now sets out more specific objectives than ever before.  Local offices are mandated to plan to the hour, more than 90% of all intended activities, over a year in advance.  Virtually every action has to be associated in the database with one or more of the nationally-devised initiatives.  Under the banner of “improved metrics,” the agency’s managers and professionals have new and more stringent goals for productivity and efficiency.  Few actions can require more than 50 staff hours to complete.  Very few can take more than 90 days to complete.   Some local offices now exclude their full staff from the annual planning process, because there is so little discretion for innovation anymore.

With this sort of plan and performance measures in place, what are the consequences for actual work?  What becomes of real indicators of goal achievement – especially the “how’m I doin’?” outcome goals?  Longtime agency managers and professionals know that some percentage of audits involve more complex legal and compliance issues.  These can not be completed in 90 days, or under 50 hours of work time.  The clear message that employees read between the lines of their strategic plan is that what matters most is numbers – more actions done in less time.  Quantity, not quality.

There are many lessons to be learned from this government agency’s experience with strategic planning and performance measurement.  Why do we do strategic planning anyway?  Professor Ralph Stacey of the Universityof Hertfordshireis widely acknowledged as a leading thinker in applying the concepts of complexity theory to management and organizational behavior.  Stacey has said in workshops that in fact, we really can not control the outcome of complex systems (like government agencies).  He dryly states that people engage in planning as a response to anxiety.  “We want to feel like we did something,” he says.

How many times have we felt that we have been unfairly held accountable for events that were simply out of our control?  Deming famously posited that at least 80% of defects in the workplace were not the fault of the workers, and said that “our prevailing system of management has destroyed our people.”  How would we feel if we investigated 11 businesses but got told to count just one for our productivity report?  How would we feel if we knew that others in our business were counting their work very differently?

A few years ago I was teaching a data collection and analysis workshop in Ottawa, Canada.  The room was filled mainly with government managers.  They didn’t want to talk about data, as such.  Instead, they told about the annual “performance contracts” they were required to sign each year.  These contracts set out what resources each manager had for the year – people, equipment, money – and included the annual plans for what was to be achieved.  At the end of each year, these Canadian managers had to submit an “attribution report.”  They were to explain what they did with all those people and all that money, and whether or not they had achieved their objectives.  As one, the 50 people in the room all said these should be called retribution reports, not attribution reports.     One manager said his job was to improve sanitation in rural Indian villages.  They sent people and money to dig wells, and teach about hand-washing.  But, he noted that there were 27 other non-government organizations in the same villages, doing the same kinds of work.  How, he said, could he attribute anything to his staff’s efforts?  Of what value, he asked, were plans and measures that bore no relationship to the real world of work that he and his people were doing?  I suggested he take his boss to a place where two rivers came together.  “Tell your boss,” I suggested, “that the smaller stream represents your efforts to achieve the goals of the strategic plan.”  Then tell him, I said, “that the larger river represents the combined efforts of everyone there – the agency and the 27 NGO’s.”  Finally, “ask your boss if he or she can ‘attribute” which drops of water in the river came from your wells.”  The manager smiled and said, “yes, and next year, we’ll strategically plan for rain.”

Bibliography

For more about the ideas in this article, check out the following:

Cloke, Ken and Goldsmith, Joan, The End of Management, Jossey-Bass, 2002

Cohen-Rosenthal, et al.  Unions, Management and Quality: Opoprtunities for Innovation and Excellence, Irwin Professional,Chicago, 1994.

Scholtes, Peter.  The TEAM Handbook, Joiner Associates,Madison,Wisconsin, 1988.

Waltuck, Bruce, Learning with Ralph Stacey, in Emerging, the newsletter of the Plexus Institute, www.plexusinstitute.org

 

 

 

Advertisements
This entry was posted in Change, Complexity, Continuous Process Improvement, Government Improvement. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s