By Jitka West and Michael West


Abstract.

Most certainly, the scenario plays out a hundred times a day in organizations world-wide: People want to get work done more efficiently and effectively and then, almost inevitably, the tools camp and the process camp suit up, arm themselves as best they can with their respective “facts,” and then go do battle against one another, fighting for precious little budget and resources.

When left to fight the tools-versus-process war in perpetuity, and usually on a field lacking any real leadership, not only does neither side win; both sides lose. The two camps have mutually exhausted the money which – at some level in the enterprise – comes from the same pot. The tools or technology and the processes are implemented independently of each other and, thus, do not align with or support each other. The parts of the process which are reliant on using the technology, and the parts of the technology which are reliant on following process, are all “black box” to the user. The lack of process and technology integration leaves the users inefficient and ineffective, and maybe even confused and angry, perhaps even more so than before. The battlefield is scorched … trust is broken, careers are damaged, morale is diminished, and the seeds of retribution for future battles are deeply sewn.

In this article, we – business performance consultants Jitka West and Michael West, two people who have often had to wear the blue peace-keeper helmets in such conflicts – describe a better way. Technology and process are not opponents, they are allies which, when joined under a shared vision and mission, can significantly help workers do their jobs better and faster. Now that we’ve described the disastrous results of the tools-versus-process wars – which many of you can relate to – in this article we’ll explore the root causes for this common situation and describe changes you can make in your organization to bring greater peace and performance to the organization.

How Did We End Up Here?

Think back on all the times in your career when you’ve been either scripted as a soldier in process-versus-tools battles … or ended up as collateral damage. What did those battles have in common? How did I – a process person – end up disliking and distrusting my techno-geek office mate? Or why did I – the cool tools guy – stop going to lunch with that process woman down the hall?

In a very real sense, the on-going animus between process and technology is symptomatic of the way our modern organizations are structured. We are organized in departments and teams within which we exercise knowledge, skills, and experience that are homogenous with the others in our department or team. We love what we know, and we don’t love what we don’t know, so we tend to get along well with those in our department. For the sake of efficiency, organizations divide their people into sub-cultures such as software developers, system engineers, executives, human resources, information technologists, and project managers. The cultural boundaries between our team or department and other teams are as real as the cubicle walls and doors between our team and the others. Our organizations are not organized by mission or vision. Even the integrated team (e.g., the IPT) is an artificially imposed construct, and its members are rarely truly integrated with everyone working toward a common purpose.

Thus those of us who live in technology (e.g., the IT department) love tools, and we think that technology is always the solution to every problem. The process people, who usually live in operations or quality management, love process and think that process is always the solution. We love what we know.

As CMMI® Institute-Certified Lead Appraisers, we see a lot of interesting things when we lead appraisals, including process-oriented people hold onto a certain horribly inefficient way of doing things beyond all rationale. We’ve see quality assurance auditors use a physical (piece of paper) checklist to conduct an audit (with a pen or pencil), after which they often inaccurately transcribed the audit results into a Word document. The word document looked like a form, but didn’t have any functionality of a Word form. After all of this, the QA person then manually transferred defect data (e.g., counts, types, etc.) into a spreadsheet for analysis and charting. When we suggested all of that work could not only be made more efficient but also more accurate if all of the work was done in a single Excel file, the response we received was, “well, this is how I do it.” The person had started his career using a form from a typewriter and a pencil. The technology (MS Office products) to make the individual both more efficient and effective had been pervasive in the work place for years, but he only loved what he knew.

Another dynamic that contributes to the tools-versus-process paradigm is – hmmm … how do we say this? – technology is sexy and process is boring! It is as true as it is hard to admit. Let’s face it – our whole modern existence is full of glossy, shiny, sexy technology, from our i-things to our home entertainment systems to our automobiles – which are, of course, thinly disguised rolling entertainment systems. But when was the last time you saw a Super Bowl commercial selling you a process?

Also, tool development and implementation looks cheaper than process development, even though it rarely is. An executive can wrap her head around the finite price of a SharePoint server, Team Foundation Server (TFS), an enterprise project portfolio management tool, or a corporate metrics tool. The real total cost of ownership (TCO) is rarely investigated or considered because that requires the hard work of consideration. The cost of the tool out of the box is, if nothing else, a number that we and accounting can deal with. On the other hand, the cost of process development, process improvement, or process management is ambiguous, and most managers and leaders will retreat from ambiguity to seek safety and comfort in numbers … even inaccurate numbers.

Both tools and processes are forms of codified human knowledge, but because codifying human knowledge in software and systems gets all the press, and because it’s easier to touch and feel technology, managers and leaders are just more comfortable accepting technology as the “solution.”

A Better Way

So are we just to accept our lot and resign ourselves to the endless waste resulting from the tools-versus-process wars? Can we improve the business of performance improvement in a way that everybody wins, including our shareholders and the tax payer? Yes we can. This section provides you with an approach that we have partially implemented in both Government and commercial sector organizations. Although we cannot claim dramatic success just yet, we have at least gotten the two camps to call a truce.

Get Organizational Performance Improvement Out of the Silos

In my (M. West) 2013 book, Return On Process (ROP): Getting Real Performance Results from Process Improvement [1], I observe that the modern organization is really a system of systems, made up of three sub-systems: 1) people, the social system, 2) technology systems, and 3) process systems. I also make the experience-based argument that improvement or change to any one of the three sub-systems will effect collateral changes to the other two sub-systems, whether or not we plan those changes, and whether or not we observe or measure those collateral changes.

When we change technology, whether the initiative is insertion of wholly new technology, a platform migration, a systems integration, or simply an upgrade to existing tools, the change will affect process. Process often defines how people use technology to get work done, so if changes to technology affect those interactions and work practices, then the processes will need to be changed to accommodate the technology changes.

Changes to technology also effect collateral change on the people sub-system. Ironically, technology changes sometimes do not yield the intended performance improvement because workers are not trained to use the technology effectively and efficiently. Improvement in technology almost always requires a correlating change to worker knowledge and skills.

Changes to processes – presumably to improve process performance – also do not occur in a vacuum, no matter how much the initiative is pursued within an organizational silo. The performance of defined processes more often than not incorporates the use of technology. (Think configuration management processes, requirements management processes, testing processes, to name just a few.) Thus, improvements to processes often require concomitant changes to the tools and systems – primarily the human interface aspects of the tools – to support performance of the changed processes.

Yet even when the collateral effects of changes and improvements to one sub-system within the organization are understood, improvement initiatives are rarely planned and executed in an integrated approach. The CIO or CTO executes a technology change and then, sometime later, the process people realize that the defined processes no longer work with the tools, and have to execute a catch-up “improvement.” Or the COO leads a process improvement initiative only later to have people complain that the way they use technology doesn’t fit with the defined processes. The naïve process developer (which I once was) will proudly proclaim that he wrote his processes to be “tool agnostic.” The learned and experienced process developer knows that is wrong, and just the opposite is desirable: Develop the processes such that they inextricably integrate and work with the use of the tools and systems.

Improving technology and improving process in silos will separately have less positive results on organizational performance than if the two improvements are planned and executed as one integrated improvement initiative. This won’t happen accidentally. Executives and leaders in the organization need to step outside of their domain and work with leaders of the other silos. The unifying mission cannot be process improvement alone or technology improvement alone, rather it must be the higher calling of organizational performance improvement.

Perhaps you – the reader – are not operating at the executive level or in a leadership role. Yet you were not hired to just blindly follow orders. (Remember! If all you do is say “yes” to your boss, one of you is redundant!) No matter what level of the hierarchy you’re operating in, when you find yourself getting involved in some kind of change initiative such as a process improvement, it is in your long-term self-interest to reach out to your colleagues in the other silo and say, “hey, this initiative I’m involved in is going to affect your work too … will you help me elevate the planning of this change to our bosses?”

Understand How the Tool and the Process Contribute to Performance

This topic is too easily and too often unjustly treated – and dismissed – by tired clichés: “There’s no silver bullet;” There is no magic wand.” However, quotidian quips only mask and suppress the questions and the conversations that should be embraced. Maybe that tool isn’t a silver bullet, but how much fire-power does it bring to performance improvement? The reengineered process certainly possesses no magic, but what will it contribute to improved performance, or will it even be predictably repeatable? More importantly, how can the use of the tool and the process together help people perform more effectively and efficiently in their jobs?

The answers to those two questions are intuitively obvious in certain situations. For example, most modern day software workers would find it unthinkable to control the configuration of software source code manually, thus the pervasive use of software configuration management tools. Sure, some process discipline may still be required of the people using such tools, but the tools do the “heavy lifting” in managing and controlling the integrity of the software. In other cases, a tool – even if it existed – would be either useless or a hindrance. Imagine a tool that tries to facilitate and control the interactions of people in a meeting … how well would that work?

However, in most engineering and management activities, the answer to the question – tool or process? - which brings the most value to the activity requires more in-depth analysis. Such analysis should never start with a foregone conclusion lest the analysis is biased.

To demonstrate how such an analysis might be conducted, we’ll use the example of planning and conducting peer reviews and collecting and analyzing data from peer reviews. These are all activities which are commonly performed in many organizations in both the defense and commercial sectors. Table 1 illustrates how to evaluate the contributions, and strengths and weaknesses, of both tools and processes to peer reviews.

As you can see in the simple analysis in Table 1, there are some peer review activities, such as the human interaction to disposition peer review input, for which having defined standards and processes contributes more to the performance of the process than do tools. In other activities, such as conducting a peer review, tools are the relatively stronger contributors in terms of performance efficiency and efficacy. Yet in other activities such as analyzing the results of peer reviews, defined processes and standards and technology are equal partners in terms of the value they each bring to the performance of the process.

In some Agile development environments the relative separation between tools and processes is so seamless as to be almost subconscious to process performers. If you ask an Agile software developer what she does in a daily stand-up meeting, she’ll describe the process: “I share what I completed yesterday, what I will complete today, and any barriers to completing my tasks.” If you then ask her to describe an image of the daily stand-up meeting, she’ll probably describe the stand-up board, a tool that depicts task burn-down chart, risks, etc. The act of writing an epic or a user story is a human mental process, but most Agile teams wouldn’t think of trying to plan and track their development tasks to implement the user stories without a tool to automate the process such as Team Foundation Server (TFS). In these environments, it is difficult to articulate the relative contributions of the tool versus the process, and it is not useful to do so when the developers perceive that they cannot do their work effectively and efficiently without either.

Performance Improvement is for the Performer

In terms of balancing the investment in tools or processes, another important – yet often overlooked – consideration is who will be performing the process and using the tool? You can spend many hours developing a well-articulated process description, but if the performer is a tool lover, he will find fault with the defined process, always having a bias for using a tool. Tool-oriented people will always posit the challenge, “Why do we need a defined process … our tool does that?”

Conversely, the intended user or performer may have an aversion to technology, preferring instead to perform work by following written instructions or a process. In these cases, it will be a constant challenge to convert that person simply by assuring them that the tool enforces performance of the process.

Given that different individuals could be performing the same or similar work, those of us involved in process development and management work can ill afford to alienate one group of people to make another group happy. When developing or redeveloping process, our challenge is to abandon our preconceived notions and predilection, and use the powers of inquiry and active listening to elicit from our end users – the performers – not only what they need, but also what they want.

In a current Government contract, our company is supporting a software development team in a process definition project. For years this team has ostensibly been performing the organization’s defined standard processes, which takes the traditional form of a lengthy, text-based narrative document, and which is based on a traditional RUP-based waterfall life cycle model.

However, in practice the team has not been performing the organization’s defined standard processes. This software team is comprised of very tech-savvy people, and they have adopted Agile/Scrum methodologies for software development. Prior to our working with this team, there was a history of dissonance at best and conflict at worst between the software team and the team responsible for process and process improvement in the IPT. Our contract included supporting both the process team and the software team, and serving both sets of customers with candor and integrity.

We approached this somewhat delicate situation with two vectors. First, we helped the process team come to the realization that they simply did not have – nor would ever likely have – sufficient resources to continue developing or updating all the standard processes that had been deployed over the years, and that the various groups within the IPT needed to take ownership of their own processes and the care and feeding of those processes. This part of the solution was relatively easy since process group was already realizing that they played a more strategic role in the development of the IPT.

In working with the software team, as consultants we knew from the start that we had to play on their home turf. Culture trumps strategy … every time. We knew that they would reject us outright if we proposed that the process development project be executed in a waterfall approach, and we knew that we had to apply Agile methods to process development just as they do with software development. This has not been easy because there is not a plethora of published information about applying Agile to process development. As of this article, it is still premature to claim success, but we are already seeing results in terms of getting tool-oriented people accepting standard processes on their own terms.

One of the ways we accomplished this was how we approached Sprint 0. In Sprint 0, we took the approach of identifying functional requirements (“Rnnn”) for the process system that they – the software developers – both need and want. Each functional requirement is then broken down into epics, (“Ennnn”) which would later be further broken down into user stories and finally into development tasks. Figure 1 shows a partial example of the Sprint 0 functional requirements and epics for the group’s defined processes.

In our consulting practice, we have also worked with numerous clients to develop their process systems using a traditional waterfall approach. The process is the product, and we employ most if not all of the proven project management and engineering practices that you would use to develop a product or a system. We facilitate meetings to elicit and define requirements for the process, develop a process architecture and design, develop the process system and verify that it meets the requirements, and then test or validate the process.

A Checklist for a Tools-Process Accord

One of our favorite tools is the process asset called a checklist. Checklists – no matter what their form or format – are a powerful tool for codifying things that you should do or should have done without trying to retain that do-list in your head. So we think it’s appropriate to provide you – the reader – with a checklist (Table 2) for becoming an effective peace-keeper in bringing the tools and the process camps closer together for the benefit of everyone in the organization.

Disclaimer:

CMMI® is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.


References and Notes

1. West, Michael. Return On Process (ROP): Getting Real Performance Results from Process Improvement. New York: CRC Press, 2014. Chapter 1, “Real Performance Improvement.”

Jitka West

Click to view image

Jitka West is a CMMI Institute-Certified Lead Appraiser and a Lean Six Sigma Black Belt. After working many years in information technology, data warehousing, and business intelligence, Jitka joined Natural SPI and has served as an industry thought-leader in process system design and development. When not developing advanced processes that help people reach higher levels of performance, she enjoys knitting with kitties in her lap and traveling with her husband, Michael.

Phone: 435-729-9101

E-mail: Jitka@naturalspi.com

Michael West

Click to view image

Michael West is a CMMI Institute-Certified Lead Appraiser and is co-founder of Natural SPI, Inc., a consultancy with 12 years of success in helping its clients achieve their performance improvement goals. Michael is the author of two books: 1) Real Process Improvement Using the CMMI (2004, CRC Press); and 2) Return On Process (ROP): Getting Real Performance Results from Process Improvement (2013, CRC Press). Mr. West is a novice beekeeper and enjoys travel with his wife, Jitka.

Phone: 435-901-4295

E-mail: michael@naturalspi.com


« Previous Next »