By Tom Lienhard


Abstract

Where does our knowledge regarding High Maturity of the CMMI® come from? Usually from CMMI training classes, CMMI conferences, CMMI lead appraisers, consultants and other CMMI “experts.” And how can we forget the “upfront material” of the CMMI and the infamous page 80 of CMMI Ver 1.1? What if all these sources were wrong or, at best, were only painting half the story? After all, these sources all stem from the same origin.

My personal evolution of High maturity understanding is depicted in Figure 1 below. It started with the Software CMM® (SW CMM) where high maturity was usually tied to the statistical control of defects found in peer reviews. I later became a BlackBelt and learned all about variation and the analysis of variation. From there I was exposed to the CMMI and attended the Understanding High Maturity Practices class at SEI, where I learned that High Maturity was about control charts. It was not until I truly understood that taking advantage of High Maturity Practices is about identifying business objectives, and what influences achieving those objectives, that I was able to pull my knowledge together and fully comprehend the potential of an organization that implements High Maturity Practices.

Raytheon Missile Systems (RMS) achieved SW CMM Level 5 in 2001 by statistically controlling defects detected in the software development process. Improvement was realized and return on investment was made however programs were stilling having the problem of being able to produce product at a price the customer was willing to pay. So, what went wrong, RMS was Level 5?

Was the objective of High Maturity to indentify an iterative process so statistical process control (SPC) could be applied? Was it to hang a Maturity Level 5 sticker on the wall? Or was it about identifying true business objectives and the key processes that impact those objectives, then statistically controlling those key processes to maximize the probability of meeting the objectives? We needed to step back, understand the key business objectives and concentrate on achieving those objectives. Not simply following what had been up until now, “success” in achieving High Maturity.

RMS could be described as a “high-volume prototype” factory. Although RMS builds families of weapons (missiles, projectiles, etc) each has their own unique objectives. Some are surface launched, some are launched from aircraft. Some have rocket motors, some glide. Some are small, some are large. Some are guided by GPS, some by laser and some by an Inertial Measurement Unit (IMU).

RMS needed to change their approach from one of designing a product, implementing that design, and then re-designing the product because the design was too expensive or could not be built in the volume needed to an approach of understanding the intended use of the product, making capability trades around affordability and produceabilty, and then designing the product to maximize affordability and produceabilty, as shown in Figure 2.

The paradigm shift was necessary because a business cannot survive if they design technically excellent products that can’t be produced at a price the customer is willing to pay. Analysis showed that 70% of product lifecycle costs were determined prior to the start of development yet over 75% of the cost is spent post development. In other words, once the design is on paper, over 70% of the cost is locked in.

The ah-ha moment came when RMS realized that the product lifecycle was expanded when the product was more than software, see Figure 3. SW CMM caused the lifecycle to be seen as development, since there was no manufacturing associated with software. When the SW CMM was sun-setted and the CMMI took over, it was easy to duplicate the software solution for high maturity (statistically control the peer review process), replicate for systems and hardware and claim victory. But keeping the status quo, focusing on just the development lifecycle, would completely miss RMS’ business objective – to reduce the Average Unit Production Cost (AUPC), reduce scrap, and increase yield.

Focus needs to be on the entire product lifecycle, from pre-concept through production, not just development. If the focus is just on optimizing the development lifecycle, it might actually increase the overall lifecycle costs. Or worse, negatively impact the business objectives, e.g. increase AUPC, increase scrap, and decrease yield. Production is ultimately where RMS will make a profit or lose their shorts. A small savings per unit in production can add up to be far greater than the entire development cost, refer to Figure 4. RMS was caught in the paradigm that the SW CMM, CMMI-Dev and industry caused – focusing High maturity Practices on development.

This was an epiphany. No longer think of the CMMI in terms of software, hardware, and systems but in terms of System Development. Remember what is critical to the RMS’ business. Production needs to be the emphasis over development. Production is where cost and time is either minimized or super-inflated. RMS is willing to invest more resources in development in order to streamline production. When dealing with software, production is virtually “CTRL-C” and rarely impacts design decisions. Production is extremely complex with hardware and is very much impacted by design decisions. The scope of the lifecycle does not stop at the end of development but should include manufacturing (production) and fielding. There was a profound shift in focus from the typical Software Development 1st, Systems and Hardware Development 2nd to Production 1st and Development 2nd.

In the 1950s SPC was applied to product. Starting in the 1980s, in part thanks to CMM and CMMI, SPC started to be applied to processes during development. What RMS is doing in the 2010s is looking at the mission objectives of the fielded product in the pre-concept phase and developing process and product performance models to predict the capability of the process to produce products that meet the customers’ needs at a price they can afford. This enables RMS to compose a defined process that maximizes the probability of meeting the key objectives and a design that will not need to be redesigned once it transitions into production, see Figure 5.

The process and product performance models allow predictions to be made throughout the entire lifecycle by different groups for different purposes as shown in Figure 6. Beginning in the pre-concept phase, models and historical baselines from systems which have previously been built are used to determine if the concept is even feasible. These models gain fidelity as they progress through the lifecycle. When actual data becomes available, the models are recalibrated. In development, these models are used to predict performance, producibility and affordability, and optimize the design prior to “bending metal.” During production RMS transitions from using models and simulations to SPC. The collections of models and resulting baselines are captured across the business for future programs to leverage and the cycle begins again.

For RMS, the goal is to balance performance, producibility and affordability to design a product which meets the customers’ needs at a price the customer can afford. This is embedded in RMS’ common process and is institutionalized via a plethora of statistical tools and techniques contained in the Raytheon Six Sigma Toolbox, including Quality Functional Deployment, Sensitivity Analysis, Design of Experiments, Reliability Predictions, Design for Manufacturing Analysis, Process Modeling, Producibility Assessments, and Cost as an Independent Variable and Process Capability Analysis.

Once the true business objectives are understood, the first step is to identify and understand what the customer needs. After that is established, transfer functions (or models) can be developed using tools including Process Capability Analysis Toolset (PCAT), Design Capability Analysis Tool (DCAT),Design and Analysis of Simulation Experiments (DASE) and Raytheon Analysis of Variation Engine (RAVE). These models can be used to help identify the influential factors or key characteristics that have the greatest impact on the customer needs. (In the CMMI world, these will be the subprocess that will be statistically controlled). These models can then be used to perform “what-ifs” and the knobs can be turned to determine where to set these key characteristics to maximize the probability of meeting the customer needs. These setting are captured and a control plan is established and used during production to ensure the key characteristics maintain within range. This is iterated at each subassembly and component level as appropriate.

This process helps eliminate over-design (high cost) and under-design (high scrap, rework and low quality) to find the sweet spot allowing RMS to design a product which meets the customers’ needs that can be affordably produced. This is done starting at pre-concept through production. In CMMI terms, the programs are predicting performance and optimizing the design using models and simulations prior to design and development without collecting actual measures, aka Doing Level 5 Before 4 without data!

Disclaimer:

CMMI® and CMM® are registered in the U.S. Patent and Trademark Office by Carnegie Mellon University.


Tom Lienhard

Click to view image

Tom Lienhard is a Sr. Principal Engineer at Raytheon Missile System’s Tucson facility and a Six Sigma BlackBelt. Tom has participated in more than 50 CMM® and CMMI® appraisals both in DoD and Commercial environments across North America and Europe and was a member of Raytheon’s CMMI Expert Team. He has taught Six Sigma across the globe, and helped various organizations climb the CMM and CMMI maturity levels, including Raytheon Missile System’s achievement of CMMI Level 5. He has received the AlliedSignal Quest for Excellence Award, the Raytheon Technology Award and the Raytheon Excellence in Operations and Quality Award. Tom has a BS in computer science and has worked for Hughes, Raytheon, AlliedSignal, Honeywell and as a consultant for Managed Process Gains.

Phone: 520-663-6580

E-mail: thomas_g_lienhard@raytheon.com


« Previous Next »