Quick Note: Theory of Constraints and Six Sigma November 24, 2013Posted by Tim Rodgers in Process engineering, Quality.
Tags: factory quality, performance measures, process, six-sigma
add a comment
Last week I attended the monthly meeting of the Northern Colorado chapter of the American Society for Quality. The featured speaker was Dr. Russ Johnson, President of Improvement Quest, a local management consulting firm. Dr. Johnson’s talk “Creating a Culture of Harmony by Using the Theory of Constraints Concepts to Focus and Integrate Lean and Six Sigma” included several interesting insights about to effectively integrate these strategies in a production environment.
Of course the key to successful implementation of the Theory of Constraints is identifying the bottleneck, or constraint, in the production process and then optimizing the rest of system around the constraint (“exploit, subordinate, elevate”) in order to maximize overall throughput while controlling inventory (including work-in-progress, WIP) and operating expense. At the risk of oversimplifying, Six Sigma can be described as “reduce variability,” and the lean philosophy is essentially “eliminate waste.”
These strategies are not different ways of solving the same problem. They can and should be implemented as elements in an integrated improvement effort. The trick is understanding that not all processes are equally good targets for a six sigma or lean improvement plan. It depends where the process is in relation to the constraint.
Any yield improvement or waste elimination that occurs upstream from the bottleneck doesn’t improve throughput because it effectively increases the input to the bottleneck that is already limited. In fact, it can be detrimental to the operation as a whole if it increases WIP and associated costs for material and operating expenses. The focus should be on downstream processes where yield improvement or waste elimination effectively increases the capacity of the constraint. Scrap or rework that occurs after the constraint is especially damaging because it essentially requires another pass through the constraint.
The point is that you can’t assume that all improvements at the micro level are equally beneficial at the macro level. Yes, generally there’s value in reducing variability and eliminating waste, but when your resources are limited and you have to focus, consider the constraint and whether your improvements are really improving the metrics that matter.
Does Your Company Need a Quality Department? November 13, 2013Posted by Tim Rodgers in Management & leadership, Process engineering, Product design, Project management, Quality, Supply chain.
Tags: early stage companies, factory quality, organizational models, outsourcing, process, product development, quality engineering, six-sigma, supply chain
add a comment
You already have a quality department, you just don’t realize it. Do you have suppliers or service providers? You have people managing supplier quality when you receive parts or services that don’t meet your specifications. Is your product manufactured? Whether you build it yourself or outsource to a contract manufacturer, you’ve got quality issues. Do your customers have problems with your product or service? Somebody in your team is managing your response. Poor quality is costing you money, whether through internal rework or post-sale costs. The question is whether you want to pull all this activity together into a separate, centralized organization.
Some organizations, particularly early stage companies, may feel they can’t afford a dedicated quality team. After all, quality is fundamentally a non-value-added function. It doesn’t contribute directly to the delivery of a product or service. However, we live in a world of variability, where every step in the delivery process can cause defects. You may be passionate about eliminating defects and saving money, but do you really know how? Quality professionals understand how to determine root cause, and they can investigate from an impartial perspective. They have expertise in sampling and statistics, and that enables them to distinguish between a one-time occurrence and a downward trend that requires focused resources.
Do you care about ISO 9001 certification? If you do, you need someone to develop and maintain a quality management system, monitor process conformance, and host the auditors. If you’re in regulated industry, you need someone to understand and communicate process and documentation requirements throughout your organization. Other responsibilities that could be assigned to the quality team include environmental, health and safety (EHS), new employee training, equipment calibration, and new supplier qualification.
All of these tasks can theoretically be handled by people in other functional groups, but you have to ask yourself whether you’re getting the results your business requires. Organizational design derives from a logical division of labor. The sales team is separate from product (or service) fulfillment so that one group can focus on the customer and another can focus on meeting customer needs. Fulfillment may require separate teams for development (design) and delivery. As the business grows, other functions are typically created to handle tasks that require specialized skills, such as accounting and human resources.
Quality is another example of a specialized function, one that can help identify and eliminate waste and other costs that reduce profit and productivity. Maybe those costs are tolerable during periods of rapid growth, but at some point your market will mature, growth will slow, and you won’t be able to afford to waste money anywhere in your value stream. That’s when you need quality professionals, and a function that can coordinate all the little quality management activities that are already underway in your organization.
Why We Need Quality Police November 10, 2013Posted by Tim Rodgers in Management & leadership, Organizational dynamics, Process engineering, Quality.
Tags: early stage companies, management, power, process, quality engineering
add a comment
I’ve said it myself many times: the quality department shouldn’t be the quality police. We tell ourselves that everyone is responsible for quality, and we therefore ask people to police their own behavior and make the right choices. This sounds good and noble, and it’s certainly more cost-effective than relying on a separate functional group to keep an eye on things.
And yet: it seems to be the only way. We need quality police.
When we’re left on our own, we tend to look for the fastest and easiest way to complete our assignments. We don’t spend much time thinking about the priorities or needs of other groups, or how decisions have future consequences. To eliminate chaos, businesses establish work standards and processes to enable coordinated activities and a smooth flow of information. Certainly we want our work processes to be effective, but what matters most are the consistent results that are achieved when everyone follows the process.
Somebody has to keep en eye on all this, to check for process conformance and process improvement opportunities. Managers can monitor the performance of their assigned teams, but a manager will tend to optimize within their team according to their objectives. Second-level or higher managers have a broader (and possibly cross-functional) perspective, but they probably lack the deeper understanding of the work processes.
If you have a quality team, this is their job. They’re the ones who pull together all the processes into a corporate quality management system (QMS). They’re the ones who train and audit the QMS, not just to make sure it’s being followed, but also to make sure it’s meeting the needs of the business. They’re the ones who monitor the performance of the processes to identify opportunities for improvement. And, if you care about ISO 9001 certification, they’re the ones who make sure you “document what you do, and do what you’ve documented.”
This isn’t the quality police looking for “process offenders” and punishing them. This is standardizing processes, reducing variability, and eliminating waste. Doesn’t every business want that?
Are Your Suppliers Really Committed to Quality? November 6, 2013Posted by Tim Rodgers in Management & leadership, Process engineering, Quality, Supply chain.
Tags: factory quality, leadership, management, outsourcing, performance measures, process, quality engineering, six-sigma, supply chain, test & inspection, training
1 comment so far
Suppliers always declare their commitment to the highest standards of quality as a core value, but many have trouble living up to that promise. I can’t tell you how many times I’ve visited suppliers who proudly display their framed ISO certificates in the lobby yet suffer from persistent quality problems that lead to higher cost and schedule delays. Here’s how you can tell if they’re really serious:
1. Do they have an on-going program of quality improvement, or do they wait until you complain? Do they have an understanding of the sources of variability in their value stream, and can they explain what they’re doing to reduce variability without being asked to do so? Look for any testing and measurements that occur before outgoing inspection. Award extra credit if the supplier can show process capability studies and control charts. Ask what they’re doing to analyze and reduce the internal cost of quality (scrap and rework).
2. Do they accept responsibility for misunderstandings regarding specifications and requirements? Or, do they make a guess at what you want, and later insist they just did what they were told? Quality means meeting or exceeding customer expectations, and a supplier who is truly committed to quality will ensure those expectations are clear before they start production.
3. Do you find defects when you inspect their first articles, or samples from their first shipment? If the supplier can’t get these right when there’s no schedule pressure, you should have serious concerns about their ability to ramp up to your production levels. By the way, if you’re not inspecting a small sample of first articles, you’ll have to accept at least half of the blame for any subsequent quality problems.
4. Has the supplier ever warned you of a potential quality problem discovered on their side, or do they just hope that you won’t notice? I realize this is a sign of a more mature relationship between supplier and customer, but a true commitment to quality means that the supplier understands their role in your value stream, and upholds your quality standards without being asked.
Ultimately, you will get the level of quality you deserve, depending on what suppliers you select and the messages you give them. You may be willing to trade quality for lower unit cost, shorter lead time, or assurance of supply. The real question is: What level of quality do you need? What level of poor quality can you tolerate?
Why We Need Processes (and Recipes) July 7, 2013Posted by Tim Rodgers in Management & leadership, Process engineering.
Tags: change management, innovation, management, process
add a comment
I enjoy cooking, it’s one of my few creative outlets. I used to tell people that there’s some deep connection between cooking and my early interest in laboratory chemistry, and maybe there’s something to that. At least with cooking you can eat your mistakes, most of the time. I’ve learned a few kitchen techniques, and I enjoy trying new recipes, particularly if the ingredients are accessible and it doesn’t take too much time to prepare.
I typically follow the recipe exactly the first time I try a new dish or dessert. That’s because I assume the creator of the recipe has done some trials and determined that this is the right sequence of steps and the right balance and ratio of ingredients that will yield the best result. As I’ve gained more experience I’ve become more confident in my ability to adjust the recipe to match my taste. However I make a point of writing down my changes and the results of those experiments so I can reproduce the outcome instead of relying on my memory of something I prepared weeks or months ago.
Last week I was trying to explain to someone why we need documented processes at work, and why it’s important to edit processes. If it’s important to get a consistent, predictable result, you should find a process that delivers that result and write it down so you don’t have to rely on institutional memory or the work habits of an individual employee.
If it turns out that you’re not getting the results you desire, or it costs too much, or there’s collateral damage, then you should definitely stop using that process. That doesn’t mean ignoring the process and giving up on the benefits of consistency and predictability. It means editing the process, or possibly creating a completely new one that meets your needs. Either way, those edits should be based on an understanding of what isn’t working. It may require several iterations to find a better process, but as one of my favorite TV chefs likes to say: “Your patience will be rewarded.” The alternative is chaos.
Innovative Design vs. Lean Product Development April 17, 2013Posted by Tim Rodgers in Management & leadership, Product design, Project management, Quality.
Tags: innovation, management, process, product development, six-sigma, strategy
I’ve been very busy focusing on my job search and some self-improvement projects, and unfortunately it’s been harder to find some time to address my accumulated backlog of topics. I regularly follow several group discussions on LinkedIn related to product development and quality, and lately a popular discussion topic is how to inspire innovation in product design.
See for example Wayne Simmons and Keary Crawford “Innovation versus Product Development” (http://www.innovationexcellence.com/blog/2013/04/12/innovation-versus-product-development/), and Rachel Corn’s blog “Is Process Killing Your Innovation?” (http://blog.cmbinfo.com/bid/87795/South-Street-Strategy-Guest-Blog-Is-Process-Killing-Your-Innovation?goback=%2Egde_2098273_member_229196205). The latter post quotes a former 3M vice president who says that Six Sigma killed innovation at 3M, apparently because 3M’s implementation of Six Sigma required “a full blown business case and even a 5-year business plan to get a new idea off the ground and into production.” The VP wonders: how do you institutionalize innovation without stifling it?
The conventional wisdom seems to be that product design is inherently a creative, right-brain activity that will fail or at least fall short if constrained by process. You can’t make art on a schedule.
I think this is a false conflict. I don’t see any reason why teams shouldn’t be able to conceive new designs within a structured and disciplined product development environment. Obviously the ultimate objective is to get a product to market, so at some point the experimentation must end, doesn’t it?
Six Sigma is about reducing variation. The lean movement is about eliminating waste. I understand that the early stages of product development may be wildly unpredictable and seemingly inefficient. Shouldn’t the latter stages focus on predictable outcomes, standardized processes, fast time-to-market, defect prevention, and efficient production?
Tags: leadership, management, power, process
add a comment
I’ve been puzzling over this one for some time: Why is it so hard for companies to leverage best practices developed internally? At HP we used to think the problem was poor knowledge sharing mechanisms within the corporation, especially across geographically-dispersed and independent business units, but I think it goes deeper than that. You can tell people to document and archive their processes on SharePoint, and you can host internal conferences to provide a forum for learning, but unless people are open to the possibility that there’s a better way you’re going to waste money reinventing the wheel.
The “not invented here” syndrome leads to bias against ideas that come from the outside. “They don’t understand our unique environment,” and, “Just because it works there doesn’t mean it will work here.” Even when compelled to use the new process there’s often passive-aggressive undermining or outright sabotage. Unfortunately these internal antibodies are often more antagonistic towards ideas from within the same company. If we use someone else’s ideas, doesn’t that imply that they’re smarter than we are? We don’t want them to get the credit, do we?
Sorry, but the smarter one (and the more valuable one to the organization) is the person who focuses their attention on the unsolved problems instead of those that were already solved. We all build on the foundations of engineering and process development that came before. Of course the local environment may indeed be different, and that may require a tweaking of the imported process. However, senior leadership should encourage leveraging of internal processes as another example of maximizing return-on-assets, and both the exporter and importer should be recognized as efficient collaborators. Also, when teams insist on using their own process they should bear the burden of proof to explain why the company should incur the additional expense to maintain more than one means to accomplish the same goal.
Changing the Tires While Driving the Car December 13, 2012Posted by Tim Rodgers in Management & leadership, Process engineering, strategy.
Tags: 30-60-90 day plans, change management, leadership, management, process, strategy
add a comment
That’s a phrase we often use to describe a chaotic work environment, but what if anything can be done when you’re faced with this situation? How should we manage when the current processes are incomplete, insufficient, ineffective, or even missing? How do you evaluate and implement process improvements without jeopardizing commitments to deliverables and performance metrics? Is there a logical way of managing these changes, or do we muddle through it, and later smile sympathetically when we hear about another manager’s struggles?
Obviously the whole point of introducing a new process or making a process change is to gain some improvement in performance, output, and/or cost. However there’s no getting around the fact that any process change will be accompanied by at least a short-term loss of productivity until you’re past the learning curve.
Will the current activities or projects continue long enough to benefit from an immediate change? If the benefit doesn’t outweigh the “distraction cost,” then it’s probably better to wait for scheduled downtime or a natural break between projects (in other words, wait until the car is stopped before changing the tires). If there is no natural break, then at least one project will have to pay the price so that future projects can realize the advantage. Which project can best tolerate the cost, or the risk of failure to meet scope or schedule requirements?
One practical question is whether it’s even possible to switch processes in mid-stream. If you’ve already started with the old process, can you finish the job with a new one? Starting over again from the beginning should be considered a last resort, only practical if the existing process is so unsatisfactory that you’re willing to sacrifice time for better results.
Another key concern is whether or not the organization as a whole is aligned with the need for a process change. It may be politically useful to roll out the new process on a small scale in order to build support for broader implementation. On the other hand, if there’s enough critical mass, it can be highly effective to “burn the boats,” essentially making it impossible to return to the old process.
If it’s the right thing to do, it’s just a question of when. If the benefits can’t be clearly articulated, it will never be the right time.
Not So Fast: Baseline That Process Before Changing It November 29, 2012Posted by Tim Rodgers in Process engineering, Quality.
Tags: change management, process, quality engineering, six-sigma
add a comment
Teams are usually in a big hurry to make an improvement in an under-performing process because there’s some degree of unhappiness or pain (usually financial) associated with that process. The sooner the process improves, the sooner the pain goes away. However, in the rush to move the needle a little in the right direction many teams fail to establish a performance baseline for their current process.
On the surface this is bad because without an understanding of the current state you won’t know if you’ve actually made any improvement. If your process is unstable and subject to special causes of variation, it’s impossible to tell whether the process has improved because of your deliberate action, or because of the influence of those special causes. In fact, those special causes may overwhelm and mask any positive effect of the intended process improvement.
I realize it may not always be practical to fully characterize a process and establish stability before trying to improve it, but if you can’t isolate and eliminate special causes then you can’t draw any conclusions about the success of your efforts.
Business Processes: Configuration, Customization and Convergence November 27, 2012Posted by Tim Rodgers in Management & leadership, Organizational dynamics, Process engineering.
Tags: change management, leadership, management, process, product development
add a comment
In the mid-00s after Mark Hurd’s accession to the CEO office HP’s senior leadership aggressively pursued overhead cost reduction to improve the firm’s profitability. HP’s CIO used this opportunity to consolidate IT functions, simplify processes, and eliminate redundancies due in-part to “shadow IT” groups that had grown up during the previous era of independent business units and local autonomy.
One of the early targets were the defect tracking systems used by product development teams for reporting, management, and disposition of defects found during pre-release testing. I don’t remember the exact count, but apparently the number of active defect tracking systems was in double figures. The IT team determined that convergence on a single, common, corporate-wide DTS hosted on corporate servers would lead to significant savings: from reduced headcount and other support resources, and harder-to-quantify efficiency, productivity and communication improvements.
The DTS convergence project did not go smoothly. Many product development teams objected strongly to the plan. They claimed that they would not be able to meet schedule and quality commitments for projects already in-progress if they were forced to switch to a different DTS in mid-stream. Unfortunately it didn’t help that the corporate IT team seriously underestimated the effort required to manage the transitions. Some product development teams appealed to their business unit executive, looking for an exemption or at least a delay. Others adopted a passive-aggressive position that only hardened the resolve of the IT team.
The corporate IT team objected strongly to the idea of local customization of the DTS and supporting processes, arguing that the overall cost savings and other benefits would be significantly reduced if every product development team were allowed to run their own system. I think the IT folks would have had a better chance of success if they had acknowledged the value of local solutions and introduced a system that enabled local configuration instead of customization.
The core functionality of the DTS could be retained, and the support resources minimized, while allowing the local product development team to define or select options to match their familiar and preferred style. Certainly this would not have eliminated all resistance to the change, but it would have enabled a more balanced discussion of the benefits of a common DTS that recognizes and values the needs of product development.
This happens frequently in business processes of all kinds, especially in multi-site organizations or in the aftermath of a merger/acquisition . One side pushes for convergence and the other side insists that they need a custom solution because of their unique requirements. The trick is to find a way to design a common system that can be locally configured without giving up the expected benefits in support cost and communication.