jump to navigation

Design for Manufacturability, Revisited April 29, 2012

Posted by Tim Rodgers in Product design, Quality, Supply chain.
Tags: , , ,
trackback

I was going through some old papers the other day and discovered an article I wrote in September 1990 about design for manufacturability (DFM). I had just completed my first assignment at Hewlett-Packard’s now-defunct Printed Circuit Division, co-authoring a DFM manual for circuit designers to guide their decision-making when evaluating options.

It’s a curious thing to read something you’ve written years ago, particularly from a contemporary perspective derived from the accumulated experiences in-between. DFM became a popular acronym in the late 1980s, particularly following the pioneering work of Geoffrey Boothroyd and Peter Dewhurst to develop quantitative models for evaluating ease of assembly, but for a lot of people it was an academic concept with few examples of practical application.

I’m not sure things have changed all that much. I think there are still a lot of people who consider DFM to be an item on a checklist, or maybe some kind of antagonistic negotiation between designers and suppliers to find an acceptable middle ground that meets most of the performance requirements with a minimum of complaint from the supply chain. DFM is more than that, enabling a design that matches the current process capabilities of the supply chain at the lowest cost, a cost that should be the basis for pricing negotiations when understood by all parties.

Back in 1990 I wrote about the effect of manufacturing process improvements vs. product improvements as a means of getting higher performance at optimum cost. The expectation is that higher design complexity is purchased at higher cost. In order to reduce the cost, suppliers may be expected to implement improvements (automation, new processes, yield increases) so the same design can be produced at lower cost, or higher complexity can be produced at the same cost.

DFM has a different effect. If the complexity of the design can be reduced using DFM without sacrificing performance, then in theory cost can be reduced without introducing additional changes in manufacturing.

In 1990 HP was building their own printed circuits using internal factories, so we had a pretty good understanding of manufacturing cost drivers. We were able to build a quantitative DFM model that helped our designers assess the relative cost of different options and find the optimum combination of performance and cost.

That’s obviously a lot tougher when you’re outsourcing to external suppliers who have no desire to share their internal cost drivers with their customers, and in many cases may have little understanding of those drivers in the first place. This means that rigorous application of DFM by conscientious designers to reduce cost may not correlate to any significant improvement in pricing, because the price has no relationship to the cost. This is particularly problematic when the folks on the design side also have no understanding of cost drivers and can’t negotiate from knowledge.

What’s to be done? Our goal in 1990 was to assess the cost of every design decision, something that’s probably impossible today without full transparency. Nevertheless, it should be made clear to the supply chain that any design improvement intended to reduce part cost, simplify assembly, or increase factory yield is expected to lead to a reduction in the agreed-upon price. I think it’s worth offering to share the cost savings with the supplier or provide some other incentive to identify and qualify DFM opportunities. Otherwise DFM will just be an item on a checklist, another acronym with no real value.

Comments»

No comments yet — be the first.

Leave a comment