A hackneyed maxim says that PCBA testing is a non-value-added activity. Really? Did you know PCBA testing has the unique ability to make revenue out of thin air? If that isn’t creating value, then what is?
Herewith is an episodic do-it-yourself guide to making something – basically, money – from nothing – basically, data. Keep in mind this disclaimer: What follows really happened, often multiple times. However, this is the sanitized version. You see and hear many things in the testing business.
Episode one. Ceramic capacitors have a well-documented propensity to lose their ability to hold a charge over time. In engineering parlance this is called aging, or derating. Aging happens because the barium titanate-based crystalline structure within a ceramic capacitor realigns itself when the device is heated (as in a reflow oven) above 125°C, a temperature known as the Curie Point. This realignment occurs again with cooling after assembly and the passage of time, so much so that its dielectric constant changes logarithmically. This phenomenon is especially prevalent in X7R, Y5V and Z5U (so-called Class II) products.
The immediate practical effect of this geriatric onset is that the capacitor fails in-circuit test, typically registering a capacitance value on the tester lower than nominal, and often lower than permitted by the prescribed tolerance. When this happens, the board fails.
That, obviously, is a problem. Bummer.
There is a solution. Actually, three solutions:
Can you repeat that third solution?
Shocking as this may seem, the solution most often requested by customers in our experience is No. 3.
If that request happens to come to us from a contract manufacturer (CM), it is routinely refused without first obtaining written permission from the OEM. For some reason, that permission is seldom forthcoming when requested.
Once again, testing guys have plenty of ways of adding value. Were we so disposed.
Episode two. A contract manufacturer recently brought a handful of failed boards to us for evaluation. It seems they failed the OEM’s functional test, and the OEM was incredulous at the high incidence of failures, not having experienced such an unusually high fallout before. Knowing we had accumulated tribal knowledge of this board, the OEM directed the CM to us for assistance and consultation. We examined the boards and then had a few of our own questions for our customer, to wit:
Q: “How were these failures discovered?”
A: “By our customer, the OEM, during functional testing.”
Q: “But you have stamps on each of these boards for AOI, AXI, FP and, in some cases, ET. What’s up with that?”
A: “We did no testing of these boards.”
Q: “Say again? We see commonly used, conventionally-accepted test-related stamps all over these boards.” And we have pictures. In a very secure, undisclosed location. Serialized.
A: “We did no testing of these boards.”
Bet the low bidder got the job on this one. Also betting to this point you considered testing an impediment to higher margins. Think again. More added value, in a valueless way (if you know what I mean). Remember, these stories are real.
Episode three. Another customer reported a problem with an ICT fixture and program we designed. It was failing certain capacitors high. Naturally, in the mind of our customer this was our fault and our problem, since we created the fixture and program. Perish the thought that there might be another root cause. But there was.
A little basic research and digital gumshoe work revealed they had installed the wrong-value capacitor. This was the second time. We were contacted one month prior by this same customer about this same problem, on a previous lot. And for the second time, our customer was trying to convince their OEM customer that the fault, being so obviously ours, was therefore our obligation to remove. Simply change the frequency of measurement, thereby lowering the measured value within acceptable tolerances, and the problem goes away. Implied was that competitors had done this in the past with no muss, no fuss. Nothing to see here.
However, this view failed to account for the fact that the series impedance of the tester becomes a significant component of the measured value as frequency increases. At first glance this would appear to bring the tested component within the specified tolerance. However, our customer neglected (?) to account for this system error, and simply wanted us to accept their results as good. Which we didn’t.
We get paid to report what is, not what is wished. To the specified tolerances of the BoM and the schematic. Math doesn’t lie. Nor should we.
After demonstrating, by means of a little math and physics, and to the OEM’s satisfaction, that the measurement frequencies assigned to this specific aspect of the test were correct, our OEM then summoned some backbone and, to our unexpected delight, admonished our customer for installing the wrong part and trying to pass it off as good. He also ordered our customer to stop wasting everybody’s time and get on with installing the correct-value part.
Once again, we could have added value in this scenario by making the CM’s problem vanish. All it takes is a few small, seemingly inconsequential adjustments – and a big lack of ethics.
Episode four. Our company does flying probe test programming. We are occasionally challenged to explain, in a quoting situation, why our pricing is significantly higher than that of our competition. There are numerous reasons for quote disparities, but one that arises with distressing regularity is the comparison of our fully-debugged, parametric test program to a “learned” test program, i.e., one that is derived by software from a loaded board of suspect quality (neither good nor bad but most assuredly not golden). We are somehow made to look bad, and we lose the job, in spite of our own thoroughness, because the nontechnical buyer has no idea about the qualitative difference in programs and often doesn’t care. Our thoroughness in doing our job renders us noncompetitive, thus reinforcing the belief that No Good Deed Goes Unpunished.
Moral: Shorter, learned programs cost less and, therefore, save the purchaser money. More value added. However, caveat emptor.
Episode five. We’re always on the lookout for qualified test engineers to hire. A daunting task. Test engineers don’t grow on trees, especially good ones. And there is no set academic curriculum anywhere that produces test engineers of any caliber. So we search long and hard. Foremost among our requirements is Agilent 3070 experience. We seek individuals who, once in possession of the customer’s design data and pertinent documentation, can design an ICT fixture, create and debug its accompanying program, and deliver both – production ready – to the customer or the customer’s CM. The successful candidate also should possess good verbal and written communication skills, to explain himself to lay and technical personnel alike, and to train CM personnel in the proper use of his creation. Easy.
Wrong. Typically we get resumes of technicians. Barely. Often proficient in “fine-tuning” Agilent 3070 programs. This is code. “Fine-tuning” is a euphemism for tweaking test parameters and tolerances to make nagging little problems, like failures, go away. For examples see Episodes 1 and 3 above. Meanwhile, gotta ship.
We know how to do this too. Adding value once again.
If we were so inclined.
Value? Are you kidding me? I got your value right here!
Robert Boguski is president of Datest Corp., (datest.com); firstname.lastname@example.org. His column runs bimonthly.