one-dimensional assembly tolerance stack-up

Upload: kakoda

Post on 30-Oct-2015

195 views

Category:

Documents


7 download

DESCRIPTION

One-Dimensional Assembly Tolerance Stack-Up

TRANSCRIPT

  • CATS 1-D XLOne-Dimensional Tolerance Stackup Spreadsheetby Ken ChaseandJonathan WittwerBrigham Young University

  • Complex Assemblies

  • Process VariationStandard DeviationRejects-3s+3sULLL+1s-1sMeanA frequency plot shows how a process varies from the mean

  • Models for Predicting Assembly Tolerance StackupWorst Case (WC)

    Statistical (RSS)

    Six Sigma (6s)

    Measured Data (Meas)

  • Models for Predicting Assembly Tolerance Stackup

    Model

    Stack Formula

    Predicts

    Application

    Worst Case (WC)

    Extreme limits of variation.

    Not statistical.

    Critical systems.

    No rejects permitted.

    Most costly.

    Statistical (RSS)

    Probable variation.

    Percent rejects

    Reasonable estimate.

    Some rejects allowed.

    Less costly.

    Six Sigma

    (6s)

    Long term variation

    Percent rejects.

    Drift in mean over time is expected. High quality levels desired.

    Measured Data (Meas)

    Variation using existing parts.

    Percent rejects.

    After parts are made.

    What-if? studies.

    _1075750818.unknown

    _1075750921.unknown

    _1075750951.unknown

    _1075750720.unknown

  • Centering the MeanCenterULLLCentering the process in the LL/UL window minimizes rejectsRejectsXmean

  • Decrease VariationDecrease Standard DeviationReduce rejects-3s+3sULLL

  • Higher s Values Mean Higher Quality

  • Controlling Variation Leads to Higher Yields and Fewer DefectsLimits Yield Defects 3s 99.73% 2700 / Million4.5s 99.99966% 3.4 / Million 6s 99.9999998% 2 / Billion

  • Controlling Variation Leads to Higher Yields and Fewer DefectsLimits Yield Defects 3s 99.73% 2700 / Million4.5s 99.99966% 3.4 / Million 6s 99.9999998% 2 / Billion

  • Mean Shifts HappenMean variation is time-dependent due to tool wear, temperature, etc. Mean Shift

  • Accounting for Mean Shifts1.5sThe Six Sigma Model allows the engineer to modelassembly variation due to mean shifts 4.5sULLL

  • Measures of Process Capability3s6skProcess Capability Index:Cp adjusted for k:Cpk= Cp (1-k)ULULLLLL

  • 6 Variation DefinedRSS:6s:where Cpk= Cp (1-k)and

  • Six SigmaDynamic Mean Shift- tolerance+ toleranceDistribution of partswhen tool is new.Distribution of partswhen tool is old.tool wearFigure 7.Scenario 3.As the tool wears, the mean of the most recent parts shifts from the left to the rightmin.max.99.73%(Short Term)95.45%(Long Term)-3s+3sUSLLSLComparing Short and Long Term Variation

  • Requirements for High Quality99.9999998%(Short Term)99.99932%(Long Term)6s Capability-6s+6sLLULA goal of 4.5s long term requires a 6s process in the short term

  • Excel Statistical FunctionsNORMDIST(x,mean,stand_dev,T/Fflag)=area under the Normal distrib at point xNORMSDIST(z)=area under the Standard Normal distrib STANDARDIZE(x,mean,stand_dev)=(x - mean)/stand_devNORMSINV(Probability)=z corresp to a given probability (for z
  • Statistical Function Accuracy* the 6s value is from an alternate source, accurate to 12 places.

  • CATS 1-D Spreadsheet

  • CATS 1-D Inputs

  • Mean and Variance Comparison

  • Yield and RejectsCalculated in s units, %, or parts-per-million

  • Standard Normal DistributionZ (s units)MeanStandard Deviationsx = 1.0TransformationZLZUUsed to determine % rejects from standard tables.Does not show mean shift.Transformed data all looks alike.

  • Modified Normal DistributionShows the mean shifts Shows the quality levelsAllows all 3 curves to be plotted for comparisonULLL

  • Plot Controls

  • What can CATS 1-D users expect?Probably wont see:An end to poverty and miseryAll men treating each other as brothersWorld peace

    But, you might notice:An increased understanding of the role of statistics in designFewer problems on the factory floorEngineering and production talking to each other without shouting.

    CATS 1-D XL SoftwareCATS 1-D XL, is an entry level spreadsheet for 1-D tolerance stackup analysis of assemblies. It has been designed with tutorial capabilities, but it is powerful enough for engineers to use as a design tool or for manufacturing personnel to check acceptability of production parts. Some of the features include: A. Four methods of tolerance stackup analysis:1) Worst Case (WC)2) Statistical (RSS) - using 3s tolerances; Normal or Uniform distributions.3) Motorola 6 sigma - using Cp and Cpk4) Tolerance stackup using production part measurement data. B. It plots the resulting normal distribution with applied upper and lower limits. This is a live plot which changes when you alter the input data. C. It also automatically calculates the rejects at the upper and lower limits and the predicted yield of acceptable assemblies as a percent or parts-per-million (ppm). D. No Monte Carlo simulation is required. All you need to input is the mean and 3 sigma variation for each dimension. E. Simple Excel 97 spreadsheet. No .dll files required. No macros to get corrupted.

    Manufactured parts are seldom used as single parts. They are used in assemblies of parts. But, the dimensional variations which occur in each component part of an assembly accumulate statistically and propagate kinematically, causing the overall assembly dimensions to vary according to the number of contributing sources of variation. The resultant critical clearances and fits which affect performance are thus subject to variation due to the stackup of the component variations. Tolerances must be added to engineering drawings to limit variation. Dimensional tolerances limit component size variations. Geometric tolerances, defined by ANSI Y14.5, are added to further limit the form, location or orientation of individual part features. Assembly tolerance specifications are added to limit the accumulation of variation in assemblies of parts to a level dictated by performance requirements. Tolerance analysis is a quantitative tool for predicting variation accumulation in assemblies. Tolerance analysis brings the production capabilities and performance requirements together in a common engineering model. It provides a common meeting ground where design and manufacturing can interact and quantitatively evaluate the effects of their requirements. Thus, it promotes concurrent engineering and provides a tool for improving performance and reducing cost.

    All manufacturing processes produce random variations in each dimension. If you measured each part and kept track of how many of the same size were produced, you could make a frequency plot, or histogram, showing how the size variation is distributed. Generally, most of the parts will be clustered about the mean, or average, value, causing the plot to bulge in the middle. The further you get from the mean, the fewer parts will be produced, causing the frequency plot to decrease to zero.A common statistical model used to describe random variations is shown above. It is called a Normal, or Gaussian, distribution. The mean of the distribution marks the highest point on the curve. The spread of the distribution is expressed by its standard deviation. The mean tells how close the process is to the target dimension. The standard deviation indicates the precision or process capability.UL and LL mark the upper and lower limits of size, as set by the design requirements. If UL and LL correspond to the 3s process capability, as shown, very few parts will be rejected (about 3 per 1000).In a linear or 1-D assembly stack, variation accumulates linearly. If you have measured data, you can calculate the mean and standard deviation directly, or obtain values from quality assurance data. However, it is critical to make estimates of assembly variation early in the design, so problems can be identified and corrected before any parts have been produced, or even before the tooling has been designed. A tolerance stackup analysis can identify which dimensions are contributing the most to variation in critical assembly features. To do this without parts to measure, it is commonly assumed that the tolerances on the drawing correspond to the 3s process capability, that is, Ti = 3si . This is a reasonable assumption, since it assumes the parts will be made to meet the specified tolerances.For each critical feature, a dimension chain must be identified. The chain represents those dimensions which stackup to control the assembly feature. Summing the average dimensions in the chain predicts the average feature dimension and summing the dimension variations predicts the feature variation. Four models for tolerance stackup are commonly in use, as shown above. Each has its own advantages and disadvantages. Each has its own preferred applications.

    The four models and their applications are:Worst Case(WC): Computes the extreme worst case limits by summing absolute values. Worst combination of over and undersize parts. If all the parts are within part tolerance limits, there will be no rejected assemblies. For given assembly limits, WC will require the tightest part tolerances. Thus, it is the most costly.There is no standard deviation. Does not consider statistical distributions.Statistical (RSS): Adds standard deviations by Root Sum Squares (RSS). Predicted limits are more reasonable. Predicts statistical distribution of assembly feature, from which percent rejects can be predicted. Can substitute Uniform distribution for Normal for more conservative estimate. Can account for static mean shifts.Six Sigma (6s): Basically an RSS approach, but assumes part tolerances are higher than 3s to achieve higher quality levels. Accounts for dynamic mean shifts due to long term drift in the process mean. Resultant target quality level is 4.5s.Measured Data (Meas): RSS approach. Uses actual mean and standard deviations from part measurements. Can account for static and dynamic mean shifts. No tricky factors required, since any drifts or shifts are included in the measurements.Based on the results of the stackup analysis, predicted assembly variations which exceed design limits can be examined. The principle contributing dimensions can be identified, processes may be selected or changed, tooling can be designed, or the design may need to be modified to decrease variation.

    There are basically two methods for reducing the percent rejects in an assembly: mean shift analysis and variation analysis.The mean of the assembly distribution is seldom aligned with the center of the design limits: (UL-LL)/2. All it takes is one part in the stack to get off-center and the assembly will be thrown off. Typically, all the parts will have some degree of mean shift due to setup errors, tool wear, thermal expansion, etc. Some of the shifts may cancel each other, but generally there will be a net shift, requiring constant monitoring of the part means throughout a production run. Mean shift analysis is the only way to track it before assembling the parts.As seen above, if the distribution mean shifts off-center, more of the tail sticks out past the design limit, resulting in an increase in rejected assemblies.The solution sounds simple--find the offending part and fix it. This usually requires that you change one or more nominal dimensions on the parts to bring about the necessary change in the mean. But, that is not always easy. It may be an expensive fixture or stamping die that is worn or out of spec. In that case, you may be able to change the mean dimension on some other part to compensate. This, of course, could cause problems with selling replacement parts some time in the future. Another trick you can do with mean shift analysis is simulate thermal expansion conditions to examine their effects on critical clearances. Thermal expansion is not a random variation. It must be accounted for in the calculation of the mean assembly dimensions.If the results of variation analysis predicts too high a number of reject assemblies, even after centering the distribution, the next step is to reduce the spread of the distribution. This will pull more of the tails inside the limits, reducing the number of rejects. This requires a reduction in the standard deviation.Since the assembly standard deviation is calculated by summing the standard deviations of the parts (by one of the 4 stackup models), the way to reduce the assembly spread is to tighten the tolerance on one or more part dimensions. Deciding which dimension in the assembly chain to tighten requires a knowledge of the manufacturing processes involved. It may be simply increasing the number of finish passes on a lathe or mill. Or, it may involve changing to a more rigid machine or switching to an entirely different process. Such decisions must be made by carefully evaluating the alternatives. The advantage of performing variation analysis early in the design is obvious. It is much better to identify and resolve production problems before you go into production. Dont wait until you are under the stress of meeting production quotas and shipment orders before you try to fix the production line.As you continue to refine your process controls, reducing the spread of each process distribution and keeping the processes centered, fewer and fewer rejects will occur. Fewer part rejects translates into fewer assembly rejects and fewer finished products winding up in the re-work department. That translates as more product to sell at less cost. This leads to more competitive prices, increased sales and higher profits. Companies that catch this vision of variation management are in a much stronger position to survive when times get tough.Examine the distribution plot in the above figure. Notice that the limits UL and LL are drawn at 6s, instead of the traditional 3s. Virtually none of the distribution tails extend beyond these limits. This is called a 6s process and it is the ultimate in product quality. Certainly, we would all like to be operating at this level.If the UL and LL are set at the+3s limits of the assembly distribution, the table above shows that 99.73% of the assemblies will be contained within the limits and will meet the design requirements. That leaves 2.7% reject assemblies, or, 2700 rejects per million (ppm). For UL and LL at +4.5s, the yield of good assemblies will be 99.99966%, with 3.4 ppm rejects. At +6s, the yield is nearly 100%, with only 2 per billion rejects! This may sound astonishing, but this Six Sigma quality level is actually the target quality level of many of todays major manufacturing corporations.However, there are a couple of misconceptions that can get you in trouble. Story: A certain company announced a company-wide goal of striving for 6s for all of their processes. An obedient and ambitious engineer at the company proceeded to implement the plan. He gathered data on the standard deviation of a number of key characteristics, or critical features on one of their products, over which he had responsibility. Then, he pulled the drawings and proceeded to increase the tolerance limits on those features to correspond to the 6s limits. This done, he proudly announced that he had achieved 6s.

    Story conclusion: Well, the boss was less than pleased. What went wrong? Had he not in fact achieved 6s processes? Had he missed something? Indeed he had. The UL and LL limits on key characteristics are not arbitrarily chosen. They are design limits, set by the designer, based on design requirements. They are set at a level to assure proper performance of the final product, based on analysis and testing. UL and LL can not be changed without an engineering evaluation. The only way to achieve a 6s process is for the process to suck it in, reducing s until (UL-LL) corresponds to a full 6s. If you started with a 3s process, you would have to reduce process variation by a factor of two. This is very difficult to achieve corporate-wide. It normally requires high level support within the company leadership, accompanied by full support of all the employees, in short, a commitment of the entire manufacturing enterprise. Actually, this is oversimplified. Design requirements are applied to critical assembly features. The tolerances allowed, or specified, for the assembly feature are then flowed down to the part level, to those dimensions which chain together to control the assembly feature. Tolerances on these dimensions are then set by performing a tolerance analysis, to assure that the proposed part tolerances, when added, will not exceed the specified assembly tolerance. Part tolerances which do not contribute to critical assembly characteristics are chosen to match the process or are set to the default title block values.There is another common pitfall associated with high quality levels:If you do succeed in your efforts to achieve a 6s process for a single part, it gives you a lot of wiggle room. The process mean has room to shift quite a bit without having a large impact on quality. It could, for example, shift a full 3s, until the old 3s limit of the process lies on top of UL, at which point you would have half of 2700 ppm rejects, or 1300 ppm. But this is deceptive! Lets examine the results. Since the process s did not change, the tolerance stackup would show no change in the assembly s. But, a mean shift in one part translates directly to a mean shift in the assembly. Of course, the assembly s is larger, since it represents a stackup of several parts. But, the assembly mean would shift closer to its UL, resulting in an increase in rejects, not as big, but an increase all the same.Now, suppose two parts experience mean shifts, or three or four. Your high quality level would become history. All your efforts to reduce process variation would come to naught. Of course, mean shifts are not always in the same direction. They could be self-canceling. But, you get the picture. The point is, the high quality levels on the preceding table, only apply to a centered process! To achieve high quality levels, you must monitor the mean shift as well as the variation. Mean shift accumulation can be just as damaging as variation accumulation. In the figure above, the UL and LL are set at +6s. The mean of the Normal distribution, originally centered, has been shifted 1.5s to the right, leaving 4.5s remaining. The spread of the distribution has been exaggerated, to make the protruding tail more visible. This is a one-tailed limit, since the other tail is essentially zero rejects. Since the UL is 4.5s from the mean, it will produce half the rejects of a symmetric 4.5s distribution, i.e. 3.4/2=1.7 ppm. Comparing this value with the centered distribution, the rejects increased from 2 per billion, or nearly a factor of 1000 increase due to the mean shift.So, if we wanted to maintain a quality level of 4.5s for a process that sometimes experiences as much as 1.5s mean shift, we would have to maintain a process quality of 6s in order to account for possible mean shifts.This is why many major corporations are talking about 6s processes. They are actually targeting quality levels of 4 to 4.5s, but they must strive for processes near 6s in order to account for mean shifts.

    Two measures of process capability have come into common use in industry:Cp Process Capability IndexCpk Cp adjusted for mean shiftCp compares the 3s process capability to the specified tolerance limits by taking the ratio of the tolerance limits to the process capability. If the 3s limits of the process line up exactly with the UL and LL, the Cp is 1.0. Then, each tolerance would correspond exactly to 3s, the common assumption used in tolerance analysis. If the UL and LL align at 6s, Cp is 2.0, which corresponds to a 6s quality level. Cp is useful for indicating the quality level, but it does not consider the effects of mean shift.Cpk modifies the value of Cp to account for mean shift. As shown above, Cpk is just Cp multiplied by the factor (1-k), where k is a fraction between 0 and 1. If the mean has shifted 25% of the tolerance, that is, 0.25 of the distance from the mean to UL or LL, then Cp will be reduced by 75%. Just as Cp tells you how close the UL and LL limits are to the 3s process capability for a symmetric distribution, Cpk tells you how close the closest UL or LL limit is for a non symmetric distribution. The Motorola Corporation developed the Six Sigma Program a couple of decades ago, as a means of accounting for mean shift in large volume, high quality production. It involves two common measures of process capability:Cp Process Capability IndexCpk Cp adjusted for kInstead of estimating the standard deviation si of the dimensional tolerances from Ti = 3si, as in conventional RSS tolerance analysis, a modified form is used to account for higher quality level processes (Cp>1.0): Ti = 3 Cpi siThus, a 6sprocess, which has a Cp=2.0, would have tolerances of 6s.A further modification is made to account for mean shifts, by substituting Cpk for Cp: Ti = 3 Cpki siSince Cpk is less than Cp, the estimated standard deviation si will be larger. si = Ti /(3 Cpki)This mean shift modification, applied to the variation, rather than to the mean of the distribution, is called the Dynamic Mean Shift.

    For large volume production, the mean of the process may drift due to tool wear, thermal expansion, multiple-cavity molds, etc. A machinist may deliberately set up a machine to produce a dimension on the low side, or biased toward the Least Material Condition (LMC). Then, as the tool wears, less material is removed, causing the mean to drift to the high side, toward the Maximum Material Condition (MMC). He may have saved having to sharpen the tool so often, but the result is an increase in the spread of the distribution. Thermal expansion can cause the mean to drift in the opposite direction, but the result is the same--the peak of the distribution degrades, causing an increase in the spread.In a multi-cavity mold, each cavity cools at a different rate, causing each cavity to have its own mean and standard deviation. If all the output parts are dumped in the same bin and mixed, the distributions merge, averaging out the mean and standard deviation. Again, the spread is increased. (In extreme cases, it could develop a multi-modal distribution with multiple peaks.) If you took a snapshot of a high volume distribution over a short interval of time, it would have a higher peak and narrower spread. But, over a longer time it would gradually spread out, as in the figure above. Thus, a 3s process in the short term, might degrade to a 2s process, in which case, the reject rate would increase from 2700 to 45,500 ppm. And, if the mean is shifted, the reject rate would even be higher. Because a dynamic mean shift causes the quality level to degrade over time, high quality levels are difficult to maintain over the long term. If you desire to have a resultant quality level higher than 3s, the short term quality level would have to be significantly higher in order to compensate for the drift in the mean.The Motorola Six Sigma target is a 4.5s quality level in the long term. To achieve this, they must maintain a short term quality level of 6s (Cp = 2.0). This assumes the mean is expected to drift up to one fourth of the tolerance (k=.25). Short term: Ti = 3 Cpi si = 3*2.0* si = 6 si

    Long term: Ti = 3 Cpki si = 3*2.0*(1-k) si = 4.5 siOf course, if the mean drifts less than this (k0.25), the 4.5s quality can not be maintained.Note that the dynamic mean shift affects the spread, or standard deviation of the process. If there is a static mean shift, or constant bias in the mean, the mean will be shifted off-center, causing an increase in rejects in addition to the those caused by increased spread. A static mean shift is caused by a setup error, tooling error, or other fixed bias in the process mean, while a dynamic mean shift is caused by a gradual change in the standard deviation.

    Microsoft Excel is a popular spreadsheet application for engineering designers. It is structured for multiple worksheets, with built-in graphing options for presenting results. Excels built-in statistical functions cover many that are needed in tolerance analysis. Several of the most common are shown above.

    The Excel function NORMSDIST(z) returns the accumulative distribution function for the Standard Normal distribution, that is, the area under the curve from minus infinity to the point z. The total area under the curve is 1.0, representing 100% of the population. The area up to the value z represents the probability of a randomly selected value being less than or equal to z. It is used to estimate the fraction a of parts which will be produced with a size less than z (yield or good parts) and the fraction (1-a) greater than z (rejects). A Standard Normal distribution has a mean of zero and a standard deviation of 1.0. Any Normal distribution on x may be converted to a Standard Normal distribution on z by the transformation:

    If the UL and LL of x are transformed, their resulting z scores may be used to estimate the fraction of good and bad parts.The accuracy of the NORMSDIST(z) function was determined by comparison of a and 1-a with an often-used 6th order polynomial approximation and a 15-place Standard Normal table. The shaded values represent the number of digits of agreement between the three. The Excel function is more accurate than the polynomial. It is sufficiently accurate for 6s quality levels.The CATS-1D spreadsheet is color-coded:Blue fields are for user input.White fields are intermediate calculations.Violet highlights report sections.Yellow highlights column labels

    The first set of blue columns: input the basic dimensions and tolerances of the stack for WC and RSS tolerance stacks.The second set of blue columns: input the Cp and k data for 6s analysis.The third set of blue columns: input the mean and s for the measured dimensions and tolerances for tolerance stacks created from real data.

    The three methods of predicting tolerance stackup in CATS 1-D are presented side-by-side in a table, along with the results obtained from actual measured data. WC has no statistical basis, so it predicts only the mean and worst case limits about the mean. RSS and 6s include estimates for additional statistical parameters. Xmean: All three predicted values are identical because each was calculated from the same nominal part dimensions. A static mean shift can be included if off-nominal values are input for a what-if analysis. The measured data will include any static mean shift occurring in real part data.Tasm: Each method has a different basis for estimating the assembly tolerance (variation about the mean). RSS assumes the part tolerances are 3s, while 6s assumes 3Cpk, to account for long term effects of dynamic mean shift. Meas Data uses the actual si of the parts, summed by RSS.Cp and Cpk are calculated conventionally in all cases, but the sasm are different for each case.