Free Online Tolerance Stack-Up Analysis Calculator
Perform tolerance stack-up analysis using arithmetic, probabilistic, and Monte Carlo methods.
What is Tolerance Stack-Up Analysis?
Tolerance stack-up analysis is a fundamental process in mechanical design used to study the accumulation of variation in an assembly. Each component of a product has dimensional tolerances (a permitted variation in its size). This analysis calculates the combined effect of these tolerances to predict the final dimensional variation of the assembly, ensuring that parts fit together correctly and the product functions as intended. A tolerance stack-up analysis allows to assess the effect of geometrical errors before manufacturing. Consequently, any defects can be easily fixed at a very low cost and in a short time.
Brief History
The concept of interchangeable parts dates back to the Industrial Revolution, but the systematic analysis of their accumulation gained momentum in the 20th century, especially with the rise of mass production. The need to predict and control variation became critical. Walter A. Shewhart's work in the 1920s on Statistical Process Control (SPC) laid the groundwork for probabilistic methods. Later, the development and standardization of Geometric Dimensioning and Tolerancing (GD&T) provided a precise language for defining tolerances, making analyses more robust and reliable.
How This Tool Works (Methods)
This tool performs tolerance stack-up analysis using two primary methodologies: Arithmetic and Statistical. The Statistical method is dynamically chosen between the classic Root Sum Square (RSS) method or a Monte Carlo simulation based on the distribution types you select for the components.
1. Arithmetic Method (Worst-Case)
This is the simplest and most conservative method, assuming a uniform distribution for all components. It calculates the absolute maximum and minimum possible assembly dimensions by summing all component tolerances at their most unfavorable condition. The worst-case method requires inspection of each part. It is suitable for low volume very high value products, such as jet engines.
- Advantage: Guarantees that 100% of assemblies will meet specifications.
- Disadvantage: Leads to the tightest, most expensive component tolerances, as the "worst-case" is statistically unlikely.
2. Statistical Method (RSS or Monte Carlo Simulation)
This method uses component distribution types to predict a more realistic variation range for the assembly. This method assumes that the machine or process is working in their specifications limit (the machine work according to its manufacturer specifications, no shift or abnormal conditions when making parts). The calculation method is chosen as follows:
- If all components are set to a Normal distribution, the method defaults to the highly efficient Root Sum of Squares (RSS). RSS statistically sums tolerances, assuming deviations in one direction are offset by deviations in the other.
- If any component is set to a non-Normal distribution (Weibull, Triangular, Homogeneous, Lognormal, Beta, or Exponential), the tool automatically switches to a Monte Carlo Simulation. This uses computational power to simulate the assembly thousands of times, generating a complete statistical distribution of the possible assembly outcomes.
- This tool calculates the standard deviation (σ) based on your input (Tolerance, σ value, or Cpk) to define the process spread for each component.
3. Component Distributions & Manufacturing Relevance
Selecting the correct distribution is crucial for accurate analysis, as it models how a process actually performs:
Normal (Gaussian)
The standard model for most well-controlled processes (e.g., CNC machining, turning). It assumes variation is the sum of many small, independent errors, resulting in a symmetrical, bell-shaped curve centered on the nominal mean. Relevance: Used when the manufacturing process is stable and capable (in statistical control).
Homogeneous (Uniform)
Assumes that every value within the tolerance range has an equal probability of occurring. Relevance: Highly conservative. Often used to model purchased components that are checked only at the boundary limits (Go/No-Go gauging), or where process data is unavailable, leading to a worst-case risk assessment.
Triangular
Assumes that the nominal target (mode) is the most probable value, with probability decreasing linearly towards the minimum (A) and maximum (B) limits. Relevance: Common when operators try to manually center the process, or when stacking two independent uniform distributions (e.g., a short assembly chain).
Weibull
A flexible distribution defined by shape (β), scale (η), and often a location (γ) parameter. While commonly used for reliability/life analysis, it models dimensional distributions that are skewed or non-symmetrical, often near a process boundary or for materials with inherent non-linear variation. Relevance: Useful for modeling wear, grinding processes, or thin film deposition where variation might be heavily skewed.
Lognormal
A distribution where the logarithm of the variable follows a normal distribution, resulting in a right-skewed shape. Values cannot be negative, making it ideal for strictly positive dimensions when shifted with a location parameter. Relevance: Common in processes with multiplicative errors such as coating thickness, particle size in powder metallurgy, electroplating depth, or chemical vapor deposition. Also models fatigue life and material strength properties.
Beta
A highly flexible distribution bounded between a minimum and maximum, defined by shape parameters (α, β). Can model symmetric, left-skewed, right-skewed, or U-shaped distributions. Relevance: Useful for bounded processes with known physical limits, such as percentage-based measurements (e.g., material composition, hardness indices), surface finish within specification limits, or when expert judgment suggests variation is not normally distributed but constrained.
Exponential
Models the time or distance between independent events occurring at a constant average rate, characterized by a memoryless property and strong right skew. Needs a location parameter for dimensional analysis. Relevance: Primarily used in reliability engineering for modeling failure rates or time-to-failure for components with constant hazard rates. In dimensional analysis, it can represent the distribution of defect spacing, inter-arrival times in production lines, or processes where most values cluster near a minimum value with a long tail.
Dimensions
No dimensions added yet
Click "Add Dimension" below to start building your tolerance stack-up analysis
Stack-up Visualization
Analysis Results
Arithmetic Method (Worst-Case)
Nominal value: -
Tolerance: -
Maximum: -
Minimum: -
Probabilistic Method (RSS)
Nominal value: -
Standard deviation: -
Tolerance: -
Maximum: -
Minimum: -
Monte Carlo Results
Nominal value: -
Standard deviation: -
Tolerance: -
Maximum: -
Minimum: -