The heart of the matter is that variation exists in everything! Thus, a perfect process with no variation is a figment of imagination. This is where statistical process control (SPC) comes into play. The first step in the implementation of SPC is to reduce the variation in the process, next having brought the situation under control, to then monitor the process to ensure the identified variations do not increase.

To visualize the variations found in the process, a common tool used is the Histogram.

Histogram is also known as a frequency distribution chart. This bell-shaped curve is also referred to as a normal distribution, which would be discussed in further details later in the article. So, what does this curve tell us about the process in question? The closer the points fall in the

center, the less variation occurring in the process. Conversely, the further out the points lie, the more variation is occurring. If they fall outside the distribution, these would be considered as outliers. This will

mandate immediate actions to address these issues and eliminate them. So, what are AB-Normal distributions?

A 2-hump camel is called a bi-modal distribution. This indicates something has changed in the process, causing the shift, which could have been induced by a change in the raw material, process flow or even a shift change of operators.

A skewed distribution chart should also trigger alarm bells, as this indicates that deviation in the process or the upper/lower limits were set up incorrectly, leading to the curve leaning to one side.

**Normal Curves**

The normal curve is one of the most common patterns of variations observed. This is the primary reason that Statistical Process Control is pegged to this form of distribution and the associated statistical probabilities in computations.

The “68-95-99.7” rule applies to all normal curves, which state the following:

• 68% of process data fall within ±1 standard deviation of the mean

• 95% of process data fall within ±2 standard deviation of the mean

• 99.7% of the process data fall within ±3

What is Standard deviation ?

It is an effective statistical measurement to determine how the data is clustered around the mean, thus indicating the variation. This translates to the smaller the s, the less variation is present in the process. So, the question now is what constitutes a stable process. The general criteria are, whilst variation is permissible, they must fall into a constrained pattern, with minimal changes and outlies/unusual points are not present in the data set. Once a stable process is attained, this would result in a predictable output.

What are the main root causes for an unstable process? Surprisingly, from the findings from many industrial use cases, this comes from the application of OVER Adjustment. What is the yardstick or basis of measurement used to judge the instability? Another key component in

the overall SPC scheme of things is the MSA (measurement system analysis). The process may actually be stable, but due to a flawed measurement system, the process may look unstable.

**Control Charts**

The control charts serve to flag out when the process is out of control, and the engineers must intervene to correct the situation, the chart itself does not control the process.

MSA requires a clear understanding and the ability to quantify the uncertainties in relation to collecting the measurements. Uncertainties are classified into Type A and Type B. Type A can be quantified by statistics like linearity or repeatability. For Type B, uncertainties fall under the area where engineering judgment must be applied. An example of Type B uncertainties could be a jig used to hold the sample for the requirement measurements. Do uncertainties come in the form of how repeatably can the part be placed onto the jig?

Looking at the control chart, the key components are the center line and the upper / lower control limits.

- The center line of the chart indicates the process mean.
- The upper and lower control limits indicate the boundaries of variation of the process, usually ±3s.

**Types of Control Charts **

Control charts can be further subdivided into 2 groups: Variable and Attribute Control Charts. Variables can

be measured (temperature, length), whilst the attribute is something which can be counted (part is good or bad).

In the case of the Variable control chart, 2 charts used in tandem: The average (X) and range (R) charts. The (X) monitors the process center, and the (R) the overall variation in the process.

For the Attribute charts, there are 4 types: p Chart, np Chart, c Chart and u Chart.

**Process Capability **

In simple terms, this compares the process output against the customer’s specifications. All the control charting

talked about earlier is merely to monitor variability in the process, the customer is not in the equation yet.

• Process Capability Index (Cp), which must be ≥1.33 for the process to be considered capable

• Cpk is used to look at the capability and how centered is the process

in question. Based on whichever is smaller:

**ABOUT THE AUTHOR**

**John Yik**

(John has over 25 years experience in the semiconductor manufacturing industry. His experience covers IC and MEMS manufacturing, covering a broad spectrum from photolithography to eutectic bonding, and metrology. He is ACTA certified, presently a free-lance technical trainer, focusing on semiconductor manufacturing, FMEA, SPC and Industry 4.0 Singapore)