The Lecture Notes Blog

Home » Posts tagged 'Variability'

Tag Archives: Variability

The two dimensions of quality


There are two basic dimensions of quality: Performance quality measures to what extent a product or service meets the expectations of the customer. Conformance quality measures if processes are carried out the way they were intended to be carried out.

The root cause for quality problems is process variability. Were it not for process variability, every run through a process would result in the optimal output or in the very same error, which would then be easy to detect. However, due to process variability, some runs through a process result in optimal outcomes while others result in different kinds of errors. With some very basic statistical probability tools, we can assess the chances of such errors and defects occurring during a process. To calculate total error probabilities for an assembly line, one has to look at the error rate of each work step and calculate their yields (the percentage of error-free flow units the work step produces).

The yield of the process is defined as the percentage of error-free parts which are produced by the process – which of course depends on the yields of the various work steps. The total process yield is thereby simply the product of the individual yields:

process yield = yield 1 * …. * yield n

It is noteworthy, that even small defect probabilities can accumulate to a significant error rate, if there are many steps in a process. For example, if a process workflow consists of 10 steps with every step having a low defect probability of only 1%, the chances of an completely error-free product leaving this workflow are only 0,99^10 = 89,5%

The Swiss Cheese model explains, why defects or procedural errors sometimes do not get noticed, even if there are highly efficient quality checks in place: Since every slice of Swiss Cheese has some holes (defects) in it, there is a small probability that holes will line up in a way that creates a hole through a staple of cheese slices. This is then akin to multiple quality checks failing during the production of the same flow unit – though the chances of this happening might be low, it is bound to happen from time to time. This insight is also the main reason behind redundant checks, which means checking a quality attribute more than once to etch out all errors that might occur. With redundancy, a process hat to fail at multiple stations in order for the process yield to be affected.

These lecture notes were taken during 2013 installment of the MOOC “An Introduction to Operations Management” taught by Prof. Dr. Christian Terwiesch of the Wharton Business School of the University of Pennsylvania at Coursera.org.
Advertisements

The concept of responsiveness


Responsiveness is the ability of a system or process to complete tasks within a given time frame. E.g. how quick can a business respond to customer demands? If customers are made to wait, they are turned into inventory, potentially resulting in a unpleasant customer experience. Any customer waiting time is also an indicator of a mismatch between supply and demand.

Concepts for solving waiting time problems can include increasing the capacity of the resource at the bottleneck as well as increasing process flexibility in order to ensure, that capacity is available at the right time. It has, however, to be kept in mind that waiting times are most often not driven by either the capacity or the flexibility of a process but rather by variability. Variability in the process flow (e.g. customers arriving at random) can lead to unwanted waiting times even when the implied utilization is clearly below 100%. If analysis builds solely on averages and fails to consider process variability, it can thus be wrongfully concluded that there is no waiting time, when, in fact, there is.

To solve this problem, new analysis methods are needed when dealing with process variability. It is noteworthy, that those methods are only requisite when a process has more capacity than demand – if demand exceeds capacity, it can be safely concluded that there will be waiting time even without looking at the process variability.

These lecture notes were taken during 2013 installment of the MOOC “An Introduction to Operations Management” taught by Prof. Dr. Christian Terwiesch of the Wharton Business School of the University of Pennsylvania at Coursera.org.

How does product variety affect distribution systems?


The more the demand is divided into smaller and smaller segments, the harder it is to predict. Once there is more product variety, demand is going to become variable as well. Thus, statistical indicators such as mean and standard deviation are needed to cope with the so-called variability of demand. If demand streams are combined, the standard deviation of the combined demand goes up slower than the standard deviation of the previously uncombined demands. Such an aggregation of demand is called demand pooling and is an important method for reducing statistical uncertainty. Reductions in uncertainty can also be achieved through variance reduction or stock building.

These lecture notes were taken during 2013 installment of the MOOC “An Introduction to Operations Management” taught by Prof. Dr. Christian Terwiesch of the Wharton Business School of the University of Pennsylvania at Coursera.org.
%d bloggers like this: