What is Design of Experiments (DOE)

Design of Experiments (DOE) is a structured approach for varying process and/or product factors (x’s) and quantifying their effects on process outputs (y’s), so that those outputs can be controlled to optimal levels.
DOE deals with identification of critical factors and their response variables, and the magnitude of the response for each level of the critical factors. DoE is also used to understand the interaction between the various critical factors to ensure right mix of the critical factors to get the best amount of response.
DoE is used to understand the transfer function and mathematical model for the optimization of the response variable.
A DC motor manufacturer might wish to understand the effects of two process variables, wire tension and trickle resin volume, on motor life. In this case, a simple two factor (wire tension and trickle resin volume), two level (low and high values established for each of the two factors) experiment would be a good starting point. Randomizing the order of trials in an experiment can help prevent false conclusions when other significant variables, not known to the experimenter, affect the results. There are a number of statistical tools available for planning and analyzing designed experiments.

Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2013-09-29 16:46:00.

Sixth principle of SPC – causes for Variation

According to the sixth principle of SPC  a frequency distribution will be deviating from normal distribution, only in the presence of any assignable cause.
A frequency distribution is a tally of measurements that shows the number of times the measurement is included int he tally. From this frequency distribution we can see if there are only chance causes present in the process of any assignable causes are acting.
If there is a distortion from the normal curve, we can say that there is presence of assignable causes. This finding can actually help us to find the causes and address them.
Various effects of the presence of assignable causes, will tend to distort the shape in center, or the spread as sees earlier. This indication forms the basis of various techniques used in Statistical Process Control.
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-05-02 01:31:00.

Basic Principles of Statistical Process Control

The six principles below are the Basic Principles of Statistical Process Control (SPC). These can be clearly understood using Frequency Distributions.
The principles are listed below. The explanation is linked to each sentence

  1. No two things are exactly alike
  2. Variation in a product or process can be measured
  3. Things Vary according to a definite pattern
  4. Whenever things of the same kind are measured, a large group of the measurements will tend to cluster around the middle.
  5. It is possible to determine the shape of the distribution curve for the measured Output (Parts produced, transaction) by any process
  6. Variation is due to assignable causes tend to distort the Normal distribution Curve.
To Understand more on these principles, we can study the data from the output of the process. These foundation principles will be useful for all types of processes.
Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-02-27 01:53:00.