What are the 7 Basic Quality Tools

These are the Most used basic quality tools in solving Quality related Problems. These are suitable for those people with little or minimal formal training on Statistics. These are the seven basic Graphical techniques which help in solving vast majority of problems.
The History of these tools is interesting.
In the 1950, Just after the 2nd world war, Japan was concentrating on rebuilding. One of the initiatives was invitation to the legendary American Quality Guru, W. Edwards Deming to Japan by the JUSE(Japanese Union of Scientists and Engineers) to train hundreds of Japanese Engineers, Managers and Scholars on the subject of Statistical Process control. During the Hundreds of lectures delivered by Deming, the emphasis was on basic tools which were available on the process control.
Taking Cue from these, Kaoru Ishikawa, at the time an associate professor at the University of Tokyo and a member of JUSE, Developed these tools. His chief desire was to democratize Quality i.e  he wanted to make quality control comprehensible to all workers, and inspired by Deming’s lectures, he formalized the Seven Basic Tools of Quality Control. He believed that  90% of a company’s problems could be improved using these seven tools, and that  they could easily be taught to any member of the organization. This ease of use combined with their graphical nature makes statistical analysis makes interesting to all.
These are listed below.

  1. Check Sheets – A generic Tool which can be used for collection and analysis of data. A structured and prepared form that can be adapted for wide variety of issues
  2. Control Charts – This is a graphical technique,which can be used to study the changes to a process over time
  3. Pareto Chart – This is another graphical technique, which can be used to identify the significance of individual factors
  4. Scatter Chart – This is used to identify the relation between variables, by plotting pairs of numerical data, with one variable on each axis. The points will be falling on a line or a curve, if the variables are related.
  5. Cause and Effect Diagram (Also called as Ishikawa Diagram or Fishbone diagram) – This can be used to structure the brain Storming Sessions. It is used to sort ideas into useful categories. Many Possible Causes are identified for a stated problem and the effect on the problem are identified
  6. Flow Chart (Stratification Charts)- This tool is used to identify the patterns within the data collected from multiple sources and clubbed together. It is used to identify the meaning of the vast data by identifying patterns.
  7. Histogram – It looks very much like a bar chart. it is used to identify the frequency of occurrence of a variable in a set of data.
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-03-19 16:18:00.

Basic Statistics – III – Introduction to Probability

Basic Statistics - III - Introduction to Probability 1 Probability is a concept which is simple but powerful if applied correctly. A Very Simple example is that when there is coin tossed up,there are two outcomes possible. HEAD and Tail. If the coin has no bias, both the out comes are equally possible. We say that there is a possibility of 50% (0.5) each. Let us extend this to another commonly used game. If we throw a dice, there is a possibility of one out of the six outcomes. The dice will have 6 sides with numbers 1-6 on each side. Here the Probability is 1/6. Meaning each side has a 16.666% (0.01666) chances. In both these cases, assume that each outcome is an event, and  chance of occurrence is called probability.
There are many definitions to Probability.
The simplest is that “The measure of likelihood of occurrences“.The classical definition of probability is stated as below.
If there are “n” Exhaustive, Mutually exclusive and equally likely events, and “m” of them are favorable to an event “E” then the probability of occurrence of event “E”, denoted by Pr[E] is 

Pr[E] = m/n
Here N need to fulfill 3 conditions 1 – Mutually Exclusive (The events are Mutually Exclusive if there is no possibility of them occurring together. Ex : Head and Tail of a same coin.),  2- Collectively Exhaustive (All the possible events are to be taken into account. ex: in the coin it is 2 ) and 3 – Equally Likely ( there shall not be any bias towards any event.)
Statistical (Empirical) definition of Probability:
If an experiment is repeated many times, under identical conditions, then the limit of the ratio of number of times that an event happens(m) to the total number of trials(n), as the umber of trials increases indefinitely, is called as probability of happening of the event. 
Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2014-02-23 11:10:00.

Basic Statistics – II What is a Variable and What are variable types

Variable is a characteristic, number, or quantity that increases or decreases over time, or takes different values in different situations. The variables are the basic units used in statistics for measuring , collecting and analyzing. Variables can be classified in to different categories depending on the usage at the point of analysis. The different variable types are

Dependent and Independent Variable types 

An independent Variable can take any value and can be controlled and measured. These are the inputs used for the study. These are also called factors.
A Dependent Variable cannot be controlled. it can only be measured. these are generally output of the changes done to the independent variables. The value of the dependent variable is dependent on the relation on the independent variable. These are called as responses.
It is notable that the dependent and independent variables are not fixed. a dependent variable in one experiment or study may become factor in a different experiment or study. 
For Example, The heat generated is dependent on the amount of fuel burnt. (in this case, heat is a dependent variable and amount of fuel is an independent variable. 
In a different experiment, the time taken for completely evaporate a substance is dependent on the amount of heat supplied. in this case, the time taken is the dependent variable and amount of heat is an independent variable. It is notable here that amount of heat is dependent in one experinment and independent in another experiment.

Qualitative and Quantitative Variable types

Variables are also classified according to the type of the data they represent. This classification depends on the type of the value associated with the variable.
A Qualitative variable describes the characters in a non numerical form. They are also called as categorical variables. Examples of the values which a categorical variable can take are Good, Bad, Red, Blue, Light, heavy, etc. The variables are result, color, weight etc. This is also called as nominal variable.
A Quantitative variable has a numerical value associated with it. This would be a counted or a measured value. These are also called as Numerical variables. Examples of the values a variable are in numbers, 0, -1, 1,2 etc. the variables are height, weight etc. 
Notable that the same variable can be a qualitative or quantitative depending upon the value it takes. for example, if height is give a measured value such as 1.72 Meters, height is a quantitative variable. If the same height is expressed in a comparative value such as tall, short, height is a Qualitative Variable.

Discrete and Continuous Variables.

A discrete variable is something which is an output of counting. This can take only a set of values including negative and fractional values. Examples for a discrete variable are Number of people, charge on electron, etc…. . As a thumb rule, if there a prefix “number of” to the variable, it can be treated as a discrete variable.
A continuous variable can take any value within a specified range. This is generally a measured value. examples of continuous variables are speed, height, distance etc.
Discrete and continuous variables are subset of Numerical variable types

Binomial, Nominal and Ordinal Variables.

A binomial variable can take only two possible values. There is no third option available. For example, result of a test (pass or Fail), Result of tossing a coin (head or tail) etc
A Nominal variable can take several un-ordered values. Examples such as color red, blue, green), Type of bank account( savings, checking etc).
An ordinal variable can have any of the several ordered values. There is clear distinction between the order of the values which are assigned example such as height (tall, short), or response in a survey of satisfaction (excellent, good, poor, etc)
Binomial, Nominal and Ordinal variables are subset of the Qualitative variable types
Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2014-01-01 18:25:00.

What is KPI (Key Performance Indicator)

Key Performance Indicators (KPI’s) are the critical success factors that define an organization’s progress. KPI’s must be measurable, and are trended over time to show progress and take action where needed. These can be defined at various levels in an organization, from the CEO to the sales team to the manufacturing floor. These are the statistics generated from the organizational metrics
The KPIs are to be simple, Measurable, repeatable, and which can be analysed. They are dynamic depending upon the organization and must be in line with the organizational goals. Even though these are different at different levels, they must have a link to the organizational goals. There must be a link to both the short-term goals and to the long-term goals of the organizations.
KPI must be including the fundamentals, which are basic and are needed for the sustenance of the organization. In addition to this, the KPIs must be looking at future growth of the organization.

Here are some examples of KPI

  • At CEO level, Earnings per share, Market share etc
  • For a BPO, Average on-hold time for customers calling
  • No of defects for an engineering process
  • Process yield for a manufacturing team
  • Personal productivity targets at the employee level
Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2013-09-29 16:47:00.

What is Design of Experiments (DOE)

Design of Experiments (DOE) is a structured approach for varying process and/or product factors (x’s) and quantifying their effects on process outputs (y’s), so that those outputs can be controlled to optimal levels.
DOE deals with identification of critical factors and their response variables, and the magnitude of the response for each level of the critical factors. DoE is also used to understand the interaction between the various critical factors to ensure right mix of the critical factors to get the best amount of response.
DoE is used to understand the transfer function and mathematical model for the optimization of the response variable.
A DC motor manufacturer might wish to understand the effects of two process variables, wire tension and trickle resin volume, on motor life. In this case, a simple two factor (wire tension and trickle resin volume), two level (low and high values established for each of the two factors) experiment would be a good starting point. Randomizing the order of trials in an experiment can help prevent false conclusions when other significant variables, not known to the experimenter, affect the results. There are a number of statistical tools available for planning and analyzing designed experiments.

Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2013-09-29 16:46:00.

Sixth principle of SPC – causes for Variation

According to the sixth principle of SPC  a frequency distribution will be deviating from normal distribution, only in the presence of any assignable cause.
A frequency distribution is a tally of measurements that shows the number of times the measurement is included int he tally. From this frequency distribution we can see if there are only chance causes present in the process of any assignable causes are acting.
If there is a distortion from the normal curve, we can say that there is presence of assignable causes. This finding can actually help us to find the causes and address them.
Various effects of the presence of assignable causes, will tend to distort the shape in center, or the spread as sees earlier. This indication forms the basis of various techniques used in Statistical Process Control.
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-05-02 01:31:00.

Fifth Principle of SPC – shape of the distribution

The Fifth Principle of SPC  states that it is possible to determine the shape of the distribution form the measurements of any process. We can learn abut what the process is doing, against what we want the process to do. For this we need to measure the output of the process with the design specifications.the process can be altered if we donot like the comparison, especially if we see a variation.
We need to address eh variation so that it falls in the required pattern. The variation is due to mainly of 2 types. Common Cause variation and Special Cause Variation.
If the variation in output is caused only by common causes, the output will vary in a normal and predictable manner. In such cases, the process is said to be “stable” or “in a state of Statistical Control”.  While the individual measurements may differ from each other, they tend to follow a Normal Distribution.
The normal distribution is characterized by the following

  • Location (Typical Value)
  • Spread – Amount by which the smaller values differ from the center.

The shape of the distribution will deviate from the normal curve in case of any un usual occourances.  These changes can be called as Assignable causes.

The presence of assignable causes will result in difference from the usual normal curve, either in Shape, or in spread or a combination of both.
Fifth Principle of SPC - shape of the distribution 2
Non Normal
some changes are given below. 
Fifth Principle of SPC - shape of the distribution 3
Normal

Fifth Principle of SPC - shape of the distribution 4
Non Normal

The above findings will lead us to the sixth principle of SPC – Variation due to assignable causes tend to distort the normal distribution curve.

Please share this on Facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-05-02 01:24:00.