Six Sigma project Roles

Black Belt 
Leaders of team responsible for measuring, analyzing, improving and controlling key processes that influence customer satisfaction and/or productivity growth. Black Belts are full-time positions.
Green Belt 
Similar to Black Belt but not a full-time position.
Master Black Belt  
First and foremost teachers. They also review and mentor Black Belts. Selection criteria for Master Black Belts are quantitative skills and the ability to teach and mentor. Master Black Belts are full-time positions.
Customer 
Any internal or external person/organization who receives the output (product or service) of the process; understanding the impact of the process on both internal and external customers is key to process management.
Process Owner

Process owners are exactly as the name sounds – they are the responsible individuals for a specific process. For instance, in the legal department there is usually one person in charge – maybe the VP of Legal – that’s the process owner. There may be a Director of Marketing at your property – that’s the process owner for marketing, and for the Check-in process, the process owner is typically the Front Office Manager.

Team leader

For DMAIC projects, the team leader is usually the Black Belt. For Quick Hit and iDMAIC projects, it is typically the Sponsor or Process Owner. For large DMAIC projects with more than one BB or MBB, the Team leader is the main point of contact for the project.
Team member
An active member of a Six Sigma Project team (DMAIC or iDMAIC),heavily involved in the measurement, analysis and improvement of a process. To be effective, team memberships require a minimum of 10% time commitment to a phase of the project. He/she also helps fosters the Six Sigma culture within the organization by informing /educating fellow Associates about Six Sigma tools and processes.
 Transfer Team Leader (Process Owner/Department Head) 
A person selected by the GM and property SIXSIGMA Council to lead an iDMAIC project based primarily on proximity and decision-making authority relative to the process involved. This person has primary responsibility for implementing the project, leading the team, and interacting with others to gather information and understanding necessary to succeed. Often, the transfer team leader will be the department head or process owner of the process being improved with the best practice. The ability to lead the team and to anticipate clear barriers are important characteristics for a person in this role. 
Transfer Team Member 
Associates selected by the Transfer Team Leader and Six Sigma Council to serve on the iDMAIC project based on their knowledge of key aspects of the process, experience with the current process, enthusiasm for improvement, and ability to champion change. Other key factors in selecting transfer team members include time availability and representation from relevant functions. All members will be provided training on the skills and tools used in the transfer process.
Project Sponsor 
This member of the executive committee is a strong advocate of the project and can assist with barriers that may come up. He or she is accountable for the project’s success and can therefore explain to Six Sigma Council members and everyone in the property the business rationale for the transfer project and assist with cross-functional collaboration efforts. He or she will remain up to date on key aspects of the project by regularly meeting with the team leader and members.
The project sponsor:
  • Is a member of the Executive committee
  • Is accountable for project success
  • Addresses cross-functional or other barriers
  • Reviews and tracks progress with team leader
  • Advocates for necessary resources
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-02-26 03:50:00.

Tabulation of data – Presentationtechnique

Tabulation of data is the most commonly used method for presentation or organization of data. The table has distinct features which are explained here. The data collected by using various methods like Surveys, Interviews, In-field studies etc will give you the raw data. You may not be able to draw any conclusions using this data. This data need to be organized and then only it can talk(yes. the data talks if it is collected properly) to you. Also this data will talk for you only if your present it in such a way that the user can receive.
There are various methods by which you can organize and present the data.The selection of the method depends upon the purpose and the target audience.
The most commonly used method is tabulation.  Normally we see many tables daily. However, if your presentation or analysis is to be relevant, the table shall contain the relevent details.
A good table is used to condense the data and present in a useful form. It is the most common method and easily understood method of presenting data.

A good table will have the following details.

  1. The Table must have a heading.
  2. The Table should present the data clearly, highlighting important details.
  3. The Table should save space but attractively designed.
  4. The table number and title of the table should be given.
  5. Row and Column headings must explain the figures therein.
  6. Averages or percentages used in the table should be close to the data.
  7. Units of the measurement should be clearly stated along the titles or headings.
  8. Abbreviations and symbols should be avoided as far as possible.Sources of the data should be given at the bottom of the data.
  9. In case irregularities creep in table or any feature is not sufficiently explained, references and foot notes must be given.The rounding of figures should be unbiased.
  10. Wherever notes is required, they should be given below the table with relevent references.

If a table contains all the above, it can be said a good table. Even though these are present is the normal table, the classifications is good to verify if the table is good or not.

Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2012-01-23 02:14:00.

Control Charts

Control Charts is a running record of the process performance. It is a recod of results of periodic sampling inspections.

A chart becomes a control chart when it has control limits based on inherent process variation. Process control limits are boundaries on a control chart within which the process can operate to a standard. These limits are based on natural variation of the process without the influence of assignable causes. each time the job is checked, the results are compared with the control limits. If the results are within the control limits, then the process is to be left alone. But if a point on a control chart falls outside the control limits, or any other indications of an out of control process, it indicates that there is some change which is happened and the process in no lnger operating normally.

In other words, Control Limits are warning singnals that tell us

1. When to take action

2. When to leave the process alone.

Taking action on a process operating within control limits is not only eneconomcal but may also increase the variation.

There are two general types of Control Charts

1. Variables Chart – This type of chart is used where a dimentsion of a charecteristin is meaeasured and the result is a value.

Popularly used Variables charts are

X-Bar – R Charts

X-Bar – S – Charts

X-R Charts

2. Attributes Chart. – This type of chart is used where a product quality is assessed by sensory means or the data is in terms of count of defectives of count of defects.

Teh popularly used Attributes Charts are

p-Charts

np- Charts

C- Charts

In addition to above there are some adapations to the control charts which are a combination of the above. These are called Special Contorl Charts which will be discussed later.

Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-12-04 05:40:00.

Basic Statistics I – Definitions of commonly used Statistical Terms

When talking of statistics, we come across many simple terms. These Basics are called as basic statistics. we come across them day in and day out, when ever we are working on any statistics. Without these basic statistics terms, we cannot understand anything with respect to our statistical problem. Most of them are also used in our daily life, but may be with a different name in statistics

  • Average – Also called the mean, it is the arithmetic average of all of the sample values. It is calculated by adding all of the sample values together and dividing by the number of elements (n) in the sample.
  • Central Tendency – A measure of the point about which a group of values is clustered; two measures of central tendency are the mean, and the median.
  • Characteristic – A process input or output which can be measured and monitored.
  • Cycle Time – The total amount of elapsed time expended from the time a task, product or service is started until it is completed.
  • Long-term Variation – The observed variation of an input or output characteristic which has had the opportunity to experience the majority of the variation effects that influence it.
  • Median – The middle value of a data set when the values are arranged in either ascending or descending order.
  • Mode : The data point which occurs at maximum frequency
  • Lower Control Limit (LCL) –  for control charts: the limit above which the subgroup statistics must remain for the process to be in control. Typically, 3 standard deviations below the central tendency.
  • Lower Specification Limit (LSL) – The lowest value of a characteristic which is acceptable.Range – A measure of the variability in a data set. It is the difference between the largest and smallest values in a data set.
  • Specification Limits – The bounds of acceptable performance for a characteristic.
  • Standard Deviation – One of the most common measures of variability in a data set or in a population. It is the square root of the variance.
  • Trend – A gradual, systematic change over time or some other variable.
  • Upper Control Limit (UCL) for Control Charts – The upper limit below which a process statistic must remain to be in control. Typically this value is 3 standard deviations above the central tendency.
  • Upper Specification Limit (USL) – The highest value of a characteristic which is acceptable.
  • Variability – A generic term that refers to the property of a characteristic, process or system to take on different values when it is repeated.
  • Variables – Quantities which are subject to change or variability.
  • Variable Data – Data which is continuous, which can be meaningfully subdivided, i.e. can have decimal subdivisions.
  • Variance – A specifically defined mathematical measure of variability in a data set or population. It is the square of the standard deviation.
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-10-20 12:53:00.

Types of Data

The first step of any statistical enquiry is the collection of relevant numerical data. The types of data used for statistical purposes is mainly classified as primary data and secondary data

Variable

a characteristic of population that can take different values (e.g., defects, processing time).

Data

Data are measurements collected on a variable

Primary Data

Data collected for the purpose of the given inquiry is called as Primary Data. These are collected by the enquirer, either by his own or through some agency set up for this purpose, directly from the field of enquiry. This type of data can be used with greater confidence because the enquirer himself decides upon the coverage of the data, definitions to be used and as such will have a greater control over the reliability of the data.

Secondary Data

The data already collected by some other agency or for some other purpose and available in published or un published form is known as secondary data. The user has to be perticularly careful about using using such data. The user must clearly understand the nature of the data, their coverage, the definitions used for the data and their reliability.
The usage of secondary data is generally preferred if the conditions mentioned above are clear and usable. This will reduce the time taken for the analysis, also reduces cost of the analysis.

Types of data

Discrete Data

Count or frequency of occurrence

Attribute Data

Data which on one of a set of discrete values such as pass or fail, yes or no.

Continuous Data

Measurements that can be meaningfully divided into finer and finer increments of precision

Usage of Sampling

The big question is weather the collection of data should be done by complete population or by sampling. If sample is used, care should be taken that this is a representative of complete population. A sample designed with care can produce results that may be sufficiently accurate for the purpose of enquiry. A Carefully designed sample can save a lot of time and money.

Methods of Data Collection

The methods used to collect data are Questionnaire Method, Interview Method and Direct Observation Method. Any one or a combination of these are used to collect data.

Usage of Data

The data collected should be subjected to a thorough scrutiny to see if they may be considered correct. The success of the analysis depends on the reliability of the data. However excellent the statistical method of data analysis may be, they cannot bring out useful and reliable information from faulty, unreliable of mistaken data. Especially, this is more applicable in case of usage of secondery data.

Like this?? – Go on the visit the next column – Statistics – 3: presntations and Organization

Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-10-01 11:14:00.

Statistics – 1 – What is Statistics ?

Statistics can be described as a quantitative method of scientific investigations.
If used as  plural noun ‘Statistics’ means the numerical data arising out of any sphere of human experience.
Used as singular ‘Statistics’ is the name for the body of scientific methods used for collection, analysis, Organizing, and interpretation of Numerical data.
According to American Statistical Association “Statistics” is the scientific application of mathematical principles to the collection, analysis, and presentation of numerical data’
Also, There is a different meaning for the word ‘Statistic’ in the field of Statistics(subject). In this sense A ‘Statistic’ is a numerical item which are produced by the some calculations using the data. Standard Deviation, Mean etc are called as ‘Statistic’  in this sense.
This is one arm of Mathematics, which is extensively used in all most every field. It has become an important tool in the work of many academic disciplines such as medicine, psychology, education, sociology, engineering and physics, just to name a few. It is also important in many aspects of society such as business, industry and government. Because of the increasing use of statistics in so many areas of our lives, it has become very desirable to understand and practice statistical thinking. This is important even if you do not use statistical methods directly.
Even with so many uses, there is some mistrust in public about the subject. This is because of the misuse of the figures by the people for their convenience. During the introduction to the course i joined on, this statement is used. There are 3 types of lies. 1 – Lies, 2- damned Lies 3-Statistics. We will teach you the 3rd part here.
Used properly statistics is a panacea for all the problems faced by the world. it can be a tremendous tool for the growth of any organization. 
Visit the next post Data Collection – Types of Data
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-09-30 20:00:00.

What are the 7 Basic Quality Tools

These are the Most used basic quality tools in solving Quality related Problems. These are suitable for those people with little or minimal formal training on Statistics. These are the seven basic Graphical techniques which help in solving vast majority of problems.
The History of these tools is interesting.
In the 1950, Just after the 2nd world war, Japan was concentrating on rebuilding. One of the initiatives was invitation to the legendary American Quality Guru, W. Edwards Deming to Japan by the JUSE(Japanese Union of Scientists and Engineers) to train hundreds of Japanese Engineers, Managers and Scholars on the subject of Statistical Process control. During the Hundreds of lectures delivered by Deming, the emphasis was on basic tools which were available on the process control.
Taking Cue from these, Kaoru Ishikawa, at the time an associate professor at the University of Tokyo and a member of JUSE, Developed these tools. His chief desire was to democratize Quality i.e  he wanted to make quality control comprehensible to all workers, and inspired by Deming’s lectures, he formalized the Seven Basic Tools of Quality Control. He believed that  90% of a company’s problems could be improved using these seven tools, and that  they could easily be taught to any member of the organization. This ease of use combined with their graphical nature makes statistical analysis makes interesting to all.
These are listed below.

  1. Check Sheets – A generic Tool which can be used for collection and analysis of data. A structured and prepared form that can be adapted for wide variety of issues
  2. Control Charts – This is a graphical technique,which can be used to study the changes to a process over time
  3. Pareto Chart – This is another graphical technique, which can be used to identify the significance of individual factors
  4. Scatter Chart – This is used to identify the relation between variables, by plotting pairs of numerical data, with one variable on each axis. The points will be falling on a line or a curve, if the variables are related.
  5. Cause and Effect Diagram (Also called as Ishikawa Diagram or Fishbone diagram) – This can be used to structure the brain Storming Sessions. It is used to sort ideas into useful categories. Many Possible Causes are identified for a stated problem and the effect on the problem are identified
  6. Flow Chart (Stratification Charts)- This tool is used to identify the patterns within the data collected from multiple sources and clubbed together. It is used to identify the meaning of the vast data by identifying patterns.
  7. Histogram – It looks very much like a bar chart. it is used to identify the frequency of occurrence of a variable in a set of data.
Please share this on facebook. if you like this blog, like us on https://www.facebook.com/excellencementor

Originally posted 2011-03-19 16:18:00.