Friday, October 26, 2012

Seven quality tools you shouldn't live without


Seven quality tools you shouldn't live without

Contributor:  Sudeshna Banerjee
Posted:  10/24/2012  12:00:00 AM EDT  | 
 4  

Rate this Article: (3.0 Stars | 1 Vote) 
         

Even highly qualified personnel balk at the idea of using sophisticated quality tools such as Design of experiment; hypothesis testing or multivariate analysis, says Sudeshna Banerjee. The good news is that  most quality related issues can be resolved with these Seven Basic Tools of Quality.
The Seven Basic Tools of Quality is a name given to a set of very simple graphical techniques which have been identified as being most helpful in troubleshooting simple, day to day quality related issues. They are called basicbecause even people with little or no statistical training would be able to grasp these concepts and apply them to their everyday work.
I have often seen that even highly qualified personnel balk at the idea of using sophisticated quality tools such as Design of experiment; hypothesis testing or multivariate analysis. Hence it would come as a welcome respite for most professionals to know that most quality related issues can be resolved with these Seven Basic Tools of Quality.
The purpose of this article is to give a basic overview into these tools and how they can be used effectively. It goes without saying that to derive optimum results from any of these tools; the quality practitioner needs to ensure clean, unbiased and sufficient data.
Such a simple concept so why make it overly complex?
Tool # 1: Ishikawa diagrams
Ishikawa diagrams (also called fishbone diagrams or cause-and-effect diagrams) are causal diagrams that show the root cause/s of a specific event. A common method to arrive at a really informative fishbone is to use the 5 Whys method in conjunction while making the fishbone.
Basic cause categories could include :
  1. People- personnel involved with the process; stakeholders etc
  2. Methods – process for doing the task and the specific requirements for doing it, such as policies, procedures, rules, regulations and laws
  3. Machines -  Any equipment, computers, tools etc. required to accomplish the job
  4. Materials - Raw materials, parts, pens, paper, etc. used to produce the final product
  5. Measurements - Data generated from the process that are used to evaluate its quality
  6. Environment - The conditions, such as location, time, temperature, and culture in which the process operates

Tool # 2: Check sheet
The check sheet is a structured, prepared form for collecting and analyzing data. This is a generic tool that can be adapted for a wide variety of purposes. The data it captures can be quantitative or qualitative. When the information is quantitative, the check sheet is called a tally sheet
The defining characteristic of a check sheet is that data is recorded by making marks ("checks") on it. A typical check sheet is divided into regions, and marks made in different regions have different significance. Data is read by observing the location and number of marks on the sheet. Check sheets typically employ a heading that answers the Five Ws. Remember to develop operational definitions for each of the Ws.
  1. Who filled out the check sheet
  2. What was collected (what each check represents, an identifying batch or lot number)
  3. Where the collection took place (facility, room, apparatus)
  4. When the collection took place (hour, shift, day of the week)
  5. Why the data was collected

Tool # 3: Histogram
histogram is a display of statistical information that uses rectangles to show the frequency of data items in successive numerical intervals of equal size. In the most common form of histogram, the independent variable is plotted along the horizontal axis and the dependent variable is plotted along the vertical axis.  
The main purpose of a histogram is to clarify the presentation of data. It is a useful tool for breaking out process data into regions or bins for determining frequencies of certain events or categories of data. These charts can help show the most frequent. Typical applications of histograms in root cause analysis include presenting data to determine which causes dominate; understanding the distribution of occurrences of different problems, causes, consequences, etc. A Pareto diagram (explained later in article) is a special type of histogram. 
Tool # 4: Pareto chart
The Pareto chart is an important tool and concept. Since organizational resources are few, it is important for process owners and stakeholders to understand prime causes for errors; defects etc. Pareto represents this principle beautifully by clearly prioritizing the major defect causes. It is also known as the 80:20 principle.
The graph - named after economist and political scientist Vilfredo Pareto - is a type of chart that contains both bars and a line graph, where individual values are represented in descending order by bars, and the cumulative total is represented by the line. The left vertical axis typically represents frequency of occurrence. The right vertical axis is the cumulative percentage of the total number of occurrences.  Because the reasons are in decreasing order, the cumulative function is a concave function. To take the example above, in order to lower the amount of late arriving by 78%, it is sufficient to solve the first three issues.
INTERESTED IN LEARNING MORE ABOUT THIS TOPIC?
 Lean Six Sigma India Summit
Organizations all over India are focusing on gaining competitive advantage and improving business performance by enhancing the quality of products and services. Find out how to be one of them at the 2nd annual Lean Six Sigma India Summit taking place this December in Bengalu... 

Tool # 5: Scatter plot or scattergraph
A scattergraph is often employed to identify potential associations between two variables, where one may be considered to be an explanatory variable and another may be considered a response variable. It gives a good visual picture of the relationship between the two variables, and aids the interpretation of the correlation coefficient or regression model. The data is displayed as a collection of points, each having the value of one variable determining the position on the horizontal axis and the value of the other variable determining the position on the vertical axis.
A scatter plot is used when a variable exists that is under the control of the experimenter. If a parameter exists that is systematically incremented and/or decremented by the other, it is called the control parameter or independent variable and is customarily plotted along the horizontal axis. The measured or dependent variable is customarily plotted along the vertical axis. If no dependent variable exists, either type of variable can be plotted on either axis or a scatter plot will illustrate only the degree of correlation (not causation) between two variables.
Tool # 6 -Stratified sampling
Stratified sampling is a method of sampling from a population. In statistical surveys, when subpopulations within an overall population vary, it is advantageous to sample each subpopulation (stratum) independently.Stratification is the process of dividing members of the population into homogeneous subgroups before sampling.
The strata should be mutually exclusive: every element in the population must be assigned to only one stratum. The strata should also be collectively exhaustive: no population element can be excluded. Then simple random sampling or systematic sampling is applied within each stratum.
This often improves the representativeness of the sample by reducing sampling error. It can produce a weighted mean that has less variability than the arithmetic mean of a simple random sample of the population. I often tell groups that I mentor that correct sampling procedures are more important than simply having adequate sample size!!
Tool # 7: Control charts, also known as Shewhart charts or process-behavior charts
A control chart is a specific kind of run chart that allows significant change to be differentiated from the natural variability of the process.
If analysis of the control chart indicates that the process is currently under control (i.e. is stable, with variation only coming from sources common to the process) then no corrections or changes to process control parameters are needed or desirable. In addition, data from the process can be used to predict the future performance of the process.
If the chart indicates that the process being monitored is not in control, analysis of the chart can help determine the sources of variation, which can then be eliminated to bring the process back into control.
The control chart can be seen as part of an objective and disciplined approach that enables correct decisions regarding control of the process, including whether to change process control parameters. Process parameters should never be adjusted for a process that is in control, as this will result in degraded process performance. A process that is stable but operating outside of desired limits (e.g. scrap rates may be in statistical control but above desired limits) needs to be improved through a deliberate effort to understand the causes of current performance and fundamentally improve the process.
When I mentor simple Six Sigma projects ( typically called the Yellow Belt project),  where the issues are simple and the project team are people between 3- 5 years experience on the processing floor , I strongly advocate usage of these simple tools to resolve the process related issues.
As a thumb rule, any process displaying a process capability of 1-2 sigma, can be improved by simple analysis using these tools. It is only when the process capability is over 2.5 – 3 sigma, once needs to use medium to high complexity tools to identify and resolve process related issues. I also recommend any initial Six Sigma curriculum and training to include the Seven QC tools, where then creates an fertile territory development of Green Belts and Black Belts within the organization. 



No comments:

Post a Comment