Measure Phase Overview Flashcards

1
Q

Measure Phase

A

Second phase of DMAIC

  • Main activity is to define the baseline
  • We get the ‘real story’ behind the current state by gathering data and interpreting what the current process is really capable of
  • Team checks how the process is performing against customer expectations and CTQs noticed in the Define phase
  • Help understand the extend of the problem with the help of data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Goals of Measure Phase

A
  • Establish baseline performance of the process
  • Identification of process performance indicators
  • Develop a data collection plan and then collect data
  • Validating the measurement system
  • Determine the process capability
  • Measure phase is approximately 2 to 3 weeks process based on the project inputs.
  • All relevant stakeholders involvement is key to getting quality data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Measure Phase Basic Tools

A
  • Process Map
  • Value Steam Mapping
  • Spaghetti Diagram
  • Cause and Effect Matrix
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Spaghetti Diagram

A
  • A visual representation using a continuous flow line tracing the path of an item or activity through a process. The continuous flow line enables process teams to identify redundancies in workflow and opportunities to expedite process flow.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Data Collection

A
  • Measure Phase is all about collecting as much data as possible to get the actual picture of the problem. Data must be accurate and precise
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Data Types

A

Data-set of values of qualitative or quantitative variables. It may be numbers, measurements, observations, or even just descriptions of things. Below are Quantitative Data:

Discrete Data: aka attribute data. The data is discrete if the measurements are integers or counts. E.g. number of customer complaints, weekly defect data etc.

Continuous Data: The data is continuous if the measurement takes on any value, usually within some range. E.g. stack height, distance, cycle time, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Discrete Data aka Attribute Data

A
  • Count - Ex. counts of errors
  • Binary data - data that can have only one of two values.
    • E.g. On-time delivery (yes/no); Acceptable product (pass/fail)
  • Attribute-Nominal - The “data” are named or labels. There is no intrinsic reason to arrange in any particular order or make a statement amount any quantitative different between them
    • E.g. In a company: Dept A, Dept B, Dept C
    • E.g. In a shop: Machine 1, Machine 2, Machine 3
    • E.g. Types of transportation: boat, train plane
  • Attribute-Ordinal - The names or labels represent some value inherent in the object or item (so there is an obvious order to the labels)
    • E.g. On product performance: excellent, very good, good, fair, poor
    • E.g. Customer survey: strongly agree, agree, disagree, strongly disagree
    • *Though ordinal scales have a defined sequence, they do not imply anything about the degree of difference between the labels (that is, we can’t assume that “excellent” is twice as good as “very good”)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Coding Data

A
  • Sometimes it is more efficient to code data by adding, subtracting, multiplying, or dividing by a factor

Types of Data Coding

Substitution: ex. Replace 1/8ths of an inch with +/ 1 deviations from center in integers

Truncation: ex. data set of 0.554, 0.5542, 0.5547 - you might just remove the 0.554 portions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Data Collection Plan

A
  • Useful tool to focus your data collection efforts on, directed approach helps to avoid locating and measuring data just for the sake of doing so.
    • Identify data collection goals
    • Develop operational definitions
    • Create a sampling plan
    • Select and validate data collection methods
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Plan for and begin collecting data

A

Data collection form: a way of recording approach to obtaining the data that is needed to preform the analysis. Data should be recorded by trained operators with a calibrated instrument and also in a standard data collection form

  • Checklist
    • perform repetitive activities, to check a list of requirements, or collect data in an orderly and systematic manner, make systematic checks of activities or products, ensuring the operator does not forget anything important
  • Check Sheet
    • structured, well-prepared form for collecting and analyzing data consistently of a list of items and some indications of how each item occurs, collecting data in a standard form helps six sigma teams solve problems and make better decisions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Measurement System Analysis (MSA)

A
  • Experimental and mathematical method of determining how much the variation within the measurement process contributes to overall process variability

Some Basic Factors:

  • Accuracy
  • Precision
  • Gage R&R
    • Repeatability
    • Reproducibility
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Six Sigma Statistics

A

Statistics: science of gathering, classifying, arranging, analyzing, interpreting, and presenting the numerical data, to make inferences about the population from the sample drawn.

-Two basic categories: Analytical (aka inferential statistics) and Descriptive (aka Enumerative statistics)

  • Basic Six Sigma Statistics - foundation for six sigma projects. Allows us to numerically describe the data that characterizes the process Xs and Ys
  • Inferential statistics - aka analytical statistics is used to determine whether a particular sample or test outcome is representative of the population from the sample it was originally drawn.
  • Descriptive statistics - aka enumerative statistics is basically organizing and summarizing the data using numbers and graphs. Describes the characteristics of the sample or population
    • Measure of frequency (count, percentage, frequency)
    • Measure of Central Tendency (mean, median, mode)
    • Measure of dispersion of variation (Range, variation, standard deviation)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Shape of Data Distribution

A
  • Depicted by its number of peaks and symmetry possession, skewness, or uniformity.
  • Skewness - is a measure of the lack of symmetry. In order words, skewness is the measure of how much the probability distribution of a random variable deviates from the Normal Distribution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Data Organization/Data Display/Data Patterns

A

The graphical analysis creates pictures of the data, which will help understand the patterns are also the correlations between process parameters. Graphical analysis is the starting point for any problem-solving method.

  • Control Chart: The control chart is a graphical display of quality characteristics that have been measures or computed from a sample versus the sample number or time
  • Frequency Plots: Frequency plots allow you to summarize lots of data in a graphical manner making it easy to see the distribution of that data and process capability, especially when compared to specifications.
  • Box Plot: Box plot is a pictorial representation of continuous data. In other words, Box plot shows the Max, Min, Median, Interquartile range Q1, Q3, and outlier.
  • Main Effects plot: The main effects plot is the simplest graphical tool to determine the relative impact of a variety of inputs on the output of interest.
  • Histogram: Histogram is the graphical representation of a frequency distribution. It is the form of a rectangle with class intervals as bases and the corresponding frequencies as heights
  • Scatter plot: A scatter analysis is used when you need to compare two data sets against each other to see if there is a relationship
  • Pareto Chart: 80:20 a graphical tool to map and grade business process problems from the most recurrent to the least frequent
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Basic Probability

A

Basic Six Sigma Probability terms like independence, mutually exclusive, compound events, and more are necessary foundations for statistical analysis.

Probability is the ratio of number of favorable outcomes to the total number of possible outcomes. Probabilities are usually shown in fractions or decimals. The probability ALWAYS lies between 0 and 1. An event is one or more outcomes in an experiment. The probability of an event E indicates how likely that event is to occur.

Probability of an event (E) = number of favorable outcomes/number of possible outcomes

  • Additive law
  • Multiplication law
  • Compound Event
  • Independent Event
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Additive Law

A
  • Aka addition rule
  • The probability of the union of two events. There are two scenarios in the additive law:
    • When two events are not mutually exclusive:
      • When two events A and B are not mutually exclusive, the probability of A and B will occur is the sum of the two events probabilities and subtract both probability of A and B will occur (intersection), the formula can summarize the same:
        • P (A U B) = P(A) + P(B) – P(A Ո B)
    • When two events are mutually exclusive:
      • When two events A and B are mutually exclusive, the probability of A and B will occur is the sum of the two events’ probabilities, the formula can summarize same:
        • P (A U B) = P(A) + P(B)
17
Q

Mutually Exclusive

A

Statistical term describing two or more events that cannot happen simultaneously.

If they are** mutually exclusive they **cannot happen simultaneously

If they are not mutually exclusive, they can happen simultaneously

18
Q

When Would you Use the Additive Law in a Six Sigma Project?

A

Additive law is essential in probability. Additive law tells us with a way to calculate the probability an event “A” or the probability of “B.” The use of the additive law is dependent upon whether event A and event B are mutually exclusive or not.

Example Attached

19
Q

Multiplication Law

A
  • A method to find the probability of events occurring at the same time. There are two scenarios in the multiplication law:
    • When events are independent
    • When events are dependent

If events A and B are dependent, the probability of A influences the probability of B. This is known as conditional probability and the sample space is reduced.

20
Q

Compound Event

A
  • It is an event that has more than one possible outcome of an experiment. In other words, compound events are formed by a composition of two or more events.
  • When would you guys compound events?
    • Used when the outcome may have different probabilities but they are all equally possible. Events that are chained together in a row.

Example Attached

21
Q

Independent Event

A
  • Events can be independent events when the outcome of one event does not influence another event’s outcome.
  • When would you use an independent event in a six sigma project?
    • Independent events are used in six sigma projects where one event does not connect with another event’s chance of occurring. Like car mileage does not depend on the color of the car.

Example Attached

22
Q

Determine the Process Capability

A
  • Process Capability Analysis
    • Tells us how well a process meets a set of specification limits based on a sample of data taken from a process.
    • The process capability study helps to establish the process baseline and measure the future state performance.
      • Revisit the operational definitions and specify what are defects and which are opportunities.
  • Calculate the baseline process sigma
    • The value in making a sigma calculation is that it abstracts your level of quality enough so that you can compare levels of quality across different fields (and different distributions.) In other words, the sigma value (or even DPMO) is a universal metric, that can help yourself with the industry benchmark/competitors.
      • Baseline Sigma for Discrete Data
      • Baseline Sigma for Continuous Data
23
Q

Hypothesis Testing

A
  • Key procedure in inferential statistics used to make statistical decisions using experimental data.
  • It is basically an assumption that we make about the population parameter.
  • When using hypothesis testing we create:
    • null hypothesis (H0): the assumption that the experimental results are due to chance alone, nothing (from 6M) influenced out results.
    • alternative hypotheses (Ha): we expected to find a particular outcome.
24
Q

Baseline Sigma for Discrete Data aka Attribute Data

A
  • Calculate the process capability is through the number of defects per opportunity.
  • The acceptable number to achieve sex sigma is 3.4 Defects Per Million Opportunities (DPMO)
    • DPO= Defects/(Units*Opportunities)
    • DPMO=(Defects/Units*Opportunities*Total 1,000,000
    • Yield=1-DPO(It is the ability of the process to produce defect free units)
25
Q

Baseline Sigma for Continuous Data

A
  • Process Capability is the determination of the adequacy of the process with respect to the customers needs.
    • Process capability compares the output of an in-control process to the specification limits.
    • Cp and Cpk are considered short-term potential capability measures for a process.
26
Q

Cpk

A
  • Cpk is a measure to show how many standard deviations the specification limits are from the center of the process.
    • Cplower = (Process Mean - LSL)/(3*Standard Deviation)
    • Cpupper = (USL - Process Mean)/(3*Standard Deviation)
    • Cpk is the smallest value of the Cpl or Cpu
      • Cpk = Min (Cpl, Cpu)
27
Q

Six Sigma Derives…

A
  • Six Sigma derives from the normal or bell curve in statistics, where each interval indicates one sigma or one standard deviation
  • Sigma is a statistical term that refers to the standard deviation of a process about its name
    • In normally distributed process, 99.73% of measurement will fall within ±3σ and 9.99932% will fall within ±4.5σ.
28
Q

Measure Phase DMAIC Deliverables

A
  • Detailed process map
  • Data collection plan and collected data
  • Results of Measurement system analysis
  • Graphical analysis of data
  • Process capability and sigma baseline