Professional Capability - Evaluating Impact Flashcards

1
Q

What is Evaluation?

A

Multi level, systematic method for collecting, analyzing and interpreting data to confirm the effectiveness of talent development initiatives. Efficacy depends on demonstrating the value of the investment.

Evaluation is one way to document whether the investment achieved the desired outcomes. Programs are based on objectives specifying what they must accomplish and in what time frame

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Purposes of Evaluation

A

1) determine business impact, cost benefit ratio, ROI for the solution
2) objectives - they were met and how well?
3) instructional strategies: assess the effectiveness and appropriateness of the content
4) performance assessment: reinforcing learning by using a test or similar way
5) facilitator feedback:
6) participants feedback:
7) Learning retention: assessing on the job environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Ralph Tyler Goal Attainment Method

A

Earliest design process incorporating Evaluation of learning experiences based on objectives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Ralph Tyler Goal Attainment Method

A

Model poses 4 questions

  • What objectives learners should achieve?
  • What Learning activities will assist learners to achieve these objectives?
  • How should the curriculum be organized?
  • How should learners achievement be measured?

It helps to evaluate the success of curriculum

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Measurement Process

A

Measure: a standard used to evaluate the degree of quality of the results of a solution

Measurement: a part of the research, is the process of quantifying assessment data and providing the necessary info required to make sound decisions about an issue or a situation. Measurements define or quantify specific attributes of an observation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Measurement Process

A

TD professionals should identify the desired outcomes before designing an evaluation plan. To do this, they must gather, summarize and interpret the data generated by the assessment process to determine the root cause.

The root cause could be the process, lack of resources, lack of info, lack of motivation, health related issues or a need for KSA. Based on these results they will determine the best solutions and write objectives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Measurement Process

A

Consideration for selecting the measurement process;
Nature of the solution
Characteristics of the learners
Focus on the outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Measurement Process

A

The evaluation process;
Use the assessment data to identify evaluation outcomes and goals
Develop an evaluation design and strategy
Select and construct measurement tools
Analyze and report data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Output Models

A

A program is a set of resources and activities toward one or more common goals typically under the direction of a single manager or management team.

Program evaluation is the systematic assessment of program results and if possible the assessment of how the program caused them.

Results may occur at several levels;
Reaction to the program
What was learned
What was transferred to the job
The impact on the organization
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Output Models

A

Evaluation includes ongoing monitoring of programs as well as one time studies of program process or effects.

Program evaluation assess the effect of a learning program
Learning transfer evaluation measures the success of the learner’s ability to use when they have on the job

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Formative vs Summative Evaluation

A

Formative Evaluation
It occurs throughout the design of any TD solution.
Purpose of the evaluation is to improve the draft learning program and increase the likelihood that it will achieve the objectives.
TD professionals should use a formative evaluation while the learning program is being developed and use this info to revise the learning program immediately to make it more effective.
During the evaluation, they should ensure the learning program is understandable, accurate, current and functional.
Formative evaluation could include but if not limited to pilot test, beta test, technical reviews with SMEs, production reviews and stakeholder reviews.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formative vs Summative Evaluation

A

Summative Evaluation:
Occurs after a TD solution has been delivered.
This one focuses on the results or impact of the TD solutions to provide evidence about the value of a program.
Measures participants reactions, the effect on business goals, the initiative costs and the stakeholder’s expectations.
This evaluation measures the outcome and could include but not limited to standardized tests, reaction forms, stakeholder satisfaction surveys and the final ROI.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Kirkpatrick Evaluation- 4 Levels

A

1) Reaction: reaction sheets, (smile sheets) word of mouth feedback to the instructor, managers or other employees

Measures the degree to which participants find the program favorable, engaging and relevant to their jobs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Kirkpatrick Evaluation- 4 Levels

A

2) Learning: evaluates a learner’s mastery of the program content, knowledge or performance tests-determine participant’s ability, observation of skills or behavior by the learner

Measures the degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Kirkpatrick Evaluation- 4 Levels

A

3)Behavior: evaluates a learner’s transfer of knowledge and skills to the job and the extent to which they have applied what they learned. Comprehensive continuous performance monitoring based on behaviors, Manager assessment, self assessment, observation

Measures the degree to which participant apply what they learned during the program when they are back on the job

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Kirkpatrick Evaluation- 4 Levels

A

4) Results: evaluate whether target outcomes occur, often viewed as a program’s organizational effect
This can be accomplished only if well defined targeted outcomes are identified prior to designing the program
Productivity measures, cost or expense, employee turnover, engagement

Measures the degree to which targeted outcomes occur as a result

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Phillips ROI Methodology

A
6 types of collected data
First level 4 is similar to Kirkpatrick 
1) Reaction and planned action
2) Learning
3) Application and implementation 
4) Business impact
5) ROI
6) Intangible measures
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Phillips ROI Methodology

A

The methodology includes the use of;

1) standard values: those that are already accepted in the organization
2) historical costs: (ex:cost of an unexpected absence) represents the measures being converted have cost the organization in the past
3) input from internal or external experts on a particular measure
4) participant estimates
5) supervisor and manager estimates
6) link to other measures that have already been converted to monetary value, a technique used when placing value on customer value on customer or employee satisfaction
7) TD staff estimates

19
Q

Phillips ROI Methodology

A

Isolating effects and accountability:
Includes a step which answers the question: How do we know our training is what caused the results?
Using a control group is one way to answer this question, not always feasible

Some other approaches; trend line analysis, forecasting methods, use of experts

20
Q

The Brinkerhoff Case Method

A

The SCM (success case method) was developed to assess impact of organizational solutions. It involves identifying and examining the most and least successful cases in a program.

21
Q

The Brinkerhoff Case Method

A

5 key steps of SCM;

1) focus and plan: a success case study
2) create an impact model: that defines what success should look like
3) design and implement; a survey to search for best and worst cases
4) interview and document; success cases
5) communicate; finding, conclusion and recommendation

SCM is a useful approach for documenting stories of impact that can be shared with stakeholders and used to develop an understanding of the factors that enhanced or impeded program success

22
Q

Balanced Score Card

A

Measures effectiveness of a solution, initiative or practice from 4 perspectives

23
Q

Balanced Score Card

A

4 perspectives

1) customer perspective : did the solution, initiative or practices meet the customer’s need or expectation?
2) innovator and learning perspective: did users gain the needed skills or knowledge
3) internal business perspective: did the solution, initiative or practice have an effect back on the job?
4) financial perspective: did the solution have a financial payoff?

24
Q

Evaluation Approaches

A

Cost benefit analysis: measures monetary gains and losses

Culturally responsive evaluation: holistic framework for centering evaluation in culture

Developmental evaluation approaches: useful in complex or uncertain environments such as innovation, radical program redesign or crisis

25
Evaluation Approaches
HPT Evaluation - Human Performance Technology: stress on analysis of present and desired levels of performance Identifies the causes for the gap and offers a range of interventions Lean six sigma: combines eliminating waste with lean practices and quality improvement from 6 sigma Predictive learning analytics: a systematic methodology for predicting learner outcomes and actions ROE-return on expectations: a Kirkpatrick approach in which stakeholders identify the TD value in terms of how it contributes to their goals
26
Evaluation Approaches
Robinsons training for impact: helps to achieve organizational goals, gives people the skills and knowledge required and produces measurable results that can be traced on the job Six sigma: a disciplined data approach to eliminate defects in process TQM- total quality management: focusing on improving quality and productivity
27
Qualitative & Quantitative Data
Overview of statistics: Statistics are made up of the collection, analysis, display, interpretation and presentation of data Sometimes called data analysis, statistics allows data to be organized and summarized in a way that makes it possible to reach a conclusion TD professionals can use statistics to document current levels of performance (individual, group or organizational) measure the effect of TD initiatives, and data based feedback for change
28
Qualitative & Quantitative Data
Use of statistics in TD 1) summarize large amounts of data 2) determine the relationship between two or more items 3) compare the differences in performance
29
Qualitative & Quantitative Data
Descriptive statistics: summarize the data numerically or geographically in 4 ways 1) measures of frequency: show how often something occurs (count, percent, frequency) 2) measures of central tendency: averages, locate the distribution at specific points and are used to show the most common responses (mean, median, mode) 3) measures of dispersion: variation, show the spread of numbers by starting them in intervals. It is used to show the data spread (range, standard deviation) 4) measures of position: describe how numbers relate to one another and are used to compare a number to a predetermined norm (percentile rank)
30
Qualitative & Quantitative Data
Inferential statistics: uses analysis to infer data about a larger population than was actually sampled and then models the relationship within the data 1) estimation: uses numbers to approximate the data and relate it to the longer population 2) modeling: uses mathematical equations to describe the relationship between two or more variables 3) hypothesis testing: used to determine whether data support the hypothesis
31
Qualitative & Quantitative Data
Additional terms Control groups: in any study or initiative, control group is a group that doesn’t receive treatment, benefit or training to represent a reference point for comparison Correlation: is the association or relationship between two or more variables Data isolation: is the data control that determines when and how a change to data made by one action becomes visible to another. The goal is to allow numerous transactions at the same time without influencing one another.
32
Qualitative & Quantitative Data
Additional terms-2: Frequency distribution: is a list, table or graph that shows the frequency of numbers or items in a sample. The numbers may be summarized using graphs and summary numerals. It can show the actual number of observations falling in each range. In the case of the percentage of observations, the distribution is called a relative frequency distribution. Normal distribution: is a particular way in which observations tend to gather around a certain value instead of being spread evenly across a range of values. This is generally the most applicable to continuous data and is best described by a bell shaped curve.
33
Qualitative & Quantitative Data
Additional terms-3 Outliers: is a data point that is further from others in data set, meaning it’s an unusually large or small value compared with the others. It might be the result of an error in measurement, in which case it distorts interpretation of the data and has an undue influence of many summary statistics. Skewness: is the asymmetry in the distribution of sample data values on one side of the distribution tend to be farther from the middle than values on the other side
34
Quantitative Methods
Yield hard data which are objectives and measurable Can be started in terms such as frequency, percentage, proportion and time Can be used to measure a problem or opportunity numerically and apply statistical analysis to validate a hypothesis Provide hard facts to make decisions about whether the problem is real
35
Qualitative Methods
Yield soft data which are more intangible, anecdotal, personal and subjective Collects data through focus groups, interviews or sources such as observer notes and survey comments Data may be difficult to express in numbers as the analysis is often descriptive The analysis of qualitative data identifies common themes and atypical data, which is then categorized by specific topics Knowing how employees feel (qualitative) about a skill is just important in a program’s final design as knowing how well (quantitative) they perform the skill
36
Data Collection
Quantitative Data Sources ``` Surveys and questionnaire Analytics from technology platforms Exam and assessments Self evaluations Simulation and observation Archival or extant data (existing records, reports and data) ```
37
Data Collection
Qualitative Data Sources ``` Focus groups Interviews Comments from survey and questionnaire Notes from observations Benchmarking Impact analysis ``` TD professionals should combine both measures in a data collection process to ensure the results are both useful and accurate.
38
Data planning - storage
Automatic calculation table: the cells in an automatic calculations table contain formula that extract and calculate info from the data entry table Automatic chart: analysis of data includes presenting data in various charts such as bar charts and line charts to find trends Data management, security, protection, retain collected data, name spreadsheet files or databases for later retrieval and ensure system sharing the info is secure
39
What is mean?
Average Takes into account the quantitative values of each number It equals the sum of all numbers divided by the number of values that make up the sum.
40
What is median?
Is the middle of a distribution arranged by magnitude (half below half above) The median is less sensitive to extreme scores than the mean. To determine the median, order the numbers from smallest to largest The median in distribution of odd numbers is the middle number, for even numbers it’s the calculated average of 2 middle numbers
41
What is mode?
It is the most frequently occurring score in a distribution and as a result is also used as measures of central tendency. Since it is subject to sample fluctuations, mode is not recommended for use as the only measure of central tendency Some distributions have more than one mode which is called multimodal
42
What is range?
It is the two highest and lowest numbers
43
What is the standard deviation?
It is the difference between the number and the mean
44
What is standard score?
It is the number that has the same mean and standard deviation for comparing what is normal for a defined population