T&D Week 9 Evaluating Training PDF

Title T&D Week 9 Evaluating Training
Author Emma Johnston
Course human resources management
Institution George Brown College
Pages 3
File Size 146.9 KB
File Type PDF
Total Downloads 72
Total Views 193

Summary

TRAINING & DEVELOPMENT...


Description

TRAINING AND DEVELOPMENT - WEEK 9: EVALUATING TRAINING What, why, who, when, where, how Objectives  Apply techniques to evaluate training  Produce evaluation materials  Evaluate alternative approaches to staff development using cost benefit (the analysis to see how effective the cost is) and cost effectiveness (for every $ you spent, how much benefit in return) analyses techniques Outline  What is Evaluation: Definitions, Types, Techniques & tools  Why evaluates training  When to conduct various tasks required for effective evaluation  Barriers to evaluation 1. What is Training Evaluation? Definition: The systematic collection & analysis of descriptive and judgmental information necessary to make effective decisions related to the selection, adoption, value and modification of various instructional activities Why Evaluate Training? Three reasons: 1) To determine if training achieved expected results 2) To improve training 3) To establish cost-benefits of training 2. 1) 2) 3)

Types of Evaluation Summative/Outcomes – when need to determine if training achieved desired results Formative/Process – when want to improve training that will be delivered again Financial – when you need to determine if the benefits of the training are worth the costs

    

Aspects of Training to Assess Trainee reactions & perceptions Trainee learning – during & immediately after training Trainee behaviour back on the job Organizational results Process of designing & delivering training

         

Training Evaluation Models: Many different approaches to choose from, when evaluating training: COMA (cognitive, organizational, motivation, attitudinal) Decision-based evaluation Internal referencing strategy CIPP (context, input, process, product) Experimental research models (causal evaluation designs) Explication model Connoisseurship and criticism models Kirkpatrick model Accountability studies +++

2.1. Summative Evaluation: Kirkpatrick’s Evaluation Model 1) Trainee reactions 2) Trainee learning 3) Trainee job behaviour change 4) Organizational results …. Plus, some advocate …. 5) Financial benefit OR Impact on society (Kirkpatrick 1967, 1987, 1994)

Level 1 - Reactions  Affective – participants’ likes and dislikes re training  Utility – participants’ perceptions of training’s usefulness Techniques for gathering reaction data: Questionnaires, Focus groups, Interviews, E-mail, Comment cards Techniques to analyze reaction data:  Tabulations, cross-tabulations, basic statistics  Interpretation, grouping (phân nhóm), sorting (phân loại) Level 2 – Learning: Learning objectives - Knowledge, Skill, Attitude change Techniques for gathering learning data:  Tests (written - MC, TF, etc.)  Demonstrations & explanations  Discussion, interview  Self-report instruments (e.g., attitude scales) Techniques for analyzing learning data:  Raw scores, percentage correct, basic statistics  Interpreting, grouping, sorting, coding Level 3 – Job Behaviour: Degree to which participant applied new KSAs on the job (training transfer) Techniques to gather job behavior change data include: Self-report, Observation, Production indicators, 360° reports, Questionnaires, & Diary/log of behaviour. Level 4 - Organizational Results: Degree to which the training affected organizational performance Organizational results data should be:  Same indicators that were used to establish need for training in TNA  Examples include measures of waste, productivity, customer complaints, voluntary absenteeism, grievances, sales, student performance, customer satisfaction, product returns, etc. 2.2. Formative Evaluation: Text refers to formative evaluation as Process Evaluation Conduct formative evaluation when the training will be delivered more than once & you want to improve it. Gather process data:  Trainer’s notes & recollection of actual session vs plan  Trainees’ reactions at various points in session  Training materials  Observer’s notes Analyze information to determine what works, what doesn’t and to make decisions about changes to improve any aspect of the training 2.3. Financial Evaluation  Cost/Benefit: Compares the cost of training with non-monetary benefits  Cost-Effectiveness: Compares the cost of training to its financial benefits – Cost Savings/Net Benefit: Calculation of $ saved after cost of training subtracted – Return on Investment: Ratio calculation - cost of a training program relative to its financial benefits Training Costs  Direct costs – course materials, instructional aids, equipment rental, travel, refreshments, trainer’s salary & benefits  Indirect costs – incurred in support of training activities but cannot be linked to specific program; costs would not be recovered if program canceled at last minute; includes overhead  Development costs – design of program materials, piloting program, re-design, front end assessment, evaluation & tracking  Trainee costs – salary & benefits paid to participants during training session  Which to include depends on organization

Cost Benefit  Use when difficult to calculate financial benefits of training  Examples include improvements in attitudes, safety, employee well-being, customer satisfaction Cost Saving/Net Benefit  AKA Net Benefit (Profit after tax/ Positive net benefits)  The $ value of the performance improvement minus the cost of training  Monetary benefits – organizational outcomes that are easily measured in $ figures Return on Investment ROI = $ benefits /total cost (the amount of benefit or effectiveness gained for the money you put in)  Return of $3 (after cost of training subtracted) for every $1 invested on training – As a ratio ROI is 3:1 – As a percentage ROI is 300%  The greater the ratio of benefits to costs, the greater the benefit the organization receives by delivering the training  If ROI ratio is less than 1 or as a percentage less than 100%, then the training costs more than it returns to the organization 3. Conclusion: Evaluation & ISD (Instructional Systems Design)

       

Evaluation in the ISD Process Decide purpose (& therefore type) of evaluation early in design stage Identify expected user of evaluation Prepare tools to gather evaluation information; decide when to use each Gather information throughout training If summative , compile & analyze as training continues If formative , compile & analyze when training completed If ROI or utility analysis , compile & analyze when training complete Report as appropriate

4) Summative/Outcomes – when need to determine if training achieved desired results 5) Formative/Process – when want to improve training that will be delivered again 6) Financial – when you need to determine if the benefits of the training are worth the costs

Conclusion: Barriers to Evaluation 1) Political  possibility of poor results threatens players 2)    

Pragmatic lack of knowledge & skill lack of data evaluating uses scarce resources no one in authority asks for...


Similar Free PDFs