Evaluation

Evaluation

Chapter 13

HESC 400 Ransons

HESC 400 Ransons

*

General Categories of Evaluation

  • Informal evaluation

characterized by absence of breadth and depth because of the lack of systematic procedures and formally collected evidence (Fitzpatrick et al., 2004)

  • Formal evaluation

characterized by “systematic, well-planned procedures” (Williams & Suen, 1998, p. 308); control of extraneous variables

HESC 400 Ransons

HESC 400 Ransons

*

Definitions

  • Evaluation

“a process of reflection whereby the value of certain actions in relation to projects, programs, or policies are assessed” (Springett, 2003, p. 264)

“the comparison of an object of interest against a standard of acceptability” (Green & Lewis, 1986, p. 362)

  • Standard of acceptability

minimum levels of performance, effectiveness, or benefits used to judge the value (Green & Lewis, 1986)

HESC 400 Ransons

HESC 400 Ransons

*

Types of Standards of Acceptability

  • Mandates of regulating agencies (ex. % of people wearing safety belts or children immunized)
  • Priority population health status (ex. morbidity & mortality rates)
  • Values of a community (ex. curriculum)
  • Standards of professional groups (ex. CHES)
  • Norms from research (ex. body fat)
  • Norms from previous programs (ex. smoking cessation or weight loss expectations)
  • Comparison & control groups (ex. research studies)

HESC 400 Ransons

HESC 400 Ransons

*

Evaluation Terms

  • Process Evaluation
  • Impact Evaluation
  • Outcome Evaluation
  • Formative Evaluation
  • Summative Evaluation

HESC 400 Ransons

HESC 400 Ransons

Process evaluation

  • Evaluation done during the implementation of the program.
  • Improving the quality of the program as it is being delivered.
  • Examples:

Finding out if the time the program is held is acceptable.

Are the speakers/presenters effective?

  • Can be done through:

Questionnaire

Focus Group

Observation

HESC 400 Ransons

HESC 400 Ransons

Impact evaluation

  • Evaluating short-term, immediate effects of the program.
  • Completed at the end of the program.
  • Measures:

Awareness

Behavior/Attitude changes

Knowledge gain

  • There’s no definitive line between Impact and Outcome Evaluation.

HESC 400 Ransons

HESC 400 Ransons

Outcome evaluation

  • Evaluating long-term, over time effects of the program.
  • More resources and time are needing compared to Impact Evaluation.
  • Measures the ultimate goal of the program
  • Examples:

Reduction in health care costs

Decline in morbidity or mortality

Met the goals and objectives of the program

HESC 400 Ransons

HESC 400 Ransons

Formative & Summative Evaluation

  • Formative evaluation:

Any measurements made before or during the implementation of the program.

Improving the quality of the program as it is being delivered.

  • Summative evaluation:

Measurements made at completion of the program.

Effects of the program, benefits, etc.

HESC 400 Ransons

HESC 400 Ransons

*

Comparison of Evaluation

Start of End of

Planning Implementation Implementation

Process

Impact

Outcome

Formative

Summative

HESC 400 Ransons

HESC 400 Ransons

*

Purpose of Evaluation
(Capwell, Butterfoss, & Francisco, 2000)

  • To determine achievement of objectives
  • To improve program implementation
  • To provide accountability to stakeholders
  • To increase support for initiatives
  • To contribute to the scientific base
  • To inform policy decisions

HESC 400 Ransons

HESC 400 Ransons

*

Evaluation Framework

  • A starting point for tailoring your evaluation

*

Evaluation Framework

Step 1

Engaging Stakeholders

  • Who are the stakeholders?

Those involved in program operations

Those served or affected (directly or indirectly) by the program

Primary users of the evaluation

  • Why should stakeholders be involved?

Better chance for useful evaluation

Improve credibility

Ethical concerns (e.g., conflict of interest)

Understand those involved

  • How much should stakeholders be involved?

HESC 400 Ransons

HESC 400 Ransons

*

Evaluation Framework

Step 2

Describe the Program

  • Sets frame of reference for evaluation process.
  • Should include:

Missions, Goals, Objectives

Capacity to effect change

How it can fit into an organization or community

  • Usually, logic models are used in this step

HESC 400 Ransons

HESC 400 Ransons

Evaluation Framework

Step 3

Focusing the Evaluation Design

  • Interests of the stakeholders are addressed

Using resources and time efficiently

  • Should include:

Purpose of evaluation

Gain insight, assess effect, etc.

Who will be using the evaluation results?

Formulating the evaluation questions

Determining design type

HESC 400 Ransons

HESC 400 Ransons

Evaluation Framework

Step 4

Gathering Credible Evidence

  • Should decide and consider:

Measurement indicators

Sources of evidence

Quality and quantity of evidence

Logistics for collecting evidence

  • Discussed in Chapter 5: Measurements, Measures, Data Collecting, & Sampling

HESC 400 Ransons

HESC 400 Ransons

Evaluation Framework

Step 5

Justifying Conclusions

  • Comparison of evidence to the standard of acceptability

Interpreting those comparisons

Judging the worth and significance of the program

Actions recommended

HESC 400 Ransons

HESC 400 Ransons

Evaluation Framework

Step 6

Ensuring Use and Sharing Lessons Learned

  • Using and disseminating the evaluation results
  • Keep in mind your Stakeholders

HESC 400 Ransons

HESC 400 Ransons

Evaluation Framework

Standards

  • Utility – needs are satisfied
  • Feasibility – realistic/ affordable
  • Propriety – ethical
  • Accuracy – correct

*

Practical Problems or Barriers
in Evaluation

  • Fail to plan for evaluation
  • Inadequate resources
  • Organizational restrictions
  • Effects hard to detect; small,slow coming, don’t last
  • Time allocated to evaluation
  • Restrictions in data collection
  • Difficult to distinguish between cause & effect

HESC 400 Ransons

HESC 400 Ransons

*

Practical Problems or Barriers
in Evaluation

  • Difficult to evaluate multistrategy interventions
  • Conflict between professional standards & do-it-yourselfers over appropriate design
  • Sometimes people’s motives get in the way
  • Stakeholders’ perceptions of the evaluation’s value
  • Intervention not delivered as intended

HESC 400 Ransons

HESC 400 Ransons

Evaluation

  • Evaluation must reflect the goals and objectives of the program
  • The evaluation must be planned in the early stages of planning
  • Ethical Considerations

Evaluation should never cause mental, emotional, or physical harm to those in the priority population

Participants should always be informed of the purpose and potential risks

HESC 400 Ransons

HESC 400 Ransons

Who will conduct the evaluation?

  • Internal evaluation – Advantages (Fitzpatrick et al., 2004)

More familiar with organization & program

Knows decision making style of organization

Present to remind people of results

Able to communicate results more frequently & clearly

  • External evaluation – Advantages (Fitzpatrick et al.,2004)

More objective; fresh outlook

Can ensure unbiased evaluation outcome

Brings global knowledge

Typical brings more breath & depth of technical expertise

  • Combination of internal & external

HESC 400 Ransons

HESC 400 Ransons

*

Evaluation Results

  • Who will receive them?
  • In what form will they be delivered?
  • Different stakeholders may want different questions answered
  • The planning for the evaluation should include a determination of how the results will be used.

HESC 400 Ransons

HESC 400 Ransons

*

EVALUATION APPROACHES AND DESIGNS

Chapter 14

HESC 400 Ransons

HESC 400 Ransons

*

Selecting an Evaluation Design

What can be expected from the program?

Determining what is to be evaluated.

HESC 400 Ransons

HESC 400 Ransons

*

Basic Design Decision:
Types of Data

  • Quantitative (deductive; applying principle to case)

Deals with numbers

Data can be transformed into numbers

Analysis largely statistical

Designs with control

  • Qualitative (inductive; examining case to form principle)

Deals with words

Uses interviews & observational techniques

Analysis & reporting mostly narrative

HESC 400 Ransons

HESC 400 Ransons

*

Qualitative Methods Used in Evaluation
(McDermott & Sarvela, 1999)
See Box 14.2, Page 400

  • Case studies
  • Content analysis
  • Delphi technique
  • Elite interviewing
  • Ethnographic studies
  • Film ethnography
  • Focus groups
  • Historical analysis
  • In-depth interviewing
  • Kinesics
  • Nominal group
  • Participant-observer studies
  • Quality circle
  • Unobtrusive techniques

HESC 400 Ransons

HESC 400 Ransons

*

Ways to Integrate Qualitative & Quantitative Methods
(Steckler, McLeroy, Goodman, Bird, & McCormick, 1992)

HESC 400 Ransons

HESC 400 Ransons

*

Ways to Integrate Qualitative & Quantitative Methods
(Steckler, McLeroy, Goodman, Bird, & McCormick, 1992)

*

Participants (Groups) of Evaluation

  • Experimental group – those who receive the intervention
  • Groups use to compare

Control group – those who do not receive the intervention; have been randomly assigned to group

Comparison group – those who do not receive the intervention; have not been randomly assigned to group

HESC 400 Ransons

HESC 400 Ransons

*

Hierarchy of Possible Designs
(from least control to most control)

  • Pre-experimental

no comparison

  • Quasi-experimental

defined by comparison of intact groups

comparison groups

  • Experimental

defined by comparison after randomization

control groups

HESC 400 Ransons

HESC 400 Ransons

*

There are many different possible designs…

*

O1 X O2

  • Where:

O1 = average number of cigarettes smoked in 24 hours; measured via self-report one month prior

X = four week, eight session, smoking cessation program

O2 = average number of cigarettes smoked in 24 hours; measured via self-report one month after the last session

A basic evaluation design:

HESC 400 Ransons

HESC 400 Ransons

*

Validity (External & Internal)

  • Internal Validity

the degree to which the program (intervention, treatment, independent variable) & not extraneous factors (confounding variables) cause the change that was measured.

  • External Validity

the extent to which the program (intervention, treatment, independent variable) can be expected to produce similar effects in other populations (generalizability).

HESC 400 Ransons

HESC 400 Ransons

*

Threats to Internal Validity

  • History
  • Maturation
  • Testing (e.g., pre-testing)
  • Instrumentation
  • Statistical regression
  • Selection
  • Mortality
  • Diffusion or imitation of interventions
  • Compensatory equalization or rivalry
  • Resentful demoralization
  • Interaction of several threats

HESC 400 Ransons

HESC 400 Ransons

*

Threats (reactive effects) to External Validity

  • Social desirability
  • Expectancy effect
  • Hawthorne effect
  • Placebo effect
  • Multiple X interference

Priority Population #1

Priority Population #2

HESC 400 Ransons

HESC 400 Ransons


Comments are closed.