Sunday, 18 June 2017

CIPP model
Use of the Context, Input, Process, and Product Evaluation Model (CIPP Model)

A.    Evaluation of Planning (Connected to Strategic Planning)

1.      Context Evaluation (C): provides information for the development of and evaluation of mission, vision, values, goals and objectives, and priorities
a.       purposes

(1)   define the characteristics of the environment
(2)   determine general goals and specific objectives
(3)   identify and diagnose the problems or barriers which might inhibit achieving the goals and objectives

b.      tasks

(1)   define the environment, both actual and desired
(2)   define unmet needs and unused opportunities
(3)   diagnose problems or barriers

c.       methods

(1)   conceptual analysis to define limits of population to be served
(2)   empirical studies to define unmet needs and unused opportunities
(3)   judgment of experts and clients on barriers and problems
(4)   judgment of experts and clients on desired goals and objectives

2.      Input Evaluation (I): provides information for the development of program designs through evaluation of data bases, internal and external stakeholders’ interests, WOTS UP? (Weaknesses, Strengths, Threats, and Opportunities).

a.       purposes

(1)   design a program (intervention) to meet the objectives
(2)   determine the resources needed to deliver the program
(3)   determine whether staff and available resources are adequate to implement the program

b.      tasks

(1)   develop a plan for a program through examination of various intervention strategies
(a)    examine strategies for achieving the plan
-          time requirements
-          funding and physical requirements
-          acceptability to client groups
-          potential to meet objectives
-          potential barriers

(b)   examine capabilities and resources of staff

-          expertise to do various strategies
-          funding and physical resources
-          potential barriers

(2)   develop a program implementation plan which considers time, resources, and barriers to overcome
B.     Evaluations of Implementation (connected to tactical planning)

1.      Process Evaluation (P): develop ongoing evaluation of the implementation of major strategies through various tactical programs to accept, refine, or correct the program design (i.e. evaluation of recruitment, orientation, transition, and retention of first year students).

a.       purpose

(1)   provide decision makers with information necessary to determine if the program needs to be accepted, amended, or terminated.

b.      tasks

(1)   identify discrepancies between actual implementation and intended design
(2)   identify defects in the design or implementation plan

c.       methods

(1)   a staff member serves as the evaluator
(2)   this person monitors and keeps data on setting conditions, program elements as they actually occurred
(3)   this person gives feedback on discrepancies and defects to the decision makers

2.      Product Evaluation (P): evaluation of the outcome of the program to decide to accept, amend, or terminate the program, using criteria directly related to the goals and objectives (i.e. put desired student outcomes into question form and survey pre- and post-). Loop back to the original objectives in the Context Evaluation (C) to see if and how these would be changed or modified based on the data.

a.       purpose

(1)   decide to accept, amend, or terminate the program
b.      task
(1)   develop the assessment of the program

c.       methods

(1)   traditional research methods, multiple measures of objectives, and other methods

II.                Developing the Structure for Evaluation Designs: the structure is the same for Context, Input, Process, or Product Evaluation

A.    Focusing the Evaluation

1.      Identify the major level(s) of decision-making to be served.
2.      For each level of decision-making, describe each in terms of the main focus, timing, and composition.
3.      Define outcome criteria for each decision situation by identifying things to be measured and standards for use in judging the outcomes
4.      Define policies within which the evaluation needs to operate.

B.     Collection of Information

1.      Specify the source of the information to be collected.
2.      Specify the instruments and methods for collecting the needed information.
3.      Specify the sampling procedure to employed.
4.      Specify the conditions and schedule for collecting the information.

C.    Organization of Information

1.      Provide the format for the information which is to be collected.
2.      Designate a means for coding, organizing, storing, and retrieving information.

D.    Analysis of Information

1.      Select the analytical procedures to be employed.
2.      Designate a means for performing the  analysis

E.     Reporting the Information

1.      Define the audiences for the evaluation reports.
2.      Specify means for providing the information to the audiences.
3.      Specify the format for evaluation reports and/or reporting sessions.
4.      Schedule the reporting of information.

F.     Administration of the Evaluation

1.      Summarize the evaluation schedule.
2.      Define staff and resource requirements and plans for meeting these requirements.
3.      Specify means for meeting policy requirements for conduct of the evaluation.
4.      Evaluate the potential of the evaluation design for providing information which is valid, reliable, credible, timely, and pervasive.
5.      Specify and schedule means for periodic updating of the evaluation design.
6.      Provide a budget for the total evaluation program.

III.             Relationships Among Program Planning, Evaluation, and Decision-Making

Program Planning Stage                 Evaluation Type                Decision Type

1. Assessment Planning                   Context Eval.                     Planning Decision
2. Develop Goals and Obj.              Context Eval.                      Planning Decision
3. Design a Program                         Input Eval.                         Procedural Decision
4. Design and Implement                  Input Eval.                         Procedural Decision
5. Pilot Run the Program                  Process Eval.                      Implement. Decisions
6. Evaluation the Pilot Run              Process Eval.                       Recycling Decisions
7. Adopt, Amend, Drop It                Product Eval.                       Recycling Decisions
8. Institutionalize Program               Process Eval.                        Implement. Decision
9. Evaluate InstituProgram             Product Eval.                        Recycling Decisions

No comments:

Post a Comment