Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

management oriented evaluation approaches

No description

sinem yılmaz

on 30 May 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of management oriented evaluation approaches

Thank You!
This approach in education is meant to serve decision makers
Developers of the management oriented evaluation approach and their contributions:
The most important contributions have been made by Stufflebean and Alkin
•Context Evaluation- planning decisions
determine needs to define objectives of the program
management oriented evaluation approaches
Decisions are made about inputs, processes, and outputs
This approach clarifies who will use the evaluation results, how they will use them, and what aspect of the system they are making decisions about.
Decision makers is the audience
Their concerns, informational needs, and criteria for effectiveness guide study
Both Stufflebean and Alkin make the decisions of the program managers the pivotal organizer for the evaluation.
In the model proposed by both theorists, the evaluator working closely with administrator, identifies the decisions the administrator must made and then collects sufficient information about the relative advantages and disadvantages of each decision alternative to allow a fair judgment based on specified criteria.
Collect data to determine goals of program
Ask “What should we do? “
•Input Evaluation- structuring decisions
Ask “How should we do it?”
Involves determining the steps and resources needed to accomplish goal or objective
•Process Evaluation- implementing decisions
Ask “Are we doing it as planed?”
To evaluate how well the plan has been implemented
•Product evaluation- recycling decisions
lt allows the decision maker to compare actual outcomes to anticipated outcomes and determine if the program should be continued, modified or dropped?
As a logical structure for designing each type of evaluation, Stuffleam proposed that evaluators follow these steps:
–Focusing the Evaluation
1) identify the major level of decision making to be served
2) for each level of decision making, project the decision situations to be served and describe each one in terms of its locus, focus, criticality, timing and composition of alternatives
3) define criteria for each decision situation by specifying variables for measurement and standards for use in the judgment of alternatives
–Collection of Information
1) specify the source of the information to be collected
2) specify the instruments and methods for collecting the needed information
3) specify the sampling procedure to be employed
4) specify the conditions and schedule for information collection
–Organizing Information
1) provide a format for the information to be collected
2) designate a means for performing the analysis
–Analysis of Information
1) select the analytical procedures to be employed
2) designate a means for performing the analysis
–Reporting of Information
1) define the audiences for the evaluation reports

2) specify the means for providing information to the audiences

3) specify the format for evaluation reports and/or reporting sessions
4) schedule the reporting of information
–Administration of the Evaluation
1) summarize the evaluation schedule
2) define staff and resource requirements and plans for meeting these requirements
3) specify means for meeting policy requirements for conduct of the evaluation
4) evaluate the potential of the evaluation design for providing information that is valid, reliable, credible, timely, and pervasive
5) specify and schedule means for periodic updating of the evaluation design
6 provide a budget for the total evaluation program

The UCLA evaluation model:
Alkin's model includes five types of evaluation:
1–Systems assessment, to provide information about the state of the system.

2–Program planning, to assist in the selection of particular programs likely to be effective in meeting specific educational needs

3–Program implementation, to provide information about whether a program was introduced to the appropriate group in the manner intended

4–Program improvement, to provide information about how a program is functioning, whether interim objectives are being achieved, and whether unanticipated outcomes are appearing
5–Program certifications, to provide information about the value of the program and its potential for use elsewhere
Eventhough it seems that both the CIPP and UCLA are linear and sequential, it is not the case. For example, the evaluator does not have to complete context or input evaluation in order to undertake one of the other types of evaluation listed in the framework
Development of the CIPP model
Context - Stufflebeam advanced the procedure for conducting a context evaluation with his guidelines for designing a needs assessment for an educational program or activity.
Input- Reinhard developed a guide for use in input evaluation called the “advocate team technique”. It is used when acceptable alternatives for designing a new program are not available or obvious.
Process - Cronbach provided useful suggestions for the conduct of process evaluation

Other management oriented evaluation approaches:

•Provus’s Discrepancy Evaluation Model

–Described as an objectives-oriented evaluation model but has parallels to management-oriented evaluation models
– It is systems-oriented
- Focuses on input, process, and output at each of the 5 stages of evaluation: program definition, program installation, program process, program products and cost benefit analysis

•The Utilization-Focused Evaluation Approach of Patton
- The process of identifying and organizing relevant decision-makers and information-users is the first step in evaluation
- Asserts that the use of evaluation findings requires that decision makers determine what information is needed by various people and arrange for information to be collected and provided to those person
How the Management Oriented Evaluation Approach Has Been Used
The management oriented approach to evaluation has guided educators through program planning, operation and review
This evaluation approach has also been used for accountability purposes. It provides a record keeping framework that facilitates public review of educational needs, objectives, plans, activities, and outcomes.
Two uses of the CIPP model
Strengths of Management Oriented Evaluation Approach
It gives focus to the evaluation
It stresses the importance of the utility of information

It is instrumental in showing evaluators and educators that they need not wait until an activity or program has run its course before evaluating

Preffered choice in the eyes of most school administartors and boards
Helps the evaluator generate potentially important questions
Easy to explain to lay audiences

Supports evaluation of every component of a program as it operates, grows, or changes

Stresses the timely use of feedback by decision makers

•Evaluator’s occasional inability to respond to questions or issues that may be significant
•Programs that lack decisive leadership are not likely to benefit from this approach to evaluation
•Preference is given to top management
•Can make the evaluator the “hired gun” of the program establishment
•Evaluation can become unfair and possibly even undemocratic

If followed in its entirety, the management oriented approach can result in costly and complex evaluations
The management oriented evaluators need to be realistic about what work is possible and not to promise more than can be delivered
This evaluation approach assumes that the Important decisions can be clearly identified in advance, that clear decision alternatives can be specified, that the decisions to be served remain reasonably stable while the evaluation is being done. All of these assumptions about the orderliness and predictability of the decision making process are suspect and frequently unwarranted.
The CIPP Evaluation Model
Stufflebean developed an evaluation framework to serve managers and administrators facing four differerent kinds of educational decisions:
includes continuously
monitoring the program
it provides an idealized view of the program being evaluated by describing how the program or process should work rather than a realistic portrayal
Full transcript