Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.



No description

Beth Peddle

on 25 February 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of CAROL H. WEISS

Many Influences
Work Experience
Scholarly Influences
Worked on
War on Poverty
Published key Evaluation books
Ph.D. (Soc) Columbia
Hired, Harvard Grad School of Education
Beatrice B. Whiting Professor of Educ.
WEISS, Carol (Hirschon) Died on Tuesday, January 8, 2013 at the age of 86 of complications of heart disease and cancer. She was the Beatrice Whiting Professor Emerita at the Harvard University Graduate School of Education, where she had worked since 1978. She was born and raised in New York, graduated from Cornell University, and earned an M.A. and Ph.D. at Columbia University. Her professional career was devoted to the study of evaluation research, where she was one of the founders of the scientific approach to evaluation of social programs. At the Bureau of Applied Social Research at Columbia University, and later at Harvard, she also conducted groundbreaking studies of the use of social research in making public policy decisions. With many colleagues and students, she created a field of study of research utilization. For her professional accomplishments, she won many honors and awards, including fellowships at the Center for Advanced Study in the Behavioral Sciences, the Brookings Institution, and the Myrdal Award from the Evaluation Research Society. Through her many books, articles, and training, she helped to create a professional cadre of evaluators who knew how to use scientific methods. She traveled and consulted widely, and was influential in encouraging the use of evaluation by dozens of U.S. government agencies, international agencies, and governments in Africa, Asia, Australia, and Europe, in fields as diverse as international development, public health, crime and delinquency, agriculture, mental health, and education. After her retirement in 2006, she became a docent at the Boston Museum of Science. Carol is survived by her husband of 64 years, Malcolm A. Weiss, by her children Daniel, Judith, and Janet, her son-in-law Donald Kinder, and by four grandchildren, Benjamin, Samuel, Jacob, and Delilah. She also leaves scores of devoted friends, former students, and colleagues...
Many publications!
- Her parents: "their social and political conscience was transmitted to me from my earliest days" (Alkin, 2004, p. 163)
- Public school education in New York during the 1930-1940s
- Liberal Arts Bachelor and Master's degrees in Political Science
- Mid-1960s she evaluated a Harlem-based training project as part of Lyndon Johnson's "War on Poverty"
- The resultant 3-volume report did not even receive a response
- To evaluate the programs, Weiss had to teach herself; she began writing guides for future evaluators and working on her Ph.D.
(Graff & Cristou, 2001)
Influenced by:
- Social scientists of 20th century:
Jim March, Ed Lindblom, Lee Cronbach
- Campbell's work on experiments and
- House, Fetterman, Patton and especially
Rossi as a comprehensive resource
... as applied research
- most important part of evaluation is the choice of, and adherence to, research methods
- not necessarily quantitative; qual is good too
- many types of evaluation; all are research
- research methods is the evaluators expertise-what they bring to the evaluation
Evaluation Research: Methods of Assessing Program Effectiveness (1972); Evaluation Methods for Studying Programs and Policies (1998)
... as seeking objectivity
- complete neutrality/objectivity is impossible
- evaluators' beliefs and biases affect everything
- "there is such a thing as more or less objectivity"
- it's important to suspend judgement, keep an open mind, collect data and analyze systematically - aim for objectivity throughout!
Truth tests and utility tests: Decision-makers' frames of reference for social science research (1980); Nothing as practical as good theory: Exploring theory-based evaluation for comprehensive community initiatives for children and families (1995)
... as professional practice
- professionalism involves hearing all relevant viewpoints-don't focus on the "sponsor"
- ensure marginalized groups have a voice
- evaluation takes time; rushing is a disservice
- "take the time and use professional skills to research conclusions that are likely to stand up to critical scrutiny" (p. 156)
The Stakeholder Approach to Evaluation: Origins and Promise (1983); Interview with Karen Horsch
... as acknowledging its limits
- "evaluators have a responsibility to point out the limitations in their work" (p. 156)
- "Overpromising" to get the contract is damaging for the evaluator, the client and the field of evaluation
See examples in methods.
... as enmeshed in politics
- evaluation is a political activity in 3 ways:
1. programs are born through political process
2. evaluation results are used in decision-making and policy-making (often political)
3. evaluation shows a political opinion of any given program; it says (a) it is worth evaluating, and (b) it is flawed enough to need an evaluation
Where politics and evaluation research meet (1993); The fairy godmother and her warts: Making the dream of evidence-based policy come true (2008)
... and program theory
- evaluation often identifies things that work and don't, but not WHY this is the case
- program theory helps inform the evaluator to infer/deduct the WHY when RCTs impossible
- to generalize learning from any evaluation, lessons must connect to the program's interventions and assumptions
Theory-based evaluation: Past, present and future (1997); Which links in theories shall we evaluate? (2000)
... and "cumulation
of evaluation evidence"
- we should build the field of research on evaluation
- use meta-analysis to build on evaluation conclusions and find patterns in successes
- by identifying underlying assumptions, evaluators can review a wide range of seemingly diverse programs (that share those assumptions)
How Can Theory Based Evaluation Make Greater Headway? (1997); What kind of evidence in evidence-based policy? (2001)
... use!
- evaluation's main purpose = USE
- rare that use is tangible or immediate; special cases:
(1) results provide support for managers' beliefs
(2) organization in crisis turns to eval for new direction
(3) new admin are very receptive to critique/innovation
- more often a long-term percolation of ideas from evaluation into discourse: "enlightenment"
decision accretion
= Weiss coined the term, meaning "the build up of small choices, the closing of small options and the gradual narrowing of available alternatives"
Have we learned anything new about the use of evaluation? (1998); Using social research in public policy making (1977); The many meanings of research utilization (1979)
Also, she's a role model
- As a female, she was a minority among PhD graduates, academics and among evaluation theorists
- Even now, there are fewer female theorists, though that seems to be gradually changing
(e.g. Preskill, Greene, Stame)
- She accidentally ensured
this was not a "men only" field
(as many fields still were in
the 60s, 70s, 80s, etc)
Google Scholar lists these six as her most cited publications
1. Methods for Assessing Program Effectiveness (1972)
- Book, cited 1586 times
2. The Many Meanings of Research Utilization (1979)
- Article, cited 1170 times
3. Using Social Research in Public Policy Making (1977)
- Book, cited 670 times
4. Social Science Research and Decision-Making (1980)
- Book, cited 623 times
5. Nothing as practical as good theory: Exploring theory-based evaluation for comprehensive community initiatives for children and families (1995)
- Article, cited 599 times
6. Knowledge Creep and Decision Accretion (1980)
- Article, cited 566 times
General Critique:
- Concentration on quantitative methods; she studied/published very little about qualitative
- She concentrated on public policy contexts; some of her conclusions applied less to local contexts
- Her discussions of use understate the importance of process use
- These critiques are largely due to the time period in which she worked; qualitative methods and process use have been greatly developed in the last two decades
(Smith & Chircop, 1989)
Promoting the use of rigorous research methodology in evaluation
Promoting and furthering evaluation however possible (methods, legitimacy, credibility, and use)

Research on the relationship between evaluation and decision-making.
- Guides for using evaluations in decision-making
- Research on how decisions are made (i.e. decision accretion), and how evaluation conclusions are included in those
Alkin, M. C. (2004)
Evaluation Roots: Tracing Theorists’ Views and
. Thousand Oaks, CA: SAGE Publications.
Alkin, M. C. (2012)
Evaluation Roots: A Wider Perspective of
Theorists’ Views and Influences
. Thousand Oaks, CA: SAGE
Carden, F. & Alkin, M. C. (2012). Evaluation roots: An international
Journal of MultiDisciplinary Evaluation 8
Dale, S. (n.d.).
In Conversation: Carol Weiss and Evert Lindquist on
policymaking and research.
Retrieved from http://www.idrc.ca/
Horsch, K. (1998). Questions & Answers: Interview with Carol H.
The Evaluation Exchange 4
(2), 5-6.
Smith, N. L. & Chircop, S. (1989). The Weiss-Patton debate: The
illumination of the fundamental concerns.
American Journal of
Evaluation 10
(1), 5-13.
Those referenced in the contributions slides, and:
Full transcript