Planning and Evaluation - Rutgers School of Criminal Justice

Planning and Evaluation
Spring Semester 2015
27:202:529:01
Class time:
Location:
Instructor:
Instructor’s Office:
Office Hours:
Email:
Monday, 6.00pm – 8.40pm
Center for Law and Justice, Room CLJ 572; SCJ Computer Lab
Professor Joel Miller
Room 549
Center for Law and Justice
123 Washington St.
Monday 4pm-6pm, by appointment
joelmi@rutgers.edu
Course description
Planning and Evaluation is the second part of a two-course sequence, along with Problem Analysis,
focused on the development of knowledge and skills relating to research methods and statistics.
Together, the courses give students the skills to conduct research on crime and criminal justice
problems, develop appropriate policy-responses to them, and to evaluate the success of responses.
These skills are relevant to the work of analysts, practitioners and managers within criminal justice
and social service agencies, as well as researchers and scholars.
The first course examined ways to obtain, analyze, and use data to identify and describe
“problems” that could be the target of interventions. This follow-up course focuses on how to
design interventions to address problems identified and described through problem analysis. The
course tries to avoid “cookie-cutter” or “one-size-fits-all” programs, instead focusing on tailored
targeted problem responses. The course also teaches students how to monitor and evaluate
interventions to assess whether they are implemented as planned, and whether they produce
desired (or undesired) outcomes. In doing so, it emphasizes practical, real-world, budget-conscious
approaches to evaluation, alongside the kinds of high-budget experimental evaluations that are
prized in conventional academic research literature.
Course outcomes
At the end of this course, students will be able to:
• Design interventions to address specific problems
•
•
•
•
•
•
•
Texts
Craft these interventions according to the results of empirical problem analyses and prior
evaluation literature
Target interventions at problem concentrations
Create detailed and realistic plans for the implementation of interventions
Create systems of performance monitoring
Plan outcome evaluations
Design process evaluations
Conduct and interpret multivariate analyses, using OLS and logistic modeling strategies
The course will rely on one text book that was previously used in Problem Analysis:
Maxfield, M.G., and Babbie, E. (2012). Basics of Research Methods for Criminal Justice and
Criminology, 3rd Edition. Wadsworth Publishing.
Other readings will be distributed via blackboard on a week to week basis.
Student assessment
A) Attendance and general participation (10%)
Students are required to attend class, complete readings, participate in and lead discussion.
Authorized absences will only be granted when requested in advance, and when supported by
medical or other official documentation. Unauthorized absences will negatively impact the course
grade.
B) Group-based project (10%)
Across the course, students will work in groups within the classroom to develop a strategy that will
help address the problem identified by the class last semester. The development of strategies will
include plans for an outcome evaluation. Students will submit an individually-written final paper
articulating (in their own words) the conclusions of this planning exercise. (Final individual
write-up due 19 April).
C) Planning and evaluation project (30% overall)
Across the course, students will work to design their own intervention to address a particular
problem (ideally the one analyzed last semester), and develop plans for evaluating the
implementation of the intervention and its impacts. This work will be assessed according to the
following deliverables:
1. Problem statement. One page (re-)statement of a specific problem; why it's of interest;
and what are some of its key characteristics. Include at least two scholarly references.
Ideally, this will build off your prior problem analysis project. Remember to be very specific
about the type of problem, and its geographic scope. (pass/fail – no percentage; due 15
Feb)
2. Intervention plan. Minimum eight page description of students’ own designed
intervention, including details of mechanisms and targeting, along with a logic model. The
intervention description should ideally be grounded in a prior problem analysis, or at least
some basic background reading on the issue if a problem analysis is not available.
Additionally, it should be grounded in existing literature. This means students will need to
research and review at least five evaluation studies (or other policy studies if formal
evaluations are not available) on interventions relevant to ytheirour problem (this review
should draw practical lessons from the literature, not just summarize it). (10%) (due 9
March, more guidance to follow) (15%, due 26 Apr).
3. Evaluation plan. Minimum six page plan for the evaluation or assessment of each student’s
intervention. This should include an outcome and process elements, and should clearly
explain their design, including measurement used and sampling considerations (15%, due
10 May).
D) Statistical assignments (20% overall).
1. Advanced OLS regression assignment. This will require a multivariate analysis using OLS
regression conducted on SPSS. (10%, due 29 Mar). .
2. Logistic regression assignment. This will require a multivariate analysis using logistic
regression for a binary dependent variable, again using SPSS. (10%, due 12 April). .
F) Final exam (30%, provisional date 4 May)
The final exam will require written answers on topics and methods taught across the entire
semester.
Final grades will be awarded according to the following rubric:
A
B+
B
C+
C
F
= 90.0% - 100.0%
= 85.0% - 89.9%
= 80.0% - 84.9%
= 75.0% - 79.9%
= 70.0% - 74.9%
= 60.0% - 69.9%
Class schedule
The following schedule provides a guide to the topics covered, week-by-week. The precise
scheduling, topic coverage or reading may be adjusted as the course proceeds.
****PLEASE CHECK BLACKBOARD FOR THE DEFINITIVE GUIDANCE FOR WEEK-TO-WEEK
READING AND PREP****
PART A: INTRODUCTON
Week 1 (26 Jan)- Course overview
The first class will provide an overview of the course. It will introduce key concepts, and provide a
brief review of key lessons from last semester’s Problem Analysis that will form a foundation for
the current course.
Week 2 (2 Feb)– Learning from examples of intervention planning and evaluation
This class will review examples of interventions, and will examine the role that planning played
example It will also review examples of the process by which problem analysis leads to
interventions and, subsequently, interventions are evaluated.
Orange County (1994)
Sallybanks (2001) – Executive summary and Chapter 3
Miller, Bland and Quinton (2000) and Miller (2010) [these studies are related]
Poyner (1994)
PART B: OUTCOME EVALUATIONS
Week 3 (9 Feb) – The nature of outcome evaluations
In this class, we consider the core principles of outcome evaluation, and review common types of
evaluation design.
•
Fratello et al. (2013)
Eck (2004) – Page 1-11; 18-34, Appendices A and B
Eck (2002)
Sherman et al. (1998) (pay attention to pages 4-6 (Maryland Scientific Methods Scale)
Farrington and Toffi (2010) (especially Exec summary, Chap 1, Chap. 7)
Sunday 15 Feb –individual project proposal due
Week 4 (16 Feb) –Randomized experiments
This class examines the logic and challenges of using randomized experimental methods for
assessing the outcome of an intervention, often considered the “gold standard” for evaluation.
Maxfield and Babbie (2012)-pp. 105-116
Eck and Wartell (1999)
Sherman and Berk (1993)
Farrington and Welsh 2005
Pawson & Tilley (1998)
Week 5 (23 Feb) – Quasi-experiments
This class examines non-randomized experiments and other higher and lower-quality methods for
assessing the outcome of an intervention.
Clarke and Eck (2005) – 47
Maxfield and Babbie (2012)-pp. 116-128
Braga et al. (2001) (Part 2)
Duguid and Paswon (1998)
Tuffin et al. (2006) – Exec summary and Chapter 1
PART C: ADVANCED TOPICS IN STATISTICS
Week 6 (2 Mar) – Advanced OLS regression 1
In this class, we will learn how to incorporate and interpret multiple independent variables within
OLS regression. This introduces the idea of working to “control for” third variables. The class will
also introduce the use of dummy for use within OLS.
Stats
Weisburd and Britt (2014) – Chapter 16
SPSS
Pallant (2011) – Chapter 13
Video
https://www.youtube.com/watch?v=f8n3Kt9cvSI
https://www.youtube.com/watch?v=Nc-0QdQk01s
Week 7 (9 Mar) – Advanced OLS regression 2
In this class, we will continue to examine how to incorporate and interpret multiple independent
variables into OLS regression. The class will also introduce the use of transformed variables and
interaction terms for use within OLS.
Stats
Weisburd and Britt (2014) – Chapter 17
SPSS
Pallant (2011) – Chapter 13
Video
https://www.youtube.com/watch?v=l3Aoikhaxtg
Week 8 (16 Mar) – SPRING BREAK
Week 9 (23 Mar) – Logistic regression
This class will examine the use of logistic regression for binary categorical dependent variables. It
will examine the mathematical logic that underpins these models, and practical strategies for
interpreting model results.
•
Stats
Weisburd and Britt (2014) – Chapter 18
SPSS
Pallant (2011) – Chapter 14
Video
https://www.youtube.com/watch?v=Ak_t86zm_sQ
Sunday 29 Mar – OLS regression assignment due
PART D: DESIGNING AN INTERVENTION
Week 10 (30 Mar) –Articulating intervention mechanisms
In developing a policy or program, it is important to have an idea of how your intervention is going
to produce the outcomes that you want. An important part of this is having a clear understanding of
the mechanism that the program will use to achieve its outcomes.
Tilley (1993) – especially chapter 2
Veldhuis (2012)
Morgan et al. (2012) p12-19, Appendix A1
Skogan et al. (2008) - especially 1-1 to 1-12
Miller et al. (2000) – Pages 18-21
Clarke and Eck (2005) - 39-43
Week 11 (6 Apr) –Targeting an intervention
In Problem Analysis, we learned about the 80:20 rule. This insight helps us target interventions: if
we know where, when and among whom the problem is concentrated, we can often get better
results by targeting these concentrations, instead of distributing resources more evenly through
space, time and among people etc. This class also examines ways to identify where problems will
tend to be concentrated in the future using “risk assessment” approaches.
•
Weisburd et al. (1996)
Walker et al. (2001)
Forrester et al. (1988) – Phase 1 only
Caplan et al. (2009)
Fratello et al. (2011)
Sunday 12 Apr – Logistic regression assignment due
Week 12 (13 Apr) –Using “logic models”
This class looks at the process of putting intervention plans on paper, and articulating clearly the
inputs, tasks, activities, mechanisms and expected outcomes. A key tool for doing this, that we will
focus on, is the logic model. This logic model also provides a framework for monitoring and
evaluation.
•
McCawley (1997)
Kellogg Foundation (2004), especially chapters 1 and 2
Kaplan and Garrett (2005)
Savaya and Waysman (2005)
Chen et al. (1999)
Sunday 19 Apr –Group project, individual write-up
PART E: OTHER MONITORING AND EVALUATION STRATEGIES
Week 13 (20 Apr) - Process evaluations
In this week’s class, we examine the steps that are taken to assess how programs are actually
implemented, whether they follow their plans, and whether they meet unexpected problems. We
consider the relevance of both quantitative and qualitative approaches to this aspect of evaluation.
•
Clarke and Eck (2005) - 46
Bowie and Bronte-Tinkew (2008)
Harachi et al. (1990)
Bouffard et al. (2004)
Esbensen (2011)
Edmonson and Hoover (2008)
Franzen et al. (2009)
Sunday 26 April –individual project, intervention plan assignment
Week 14 (27 Apr) – Indicators and monitoring
This class examines how routine monitoring can be built into program design, to help assess
implementation and outcomes. In doing so, it will look at approaches to measurement and data
collection, that will be relevant to evaluation generally.
Vera Institute of Justice (2003) – Part 1
Maxfield (2001) – Chapter 3
Mears and Butts (2008)
Lampert (2013)
Henry (2006)
Week 15 (4 May) – Final exam
•
Sunday 10 May – individual project, evaluation plan assignment
References
Bowie, L. and Bronte-Tinkew, J. 2008. Process Evaluations: A Guide for Out-Of-School Time
Practitioners. Washington, DC: Child Trends.
Braga, A., Kennedy, D. Piehl, A. and Waring, E. 2001. “Measuring the Impact of Operation Ceasefire.”
In Reducing Gun Violence: The Boston Gun Project’s Operation Ceasefire. Washington, D.C.:
U.S. Department of Justice, National Institute of Justice.
Brown, Rick, and Michael S. Scott. 2007. Implementing Responses to Problems. Problem-solving tools
series, no. 7. Washington, DC: U.S. Department of Justice, Office of Community Oriented
Policing Services. (http://popcenter.org/Tools/toolimplementingresponses.htm.)
California Department of Alcohol and Drug Programs. 2002. Substance Abuse and Crime Prevention
Act 2000 (SACPA – Proposition 36:First Annual Report to the Legislature.
Caplan, J. M., Kennedy, L.W. and Miller, J. 2009. Case Study: Applying Risk Terrain Modeling to
Shootings in Irvington, NJ. Newark, NJ. Rutgers Center for Public Security.
Clarke, R. V., and Eck, J. 2005. Crime Analysis for Problem Solvers in 60 Small steps. Washington: U.S.
Department of Justice, Office of Community Oriented Policing. (Http://www.popcenter.org).
Daly, R., Kapur, T. and Elliot, M. 2011. Capital Change: A Process Evaluation of Washington, DC’s
Secure Juvenile Placement Reform. Washington, DC: Vera Institute of Justice.
Eck, J. E. 2004. Assessing Responses to Problems: An introductory guide for police problem-solvers.
Washington, D.C.: Office for Problem Oriented Policing.
Eck, J. E. 2002. Learning from Experience in Problem-Oriented Policing and Situational Prevention:
The Positive Functions of Weak Evaluations and the Negative functions of strong ones.
Crime Prevention Studies, 14, 93-118.
Eck, J. E. and Wartell, J. 1999. Reducing Crime and Drug Dealing by Improving Place Management: A
Randomized Experiment. Washington DC: National Institute of Justice.
Farrington, D.P. and Ttofi, M.M. 2009. School-Based Programs to Reduce Bullying and Victimization.
Campbell Systematic Reviews, 2009:6.
Forrester, D., Chatterton, M., and Pease, K. 1988. The Kirkholt Burglary Prevention Project, Rochdale.
Crime Prevention Unit Paper 13. London: Home Office.
Fratello, J., Kapur, T. D., & Chasan, A. 2013. Measuring success: A guide to becoming an evidencebased practice.
Fratello, J., Salsich, A. and Mogulescu, S. 2011. Juvenile Detention Reform in New York City:
Measuring Risk Through Research. New York, NY: Vera Institute of Justice.
Kelling, G.L., Pate, T., Dieckman, D., & Brown, C. E. (1974). The Kansas City preventive patrol
experiment: A summary report (Vol. 1015). Washington, DC: Police Foundation.
Kellogg Foundation. (2004). Logic model development guide. Battle Creek, MI: Author
McCawley, P. F. 1997. The Logic Model for Program Planning and Evaluation. Idaho: University of
Idaho Extension.
Maxfield, Michael G. 2001. Guide to Frugal Evaluation for Criminal Justice. Final Report to the
National Institute of Justice. Washington, DC: U.S. Department of Justice, Office of Justice
Programs, National Institute of Justice. www.ncjrs.org/pdffiles1/nij/187350.pdf.
Mears, D. P., and Butts, J. A. 2008. “Using performance monitoring to improve the accountability,
operations, and effectiveness of juvenile justice.” Criminal Justice Policy Review 19(3), 264.
Miller, J. 2010. “Stop and Search in England: A Reformed Tactic or Business as Usual? British Journal of
Criminology
Miller, J. Bland, N. and Quinton, P. 2000. The Impact of Stops and Searches on Crime and the Community.
London: Home Office
Orange County Probation Department. 1994. The 8% solution. Retrieved Februrary 24, 2009, from
http://www.oc.ca.gov/Probation/solution/index.asp?c
Poyner, B. 1994. "Lessons from Lisson Green: An Evaluation of Walkway Demolition on a British
Housing Estate", in Clarke, R. (Eds),Crime Prevention Studies, Vol. 3. Monsey, NY: Criminal
Justice Press.
Sallybanks, J. (2001). Assessing the police use of decoy vehicles. Home Office, Policing and Reducing
Crime Unit, Research, Development and Statistics Directorate.
Sherman, L. W. and Berk, R.A. 1984. The Minneapolis Domestic Violence Experiment. DC: Police
Foundation.
Sherman, L. W., Gottfredson, D. C., MacKenzie, D. L., Eck, J., Reuter, P., & Bushway, S. D. 1998.
Preventing crime: What Works, what Doesn’t, What’s promising: A report to the US
congress. Washington DC: DOJ.
Skogan, Wesley G., Susan M. Hartnett, Natalie Bump, and Jill Dubois. 2008. Evaluation of CeaseFireChicago. Assisted by Ryan Hollon and Danielle Morris. Evanston, IL: Center for Policy
Research, Northwestern University.
Social Research Methods Knowledge Base online (n.d.)
http://www.socialresearchmethods.net/kb/contents.php.
Tilley, N 1993. Understanding Car Parks, Crime and CCTV: Evaluation Lessons from Safer Cities
(Crime Prevention Unit Series Paper 42). London: HMSO
Tuffin, R., Morris, J., and Poole, A. 2006. An evaluation of the impact of the National Reassurance
Policing Programme. London: Home Office.
Vera Institute of Justice. 2003. Measuring Progress toward Safety and Justice: A Global Guide to the
Design of Performance Indicators across the Justice Sector. New York: Vera Institute of
Justice (http://www.vera.org/download?file=9/207_404.pdf)
Walker, S., Alpert, G.P. and Kenney, D. 2001. Early Warning Systems: Responding to the Problem
Police Officer. Washington, DC: National Institute of Justice
Weisburd, D. Green, L., Gajewski, F. and Belluci, C. 1996. Research Preview: Policing Hotpspots. DC:
National Institute of Justice
Weisel, D. L. 2003. “The Sequence of Analysis in Solving Problems.” In J. Knutsson (Ed.), ProblemOriented Policing: From Innovation to Mainstream (pp. 115–146). Crime Prevention Studies,
vol. 15. Monsey, NY: Criminal Justice Press.
GENERAL INFORMATION
Academic Integrity
As a member of the Rutgers University community you are not to engage in any academic
dishonesty. You are responsible for adhering to basic academic standards of honesty and
integrity as outlined in the Rutgers University Policy on Academic Integrity for Undergraduate
and Graduate Students (http://academicintegrity.rutgers.edu/policy-on-academic-integrity).
Your academic work should be the result of your own individual effort, you should not allow
other students to use your work, and you are required to recognize and reference any material
that is not your own. Violations of the university’s policy will result in appropriate action.
Office of Disability Services
As stated in the Manual for Students and Coordinators of Services for Students with Disabilities
(https://ods.rutgers.edu/), Rutgers University “is committed to providing equal educational
opportunity for persons with disabilities in accordance with the Nondiscrimination Policy of
the University and in compliance with § 504 of the Rehabilitation Act of 1973 and with Title II
of the Americans with Disabilities Act of 1990.” For students with disabilities, review the
manual and visit http://robeson.rutgers.edu/studentlife/disability.html for information about
requesting accommodations.
Rutgers Learning Center (for tutoring assistance)
The Rutgers Learning Center (http://www.ncas.rutgers.edu/rlc) exists to support your studies
by offering tutoring assistance and resource materials in a variety of subjects, including
statistics. You may schedule an appointment by visiting the TutorTrac website
(https://cas.rutgers.edu/login?service=http://gacrux.nwkcampus.rutgers.edu/TracWeb40/default.html) or the Learning Center’s office, which is open
Monday through Friday from 9:00 a.m. to 5:00 p.m. and located in Bradley Hall, room 140.
Additionally, you may access a range of online tutorials and links from the Center at
http://techcenterblog.com/worksite/_links.html.
Rutgers-Newark Counseling Center
If you experience psychological or other difficulties as a result of this course, or because of other
issues that may interfere with your performance in the course, please contact the university’s
Counseling Center (http://counseling.newark.rutgers.edu; 973-353-5805), which is located in
Blumenthal Hall, room 101. The center offers a variety of free, confidential services to part-time
and full-time students who are enrolled at Rutgers.