How to Use the Information You Find on

We created to help criminal justice, juvenile justice, and crime victim service professionals better understand crime and identify program and practice solutions that address the unique needs of their communities. Here you will find tips for practitioners, policy makers, and researchers to use this information to advance justice. The tips are organized under five headings:

Aside Right


Do you run your own site or blog? You can provide your visitors with access to the program profiles and ratings most recently posted to Add one of the available program widgets to your site. Learn how.

We also have a 1-page flier that you can print and share with your colleagues.

Improving Program Effectiveness helps justice professionals who are not social scientists improve the effectiveness of programs. The systematic, independent review process and evidence ratings are intended to give practitioners access to social science evidence that is otherwise difficult to obtain, and serve as a basis for gauging the quality of evidence. In short, strives to help practitioners answer the questions Has it worked, and in what context?

Practitioner Tip 1: Familiarize yourself with evidence-based programs in your field.
Justice practitioners at all levels may benefit from reviewing the program profiles related to their work, even if they are not in a position to make decisions about replicating programs. Our profiles show which programs have produced positive results and which have not. This information can be shared within agencies and organizations and facilitate conversations with colleagues and superiors to explore ways to modify existing practices to better align them with evidence-based programs.
Practitioner Tip 2: Replicate an evidence-based program.
Programs rated as "Effective" and "Promising" on have produced positive results in the past. Replicating programs that have been shown to work, and that fit a community's needs, can save valuable time and resources compared to implementing untested programs that may not address the same problems as effectively. The best way to get similar positive results from evidence-based programs is to replicate them with fidelity to the original design.

Each program on includes summary information and additional resources to help you replicate the program. You should view as a starting point where you may be able to find resources such as dedicated websites, publications, implementation manuals, training materials, live training, and certifications.

Developing New Strategies

Full replication of evidence-based programs is ideal, but may not always be possible because well-tested programs do not exist for all circumstances and populations. Even some programs identified as "Promising" or "Effective" do not have detailed training materials that specify all the ways to implement them. That does not mean that you have to start from scratch when it comes to research evidence. The practice information on can help you understand the types of outcomes that have been achieved (or not achieved) by a general strategy or program type. The practice profiles point toward the specific programs that have been evaluated. The broad practice information can guide you to which types of interventions to consider.

Practitioner Tip 3: Understand the outcome evidence for practices.
If you are developing or enhancing a strategy for which there is not an evidence-based program, familiarize yourself with the practices on and the related programs listed on the practice profiles. In the absence of evidence-based programs, this broader information can help identify the types of programs that are more and less likely to work on the problems you are facing.
Practitioner Tip 4: Adapt an evidence-based program and evaluate it.
Rather than build a new program from scratch, practitioners may choose to use tested programs as a foundation and make as few adjustments as possible to increase the chances that the modified program will succeed. Such modifications can be very helpful to others in the field when they are carefully documented and paired with rigorous evaluation. In these cases, adaptation and innovation help generate new evidence about programs' and practices' effectiveness.

Funding Decisions

If you are in a position to influence funding, can help you make informed decisions.

Policy Maker Tip 1: Create incentives to use evidence-based programs and practices.
Investing in programs and practices with demonstrated track records makes sense regardless of whether funding comes from public or private sources. Similarly, it makes sense to carefully review and possibly discontinue programs and practices when evidence shows they have failed to produce their intended results. "Effective" programs and practices are particularly suitable for replication, especially when they come with strong training materials. However, policy makers with funding responsibilities should take care not to oversimplify their task: may be used as one factor in determining worthy and unworthy investments, but only as one factor among many.
Policy Maker Tip 2: Create incentives for ongoing innovation and the generation of evidence-based programs and practices.
We encourage policy makers who find resources like useful to recognize the innovative practitioners who are striving to solve problems every day and the social scientists who help produce the evidence. These innovators require ongoing support so they can develop new evidence-based programs and practices. Funders can contribute to the growing body of evidence-based programs by pairing funding for innovative and untested approaches with rigorous evaluation.

Informing Training aims to create demand for programs and practices with proven results. The profiles on this site provide basic information and resources, but these may not be enough for practitioners to fully learn and replicate.

Trainer Tip 1: Develop training materials for evidence-based programs and practices that have been rated as "Effective."
Training and technical assistance providers can provide benefits to the field by developing materials such as logic models, implementation guides, and manuals for evidence-based programs. Online, open-access materials, and training are particularly beneficial to practitioners in public agencies and organizations who often work in tight budget environments.

Informing Research

Social scientists play an essential role in identifying evidence to inform practitioners and policy makers about what is, and is not, effective.

Researcher Tip 1: Consult evidence standards to strengthen program evaluation designs. evidence standards are described in the Program Scoring Instrument, and may be useful points of reference during the program evaluation design phase. We developed the evidence standards for in consultation with a wide range of social scientists in the justice field. We encourage program evaluators to implement the strongest designs possible for producing causal evidence.
Researcher Tip 2: Focus on evaluating "Promising" programs using rigorous evaluation designs to build the body of evidence and increase confidence in program effectiveness.
Many programs we have deemed "Promising" are widely used in the field, and improving the body of evidence on these programs is particularly helpful. Social scientists highly committed to helping practitioners identify and use evidence-based programs can make a valuable scientific contribution by replicating a prior evaluation using stronger methods.
Researcher Tip 3: Review this list of program evaluations that did not meet our criteria for being rated on to see gaps in the body of evaluation research and potentially determine areas of future research.

General Tips for Users

General Tip 1: Understand that the body of evidence provided varies across topics.
The extent and quality of effective evidence varies considerably across topic areas within criminal justice, juvenile justice, and crime victim services. The availability of evidence in a topic area is not based on the importance or the extent of activity in that area, but is more likely to be based on factors related to the capacity to conduct social science evaluation in that area. The presence or absence of evidence in a particular topic area does not indicate whether activities in that topic area are more or less effective than those in another.
General Tip 2: is not yet comprehensive.
Although continues to review evidence and add profiles for programs and practices, there remains social science evidence within criminal justice, juvenile justice, and crime victim services that has not yet reviewed. Users of this resource should not view as a complete body of social science evidence within justice systems. A list of programs that have been reviewed, but not rated can be found here. We also allow repeals of a given rating.
General Tip 3: Understand and share what the term "evidence-based" means.
A program or practice is considered "evidence-based" if it has been found to produce its intended results based on rigorous social science evaluation. Many other notable lines of activity associated with terms like "data-driven" or "research-based" may sound the same, but programs and practices that have not been subjected to rigorous evaluation cannot be called evidence-based. Become an educated consumer and user of these terms to help advance the cause of truly evidence-based programs and practices.