We can deliver better outcomes for North Carolinians by investing in proven effective programs and committing to continually improving government operations. OSBM helps policymakers and program managers use data and theory to build the evidence needed to inform decisions about how to get the results we want, and how to do so most efficiently.
We support agencies through the process of evidence-building and ongoing performance management to address priority goals and challenges:
- Understand problems and their underlying causes
- Inform the design and targeting of programs, policies, and processes
- Assess implementation, track outputs, and measure performance
- Determine what works to achieve the desired outcomes and test new approaches
- Improve operations and program delivery to get the biggest impact in the most cost-effective way
What is Evidence?
Evidence is viewed broadly as the available body of facts or information indicating how likely it is that a belief is true. Evidence can be qualitative or quantitative, and it may come from a variety of sources, with varying degrees of credibility. State government leaders can employ evaluation and data analytics to guide management, policy, and funding decisions.
When we want to understand the effectiveness of a program or policy, we rely on evidence from research that can isolate and measure the outcomes caused by the intervention. These types of studies are called impact evaluations. High-quality impact evaluations use well-matched participant and comparison groups to identify the effects of a program or policy, separate from other factors.
How Strong is the Evidence of Effectiveness?
Research that uses rigorous methods increases confidence in the findings and subsequent decisions.
OSBM has created a rating scale that provides a shared framework to assess the effectiveness of a program or policy (positive or negative) and the level of confidence we can have in the findings based on the evaluation methods. The rating scale ranges from "proven harmful" to "proven effective." We have provided a detailed, print-friendly version of the rating scale.
For example, proven effective means we can be confident that the program or policy will generate meaningful outcomes, based on the findings of multiple rigorous impact evaluations that employ a randomized control trial (RCT) or a quasi-experimental research design (QED). Both RCTs and QEDs use treatment and control groups to determine the outcomes caused by program participation.
Many state programs do not currently have any evidence of effectiveness, but this does not mean they do not work. These programs may be backed by strong theoretical models or output metrics, but impact evaluations are needed to measure their effects. See more about how learning agendas and research partnerships can help fill evidence gaps below.
Finding Evidence
Looking for data resources to help with research?
See our list of state government open data resources.
Download our guide to where and how to find evidence.
How OSBM Can Help
Budget proposals
State agencies should submit budget expansion requests backed by evidence or, where evidence is limited, with plans for how the needed evidence will be generated. Strong budget request justifications incorporate one or more of the following:
- Administrative data or performance metrics demonstrating trends, needs, and goals
- Data analytics or research that deliver insights into the underlying causes of an issue and the theoretical basis for proposed solutions
- The evidence of effectiveness for new or expanding programs or policies
- If no evidence of effectiveness currently exists, include a plan for monitoring and evaluation
- Return on investment estimates
Please contact your Budget Development Analyst for guidance on incorporating data and evidence into budget requests.
Strategic planning and learning agendas
Strategic plans set the course for what an agency will do over the next two to five years to address top priority goals and challenges, and how progress will be measured. Often there are open questions about how to best achieve those goals.
Learning agendas articulate knowledge gaps and research questions that, if answered, will inform the agency’s strategies for achieving the desired results. Learning agendas form the basis for evaluation planning and research partnership opportunities.
Strategic Planning training is offered every other year and technical assistance is available by request.
Research partnerships
The NC Office of Strategic Partnerships (OSP) develops, launches, and enhances partnerships between state government and North Carolina’s research and philanthropic sectors. Join monthly panel discussions on cross-sector partnership topics with interested parties across the state. Explore open partnership opportunities by visiting the North Carolina Project Portal. Agencies can contact OSP for assistance in posting a research opportunity on the portal.
See also the State of North Carolina Research Registry.
Return on investment modeling
Benefit-cost analysis is the process of estimating the return on investment (ROI) from programs or regulations. Monetizing the outcomes and weighing them against the costs informs implementation decisions about how to generate the greatest impact with limited resources. Resulting ROI estimates can be used to compare alternative approaches.
OSBM partners with agencies to monetize the benefits and costs of social programs with sufficient evidence of effectiveness using the Results First model.
OSBM reviews and advises regulatory impact analyses (fiscal notes), which are benefit-cost analyses of proposed permanent rule changes.
Performance management training and consultation
OSBM offers the Performance Management Academy to state employees annually. Other training opportunities may occasionally be shared through the Performance Management Community of Practice, which is moderated by OSBM.
In addition, OSBM offers agencies facilitation and consultation services on request for strategic planning and process improvement efforts.