Your data is disaggregated to understand and improve performance, and better understand who is being served and how their needs and experiences might differ.
Your data disaggregation process addresses equity concerns and provides a nuanced understanding of operations, uncovering areas of strength that can be leveraged, or critical issues that need to be addressed; strengthening understanding of how often, and when, strategies help achieve desired outcomes.
When possible, your data is broken out by demographics to learn about the dynamics of a priority or challenge area. This knowledge can be used to ask more insightful questions about the data, be used as an effective strategy for identifying previously undetected issues within communities, or for targeting interventions. (What Works Cities DM8)
Intake, Survey, and Program Specific Collection
Your government incorporates a range of data sources including administrative and descriptive data, and also relevant census data, health data, etc.
You collect, analyze, share, and use high-quality administrative and survey data to best understand service-users and build a baseline data set to help assess program impact. (Fed Standard of Excellence)
Aiming for Equitable Outcomes
You develop research questions and designs that aim to advance racial and ethnic equity. To this end, you can involve racially and ethnically diverse research teams, construct a research design that is accepted by the community, develop research questions that target root causes and address equity when identifying data collection methods and instruments. (Annie E Casey Fdn)
You consider multiple types of evidence to inform your decision making like lived experiences, stakeholder engagement on learning and knowledge building, and disaggregation of evaluation results in order to best understand how to drive toward equitable outcomes. (State Standard of Excellence)
Supportive Executive Leadership
Your government has a leadership and governance structure with the authority to use evaluations to improve results. (State Standard of Excellence Criteria)
Your Building Blocks
Investing in Evidence Based Initiatives
You can determine whether a program is evidence-based and implement it in your local context: Adapting and implementing existing high quality evidence based practices is important to ensure you are building on existing research to achieve maximum impact. (What Works Cities EVAL4)
You can implement evidence-based programs: evidence-based programs (EBPs) are programs that have been rigorously tested and have proven effective at improving outcomes. (What Works Cities EVAL4)
Embedding an Evidence and Evaluation Policy
Your government’s documented commitment to rigorous evaluation requires departments to provide justification for which types of programs / policies / practices should or should not be rigorously evaluated. (What Works Cities EVAL1)
Your government’s documented commitment to rigorous evaluations encourages departments to use rigorous evaluation methods or leverage evidence from existing evaluations where feasible. (What Works Cities EVAL1). Additionally, you embed evaluations planning into the design of pilot programs. (ARPA Guidance for non-evidence based initiatives)
Building a Culture of Evidence and Evaluation
Your government trains, upskills, and empowers staff in the management and the use of city data to inform decision-making.
Your government has included in its onboarding materials, skills development plans, or data workforce strategy, the training and upskilling of staff on management and the use of data and evidence to make decisions. (What Works Cities LC 3)
Your government has a designated leader and/or team responsible for ensuring departments are conducting rigorous evaluations (e.g., process, experimental, or quasi-experimental). (What Works Cities LC6)
Getting Internal Buy-In
Buy-In & Feedback: You have a culture that supports the sustainable use of data and evidence to deliver results in a transparent, equitable, and ethical manner. (State Standard of Excellence Criteria)
Budgeting for Evaluations
You have dedicated resources for using evaluations to improve results. (State Standard of Excellence Criteria)
Your government uses logic models as a part of program and evaluation planning. A logic model is a graphic depiction (road map) that presents the shared relationships among the resources, activities, outputs, and outcomes/impacts for your program. It depicts the relationship between your program’s activities and its intended effects, in an implicit ‘if-then’ relationship among the program elements — if I do this activity, then I expect this outcome. Among other things, a logic model helps clarify the boundary between ‘what’ the program is doing and ‘so what’—the changes that are intended to result from strong implementation of the “what.” (CDC Evaluation Framework)
Designing Evaluations
Your government’s evaluation design assesses the issues of greatest concern to stakeholders while using time and resources as efficiently as possible.
You consider the most appropriate evaluation design, using the four evaluation standards—especially utility and feasibility—to decide on the most appropriate design. (CDC Evaluation Framework)
Committing to Equitable Practices
Your government’s equity approach is tailored to the specifics of your evaluation and to the needs of your community; tactics can include:
Examine your own backgrounds and biases.
Make a commitment to dig deeper into the data.
Recognize that the research process itself affects communities and that your local government has a role in ensuring research benefits communities.
Engage communities as partners in research and give them credit.
Guard against the implied or explicit assumption that white is the normative, standard or default position.
You know how much to budget for an evaluation, having an understanding of the evaluation process and of the various factors that might influence cost. In simple terms, the amount of money that you will need depends on the scope and complexity of both the program to be evaluated and the evaluation itself. (OSEP Budgeting for Evaluation Guide)
Selecting the Right Evaluator
Your government carefully selects and supports an outside evaluator for program assessment. You will experience the maximum benefits from an evaluation if you hire an evaluator who is willing to work with you and your staff to help you better understand your program, learn what works, and discover what program components may need refining. (Administration for Childrens and Families)
Your government has consistent access to an external partner to support high-quality evaluations and/or access to specialized in-house skills. (What Works Cities LC6)
You manage a successful research partnership including the commitment of a senior-level decision-maker within the government (who ensures that the project aligns with the government’s overall priorities, helps navigate relationships with key stakeholders, and provides momentum when needed) and a project manager (who allocates a significant percentage of his or her time to the project, serves as the point person for moving the project forward, and meets regularly with the researcher and other partners.) (J-PAL North America)
Sharing What You Learn
Your government develops a communications plan that includes the community as one of multiple primary audiences for program and evaluation decisions, you use various formats for reporting results and prioritize findings that the community can act on and use. (Annie E Casey Fdn)
Your Growing Capacities
Prioritizing Evidence
Your government has used the results from rigorous evaluations to inform decisions related to improving services for residents or related to key community-wide priorities. (What Works Cities EVAL3)
Your government makes new or newly justified budget decisions informed by quantitative and qualitative data analysis. Examples should be demonstrated across three or more departments and/or agencies.
Your government has a documented process, informed by quantitative and qualitative data analysis, to make different, or newly justified, budget and financial decisions. (What Works Cities BF2)
Your leadership requires executive team members, staff and other internal and external stakeholders to supply quantitative and qualitative data analysis, disaggregated by geographic and demographic subgroups when possible, when necessary for strategic decision-making. (What Works Cities LC1)
Your government should have the evaluation infrastructure that allows evidence to inform an administration’s budget, management, and policy decisions. (State Standard of Excellence)
Using a Data Equity Framework
Your data governance practice uses an equity framework. This ensures that equity is incorporated in every step of the data governance process as your government develops its data governance committee, establishes a process for data collection including use cases, assigns datasets, selects data stewards, and ensures the data is being used in the most equitable way. (What Works Cities DM1)
Using Data Systems and Data Sharing
Your government has invested in technology infrastructure (tools, software, databases) that allows it to efficiently collect, inventory, and share data. (State Standard of Excellence Criteria)
Your government has a user-friendly method of collecting internal and external requests to share data and requires data-holders within the local government to respond in a timely way.
Your government has a designated process for conducting and reviewing analysis (ie. descriptive analysis, diagnostic analysis, geospatial analysis, or prescriptive analysis necessary for data-focused performance management meetings), preparation, and follow-up for each performance management meeting. (What Works Cities PA2)
Your government follows a documented methodology for routinely collecting and updating the data inventory at minimum every 24 months and is in compliance with the methodology. Data Inventory: A comprehensive listing of all available data sources and in what format this data exists. (What Works Cities DM2)
Your government has a documented process for determining when and how to make data open that includes the assessment of potential opportunities and harms and consideration of previous requests for data in prioritizing datasets for release. (What Works Cities DM3)
Publishing Evaluation Results
Results Shared with Peers/Published: Academic partners may publish in journals, online, etc. Learn more: Data and results publication (J-PAL)
Your government has an evaluation working group or at least one senior individual with the ability to identify impactful evaluation opportunities accountable for supporting departments to conduct rigorous evaluations. (What Works Cities LC6)
Supporting Evaluation Advisory Groups
Your government successfully supports community advisory boards (CAB) in your evaluation decision-making process. In community-based participatory research (CBPR) approaches, the community is typically represented by a coalition such as a CAB. Members of the CAB function as the interface between researchers and the community, as people with deep knowledge of the community, cultural and social resources they hold, and the problems they face. Additionally, the CAB plays an important role in the dissemination of the project findings within the community and in the development of an action plan that may result from the study conclusions (Community-Based Participatory Research)
Setting Goals and Measuring Progress
You have strategic goals that are public, quantitative, published regularly, community oriented, aligned, equity oriented, and publicly informed, and you regularly measure performance toward those goals. (State Standard of Excellence)
Using Participatory Research Models
Your government successfully incorporates community-based participatory research (CBPR). CBPR refers to research activities carried out in local settings in which community members actively collaborate with professionally trained researchers. (Community-Based Participatory Research)
Accessing Data on Demand
Reporting
Your existing data and measures are available on demand and mandatory reports are automated and return trusted numbers. You have systems with clean, validated data and can pull data to support performance management, evaluation & experiments, and advanced analytics.
You also include ongoing data development and exploration to feed new questions and needs.(Cal Data Maturity Model)