Browse the Collection

RC Produced by Research Connections

* Peer Reviewed Journal

Current Filters: Author:Paulsell, Diane [remove]; New in last year [remove];

18 results found.
[1]  
Select Citation
Result Resource Type

Assessing the evidence of effectiveness of home visiting program models implemented in tribal communities
United States. Administration for Children and Families. Office of Planning, Research and Evaluation, August, 2011
Washington, DC: U.S. Administration for Children and Families, Office of Planning, Research and Evaluation.

This report describes the findings from the review of home visiting programs implemented in tribal communities or evaluated with American Indian or Alaska Native families and children. The original review was conducted in fall 2010 and the report was released in February 2011. This report was updated in August 2011 based on additional studies identified through an updated literature search conducted in spring 2011. (author abstract)

Literature Review


get fulltext

Assessing the need for evidence-based home visiting (EBHV): Experiences of EBHV grantees
Paulsell, Diane, July, 2010
(Brief 1). Princeton, NJ: Mathematica Policy Research.

The Maternal, Infant, and Early Childhood Home Visiting Program, authorized by Section 2951 of the Affordable Care Act of 2010 (P.L. 111-148), will provide $1.5 billion to states over five years to provide comprehensive, evidence-based home visiting services to improve a range of outcomes for families and children residing in at-risk communities (due to high rates of poverty, violence, poor health outcomes, and other factors). To receive the funds, each state must conduct a statewide needs assessment that (1) identifies at-risk communities, (2) assesses the state's capacity to provide substance abuse treatment and counseling, and (3) documents the quality and capacity of existing early childhood home visiting programs as well as gaps in these services. A number of the grantees participating in the Children's Bureau's Supporting Evidence-Based Home Visiting (EBHV) to Prevent Child Maltreatment grantee cluster prepared needs assessments to plan for implementing or expanding grant-related evidence-based home visiting services. This brief provides information about how grantees planned the assessments and collected the data, as well as facilitators and barriers to carrying out the assessments. It also describes lessons identified by grantees. (author abstract)

Reports & Papers


get fulltext

Building infrastructure to support home visiting to prevent child maltreatment: Two-year findings from the cross-site evaluation of the supporting evidence-based home visiting initiative
United States. Office on Child Abuse and Neglect, 12 August, 2011
Washington, DC: U.S. Office on Child Abuse and Neglect.

The Supporting Evidence-Based Home Visiting to Prevent Child Maltreatment (EBHV) initiative is designed to build knowledge about how to build the infrastructure and service delivery systems necessary to implement, scale-up, and sustain evidence-based home visiting program models as a strategy to prevent child maltreatment. The grantee cluster, funded by the Children's Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services, includes 17 diverse grantees from 15 states. Each grantee selected one or more home visiting models it planned to implement for the first time in its state or community (new implementers) or to enhance, adapt for new target populations, or expand. To support the implementation of home visiting with fidelity to their evidence-based models and help ensure their long-term sustainability, the grantees are developing infrastructure such as identifying funding streams and establishing strategies for developing and supporting the home visiting workforce. The EBHV grantees must conduct local evaluations to assess implementation, outcomes, and costs associated with their selected home visiting models. The national cross-site evaluation, conducted by Mathematica Policy Research and its partner, Chapin Hall at the University of Chicago, is designed to identify successful strategies for building infrastructure to implement or support the grantee-selected home visiting models (Koball et al. 2009). This report describes cross-site findings from the first two years of the initiative (fiscal years 2008-2010), including the planning period and early implementation of the grantee-selected home visiting models. The report primarily addresses four questions: 1. What was the state or local context with respect to home visiting as EBHV grantees planned and implemented their projects? 2. What partnerships did grantees form to support planning and early implementation of new home visiting programs? 3. What infrastructure was needed to implement home visiting program models in the early stages of the EBHV grant? 4. How did EBHV grantees and their associated home visiting implementing agencies (IAs) prepare for and implement new home visiting programs? (author abstract)

Reports & Papers


get fulltext

Building infrastructure to support home visiting to prevent child maltreatment: Two-year findings from the cross-site evaluation of the supporting evidence-based home visiting initiative [Executive summary]
United States. Office on Child Abuse and Neglect, 12 April, 2011
Washington, DC: U.S. Office on Child Abuse and Neglect.

The Supporting Evidence-Based Home Visiting to Prevent Child Maltreatment (EBHV) initiative is designed to build knowledge about how to build the infrastructure and service delivery systems necessary to implement, scale-up, and sustain evidence-based home visiting program models as a strategy to prevent child maltreatment. The grantee cluster, funded by the Children's Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services, includes 17 diverse grantees from 15 states. Each grantee selected one or more home visiting models it planned to implement for the first time in its state or community (new implementers) or to enhance, adapt for new target populations, or expand. To support the implementation of home visiting with fidelity to their evidence-based models and help ensure their long-term sustainability, the grantees are developing infrastructure such as identifying funding streams and establishing strategies for developing and supporting the home visiting workforce. The EBHV grantees must conduct local evaluations to assess implementation, outcomes, and costs associated with their selected home visiting models. The national cross-site evaluation, conducted by Mathematica Policy Research and its partner, Chapin Hall at the University of Chicago, is designed to identify successful strategies for building infrastructure to implement or support the grantee-selected home visiting models (Koball et al. 2009). This report describes cross-site findings from the first two years of the initiative (fiscal years 2008-2010), including the planning period and early implementation of the grantee-selected home visiting models. The report primarily addresses four questions: 1. What was the state or local context with respect to home visiting as EBHV grantees planned and implemented their projects? 2. What partnerships did grantees form to support planning and early implementation of new home visiting programs? 3. What infrastructure was needed to implement home visiting program models in the early stages of the EBHV grant? 4. How did EBHV grantees and their associated home visiting implementing agencies (IAs) prepare for and implement new home visiting programs? (author abstract)

Executive Summary


get fulltext

Cross-site evaluation of the supporting evidence-based home visiting grantee cluster: Evaluation design volume 1
United States. Office on Child Abuse and Neglect, 30 October, 2009
Washington, DC: U.S. Office on Child Abuse and Neglect.

In 2008, the Children's Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services funded 17 grants, through cooperative agreements, to address this knowledge gap and prevent child maltreatment. Grantees are to leverage their grant funding with other funding sources to support the EBHV grantee-selected programs and practices. Specifically, grantees are to focus on supporting implementation of, scaling up, and sustaining home visiting programs with high fidelity to their evidence-based models. In addition, grantees will contribute to the knowledge base about large-scale implementation with fidelity by conducting local implementation and outcome evaluations, along with analyses of program costs. Each cooperative agreement runs for five years. The first year (fiscal year [FY] 2008-2009) was a planning year; grantees are to implement their plans during the remaining four years (FY 2009-2010 through FY 2012-2013). CB/ACF has funded Mathematica Policy Research and Chapin Hall at the University of Chicago, along with our consultant Brenda Harden Jones from the University of Maryland, to conduct a six-year cross-site evaluation of the grantees' programs. As in the cooperative agreements, the first year of the cross-site evaluation was a planning year. Mathematica-Chapin Hall, in collaboration with the 17 EBHV grantees and their local evaluators, will conduct the cross-site evaluation during the remaining five years. The primary purpose of the cross-site evaluation is to identify successful strategies for adopting, implementing, and sustaining high-quality home visiting programs to prevent child maltreatment. The evaluation was designed to be participatory and utilization-focused, engaging the grantees and other stakeholders at key points in the process and incorporating information gathered back into the program models and evaluation framework. To achieve these goals, the Mathematica-Chapin Hall team will support rigorous local evaluations carried out within a Peer Learning Network (PLN), and use data from local evaluations and crosssite research to assess participant, program, and systems outcomes. A unique feature of this evaluation is the careful attention it will pay to the infrastructure supports for and the implementation fidelity of the home visiting programs. The cross-site evaluation will add to the current home visiting evaluation literature, which tends to focus specifically on program impacts. The cross-site evaluation will focus on domains central to the implementation and monitoring of home visiting programs: systems change, fidelity to the evidence-based model, costs of home visiting programs, and family and child outcomes. The cross-site evaluation also will analyze the process that each grantee uses to implement the grant. This report describes the cross-site evaluation design. The Mathematica-Chapin Hall team worked closely with the 17 EBHV grantees and their local evaluators, as well as CB/ACF and other federal partners, to design the cross-site evaluation. (author abstract)

Other


get fulltext

Data collection instruments for the evidence-based home visiting to prevent child maltreatment cross-site evaluation
United States. Office on Child Abuse and Neglect, April, 2012
Washington, DC: U.S. Office on Child Abuse and Neglect.

In 2008, the Children's Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services funded 17 cooperative agreements to support the infrastructure needed for the high-quality implementation of existing evidence-based home visiting (EBHV) programs to prevent child maltreatment. CB/ACF funded Mathematica Policy Research and Chapin Hall at the University of Chicago to conduct a participatory- and utilization-focused cross-site evaluation of the grantees' EBHV programs. The primary purpose of the cross-site evaluation is to identify successful strategies for adopting, implementing, and sustaining high-quality home visiting programs to prevent child maltreatment. The design for the EBHV cross-site evaluation is described in a design report published in 2009 (Koball et. al). This document is a companion piece to that design report. It provides data collection instruments used in the evaluation during 2010 and 2011. Protocols for site visits conducted in 2010 are in Section I. Instruments used to collect data on system change and infrastructure building appear in Section II. Section III contains instruments developed to collect data on model fidelity. The time use survey administered as part of the cost study is in Section IV. Finally, Section V contains protocols for site visits conducted in 2012. (author abstract)

Other


get fulltext

Evaluating implementation of quality rating and improvement systems
Paulsell, Diane, 2013
In T. Halle, A. Metz, & I. Martinez-Beck (Eds.), Applying implementation science in early childhood programs and systems (pp. 269-293). Baltimore: Paul H. Brookes

A discussion of early care and education quality rating and improvement systems (QRIS), and an application of implementation science and systems theory to QRIS design, monitoring, and implementation

Other


*

Evaluating infrastructure development in complex home visiting systems
Hargreaves, Margaret B., June, 2013
American Journal of Evaluation, 34(2), 147-169

In recent years, increased focus on the effectiveness and accountability of prevention and intervention programs has led to greater government funding for the implementation and spread of evidence-based health and human service delivery models. In particular, attention has been paid to programs that require significant infrastructure investment and systems change to support large scale replication. For conceptual and methodological reasons, such systems change initiatives can be a challenge to evaluate. To overcome these challenges, this article outlines a mixed methods approach to systems change evaluation and offers a case study of how this approach has been used to evaluate the development of system infrastructure supporting the implementation, spread, and sustainability of evidence-based home visiting projects. The approach combined systems concepts (boundaries, relationships, perspectives, ecological levels, and dynamics) and qualitative methods (project site visits, telephone interviews, reviews of project documents and logic models) with quantitative methods (a web-based partner survey) to directly measure the projects' system properties and contextual dynamics, and to assess how these factors were associated with the projects' infrastructure development. In the case study, the projects worked at four ecological levels (organization, community, state, and national) to build eight types of infrastructure (planning, collaboration, operations, workforce development, fiscal capacity, community and political support, communications, and evaluation). The evaluation found that the size of the projects' partner networks was not as important as the quality of their collaboration or their sharing of common goals in the projects' infrastructure development. (author abstract)

Reports & Papers


get fulltext

Evaluating systems change efforts to support evidence-based home visiting: Concepts and methods
United States. Office on Child Abuse and Neglect, 01 September, 2009
Washington, DC: U.S. Office on Child Abuse and Neglect.

In 2008, the Children's Bureau (CB) within the Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services funded 17 cooperative agreements to support the infrastructure needed for the high-quality implementation of existing evidence-based home visiting (EBHV) programs to prevent child maltreatment. Grantees are to leverage their grants with other funding sources to support the implementation of EBHV programs with fidelity, the scaling up of these high-fidelity home visiting models, and the sustainability of the models. Grantees must also conduct local implementation, outcome, and economic evaluations. CB/ACF has funded Mathematica Policy Research, Inc. (MPR) and Chapin Hall at the University of Chicago to conduct a participatory and utilization-focused cross-site evaluation of the grantees? initiatives over the next six years. The primary purpose of the cross-site evaluation is to identify successful strategies for adopting, implementing, and sustaining high-quality home visiting programs to prevent child maltreatment. The MPR-Chapin Hall (MPR-CH) cross-site evaluation will focus on four domains: fidelity, costs, systems, and family and child outcomes. The systems domain evaluation relies on system-based evaluation concepts and methods, articulating a theory of infrastructure change that incorporates key system attributes. This memo provides a literature review for the systems domain evaluation. This literature review is not an exhaustive review of complex systems theory or of the EBHV implementation, scale-up, and sustainability literature. Instead, it focuses on three aspects of the systems domain evaluation: (1) the system-based evaluation approach and theory of change, (2) core EBHV infrastructure concepts, and (3) system-based evaluation methods. (author abstract)

Other


get fulltext

Evidence-based home visiting systems evaluation update: Infrastructure-building plans and activities in 2011
United States. Office on Child Abuse and Neglect, 07 December, 2012
Washington, DC: U.S. Office on Child Abuse and Neglect.

This report provides a snapshot of subcontractors' plans for achieving their targeted outcomes and the EBHV goals and their actual infrastructure-building activities in year 3 of the initiative, roughly at the midpoint of implementation. EBHV subcontractors are operating in complex, dynamic, and unpredictable environments. As they adapt to these changing conditions, their plans and activities change, potentially altering the initiative's outcomes. Tracking these changing conditions and the evolution of subcontractors' plans and activities as they adapt can provide a rich picture of how complex system interventions operate over time and provide lessons and guidance for how to build infrastructure capacity that supports implementation with fidelity, scale-up, and sustainability of EBHV programs. This brief snapshot adds to earlier work by the cross-site evaluation team that documented subcontractor plans at the end of a one-year planning period and infrastructure-building activities during the planning and early implementation period. The report addresses two main research questions at approximately the midpoint of the funding period: 1. What EBHV initiative goals did subcontractors expect to achieve, and how did they plan to do so? What people and institutions did they expect to engage at each infrastructure level? What infrastructure-building strategies did they expect to implement? What infrastructure-building short-term results and long-term outcomes did they expect to achieve? 2. In what types of infrastructure-building activities did subcontractors actually engage? How were subcontractors' activities influenced by economic and other contextual factors? How did infrastructure-building activities change over time? What were subcontractors' perceived successes in progressing toward their targeted infrastructure-building outcomes and the EBHV goals? What challenges and barriers impeded subcontractors' progress toward their targeted infrastructure-building outcomes and the EBHV goals? (author abstract)

Reports & Papers


get fulltext

Home Visiting Evidence of Effectiveness review: Executive summary
United States. Administration for Children and Families. Office of Planning, Research and Evaluation, 15 October, 2011
Washington, DC: U.S. Administration for Children and Families, Office of Planning, Research and Evaluation.

A summary of a review of research on the effectiveness of home visiting programs for pregnant women or families with children from birth to age 5

Executive Summary


get fulltext

Home Visit Rating Scales-Adapted & Extended
Roggman, Lori A., 2010
Unpublished instrument.

Instruments


get fulltext

Measuring implementation of early childhood interventions at multiple system levels
United States. Administration for Children and Families. Office of Planning, Research and Evaluation, April, 2013
(Research Brief OPRE 2013-16). Washington, DC: U.S. Administration for Children and Families, Office of Planning, Research and Evaluation.

A discussion of measuring the implementation of early childhood interventions within multiple-level service delivery systems, with examples of tools for multiple-level assessments

Methods


get fulltext

Recruiting and retaining home visitors for evidence-based home visiting (EBHV): Experiences of EBHV grantees
Coffee-Borden, Brandon, October, 2010
(Brief 2). Princeton, NJ: Mathematica Policy Research.

This brief summarizes lessons about recruiting and training home visitors for evidence-based programs from grantees participating in the Children's Bureau's Supporting Evidence-Based Home Visiting (EBHV) to Prevent Child Maltreatment grantee cluster. As part of the EBHV cross-site evaluation, Mathematica Policy Research collected the data in spring 2010 during a series of telephone interviews conducted with managers of agencies from 9 of the 17 grantees that were implementing home visiting programs. These "implementing agencies" were selected to participate in the interviews because they had recruited, hired, and trained new home visitors during the preceding year (in contrast to some agencies that were already operating programs when the grant began, or had not yet reached the stage of staffing their home visiting programs). Most implementing agencies had experience with home visiting but few had previously implemented an evidence-based program. The brief provides an overview of agencies' strategies for recruiting and training home visitors, as well as the challenges they faced and lessons learned. (author abstract)

Reports & Papers


get fulltext

Replicating and scaling up evidence-based home visiting programs: The role of implementation research
Paulsell, Diane, September, 2012
In D. Spiker, E. Gaylor (Topic Eds.), R. E. Tremblay, M. Boivin, & R. D. Peters (Eds.), Encyclopedia on early childhood development. Montreal, Quebec, Canada: Centre of Excellence for Early Childhood Development.

This article discusses implementation research in the home visiting field, how such research can be used to strengthen programs and improve targeted outcomes, and the conditions and supports necessary for effective implementation. (author abstract)

Fact Sheets & Briefs


get fulltext

Supporting a culture of evidence-based practice and continuous program improvement: A staged approach to implementing and studying international early childhood development programs
Boller, Kimberley, December, 2011
(Working Paper No. 2011-040). Chicago: University of Chicago, Human Capital and Economic Opportunity Global Working Group.

This concept paper proposes a four-stage approach to in-country/region ECD program development, selection, and inquiry designed to build the evidence base required to guide program and policy decisions. (author abstract)

Fact Sheets & Briefs


get fulltext

Supporting evidence-based home visiting to prevent child maltreatment: Overview of the cross-site evaluation
United States. Office on Child Abuse and Neglect, 30 October, 2009
Washington, DC: U.S. Office on Child Abuse and Neglect.

An overview of the cross-site evaluation design and program models of 17 grantees participating in the implementation of evidence-based home visiting programs to prevent child maltreatment

Other


get fulltext

Supporting home visitors in evidence-based programs: Experiences of EBHV grantees
Coffee-Borden, Brandon, December, 2010
(Brief 4). Princeton, NJ: Mathematica Policy Research.

This brief summarizes experiences supporting and supervising home visitors working in evidence-based programs affiliated with grantees participating in the Children's Bureau's Supporting Evidence-Based Home Visiting (EBHV) to Prevent Child Maltreatment initiative. As part of the EBHV cross-site evaluation, Mathematica Policy Research collected the data in spring 2010 during a series of telephone interviews conducted with managers of agencies from 9 of the 17 grantees that were implementing home visiting. These "implementing agencies" were selected to participate in the interviews because they had recruited, hired, and trained new home visitors during the preceding year (in contrast to some agencies that were already operating programs when the grant began, or had not yet reached the stage of staffing their home visiting programs). Most implementing agencies had previous experience with home visiting but few had implemented an evidence-based program. The brief provides an overview of agencies' strategies for supervising and supporting home visitors, as well as the challenges they faced and lessons learned. (author abstract)

Reports & Papers


get fulltext

Select Citation
[1]  

Search Feedback


 



Research Connections is supported by grant #90YE0104 from the Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. The contents are solely the responsibility of the National Center for Children in Poverty and the Inter-university Consortium for Political and Social Research and do not necessarily represent the official views of the Office of Planning, Research and Evaluation, the Administration for Children and Families, or the U.S. Department of Health and Human Services.

Google Translate