ii20

ii20

The Interactive Institutes 2020 – Building and Sustaining a Culture of High-Quality Data provided opportunities for participants to take a deep dive into data quality topics to learn about data culture change. Participants were able to discover best practices to improve data collection, reporting, analysis, and use. They also engaged with peers and TA providers about trending data quality topics. Each session presented powerful ideas and actionable plans to improve work processes and data quality.

 

ii 20, Interactive Institutes logo

IDC held an in-person Interactive Institute in Ft. Worth, TX on March 3–4, 2020. This live event offered 32 plenaries, topical bursts, and deep dive sessions with group activities and discussions. Due to COVID-19 restrictions, IDC canceled plans for its second live ii20 in Nashville, TN, scheduled for March 31–April 1, 2020. Instead, IDC reformatted 12 of its popular sessions from the live event and presented them virtually June 23–25, 2020. You can access the presentation slides and related resources below.

Topic Selection

Selected Topic Presentations

Select a topic to see the related sessions, presenters, and downloads.

 
  • Session: Plenary/Keynote From Compliance to Continuous Improvement: Creating a Culture of High-Quality Data

    Staff from the Data Quality Campaign facilitated a conversation among representatives from Illinois and Virginia and Interactive Institute participants about the actions state leaders can take to create a culture of high-quality data within their department and across their agency. Join the Data Quality Campaign and representatives from Illinois and Virginia for a conversation about the actions state leaders can take to create a culture of high-quality data within their department and across their agency.

  • Session: Virtual From Compliance to Continuous Improvement: Creating a Culture of High-Quality Data

    Staff from the Data Quality Campaign facilitated a conversation among representatives from Illinois and Virginia and participants about the actions state leaders can take to create a culture of high-quality data within their department and across their agency.

  • Session: Plenary OSEP Straight Talk

    This OSEP Straight Talk plenary at IDC’s Texas Interactive Institute presented information on how to access Coronavirus guidance from the Centers for Disease Control and the Readiness Emergency Management for Schools Technical Assistance Center. The presentation offered a look at what’s new in OSEP and featured updates on major OSEP initiatives and work of OSEP’s Research to Practice and Monitoring and State Improvement Planning divisions. Participants learned about resources on restraint and seclusion; OSEP’s release of 10 new 618 data files; current investments to support data use; and how to access the OSEP TA&D network. They also learned about and key OSEP efforts related to the SPP/APR, differentiated monitoring and support, state Part B and Part C grant applications, and the new significant disproportionality reporting form.

  • Session: Virtual How Can Data Managers Contribute to Building and Sustaining a Culture of High-Quality Data? (SEDMAG)

    During this SEDMAG panel discussion, data managers shared their challenges and successes in working to build and sustain a culture of high-quality data across their state education agencies.

  • Session: A1 Top 10 Reasons to Improve Your Local Data

    High-quality state data is dependent upon high-quality local data. Learn why it is important to improve local data and what you can do to support locals to improve the validity and reliability of their data. Presenters will introduce the new IDC LEA Data Processes Toolkit as a tool to help states engage with their locals in order to document data processes as a way to improve their local data.

  • Session: A2 Improving Local Data One District at a Time: Introduction to IDC’s LEA Data Processes Toolkit

    Do you ever wonder what states are doing to ensure high-quality data at the local level? Participants in this session will learn the value of building local capacity of data stewards and instituting a culture of high-quality data in each local education agency. States working with LEAs to improve their data can use IDC’s new LEA Data Processes Toolkit to support that work. Presenters will provide an overview of the toolkit, and participants will have an opportunity to experience "preparing for this LEA data processes work."

  • Session: Virtual Improving Local Data One District at a Time Using IDC’s New LEA Data Processes Toolkit

    High-quality state data is dependent upon high-quality local data. Do you ever wonder what states are doing to ensure high-quality data at the local level? Participants in this session learned the value of building local capacity of data staff to instill a culture of high-quality data in each LEA. Participants learned how states can use the new IDC LEA Data Processes Toolkit to help LEAs improve their data. Presenters provided an overview of the toolkit, and participants had an opportunity to experience "preparing for this work."

  • Session: B1 Creating a Culture of High-Quality Data: OSEP’s Perspective

    In this session, participants heard OSEP’s perspective on the importance of the U.S. Department of Education reporting requirements and agency-wide efforts to increase transparency with data reporting. Participants gained a better understanding of the impact data quality has on this work.

  • Session: B2 Creating a Culture of High-Quality Data: Why It Matters

    In this session, participants engaged in facilitated discussion related to the successes and challenges of creating a culture of high-quality data to drive system improvement. They heard about ways in which OSEP can support state efforts to create and grow their culture of high-quality data and how OSEP and states can partner to make data accessible, understandable, and user-friendly to all stakeholders.

  • Session: C1 Ensuring High-Quality 618 Submissions to Build a Solid Foundation for Program Improvement

    Do you want to build a strong foundation for data analysis and program improvement using your federal 618 data? States can use the submission and reporting process as means for ensuring their data is of high-quality. This session included a review of the 618 data reporting cycle, available EDFacts Business Rules documentation, pre-submission edit check tools, tips about data notes, and where to find help when states need it.

  • Session: C2 Opening the Door for Data Use: Improving Data Quality With Data Integration

    State education agencies are working across program lines to provide more holistic support to districts and students. Integrating data across programs provides opportunities for more complex data analysis and improved data quality. Governance and documentation are key for building and sustaining a robust integrated data system. In this session, participants heard how states have used IDC’s SEA Data Processes Toolkit and CIID’s Data Integration Toolkit to improve data quality in their integrated data systems and how, in doing so, they are opening the door for more integrated analysis.

  • Session: Virtual Opening the Door for Data Use: Improving Data With Data Integration

    SEAs are working across program lines to provide more holistic support to districts and students. Integrating data across programs provides opportunities for more complex data analysis and improved data quality. Governance and documentation are key for building and sustaining a robust integrated data system. In this session, participants heard how states have used IDC’s SEA Data Processes Toolkit and CIID’s Data Integration Toolkit to improve data quality in their integrated data systems and how, in doing so, they are opening the door for more integrated data analysis.

  • Session: D1 Representativeness: How Do You Measure It and How Can You Improve It?

    This session provided an overview of the data quality issue of representativeness—that is, reporting data that accurately reflect all subgroups of students with disabilities and families in the broader target population. Presenters outlined the importance of representative responses in survey data collection—e.g., related to Indicators B8 and B14—as a component of valid and reliable data and introduced strategies for improving representativeness. Presenters discussed the intersectionality of response rate and representativeness. Presenters also referred participants to tools from IDC and NTACT available to help users assess representativeness and improve response rates across groups.

  • Session: D2 What Do Your B14 Data Really Tell You if They Don’t Represent Your Students? Focusing on Representativeness in B14 Responses

    Forty-nine states report using survey methods to collect data for Indicator B14. However, states struggle to achieve representativeness of the respondents of the target population. This workshop emphasized the importance of collecting representative data for reporting on performance and making program and policy decisions. Presenters highlighted processes and structures for both assessing and improving representativeness. Participants worked hands-on with helpful IDC and NTACT resources as they engaged in a facilitated discussion that includes states sharing the steps they have taken to improve.

  • Session: Virtual What Do Your data Tell You If They Don’t Represent Your Students: Representativeness in B8 and B14

    This session tackled the data quality issue of representativeness related to Indicators B8 and B14—that is, reporting data that accurately reflect all subgroups of students with disabilities and families in the broader target population for each indicator. States often struggle to achieve respondent representativeness in both Indicators. Presenters emphasized the importance of representative responses in survey data collection as a component of valid and reliable data and introduced strategies for improving representativeness. Presenters shared one state’s efforts to apply processes and structures to improve representativeness, using tools and guidance from OSEP-funded technical assistance centers. Participants engaged with the tools as they participated in a facilitated discussion that included sharing challenges and solutions.

  • Session: E1 Streamlining Your Data: Special Education Data Management Systems

    Don’t know where to begin with data system development? This session provided foundational information about data management and the use of statewide special education data systems to improve data quality. Participants learned about essential considerations for these data systems and heard real-life examples of states that have approached system development in different ways. These state examples underscored successes and challenges with statewide data system implementation and offered valuable take-aways for those states interested in pursuing a special education data management system.

  • Session: F1 What it Takes to Improve Preschool Environments Data

    Two states shared their efforts to improve preschool educational environments data and engaged with participants in a facilitated discussion about data challenges and solutions for improving Indicator B6 data quality. IDC facilitators framed this data discussion with a high-level overview of IDC’s Educational Environments Ages 3-5: B6 Reporting Tools 2017-2018 Clarifications and Interactive Application. State presenters shared how this tool helped them analyze their state and local data to improve Indicator B6 data and make program improvements.

  • Session: Virtual What It Takes to Improve Preschool Environments Data

    Two states shared their efforts to improve preschool educational environments data and engaged with participants in a facilitated discussion about data challenges and solutions for improving Indicator B6 data quality. IDC facilitators framed this data discussion with a high-level overview of IDC’s Educational Environments Ages 3-5: B6 Reporting Tools 2017-2018 Clarifications and Interactive Application. Presenters shared how using this tool helped the states analyze state and local data to improve Indicator B6 data and make program improvements.

  • Session: G1 Increasing Collaboration Across SEA Divisions to Collect and Use High-Quality Assessment Data

    This session provided an overview of federal requirements related to the collection and reporting of assessment data. The overview highlighted the importance of collaboration between EDFacts coordinators, CSPR coordinators, and IDEA Part B data managers in ensuring that data states submit under Part B Indicator 3 are of high quality. Presenters addressed various ways that decision makers within the SEA can use assessment data at the state level to support improving results for students with disabilities. Participants explored how two IDC tools (The Assessment Data Journey: Are We There Yet infographic and the Assessment protocol in the Part B IDEA Data Processes Toolkit) can support collaborative state efforts to collect and report high-quality assessment data.

  • Session: H1 Navigating the SPP/APR Slippage Slope

    Presenters discussed the definitions of slippage on SPP/APR Indicators and provided example statements for communicating to various stakeholders about slippage. Next, they modeled an inquiry process of using data to test hypotheses about why slippage is occurring and identify factors that may affect slippage. They discussed the benefits of conducting this inquiry process in a manner that is integrated into program improvement cycles well before the SPP/APR is due. Presenters highlighted challenges and opportunities for engaging LEAs (e.g., working with stakeholders, using slippage data visualization) in this conversation as a part of the inquiry process. Participants left with a deeper understanding of how to use the SPP/APR to drive program improvement when slippage has occurred.

  • Session: I1 Understanding Suspension and Expulsion Data Collection

    This topical burst focused on the IDEA Part B 618 Discipline data states submit for students ages 3-21, with an emphasis on preschool discipline data. States collect data on suspension and expulsion of children and youth with disabilities from a wide variety of sources including private and public preschools as well as K-12 schools. This session presented data states collect and report, sources of the data, related data quality challenges, and what the data can tell the SEA about what children and youth receive disciplinary actions.

  • Session: I2 Exploring Preschool Discipline Data and Planning for Program Improvement

    This session addressed the story discipline data can tell and data sources and diverse perspectives (such as perspectives of 619 coordinators) that are necessary to tell a complete story about preschool discipline issues. Presenters also discussed IDC resources that can help states improve the quality of their discipline data and use the data for program improvement. Preschool discipline data are required for federal reporting, and some states also legislate the collection and reporting of preschool discipline data. Presenters discussed why it is important that Part B 619 coordinators work with special education directors, data managers, EDFacts coordinators, SPP/APR coordinators, significant disproportionality staff, and indicator leads to draw the complete picture of what the state’s preschool discipline data reveals about who is being disciplined and how understanding the data can help drive program improvement. States had the opportunity to share about their efforts to support high-quality data collection to tell their discipline stories.

  • Session: Virtual Exploring Preschool Discipline Data and Planning for Program Improvement

    This session addressed the story discipline data can tell and data sources and diverse perspectives (such as perspectives of 619 coordinators) that are needed to tell a complete story about preschool discipline issues. Presenters also discussed IDC resources that can help states improve the quality of their discipline data and use the data for program improvement. Preschool discipline data are required for federal reporting, and some states also legislate the collection and reporting of preschool discipline data. Presenters discussed why it is important that Part B 619 coordinators work with special education directors, data managers, EDFacts coordinators, SPP/APR coordinators, significant disproportionality staff, and indicator leads to draw the complete picture of what the state’s preschool discipline data reveal about who is being disciplined and how understanding the data can help drive program improvement. States had the opportunity to interact virtually to share ideas about high-quality preschool discipline data.

  • Session: J1 Significant Disproportionality: Supporting LEAs to Progress Beyond Identification

    States are completing their first year of implementing the revised regulations for significant disproportionality. As a result of the revised regulations, many states identified more LEAs with significant disproportionality than they ever have previously. This session provided an overview of the ways in which states have supported LEAs to navigate the complexities of the requirements, including identifying root causes or factors that contribute to the significant disproportionality and then developing a plan for addressing the significant disproportionality.

  • Session: J2 Significant Disproportionality: Supporting LEAs to Monitor Progress After Identification

    This session explored the role of evaluation as states and LEAs are beginning the second year of implementation of the revised regulations for significant disproportionality. State staff and TA providers discussed ways that states can evaluate the efforts they have made to support LEAs to implement CCEIS plans after being identified with significant disproportionality. In addition, participants discussed ways in which states are ensuring LEAs monitor their own progress and adjust their CCEIS plans as needed.

  • Session: Virtual Significant Disproportionality as a Spark for Continuous Improvement in Student Outcomes

    States are completing their first year of implementing the revised regulations for significant disproportionality. As a result of the revised regulations, many states identified more local education agencies (LEAs) with significant disproportionality than they ever did previously. In this session, presenters provided an overview of the ways in which states have supported LEAs to navigate the complexities of the requirements. Presenters and participants also explored ways in which states can evaluate their own efforts and help LEAs evaluate their efforts, with the goal of using their significant disproportionality work as a spark for continuous improvement in outcomes for students with disabilities.

  • Session: K1 Everything I Wanted to Know About the OSEP Quality Review Process but Forgot to Ask

    Session participants received a refresher of OSEP’s Data Quality Review process at both the SEA and LEA levels. Topics the presenters covered included an overview of what OSEP considers in the data review and what it looks for in a data note. LEA-level topics included an introduction to the Foundations for Evidence-Based Policymaking Act of 2018 and why it matters along with what to expect in the near future for the review and publishing of LEA-level data by OSEP.

  • Session: K2 Getting to High-Quality LEA Data

    In this workshop, participants received a brief overview of the current state of OSEP’s publishing of LEA-level data, the LEA-level data quality review process, and what to expect in the near future. Presenters invited participants to share their thoughts through facilitated group discussions.

  • Session: L1 What All SEAs Should Know About Working on Data Quality with LEAs

    Engaging LEAs in data work can be tricky if SEAs don’t consider the underlying principles of successful engagement. Participants learned about key principles of successful engagement and explored how acting on these principles can help SEAs make inroads into the complicated task of working with LEAs to improve the quality of their data.

  • Session: L2 Partnering for Progress: Engaging LEAs to Improve the Quality of Their Data

    How can SEAs partner with their LEAs to succeed in collecting and reporting high-quality data? In this session, presenters provided an overview of key principles of successful engagement, and participants brainstormed ideas for engaging with LEAs to improve the quality of their data. Participants experienced using IDC’s Data Meeting Toolkit and Part B Indicator Data Display Wizard in a hands-on activity designed to practice engaging with others in discussing data. Upon IDC recommendation, state teams attended the session together and shared techniques they have used successfully to engage in data conversations between state and local staff.

  • Session: Virtual Partnering for Progress: Engaging LEAs to Improve the Quality of Their Data

    How can SEAs partner with their LEAs to succeed in collecting and reporting high-quality data? In this session, presenters provided an overview of key principles of successful engagement, and participants brainstormed ideas for engaging with LEAs to improve the quality of their data. Presenters demonstrated IDC’s Data Meeting Toolkit and Part B Indicator Data Display Wizard, and participants experienced using the tools to practice engaging with others in discussing data. Participants also were able to share techniques they have used successfully to engage in data conversations between state and local staff.

  • Session: M1 Working With Data Using a Project Manager Approach

    Do you wonder if you are managing your work or your work is managing you? This session explored ways Part B data managers, 619 coordinators, and SPP/APR leads can take back control of their workloads by incorporating some of the tools and strategies project managers use. Participants walked away with tips they can implement to help their states collect, analyze, process, and report data efficiently and effectively with an eye to improving and maintaining high-quality.

  • Session: M2 Minimizing Chaos and Maximizing Quality: Managing the SPP/APR Process

    This session took a deeper dive into the Project Manager Approach as it relates to coordinating workloads for completing the SPP/APR. Presenters explored IDC’s Part B IDEA Data Process Toolkit tools that states can use to ensure the successful completion of the SPP/APR, and participants discussed their relevant practices, successes, and challenges. Presenters also facilitated a hands-on activity for participants to try out the Project Manager Approach using the SPP/APR indicators.

  • Session: Virtual Minimizing Chaos and Maximizing Quality: Managing SPP/APR Processes Using a Project Manager Approach

    Do you wonder if you are managing your work or your work is managing you? This session explored ways Part B data managers, 619 coordinators, and SPP/APR leads can take back control of their workloads by incorporating some of the tools and strategies project managers use. Participants discussed their relevant practices, successes, and challenges with completing the SPP/APR and walked away with tips they can implement to help their states minimize chaos and maximize the quality of the SPP/APR submissions.

  • Session: N1 Weeding out Errors in Your Data

    Don’t just spray weed killer to deal with your data quality issues. Get to the root of data errors. Participants learned about how applying business rules offers a way to address high-quality data issues at the source in order to avoid giving surface level attention to data quality issues such as completeness, reliability, and validity.

  • Session: N2 Leaving a Breadcrumb Trail on the Path to Data Quality

    Are you feeling lost in the woods when it comes to dealing with data quality issues? This session provided tips and resources for developing a trail of breadcrumbs to help you find your way. Presenters provided a brief overview of the new Business Rules Documentation Protocol that IDC added to its SEA Data Processes Toolkit to support the collection of high-quality data. States shared their experiences with creating business rules protocols for the IDEA data collections, building capacity of data stewards, and instilling a culture of high-quality data in their states. Presenters facilitated discussion to help participants understand the value of documenting business rules and provided them an opportunity to experience preparing for and beginning this important work.

  • Session: Virtual Building a Roadmap to Data Quality

    Are you experiencing challenges with data quality? This session presented tips and resources to help build a roadmap to data quality through the use of business rules documentation. Presenters provided a brief overview of the new IDC Business Rules Documentation Protocol to support the collection of high-quality data. A state presenter shared his state’s experience with creating business rules protocols for the IDEA data collections, building capacity of data staff, and supporting a culture of high-quality data in the state.

  • Session: O1 Pathway to the SiMR: Measuring Your SSIP Progress With Progress Monitoring Data

    Whether your state is planning to make changes with your SSIP or hold the course with your current initiative, collecting meaningful progress data is crucial to reflect the impact of your state’s work on the pathway to the SiMR. This presentation introduced the importance of selecting appropriate SSIP progress data to show the impact of your state’s work and answer the questions “Why does it matter?” and “How do you know?” Presenters focused on the alignment of data with activities, interim outcomes, and the SiMR and suggested strategies for collecting timely, relevant data that engage stakeholders, provide evidence of progress for the SSIP report, and inform them about their progress toward achieving long-term improvements.

  • Session: O2 Square Pegs & Round Holes: Selecting the Right SSIP Data Measures to Assess & Document Progress

    This presentation outlined how to make sure SSIP activities, outcomes, and progress data align to provide a meaningful assessment of the state’s SSIP work. Presenters discussed potential mismatches between activities, outcomes, and data, as well as strategies for identifying progress data that are both meaningful and feasible to collect and use. Attendees took an in-depth look at sample SSIP Theories of Action to identify appropriate sources of progress monitoring data. Attendees also had an opportunity to review or develop data measures for their own states’ current or proposed SSIP activities.

  • Session: Virtual Square Pegs and Round Holes: Selecting the Right SSIP Data Measures to Assess and Document Progress

    Whether your state is planning to make changes to your SSIP or hold the course with your current initiative, collecting meaningful progress data is crucial to reflect the effect of your state’s work on its pathway to the SiMR. This session emphasized the importance of selecting appropriate SSIP progress data to show the results of your state’s work. Presenters discussed potential mismatches between activities, outcomes, and data, as well as strategies for identifying progress data that are both meaningful and feasible to collect and use. Participants had the opportunity to review a sample SSIP Theory of Action to identify appropriate sources of progress monitoring data and to review or develop data measures for their own states’ SSIP activities.

  • Session: P1 Using Multiple Data Sources for Deeper Data Analysis

    An abundance of educational data exists and many state staff work hard to ensure that their states accurately report quality data. The work should not stop when states successfully collect and report the data, and this session examined the next steps states need to take after collecting and reporting the data. These steps included ensuring SEA and LEA staff are data literate and supporting analysis and use of data from multiple resources. Taking these next steps can lead to more effective decision making and improved educational outcomes for children and youth with disabilities.

  • Session: P2 Developing Effective Practices for In-Depth Analysis of Your Data to Improve Results

    Participants joined in as staff considered effective ways to analyze and use high-quality data to improve educational outcomes for children and youth with disabilities. This interactive session looked at approaches to combining and using student data, specifically exiting and discipline data, as well as other educational data. To effectively combine and use different data from different sources for problem solving, data users must first understand and then analyze the data, identify root causes of problems, and develop hypotheses related to the problems and solutions in order to create a plan to improve educational outcomes for children and youth with disabilities. This session shared a process and example for in-depth data analysis and supported a discussion of how to help LEAs use this process for improved educational outcomes.