Skip Navigation

Step 2. Select Interventions that Demonstrate a Rationale

Well-designed and well-implemented data science and data literacy instructional programs have the potential to support a variety of student outcomes. While acknowledging the field is new, you should design or select programs that, whenever possible, use evidence-based practices in program design and that draw upon existing evidence-based practice and resources in other domains. We discuss strategies and resources both below.

For more information about the Department's definition of evidence-based practices, visit What is An Evidence-Based Practice?

Guidelines for selecting interventions that Demonstrate a Rationale:

  • Check if the program used evidence-based practices or theory for design:
    • Does the program build on existing peer-reviewed education research?
    • Does the program have a clear and cohesive pedagogical approach for instruction?
  • Check if the program uses age-appropriate and subject-appropriate technology tools:
    • Does the program use a software that builds real-world data acumen? Is it authentic?
    • Does the tool or software appropriately challenge students? Does it present accessible learning opportunities, rather than overwhelm them?
    • Does the program use technology and datasets that align to required content, or builds upon existing knowledge, in the educator's subject (e.g., math, science, social studies)?

Making Intentional Choices: Data Science Software

What software do "real" data scientists use? A professional data scientist is often comfortable with many different software tools — should students be as well, or should technology barriers be minimized? How should an educator pick a technology tool for the classroom? Early research suggests exposure and practice with an age-appropriate data science software can be beneficial for building student confidence. Educators may consider using tinker-based exploration tools (CODAP, Tuva, Cognimates), popular data science software (R / RStudio, Python, SAS, Stata, Tableau PowerBI), or even spreadsheets (Microsoft Excel, Google Sheets). Some programs have also created a progression of multiple software, introducing students to different tools across a learning trajectory, beginning with education-focused software. Regardless of approach, building a common understanding of the terms, differences between tools, and their real-world and educational use-cases should be an intentional process involving multiple stakeholders when designing your program or intervention.

  • Check if the program offers professional development resources:
    • Does the program offer synchronous or asynchronous learning resources for educators to implement the program as designed?
    • Does the program offer continuing support after initial professional development programming?
  • If the program has been evaluated before, check the following:
    • Was the student population similar to the students in your school or district?
    • Did the pilot use a validated assessment or measure?
    • Did the evaluation use a validated assessment or measurement tool?
  • If the program has not been evaluated before, check the following:
    • Have you prepared to build evidence associated with the implementation of the program, including identifying a validated assessment or measure for an outcome relevant to your circumstances?

Explore Example Programs!

Try comparing the following example data science programs yourself against the checklist above:

Explore existing evidence-based practices in data science education and related education literature:

Explore pilot data literacy programs from peer schools or districts:

  • Consult subject specialists (math, science, social studies, etc.) at your state education agency (SEA) to learn of any nearby data science or data literacy pilots, evaluation studies, or new instructional programs.
  • Consult nearby educators or education leaders at neighboring districts.
  • Explore programs in states and districts like mine, and consider the following questions:
    • How does their program relate to mine? Is the student community similar or different?
    • What steps has their program taken to seek local stakeholder engagement?
    • What steps has their program taken to collect data, evaluations, or other practice-based learnings that may be useful for mine?