Search Results: (46-60 of 478 records)
|Characteristics and Performance of High School Equivalency Exam Takers in New Jersey
Since 2014 the New Jersey Department of Education has offered three high school equivalency (HSE) exams for nongraduates seeking credentials: the GED, the High School Equivalency Test (HiSET), and the Test Assessing Secondary Completion (TASC). This study used data on exam takers who had been grade 8 students in a New Jersey public school between 2008/09 and 2013/14 and who had attempted at least one HSE exam in New Jersey between March 2014 and December 2018. It analyzed how the characteristics of exam takers differ across exams and from the characteristics of non–exam takers, how the performance of exam takers with similar backgrounds varies, and how a recent reduction in the passing threshold for two of the exams affected passing rates. Among all students who had been grade 8 students in a New Jersey public school during the study years, HSE exam takers completed fewer years of school, were more likely to have been eligible for the national school lunch program in grade 8, and were more likely to identify as Black or Hispanic than non–exam takers. GED takers had received higher grade 8 standardized test scores, were more likely to identify as White, and were less likely to have been eligible for the national school lunch program in grade 8 than HiSET and TASC takers. Under the New Jersey Department of Education's original passing thresholds, exam takers in the study sample were more likely to pass the HiSET and TASC than the GED on the first attempt (after grade 8 standardized test scores were controlled for). However, after the reduction in passing thresholds, the first-attempt passing rate was similar across the three exams. Under the new passing thresholds, two-thirds of GED takers and more than half of HiSET and TASC takers passed on the first attempt, and—when all exam attempts are included—three-quarters of all exam takers ever passed each exam.
|Using a survey of social and emotional learning and school climate to inform decisionmaking
The District of Columbia Public Schools (DCPS) has prioritized efforts to support students' social and emotional learning (SEL) competencies, such as perseverance and social awareness. To measure students' SEL competencies and the school experiences that promote SEL competencies (school climate), DCPS began administering annual surveys to students, teachers, and parents in 2017/18. DCPS partnered with the Mid-Atlantic Regional Educational Laboratory to study how the district could use these surveys to improve students' outcomes. The study found the following:
|Exploring the Potential Role of Staff Surveys in School Leader Evaluation
The Mid-Atlantic Regional Educational Laboratory partnered with the District of Columbia Public Schools (DCPS) to explore the potential use of teacher surveys in school leader evaluation. The DCPS evaluation system, like many others, currently consists of two components: an assessment on how well a school performs on a set of student achievement metrics (such as proficiency on standardized tests) and an assessment by a supervisor of the principal’s leadership across multiple domains. Incorporating teacher surveys could provide an additional perspective on principals’ leadership and performance. Examining data from two teacher surveys that DCPS has used (Panorama and Insight), the study found that:
Overall, our findings suggest that it could be useful for DCPS to use elements of teacher surveys to bring in teachers’ perspectives on principals’ leadership related to instruction, talent, and school culture. Other districts may also wish to consider employing teacher surveys to gain an additional perspective on principals from staff who interact with the principal every day.
|Associations between High School Students' Social-Emotional Competencies and Their High School and College Academic and Behavioral Outcomes in the Commonwealth of the Northern Mariana Islands
This study addressed the need expressed by education stakeholders in the Commonwealth of the Northern Mariana Islands to better understand their high school students' social-emotional competencies and how those competencies might be associated with students' academic and behavioral outcomes in high school and college. Social-emotional competencies refer to the knowledge, beliefs, and behaviors that help students recognize and manage their emotions, build positive relationships, and make responsible decisions. In May 2019 grade 11 and 12 students who were enrolled in high schools within the Commonwealth of the Northern Mariana Islands Public School System responded in May 2019 to survey questions regarding their self-management, growth mindset, self-efficacy, sense of belonging, and social awareness using a 5-point scale, with higher scores reflecting greater social-emotional competencies. The study found that high school students and high school students who went on to attend Northern Marianas College scored highest in self-management and lowest in self-efficacy. High school students with higher growth mindset or self-efficacy scores had higher high school grade point averages and grade 10 ACT Aspire math and reading scale scores. Higher self-efficacy scores were also associated with fewer days absent from high school. Students with higher social awareness scores had lower high school grade point averages. Among the high school students who went on to attend college at Northern Marianas College, higher growth mindset scores were associated with higher first semester college grade point averages, after student characteristics were controlled for. None of the four other social-emotional competency domains was associated with any of the college academic or behavioral outcomes.
|Cost-Feasibility Analysis Toolkit for Supplemental Online Programs: User Guide
Regional Educational Laboratory Appalachia researchers developed the Cost-Feasibility Analysis (CFA) Toolkit to help education leaders estimate whether implementing supplemental online programs is affordable given a school or district's available resources. The CFA Toolkit guides users through a four-stage process that yields cost information that can support decisionmaking about implementing such a program. The toolkit includes guidance, helpful resources, and an Excel-based cost-estimation tool that supports users with planning (stage 1), collecting data (stage 2), estimating program costs (stage 3), and determining the feasibility of implementing the supplemental online program (stage 4).
|The Effect of School Report Card Design on Usability, Understanding, and Satisfaction
Education policymakers view transparency and accountability as critical to the success of schools. To support these goals, the District of Columbia Office of the State Superintendent of Education (OSSE) has developed an online school report card for communicating information about the characteristics and performance of schools. To support OSSE’s interest in making report cards more usable, this study assessed the effect of different designs on how easy the report cards are to use and understand, how easy it is to find information in them, and whether users would recommend the site to others.
The study found that moving the link to details of the district’s School Transparency and Reporting (STAR) framework from the top of the page to beneath the STAR score improved the site’s usability and that reporting the number of points possible for each metric led to a better understanding of how the score is calculated. The combination of design features that produced the best performance on all measures included these two design changes. Other designs had mixed effects. In particular, making year-over-year change in school performance salient made it easier to identify which schools had improved the most, but participants disliked this feature (demonstrated by lower ratings for usability and satisfaction). In general, participants who accessed the site with mobile devices had more difficulty using it. This study illustrates how policymakers and practitioners in other states can efficiently test school report card design changes at scale.
|Effectiveness of Early Literacy Instruction: Summary of 20 Years of Research
Children entering kindergarten vary greatly in their language and literacy skills. Therefore, up-to-date information about evidence-based practices is essential for early childhood educators and policymakers as they support preschool children’s language and literacy development. This study used a process modeled after the What Works Clearinghouse (WWC) methodology to systematically identify effective early childhood curricula, lesson packages, instructional practices, and technology programs in studies conducted from 1997 to 2017. More than 74,000 studies were analyzed to identify interventions that improved students' performance in six language and literacy domains (language, phonological awareness, print knowledge, decoding, early writing, and general literacy). The study team identified 132 interventions evaluated by 109 studies that the study team determined were high-quality experimental or quasi-experimental studies. The WWC's evidence standards are used to assess the quality of an evaluation study and the strength of its claims about whether an intervention caused the observed effect on student achievement. To better understand the effectiveness of the interventions, their implementation characteristics and instructional features were coded for the relevant language and literacy domains. The findings revealed that instruction that teaches a specific domain is likely to increase performance in that domain. Interventions that teach language exclusively might be more beneficial when conducted in small groups or one-on-one than in larger group sizes. In addition, teaching both phonological awareness and print knowledge might benefit performance in print knowledge. Finally, some evidence indicates that instruction that teaches both phonological awareness and print knowledge might also lead to improvements in decoding and early writing performance.
|The reliability of shorter assessments in New Jersey for group-level inferences
Education policymakers must balance the reliability of assessments to measure academic knowledge and skills with the burdens that assessments place upon students, teachers, and schools. In 2019, New Jersey began using the New Jersey Student Learning Assessments (NJSLA), shorter assessments based on the Partnership for Assessment of Readiness for College and Careers (PARCC). Regional Educational Laboratory researchers examined the reliability of test results for the NJSLA by comparing results at the school, test, and subgroup levels from 2016 to 2019. The findings indicated a high degree of reliability across most measures the researchers examined; during the transition to the NJSLA, the reliability did not decrease for any test results—except the Algebra 2 test—reported by the New Jersey Department of Education. The instability of the Algebra 2 results was most likely not attributable to changes in the assessment but instead to changes in the student population that was required to take the test following a change in the state’s testing requirements .
|Using High School and College Data to Predict Teacher Candidates' Performance on the Praxis at Unibetsedĺt Guĺhan (University of Guam)
Policymakers and educators on Guĺhan (Guam) are concerned about the persistent shortage of qualified K-12 teachers. Staff at the Unibetsedĺt Guĺhan (University of Guam, UOG) School of Education, the only local university that offers a teacher training and certification program, believe that more students are interested in becoming teachers but that the program's admissions requirements--in particular, the Praxis® Core test, which consists of reading, writing, and math subtests--might be a barrier. Little is known about the predictors for passing the Praxis Core test. This makes it difficult to develop and implement targeted interventions to help students pass the test and prepare for the program.
This study examined which student demographic and academic preparation characteristics predict passing the Praxis Core test and each of its subtests. The study examined two groups of students who attempted at least one subtest within three years of enrolling at UOG: students who graduated from a Guĺhan public high school (group 1) and all students, regardless of the high school from which they graduated (group 2). Just over half the students in each group passed the Praxis Core test (passed all three subtests) within three years of enrolling at UOG. The pass rate was lower on the math subtest than on the reading and writing subtests. For group 1, students who earned credit for at least one semester of Advanced Placement or honors math courses in high school had a higher pass rate on the Praxis Core test than students who did not earn any credit for those courses, students who earned a grade of 92 percent or higher in grade 10 English had a higher pass rate on the reading subtest than students who earned a lower grade, and students who earned a grade higher than 103 percent in grade 10 English had a higher pass rate on the writing subtest than students who earned a lower grade. For group 2, students who did not receive a Pell Grant (a proxy for socioeconomic status) had a higher Praxis Core test pass rate than students who did receive a Pell Grant, students who earned a grade of B or higher in first-year college English had a higher Praxis Core test pass rate than students who earned a lower grade, and male students had a higher pass rate on the reading and math subtests than female students.
The study findings have several implications for intervention plans at both the secondary and postsecondary levels. Although students must pass all three Praxis subtests to be admitted to the teacher preparation program at the School of Education, examining student performance on each subtest can help stakeholders understand the content areas in which students might need more support. In the long term preparing more prospective teachers for the Praxis Core test might increase program enrollment, which in turn might increase the on-island hiring pool.
|Examination of the Validity and Reliability of the Kansas Clinical Assessment Tool
Although national assessments for evaluating teacher candidates are available, some state education agencies and education preparation programs have developed their own assessments. These locally developed assessments are based on observations of teaching and other artifacts such as lesson plans and student assignments. However, local assessment developers often lack information about the validity and reliability of data collected with their assessments. The Council for the Accreditation of Educator Preparation (CAEP) has provided guidance for demonstrating the validity and reliability of locally developed teacher candidate assessments, yet few educator preparation programs have the capacity to generate this evidence.
The Regional Educational Laboratory Central partnered with educator preparation programs in Kansas to examine the validity and reliability of the Kansas Clinical Assessment Tool (K-CAT), a newly developed tool for assessing the performance of teacher candidates. The study was designed to align with CAEP guidance. The study found that cooperating teachers reported that the K-CAT accurately represented existing teaching performance standards (face validity). Two skilled raters found that the content of the K-CAT was mostly aligned to existing teaching performance standards (content validity). In addition, K-CAT scores for the same teacher candidate, provided by cooperating teachers and supervising faculty, were positively related (convergent validity). K-CAT indicator scores showed internal consistency, or correlations among related indicators, for standards and for the tool overall (reliability). K-CAT scores showed small relationships with teacher candidate scores on other measures of teaching performance (criterion-related validity).
|A Guide to Identifying Similar Schools to Support School Improvement
To support school improvement efforts, school leaders and education agencies might need to identify groups of schools that are similar so that schools can compare their performance or share practices with other schools in the same group. This could also allow education agencies to provide tailored supports to schools in a group. This guide describes how an education agency can select a distance measure (a statistical rather than a geographic measure) to identify schools that are similar to a target school, using a variety of characteristics that enable school leaders to better understand their schools’ relative performance. This guide is based on work done with the Nebraska Department of Education and is designed to help staff in other education agencies who are interested in implementing a similar approach to support school improvement.
|Using High School Data to Explore Early College Success on Pohnpei, Federated States of Micronesia
As of 2010, about 15 percent of residents older than age 25 on Pohnpei in the Federated States of Micronesia (FSM) had completed an associate degree or higher. To increase the number of college graduates, the Pohnpei Department of Education and the College of Micronesia-FSM are working together to improve the early college outcomes of their students. They noted that in 2018, 42 percent of applicants from Pohnpei to the College of Micronesia-FSM were not admitted or were admitted to a one-year nondegree certificate program. No studies have examined possible links between high school academic preparation in the FSM and early college success outcomes, such as the college entrance test result. Examining these links could inform strategies to improve degree attainment. Using data on Pohnpei public high school graduates from 2016 to 2018 provided by the Pohnpei Department of Education and the College of Micronesia-FSM, this study examined high school academic preparation characteristics and college student characteristics to determine whether they are associated with five early college success outcomes: College of Micronesia-FSM Entrance Test result; placement in credit-bearing math, reading, and writing courses; and persistence to a second year. The study found that high school grade point average was positively associated with all five outcomes. Students who were enrolled in the high school academic coursework track were more likely than students who were enrolled in the business and vocational tracks to be admitted to a degree program and to enroll in credit-bearing reading courses. College students who first enrolled at the College of Micronesia-FSM in the summer term immediately after high school graduation were more likely to persist to a second year than those who first enrolled in the fall term.
|Pathways to Teaching: Teacher Diversity, Testing, Certification, and Employment in Washington State
The number and percentage of students of color are growing in Washington state, yet the teacher workforce remains largely White (non-Hispanic). This means that few students of color have teachers who share their race or ethnicity, which could have consequences for student achievement and wellbeing. To better understand the state’s shortage of teachers of color, this study investigated six steps in the teacher preparation and career pathway at which teacher candidates and teachers are likely to drop out or leave the profession: three teacher preparation tests, certification, employment, and retention. Among all teacher candidates who took at least one of these steps during 2010-19, Hispanic candidates and non-Hispanic candidates of color were less likely than White candidates to complete each step, took longer to complete each step, and were less likely to become a certificated educator in a Washington K-12 public school. The descriptive findings suggest that education policymakers consider revising policies and programs to increase the number of teachers of color. The state has already made changes, such as revising testing requirements for teacher candidates.
|Indiana and Minnesota Students Who Focused on Career and Technical Education in High School: Who Are They, and What Are Their College and Employment Outcomes?
In Indiana and Minnesota the state education agency, state higher education agency, and the state workforce agency have collaborated to develop career and technical education courses intended to improve high school students' college and career readiness. These agencies partnered with the Regional Educational Laboratory Midwest to examine whether high school graduates in each state who completed a large number of career and technical education courses in a single career-oriented program of study (concentrators) had different college and workforce outcomes from graduates who completed fewer (samplers) or no career and technical education courses (nonparticipants). The study found that in the 2012/13–2017/18 graduation cohorts, male graduates were more likely to be concentrators than female graduates, and graduates who received special education services were more likely to be concentrators than those who did not receive services. Graduates who were not proficient in reading in grade 8 also were more likely to become concentrators than those who were proficient. Graduates who attended urban and suburban schools were more likely than students who attended town and rural schools to be nonparticipants. Concentrators were less likely than samplers and nonparticipants with similar characteristics to enroll in college, but the differences reflect mainly enrollment in four-year colleges. Concentrators were more likely to enroll in two-year colleges. Concentrators also were less likely than similar samplers and nonparticipants to complete a bachelor's degree within four to six years. Finally, compared with similar samplers and nonparticipants, concentrators were employed at higher rates in the first five years after high school and had higher earnings.
|Identifying Indicators that Predict Postsecondary Readiness and Success in Arkansas
Arkansas has identified college and career readiness indicators for schools that can be used to monitor students' performance and to improve their postsecondary readiness and success. Using two cohorts of grade 6 students, this study examined the extent to which Arkansas’s middle school and high school indicators of postsecondary readiness predict a student postsecondary readiness outcome (an ACT score of 19 or higher) and success outcomes (enrolled in college for at least one term within eight years of beginning grade 6, and persisted in college by enrolling for more than one term within eight years of beginning grade 6). The study estimated the accuracy and strength of the middle school and high school indicators for predicting the outcomes. While fewer than half of students met the Arkansas postsecondary readiness standard, more than half enrolled in college and about half persisted for more than one term within eight years of beginning grade 6. Middle school and high school indicators, when combined with student background characteristics, predicted readiness and success outcomes with greater accuracy than did student background characteristics alone. Middle school indicators that were major predictors for at least two of the three outcomes examined included proficiency in English language arts and math, regular school attendance, no suspensions, and no expulsions. High school indictors that were major predictors for at least two of the outcomes included grade point average, enrollment in an advanced course, regular school attendance, and no expulsions.