IES Blog

Institute of Education Sciences

Beyond Wikipedia: Reading and Researching Online

By Becky McGill-Wilkinson, NCER Program Officer

Gone are the days of library card catalogs and having to consult the 26-volume hardbound encyclopedia gathering dust on your parents’ bookshelf. Students these days have seemingly infinite information at the tips of their fingers. Most households in the U.S. have a computer, and most teachers report at least one computer in their classrooms. Research shows that the majority of high school students use the Internet to complete school assignments, and 71 percent of students use their laptop computers for school. In this changing world, it becomes more and more important to understand how reading and researching on the Internet are different from performing those tasks with books and other paper texts.

Don Leu and his team at the University of Connecticut have been examining this topic for several years. First on their agenda was studying whether reading online is the same as reading on paper. They discovered that students who are poor readers on paper may be good readers online, and students who are good readers on paper are not necessarily good readers online, suggesting that reading online requires some unique skills. Leu and his collaborators argue that reading online requires that students be able to: (1) use search engines; (2) choose appropriate search result; (3) judge whether the source can be trusted to be accurate and unbiased; and (4) consolidate information across multiple websites or online texts.

Of course, it’s not enough to understand the process of reading and researching online. As with any skill, some students are better at it than others, and as computers, tablets, and smart phones become more common, it becomes more and more necessary for students to hone their online reading and research skills if they are to succeed in college and career. Teachers need to be able to teach these skills, and teachers need to be able to identify when their students need extra help or practice. In 2005, Leu received a grant from NCER to study Internet use in adolescents at risk for dropping out of school, and developed an intervention to help teach seventh-grade students specific strategies to locate, evaluate, synthesize, and communicate information on the Internet.

Building on this earlier work, in a 2009 grant from NCER, Leu and his team set out to develop measures of online reading comprehension. The end result of this project is a set of Online Research and Comprehension Assessments (ORCAs) for use with seventh grade students. The team developed both a multiple choice version and a version that allows students to work in a simulated internet environment. In both versions, the student is tasked with answering a research question posed by a simulated peer, and must use a search engine, choose the appropriate search result, determine whether a source is trustworthy, and then tell their simulated peer about what they found. The ORCAs were tested with 2,700 students in two different states, and the researchers surveyed teachers and other practitioners to determine whether the ORCAs were usable.

Leu has been especially interested in thinking about how changing ideas about literacy may impact low-income students differently from middle- and high-income students. In a recently published paper, Leu shows that students who came from families earning approximately $100,000 per year were more than a year ahead of students whose families earn approximately $60,000 per year on online reading abilities as measured by the ORCAs. This study highlights the importance of considering the achievement gaps between high- and low-income students on a variety of domains, including those not typically measured by standardized tests, such as online reading comprehension.

The ORCAs are available online for free, as is a professional development module to help teachers learn to use it. 

Questions? Comments? Please email us at IESResearch@ed.gov.

The Month in Review: August 2015

By Liz Albro, NCER Associate Commissioner of Teaching and Learning

Good Luck to Applicants!

Application deadlines for our main NCER and NCSER competitions have come and gone this month. We accepted applications for 5 competitions on August 6th and 3 competitions on August 20th. Now it’s time for us to begin screening applications and moving them into the peer review process!

NCER Staff Were Out and About

NCER staff had the opportunity to learn from experts in several meetings during the month of August.

Liz Albro attended the CRESST Conference 2015, where she participated in a session titled: Is There a Role for Evidence in the Future of K-16 Technology? The short answer was yes! She was joined at the meeting by Russ Shilling, the Executive Director of STEM Education at the Department, researchers with expertise in educational data mining, cognitive science, learning analytics, and assessment, and developers of education technology from around the world.

On August 20, NCER convened a technical working group (TWG) meeting on Researching the Influence of School Leaders on Student Outcomes. Nine researchers and practitioners who study education leadership met with ED staff to discuss the lessons learned from research that explicitly connects school leadership to student outcomes and the challenges to conducting such research. Department staff, including NCER’s Katina Stapleton, also presented information about education leadership studies funded by the National Center for Education Research, the National Center for Education Evaluation and Regional Assistance, and the Office of Innovation and Improvement. A meeting summary will be available soon on our TWG page.

In the final week of August, Meredith Larson, who oversees our research program on adult education, and Daphne Greenberg, the principal investigator of our National R&D Center, the Center for the Study of Adult Literacy, attended the 2015 National Meeting for Adult Education State Directors hosted by the Department’s Office of Career, Technical, and Adult Education.

Between Parents and Kids: IES-Funded Research in the News

Two publications from IES-funded research hit the national news this month … and both highlighted the critical role that parent-child interactions play in children’s learning outcomes. In one article, featured on WebMD, Paul Morgan and his colleagues reported that 2-year-old children with larger oral vocabularies demonstrated better academic achievement and behavior at kindergarten entry. The team also discussed child and family characteristics that are related to vocabulary size at age 2, which may help identify which groups of children are at risk for needing early language intervention.

In the other, discussed in the New York Times, Sian Beilock, Susan Levine, and their colleagues reported that parents’ math anxiety is related to their young children’s math achievement – and seems to emerge when math-anxious parents try to help their kids with their math homework.

We Said Farewell to Our Interns

As August ended, our summer interns went back to school. We were sad to see them go, but excited for them as the new school year begins. Think you might be interested in interning at IES? Read an interview with one of our interns, and learn how to apply to the internship program at the Department.

Questions? Comments? Please send them to IESResearch@ed.gov

Experts Discuss the Use of Mixed Methods in Education Research

By Corinne Alfeld and Meredith Larson, NCER Program Officers

Since IES was founded more than a dozen years ago, it has built a reputation for funding rigorous research to measure the causal effects of education policies and programs.  While this commitment remains solid, we also recognize the value of well-designed qualitative research that deepens understanding of program implementation and other educational processes and that generates new questions or hypotheses for study. In this blog post, we highlight the outcomes from a recent meeting we hosted focused on the use of mixed methods – that is, studies that combine qualitative and quantitative methods – and share some of the ways in which our grantees and other researchers incorporate mixed methods into their research.

On May 29, 2015, 10 researchers with experience designing and conducting mixed methods research met with staff from the two IES research centers in a technical working group (TWG) meeting. The TWG members shared their experiences carrying out mixed methods projects and discussed what types of technical assistance and resources we could provide to support the integration of high-quality mixed methods into education research. There was consensus among the TWG members that qualitative data is valuable, enriches quantitative data, and provides insight that cannot be gained from quantitative research alone.  Participants described how mixed methods in currently used in education research, proposed potential NCER and NCSER guidance and training activities to support the use of high-quality mixed methods, and offered suggestions for researchers and the field. Below are just a few examples that were shared during the meeting:

  • Dr. Carolyn Heinrich and colleagues used a longitudinal mixed method study design to evaluate the efficacy of supplemental education services provided to low-income students under No Child Left Behind. One of the critical findings of the study was that there was substantial variation across school districts in what activities were included in an hour of supplemental instruction, including (in some cases) many non-instructional activities.  This was revealed as the team examined the interview data describing what activities lay behind the shared metric of an hour of instructional time.  Having that level of information provided the team with critical insights as they examined the site-by-site variation in efficacy of supplemental education services.  Dr. Heinrich emphasized the need for flexibility in research design because the factors affecting the impact of an intervention are not always apparent in the design phase. In addition, she reminded the group that while statistical models provide an average impact score, there is valuable information included in the range of observed impacts, and that that variability is often best understood with information collected using in-depth field research approaches.
  • Dr. Mario Small used mixed methods research to examine social networks in childcare centers in New York City. Using observational methods, he discovered that variations in the level of networking among mothers depended on the individual child care center, not the neighborhood. He hypothesized that child care centers that had the strictest rules around pick-up and drop-off, as well as more opportunities for parent involvement (such as field trips), would have the strongest social networks. In such settings, parents tend to be at the child care center at the same time and, thus, have more interaction with each other. Dr. Small tested the hypotheses using analysis of survey and social network data and found that those who developed a social network through their child care center had higher well-being than those who did not. He concluded from this experience that without the initial observations, he would not have known that something small, like pick-up and drop-off policies, could have a big effect on behavior.
  • Dr. Jill Hamm described a difficult lesson learned about mixed methods “after the fact” in her study, which was funded through our National Research Center on Rural Education Support. In planning to launch an intervention to be delivered to sixth-grade teachers to help adolescents adjust to middle school, she and her colleagues worked with their school partners to plan for possible challenges in implementation. However, because some of the qualitative data collected in these conversations were not part of the original research study – and, thus, not approved by her Institutional Review Board – the important information they gathered could not be officially reported in publications of the study’s findings. Dr. Hamm encouraged researchers to plan to use qualitative methods to complement quantitative findings at the proposal stage to maximize the information that can be collected and integrated during the course of the project.
  • In a study conducted by Dr. Tom Weisner and his colleagues, researchers conducted interviews with families of children with disabilities to determine the level of “hassle” they faced on a daily basis and their perceptions of sustainability of their family’s routines. Findings from these interviews were just as good at predicting family well-being as parental reports of coping or stress on questionnaires. The findings from the analysis of both the qualitative and quantitative data collected for this study enhanced researchers’ understanding of the impact of a child’s disability on family life more than either method could have alone. Dr. Weisner observed that the ultimate rationale of mixed methods research should be to gather information that could not have been revealed without such an approach. Because “the world is not linear, additive, or decontextualized,” he suggested that the default option should always be to use mixed methods and that researchers should be required to provide a rationale for why they had not done so, where feasible.

Curious to learn more about what was discussed? Additional information is available in the meeting summary.

Comments? Questions? Please email us at IESResearch@ed.gov.

An Intern's Perspective on the National Center for Education Research

By Brittney Fraumeni, NCER Intern

Photo of Brittney Fraumeni

 

Each year, the Institute of Education Sciences’ two research centers offer unpaid internships for undergraduate or graduate students interested in learning about the research grant making process and contributing to the work of the centers.  Internships are coordinated through the U.S. Department of Education’s student volunteer office and are available throughout the year.  For application information, please see the ED Student Volunteer Unpaid Internship Program.  

This summer Brittney Fraumeni, a doctoral student in Psychological Science at DePaul University interned with the National Center for Education Research (NCER). At the end of her internship, Brittney reflected on her summer with NCER. 

What brought you to the internship?

As I headed in to the final months of my third year of graduate school, I began to really question what I wanted to do with my degree when I was finished. My PhD program emphasizes training for an academic position, but I had doubts about whether or not that was the best fit for me. So when the opportunity to be a summer intern at NCER presented itself, I seized it, hoping for a learning experience that would help shape my view of my future career.

How did you hear about the internship?

I briefly worked as a freelance researcher on a US Department of Education grant, which was the first time it occurred to me that the government had a research department. I easily found the IES website, and after some exploring on the site, discovered that they had internships available. I applied ASAP.

Why did you want to do the internship?

I really wanted an opportunity to see what a non-academic position could be like. As I mentioned, most of my graduate training has revolved around obtaining an academia related career, and so I had no idea what else was out there.

What were your days like at the internship?

The internship schedule was really flexible, and I was allowed to choose my own days and hours. Additionally, I was in charge of my own time management throughout the day. At the beginning of my six week stay, the main projects I was going to be working on were given to me, meaning that everyday I came in to the office after that, I mostly knew what I would be working on. I had three big projects to work on, so I usually just circulated through tasks for those, and every once in awhile a small project would head my way that I would add to my schedule.

What was beneficial about the internship?

The internship really helped confirm what I was already thinking at the beginning of the summer: I’d like to get a job in a non-academic field upon completing my doctorate. But, more than that, the internship gave me the chance to work with like-minded individuals who were open to letting me pick their brain and providing contact information with people in the education research field. Overall, it was a great learning and networking opportunity.

What did you learn from the internship?

More than just learning more about education research, I learned new skills. Before the internship, I only knew of social media from a personal standpoint. But, as more companies branch out to different social media outlets to promote their work, it’s important to know how to have a professional and effective social media presence. Working on the social media team at IES really boosted my social media skills.

What did you learn about IES/ED from doing the internship?

Before applying for the internship, I thought ED was really only a department focused on policy; I wasn’t even aware that the department was involved in research! However, through actually working here, I learned that not only is there a research department, there are so many more departments than I even could have imagined. IES itself is broken down in to multiple branches that all have different focuses on research elements. By working with the people here and having the opportunity to sit in on different meetings, I was able to learn what each department does and the special role they each play in promoting education research. Furthermore, I learned that not everybody took the same path to get here; IES is made up of employees with all different backgrounds, which makes for a fun and diverse environment to work in.

How did the internship reshape your thinking about education research?

I used to think education research was a relatively small area. Now, after having hands on experience with writing up award summaries, I know that there are many people interested in education research and pursuing it. It never occurred to me how many different companies (not just schools!) had an interest in developing interventions for education purposes. It is so inspiring to now know just how many people out there are trying to promote the best outcomes for students, from pre-k to college.


Questions? Comments? Please send them to IESResearch@ed.gov. 

The ‘Not So Simple’ View of Reading

By Karen Douglas, NCER Program Officer

 

Improving students’ capacity to understand what they read in all subject areas is a primary focus of educators and policymakers. Educators and researchers have been focused on interventions to improve reading for decades, and a great deal of attention has been given to improving word level skills (such as phonemic awareness and decoding). In part, this focus can be traced to the ‘Simple View of Reading,’ a theoretical framework developed by Gough and Tunmer almost 30 years ago.

The Simple View states that readers need to both understand language and decode the symbols on the page in order to comprehend written text. The influential role of decoding on reading outcomes has been well studied, and many interventions have been developed that show good results in improving these skills for many students. But improvement in decoding skills, while necessary, has not generally been sufficient to improve reading comprehension.

In recent years, researchers have begun exploring the other part of the equation -- language. Most often, researchers use vocabulary knowledge as a proxy for language skills and a great deal of research is focused on improving vocabulary skills. Efforts to improve vocabulary generally show that students learn the new words they are taught, but generalized effects on vocabulary knowledge and reading comprehension are elusive. It seems likely that in addition to understanding the meanings of individual words, students also need to know how words are constructed (morphology), how they are used in text (syntax and grammar), and how to make inferences from text in order to make sense of the wide variety of materials they must read.

The Reading for Understanding Research Initiative (RfU), funded in 2010 by IES, is addressing a broader conception of language in trying to improve reading comprehension. RfU provided funding for six research teams to study the basic processes that undergird reading comprehension, develop and test new curricula and instructional programs to improve it, and develop new assessments to provide a better measure of students’ capacity to read in authentic scenarios. Collectively, RfU researchers are studying the development of reading for understanding from prekindergarten through high school with the goal of creating new knowledge about what matters at each developmental stage in order for students to finish high school with sufficient reading skills for college and career. Each of these six teams has incorporated attention to aspects of language beyond vocabulary knowledge and several teams have published results that provide evidence of the potential of improved language skills for building reading comprehension. Abstracts for studies and publications to date can be found on the IES website.

In a recent article in Educational Psychology Review, my co-author Elizabeth Albro and I describe the purpose of the RfU Research Initiative, the goals of the six teams funded under the initiative, and progress made through 2014. As the work of the RfU Research Initiative comes to completion, the RfU researchers are positioned to make important contributions to what we know about the development of reading for understanding and how we can best improve it for all students. Expanded knowledge about the language skills that support reading for understanding and how to improve them will be a key component of this contribution. Stay tuned to Inside IES Research to learn more about what the teams are finding.