Skip Navigation
archived information

Measures of Aptitude
December 2020

Question

What does the research say about how reliable "measures of aptitude" are in predicting career interests and attainment in high school and postsecondary settings, especially related to science, technology, engineering, and math (STEM) fields?

Ask A REL Response

Thank you for your request to our Regional Educational Laboratory (REL) Reference Desk. Ask A REL is a collaborative reference desk service provided by the 10 RELs that, by design, functions much in the same way as a technical reference library. Ask A REL provides references, referrals, and brief responses in the form of citations in response to questions about available education research.

Following an established REL Northwest research protocol, we conducted a search for evidence- based research. The sources included ERIC and other federally funded databases and organizations, research institutions, academic research databases, Google Scholar, and general Internet search engines. For more details, please see the methods section at the end of this document.

The research team has not evaluated the quality of the references and resources provided in this response; we offer them only for your reference. The search included the most commonly used research databases and search engines to produce the references presented here. References are listed in alphabetical order, not necessarily in order of relevance. The research references are not necessarily comprehensive and other relevant research references may exist. In addition to evidence-based, peer-reviewed research references, we have also included other resources that you may find useful. We provide only publicly available resources, unless there is a lack of such resources or an article is considered seminal in the topic area.

References

Baker, H. E., Styer, J. S., Harmon, L., & Pommerich, M. (2010). Development and validation of the FYI: A preliminary report. Paper presented at the meeting of the American Educational Research Association, Denver, CO. https://eric.ed.gov

From the Abstract:
"Developed for the Armed Services Vocational Aptitude Battery (ASVAB) Career Exploration Program, the Find Your Interests (FYI) inventory was designed to help students learn about their career-related interests. The FYI is a 90-item interest inventory based on Holland's (1973, 1985, 1997) widely accepted theory and taxonomy of career choice. The inventory determines their resemblance to each of the six interest types. Nearly one-quarter of all high school students in the nation participate in the ASVAB Program (Baker, 2000), underscoring the need for study and documentation of the creation and validation of this instrument. Based on a large national sample of high school students, analyses were conducted to assess FYI content, criterion, and construct-related evidence of validity. Results showed that the FYI: (a) is composed of six factors with each factor representing one RIASEC domain; (b) has a hexagonal shape; and (c) has substantial relationships with the Strong Interest Inventory. Throughout the analyses, consistent content, criterion, and construct-related evidence for the validity of the FYI are presented."

Camara, W. (2013). Defining and measuring college and career readiness: A validation framework. Educational Measurement: Issues and Practice, 32(4), 16–27. Retrieved from https://www.researchgate.net

From the Abstract:
"This article reviews the intended uses of these college- and career-readiness assessments with the goal of articulating an appropriate validity argument to support such uses. These assessments differ fundamentally from today's state assessments employed for state accountability. Current assessments are used to determine if students have mastered the knowledge and skills articulated in state standards; content standards, performance levels, and student impact often differ across states. College- and career-readiness assessments will be used to determine if students are prepared to succeed in postsecondary education. Do students have a high probability of academic success in college or career-training programs? As with admissions, placement, and selection tests, the primary interpretations that will be made from test scores concern future performance. Statistical evidence between test scores and performance in postsecondary education will become an important form of evidence. A validation argument should first define the construct (college and career readiness) and then define appropriate criterion measures. This article reviews alternative definitions and measures of college and career readiness and contrasts traditional standard-setting methods with empirically based approaches to support a validation argument."

Coyle, T. R. (2019). Tech tilt predicts jobs, college majors, and specific abilities: Support for investment theories. Intelligence, 75, 33–40. Retrieved from https://www.gwern.net

From the Abstract:
"Specific cognitive abilities include ability tilt, based on within-subject differences in math and verbal scores on standardized tests (e.g., SAT, ACT). Ability tilt yields math tilt (math > verbal), which predicts STEM (science, technology, engineering, math) criteria, and verbal tilt (verbal > math), which predicts humanities criteria. The current study examined a new type of tilt: tech tilt, based on within-subject differences in technical scores and academic scores (math or verbal) on the Armed Services Vocational Aptitude Battery. (Technical scores tapped vocational skills for electronics, mechanics, cars, and tools.) The difference yielded two types of tilt: tech tilt (tech > academic) and academic tilt (academic > tech). Tech tilt was correlated with math and verbal scores on college aptitude tests (SAT, ACT, PSAT), ability tilt on the college tests, and STEM and humanities criteria (college majors and jobs). Tech tilt correlated negatively with academic abilities (math or verbal) on the college tests and predicted STEM criteria. In addition, academic tilt (math or verbal) predicted the analogous type of tilt on the college tests. The effects replicated using different analytical approaches (e.g., regressions and structural equation modeling) and after controlling for g. The negative effects of tech tilt with academic abilities support investment theories, which predict that investments in one domain (non-academic and technical) come at the expense of investments in competing domains (academic). In addition, the effects demonstrate the validity of vocational aptitudes, extending prior research on ability tilt, which focuses on academic aptitudes. Future research should consider factors that moderate the effects of tech tilt (e.g., life history and ability level) as well as other types of tilt (e.g., spatial tilt)."

de Beer, M. (2011). The role of the Learning Potential Computerised Adaptive Test (LPCAT) in the vocational guidance assessment of adolescents. Educational and Child Psychology, 28(2), 114–129. Retrieved from https://citeseerx.ist.psu.edu

From the Abstract:
"In the present study, the role of learning potential assessment as part of cognitive assessment for vocational guidance was investigated for a population (N = 262) of junior secondary students. Mean scores, distribution of scores, inter-correlation of scores and predictive validity were evaluated. The mean learning potential scores indicated a level of general reasoning and learning potential higher than the academic level of the students at the time of assessment and the distribution of the scores indicated tertiary level potential for some learners. Statistically significant correlations were found between the LPCAT learning potential scores and three sub-tests of the Differential Aptitude Test (Form R) (DAT-R) namely Verbal Reasoning, Comparison and Spatial Perception. Furthermore, all cognitive scores showed statistically significant correlations with the aggregate end-of-year academic performance in English, Life Orientation and Mathematics. Based on the results of this study, verbal reasoning is a better predictor of aggregate academic performance than learning potential based on non-verbal figural reasoning. A total of 35.3 per cent of the variance in academic performance was predicted by combining learning potential and aptitude scores. The unique explanation of variance in academic performance by means of the LPCAT post-test results alone was 12.9 per cent, while for Verbal Reasoning aptitude alone it was 29.2 per cent."

Francis-Smythe, J., Haase, S., Thomas, E., & Steele, C. (2013). Development and validation of the career competencies indicator (CCI). Journal of Career Assessment, 21(2), 227–248. Retrieved from https://citeseerx.ist.psu.edu

From the Abstract:
"This article describes the development and validation of the Career Competencies Indicator (CCI); a 43-item measure to assess career competencies (CCs). Following an extensive literature review, a comprehensive item generation process involving consultation with subject matter experts, a pilot study and a factor analytic study on a large sample yielded a seven-factor structure; goal setting and career planning, self-knowledge, job performance, career-related skills, knowledge of (office) politics, career guidance and networking, and feedback seeking and self-presentation. Coefficient alpha reliabilities of the seven dimensions ranged from 0.93 to 0.81. Convergent validity was established by showing that all 7-CCs loaded substantially onto a single second-order factor representing the general CC construct. Discriminant validity was established by showing less than chance similarity between the 7-CCI subscales and the Big Five personality scales. The results also suggested criterion-related validity of the CCI, since CCs were found to jointly predict objective and subjective career success."
Kier, M. W., Blanchard, M. R., Osborne, J. W., & Albert, J. L. (2014). The development of the STEM career interest survey (STEM-CIS). Research in Science Education, 44(3), 461–481. Retrieved from https://www.researchgate.net
From the Abstract:
"Internationally, efforts to increase student interest in science, technology, engineering, and mathematics (STEM) careers have been on the rise. It is often the goal of such efforts that increased interest in STEM careers should stimulate economic growth and enhance innovation. Scientific and educational organizations recommend that efforts to interest students in STEM majors and careers begin at the middle school level, a time when students are developing their own interests and recognizing their academic strengths. These factors have led scholars to call for instruments that effectively measure interest in STEM classes and careers, particularly for middle school students. In response, we leveraged the social cognitive career theory to develop a survey with subscales in science, technology, engineering, and mathematics. In this manuscript, we detail the six stages of development of the STEM Career Interest Survey. To investigate the instrument's "reliability and psychometric properties," we administered this 44-item survey to over 1,000 middle school students (grades 6-8) who primarily were in rural, high-poverty districts in the southeastern USA. Confirmatory factor analyses indicate that the STEM-CIS is a strong, single factor instrument and also has four strong, discipline-specific subscales, which allow for the science, technology, engineering, and mathematics subscales to be administered separately or in combination. This instrument should prove helpful in research, evaluation, and professional development to measure STEM career interest in secondary level students."

Mau, W. C., Chen, S. J., & Lin, C. C. (2019). Assessing high school student’s STEM career interests using a social cognitive framework. Education Sciences, 9(151), 1–11. https://eric.ed.gov

From the Abstract:
"This study investigated the psychometric properties of the Chinese version of the STEM Career Interest Survey (STEM-CCIS) with data from 590 high-school students in Taiwan. Measurement models based on Social-Cognitive Career Theory (SCCT) and STEM discipline-specific dimensions (Science, Technology, Engineering, Mathematics) were examined using confirmatory factor analyses. Findings from confirmatory factor analyses indicated that STEM-CCIS possesses adequate reliability and factorial validity, replicating the sound psychometric properties of the original English version of the STEM-CIS. Implications for the use of the STEM-CCIS are discussed."

Metz, A. J., & Jones, J. E. (2013). Ability and aptitude assessment in career counseling. In S. D. Brown & R. W. Lent (Eds.), Career development and counseling: Putting theory and research to work (pp. 449–476). Wiley. Retrieved from https://www.researchgate.net

From the Abstract:
"This chapter highlights the use of abilities, aptitudes, and skills in expanding career options, narrowing options, making a career decision, and managing one’s career. First, we examine some of the historical milestones associated with ability assessment. Then we define important constructs and review the structure and stability of these constructs, with special attention to gender and cultural differences. Assessment strategies, methods, and tools are also explored, with the primary goal of promoting effective, scientifically informed career practices in high school, college, and the workforce."

Nye, C. D., Su, R., Rounds, J., & Drasgow, F. (2012). Vocational interests and performance: A quantitative summary of over 60 years of research. Perspectives on Psychological Science, 7(4), 384–403. Retrieved from https://citeseerx.ist.psu.edu

From the Abstract:
"Despite early claims that vocational interests could be used to distinguish successful workers and superior students from their peers, interest measures are generally ignored in the employee selection literature. Nevertheless, theoretical descriptions of vocational interests from vocational and educational psychology have proposed that interest constructs should be related to performance and persistence in work and academic settings. Moreover, on the basis of Holland’s (1959, 1997) theoretical predictions, congruence indices, which quantify the degree of similarity or person–environment fit between individuals and their occupations, should be more strongly related to performance than interest scores alone. Using a comprehensive review of the interest literature that spans more than 60 years of research, a meta-analysis was conducted to examine the veracity of these claims. A literature search identified 60 studies and approximately 568 correlations that addressed the relationship between interests and performance. Results showed that interests are indeed related to performance and persistence in work and academic contexts. In addition, the correlations between congruence indices and performance were stronger than for interest scores alone. Thus, consistent with interest theory, the fit between individuals and their environment was more predictive of performance than interest alone."

Methods

Keywords and Search Strings: The following keywords, subject headings, and search strings were used to search reference databases and other sources: ("Measures of aptitude" OR "aptitude measures" OR "aptitude measurements" OR "measurements of aptitude"), Students, Career OR careers, STEM, Reliability, "Aptitude assessments", "Aptitude tests", Technology-based, "Computer-based"

Databases and Resources: We searched ERIC for relevant resources. ERIC is a free online library of more than 1.6 million citations of education research sponsored by the Institute of Education Sciences (IES). Additionally, we searched Google Scholar and EBSCO databases (Academic Search Premier, Education Research Complete, and Professional Development Collection).

Reference Search and Selection Criteria

When we were searching and reviewing resources, we considered the following criteria:

Date of publications: This search and review included references and resources published in the last 10 years.

Search priorities of reference sources: Search priority was given to study reports, briefs, and other documents that are published and/or reviewed by IES and other federal or federally funded organizations, as well as academic databases, including ERIC, EBSCO databases, and Google Scholar.

Methodology: The following methodological priorities/considerations were given in the review and selection of the references:

  • Study types: randomized control trials, quasi experiments, surveys, descriptive data analyses, literature reviews, and policy briefs, generally in this order
  • Target population and samples: representativeness of the target population, sample size, and whether participants volunteered or were randomly selected
  • Study duration
  • Limitations and generalizability of the findings and conclusions

This memorandum is one in a series of quick-turnaround responses to specific questions posed by stakeholders in Alaska, Idaho, Montana, Oregon, and Washington, which is served by the Regional Educational Laboratory (REL) Northwest. It was prepared under Contract ED-IES-17-C-0009 by REL Northwest, administered by Education Northwest. The content does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.