Skip to content

ACT Newsroom & Blog

Hide All News & Blogs View All News & Blogs

Danger Time for College Students: Working at a Job More than 15 Hours a Week May Do More Harm Than Good, Especially for Underserved Students

ACT Center for Equity in Learning’s Report Also Highlights Need for Lifelong Learning to Advance Equity WASHINGTON, D.C.—Aug. 28, 2017—Wor...

Read this article


ACT Center for Equity in Learning’s Report Also Highlights Need for Lifelong Learning to Advance Equity

WASHINGTON, D.C.—Aug. 28, 2017—Working more than 15 hours a week while in college may do more harm than good for college students from underserved backgrounds, according to a new report from the ACT Center for Equity for Learning. The result of working more hours contributes to “disparities in students’ academic and career success” is one of the findings of “Who Does Work Work For? Understanding Equity in Working Learner College and Career Success.

Over time, students from all backgrounds who work more than 15 hours weekly tend to fall behind in their academic progress, as well as in their earnings, debt, and early career outcomes. The stakes for underserved groups (members of racial or ethnic minorities, first-generation college students, or students from low-income families) are especially high.

The analysis by Sarah Blanchard Kyte, Ph.D., is based on National Center for Education Statistics data of a nationally representative cohort of first-time freshmen over a period of six years to understand when and why working during college contributes to disparities in students’ academic and career success. The data show that most students (59 percent) work during college.

The report suggests working a more moderate number of hours may be a key strategy for students from low-income families trying to get through and get ahead in college. It also finds that employers and others can assist by trying to adapt to accommodate the real-world demands on college-aged working learners.

The findings come on the heels of the Trump administration’s proposed budget that calls for a $500 million cut to the federal work-study program. The administration also proposes new criteria for who qualifies for the work-study program. If enacted, these changes would lead to only 333,000 students receiving work-study aid in 2018, compared to the 634,000 students receiving it in 2017, according to some estimates.

The report states: Among students working 15 or fewer hours, 76 percent work off campus and 61 percent receive no work-study. This suggests that students who work fewer hours benefit not only from fewer hours of work, but also from work arrangements that accommodate the rhythms of the semester and disproportionately, from work experiences set within their college or university.

Jim Larimore, chief officer, ACT Center for Equity in Learning, says, “We know that work-study and other considerations provide a smart way for learners to earn income in a way that supports them in their goal, which is an associate or undergraduate degree. Throughout the economy, we believe we need to build in more ways to help workers at every point in their careers become working learners. These ideas are central to our effort to support closing gaps in equity and achievement.”

Education as a Lever for Equity: Troubling Signs

A companion report from the ACT Center for Equity in Learning also reveals anxiety among the workforce as students go back to school and Americans pause to think about the meaning of Labor Day.

The report, “Equity in the Opportunities, Support, and Returns to Working and Learning Among U.S. Adults,” concludes that those who have successfully navigated the U.S. education system—people with a bachelor’s degree—have less confidence in it and more reservations about their ability to get ahead through hard work. On the other hand, adults who are disadvantaged in the labor market by the lack of a college degree remain more optimistic about education and equity.

Survey analysis in the report finds that a college degree is necessary but not sufficient for staying competitive in a changing economy. The American worker now must evolve into a working learner; this is a “valuable way to cultivate internal talent pipelines, employee satisfaction, and to level the playing field between workers of different educational backgrounds,” the report states.

For the vast majority of employees, becoming a working learner carries with it a substantial increase in earnings, even when a credential is not completed and independent of working learner’s age. Greater awareness of the returns for nontraditional students may offset anxieties about the costs of further education.

Test Scores Remain Vital in College Admissions

We are in the midst of the annual college admissions cycle when many high school seniors are making decisions about where they will atten...

Read this article



We are in the midst of the annual college admissions cycle when many high school seniors are making decisions about where they will attend college in the fall. During this time of year I often see news stories that dismiss the role of admissions tests. Unfortunately, many of those stories are misinformed about the utility and value such assessments provide to students, schools, and colleges.
The biggest misperception I see is the argument that high school grades are the best indicator of college success and, therefore, we don’t really need standardized admissions tests. This notion is misinformed. That’s a polite way of saying it is nonsense.

Let’s start with the fallacy in the argument. High school grades are not, in fact, the best indicator of college success. Neither are test scores alone. In fact, the best predictor of success in college coursework is the combination of the two—grades and test scores together. Hundreds of independent studies have shown this to be true.

The figure below illustrates the additional value test scores contribute beyond grades.  Two students with the same high school GPA of 3.0 may have widely different probabilities of attaining a similar college freshmen GPA based on their ACT scores.  If one student had an ACT Composite score of 20, they had a 28% probability of earning a 3.0 or higher freshman GPA; if another student had an ACT Composite score of 30, they had a 54% probability.

High school grades have their limitations.  They not only reflect the idiosyncrasies of individual teachers’ grading standards and differences in course rigor, but they also contain inflation. More than 55 percent of college-bound students report having high school grades above 3.25, and 25 percent of U.S. high schools report an average GPA of 3.75 or higher for their graduating class.

We often accept the hypothesis that grades are fair and unbiased indicators of future success without much scrutiny, but grade inflation has steadily increased in the past few decades, and it has increased more rapidly for white and Asian-American students coming from more highly educated families.

Educators acknowledge that there are differences in the quality of schools and the rigor of curriculum. Test scores are one measure that helps colleges navigate and mitigate those differences, allowing them to compare the preparation of students coming from different backgrounds and different experiences. Without test scores, colleges must rely on their own subjective impressions of different groups of students and the quality of different high schools.  We know that subjective impressions and decisions have biases which are often implicit and never as accurate as empirical data.

Admissions tests provide a common metric that allows colleges to evaluate students who attend different high schools, live in different states, complete different courses, and receive different grades from different teachers. High school GPA simply cannot do that.


Good decision-making in the admissions process requires consideration of multiple sources of data; important decisions that impact students’ lives should never be based solely on one metric.  Research has shown that around 70 percent of college-bound students actually perform similarly across both high school grades and admission test scores (i.e., high, average or low on both measures).  In such situations, tests and grades provide confirmatory evidence that can increase our confidence in our decisions.

In the other 30 percent of situations, a student’s high school grades may be significantly higher or lower than his or her ACT scores. When this occurs, admissions professionals justifiably may place much scrutiny on both the test scores and grades. Perhaps the test scores become less persuasive and relevant, or possibly the grades and other factors receive additional scrutiny.  This is not a rationale to dismiss objective test scores but rather the justification for using multiple measures and professional judgment in evaluating college applicants and their potential fit and likely success at each institution. When multiple sources of information are available, basing decisions on less information is never the best solution.

Colleges, by and large, understand this. Most—despite what you may have heard—continue to require that applicants submit test scores. Colleges rank admissions test scores second in importance after high school grades earned in college-prep courses as an admission criterion. And they actually rank test scores above high school grades earned in all courses.

I often read articles that describe what admissions tests don’t do, but they ignore or lose sight of what admission tests can do.   ACT score results, for example, help with college and career planning. ACT score reports provide feedback on the types of careers and majors that best match the student’s interests and skills. They also provide an early indicator of the types of colleges at which students may be most competitive and allow parents, teachers and counselors to assist students in planning for admissions. Not every student can or should go to an Ivy League college, and admissions tests help determine potential schools and colleges that may best fit a student’s preparation and aspiration.

There is no single measure that can definitively predict future behavior by itself, and all measures have limitations.  The best decisions are made when multiple sources of data are considered. There is no reason to ignore test scores, just as there is no reason to ignore previous accomplishments, high school grades, or personal factors that have influenced a student’s development and aspirations.

Our ultimate goal should be to help students land where they have the best possibility of success, and there is no question that admissions test scores help accomplish this goal.

Affirming and Advancing SEL: The Evidence is Here

“Sure, Social and Emotional Learning (SEL) would be a great addition, but there’s no way to add it to the plate; our teachers already hav...

Read this article


“Sure, Social and Emotional Learning (SEL) would be a great addition, but there’s no way to add it to the plate; our teachers already have their hands full with preparing students academically, and there’s no way our board will let us direct resources to anything that doesn’t raise test scores.”
“My teachers would love to get more support for Social and Emotional Learning, and would be happy to commit more time to it, but our district office demands that our initiatives be evidence-based, and how do we really know if teaching SEL actually works.”


 How often have you heard variants of one or both of these comments in your school or district?


For educators working with students daily, juggling the myriad of demands enhancing student achievement and navigating the complex regulations burdening them, these are entirely understandable reactions and concerns. It’s not that teachers, counselors, principals, and superintendents don’t care about life skills for their students; of course they do. They know more than anyone what a difference it makes when students can manage their anger, persevere during difficulties, exercise self-discipline in their studies, and get along with others.


 What educators need is twofold.

  1. They need authoritative national or global evidence that SEL programs work for their own sake (for improving SEL) and for improving academic achievement
  2. They need tools by which they can evaluate whether their own programming is working by which they can demonstrate return on investment and generate information for continuous improvement.
Educators can take assurance – the truth is out there. Both these needs can be met. First, the evidence for SEL programming in general has in a recent study been resoundingly reaffirmed and second, the tools for evaluating SEL student growth and assessing SEL’s contribution to student achievement are newly arriving in the marketplace.


 First things first: SEL programming really works.


The new study, published July 2017 in the esteemed peer-reviewed journal Child Development, is entitled “Promoting Positive Youth Development Through School-Based Social and Emotional Learning Interventions: A Meta-Analysis of Follow-Up Effects” by Rebecca D. Taylor (Collaborative for Academic, Social and Emotional Learning), Eva Oberle (University of British Columbia), Joseph A. Durlak (Loyola University), Roger Weissberg (CASEL, University of Illinois at Chicago).

It is described by CASEL (Collaborative for Academic, Social, and Emotional Learning) as being a follow up to a 2011 meta-analysis that deserves to be widely recognized throughout K-12 education. That study, sometimes referred to as Durlak 2011, reviewed 213 SEL programs involving 270,000 children. The study concluded that “compared to controls, SEL participants demonstrated significantly improved social and emotional skills, attitudes, behavior, and academic performance that reflected an 11-percentile-point gain in achievement.”

Learn more about the new study here.

The new study is based on a study of 82 different interventions involving more than 97,000 students from kindergarten to high school, where the effects were assessed at least six months and up to 18 years after the programs ended, it adds further fuel to the argument for teaching the whole child and supporting out students in all their growth needs. It concludes that these programs have short and long term positive consequences for students; in one particularly dramatic finding, “in follow-up assessments an average of 3.5 years after the last intervention, the academic performance of students exposed to SEL programs was an average 13 percentile points higher than their non-SEL peers, based on the eight studies that measured academics.”

The report also summarizes findings from some of the individual studies contained within the meta-analysis. Among them are that “SEL participants later demonstrated a 6% increase in high school graduation rates, and an 11% increase in college graduation rates. In other cases, SEL participants were less likely to have a clinical mental health disorder, ever be arrested or become involved with the juvenile justice system, and had lower rates of sexually transmitted infections, and pregnancies.”

Educators can take this to the bank—in fact, take it to your local banker, chamber of commerce, city councilor, and certainly your school board as part of an argument that these programs deserve their moral support and their financial support.


 The Next Order of Business: Gather Your Evidence


Having established the compelling benefits of SEL generally, the next order of business is to gather evidence of what is working in your school and district. Are programs, interventions, and initiatives having an impact on student SEL skill development? Which students are benefitting the most, and where is additional support most needed? At what grade levels should skill development be targeted? And when skills do rise, what correlated effects also are identified?

This is where high quality, reliable and valid SEL assessment implemented at your school level can help. ACT Tessera® is exactly that: an innovative, evidence-based system by which you can easily administer assessments and collect data on what’s working and what’s not, and use these data for evaluating the impact of improved social emotional learning. The system also comes with leadership coaching for school administrators looking to improve SEL in their building or district, and with a comprehensive teacher playbook for raising the quality of teaching and learning of these vital skills.

There’s every reason to believe that supporting student growth in these critical areas will result in higher academic achievement and in better high school graduation rates, but wouldn’t it be excellent to be able to prove this fact so as to ensure the continuation of your SEL programs when axe-wielding cost-cutters come to town?

Learn more here: www.act.org/act-tessera

ACT Makes Strategic Investment in Leading Education Venture Fund—New Markets Venture Partners—to Grow Transformative Education Companies

New Markets and ACT will back entrepreneurs creating innovative pathways improving the education-to-employment pipeline IOWA CITY, Iowa—AC...

Read this article


New Markets and ACT will back entrepreneurs creating innovative pathways improving the education-to-employment pipeline

IOWA CITY, Iowa—ACT, a mission-driven nonprofit organization, today announced a $10.5 million strategic investment in New Markets Venture Partners, a pioneering education-focused fund with a successful 14-year track record of identifying and scaling businesses committed to improving educational outcomes for millions of students across the K-12 and postsecondary spectrum.

“Investing in innovation is central to our mission of helping people achieve education and workplace success,” said Marten Roorda, ACT chief executive officer. “We are inspired by both the impact of technology and a growing community of entrepreneurs working to address some of our most vexing challenges across the student life cycle. Our investment with New Markets is designed to support businesses that are making an impact by allowing them to invest in critical efficacy research and to attract strong teams needed to scale.”

With ACT’s support, the 10-year fund will invest in education companies with proven models of success across the early childhood, K-12 and postsecondary education landscape. The fund also intends to back emerging approaches to workforce development, hiring and training at a time when skills and equity gaps challenge employers.

“There is an urgent need for investments into new educational models with proven efficacy that address the nation’s achievement and skills gaps,” said Mark Grovic, general partner and founder of New Markets Venture Partners. “Given ACT’s long history of leadership and innovation in assessment and college and career readiness, New Markets is excited to work with them to identify and support the nation’s best education entrepreneurs working to remove the obstacles that prevent students from reaching their full potential. ACT will be a great partner both for us and for our portfolio companies.”

ACT will be the largest strategic investor in the new fund, which recently named former Bill and Melinda Gates Foundation Deputy Director Jason Palmer as general partner. Other investors include the Lumina Foundation, Strada Education Network, ECMC Group and Prudential Financial.

“ACT’s strategic investment philosophy is to leverage our strong heritage as an authority in education and career research, coupled with our industry presence and influence, to enter new markets and advance our mission and long-term strategy,” said Brad Lindaas, ACT vice president for strategy. “Our investment with New Markets aligns directly with this philosophy.”


About ACT

ACT is a mission-driven, nonprofit organization dedicated to helping people achieve education and workplace success. Headquartered in Iowa City, Iowa, ACT is trusted as a national leader in college and career readiness, providing high-quality assessments grounded in nearly 60 years of research. ACT offers a uniquely integrated set of solutions designed to provide personalized insights that help individuals succeed from elementary school through career.

New Markets Venture Partners

Founded in 2003, New Markets is one of the leading education technology-focused venture firms in the U.S. New Markets has built strong relationships with states, districts, universities and other centers of innovation that allows it to provide exceptional value to its portfolio companies. Current investments include: Graduation Alliance, a leading dropout prevention and recovery firm helping students with diplomas, credentials and jobs; Presence Learning, a pioneer in the application of teletherapy in K-12 schools; leading digital credential platform Credly; and Straighterline, which offers low cost college courses accepted by over 100 accredited partner colleges. Notable exits include Moodlerooms (acquired by Blackboard), Starfish (acquired by Hobsons) and Questar Assessment (acquired by ETS).

Agreement Increases Access to ACT WorkKeys, ACT WorkKeys Career Readiness Certificate for Millions of Mexican Workers

IOWA CITY, Iowa—A new agreement between CONOCER, a Mexican training institution, and INFONACOT, a Mexican government trust fund, will remove...

Read this article


IOWA CITY, Iowa—A new agreement between CONOCER, a Mexican training institution, and INFONACOT, a Mexican government trust fund, will remove barriers and pave the way for millions of Mexican workers to take the ACT® WorkKeys® Assessments and certify their work readiness.

INFONACOT supports workers in Mexico by offering them small loans to buy durable consumer goods, which are repaid through programmed discounts taken from the workers’ salaries. The agreement with CONOCER, a public entity of the Mexican federal government, opens a new credit line to finance loans for workers to pay for workforce credentialling and training, including WorkKeys, through INFONACOT.

“We are impressed with the ingenuity of the INFONACOT program and pleased that the government has expanded the list of eligible goods to include skills training and professional development,” said ACT Chief Commercial Officer Suzana Delanghe. “INFONACOT was originally created to allow workers to buy cars, equipment, housing and healthcare so they could have the tools necessary to succeed and thrive. Including education and skills training shows the program recognizes that these are also essential tools to drive upward mobility and fuel success.”

“We are truly gratified to see our workforce solutions become a part of this admirable program and know that it will remove barriers for workers who are motivated to gain new skills,” said Jacqueline Krain, ACT vice president for international markets. “We hope this agreement will open doors for many Mexican workers to take advantage of new opportunities, advance their careers and improve their lives.”

WorkKeys Assessments measure skills needed for success in the workplace in areas such as applied mathematics, reading for information, and locating information. They have been used for more than two decades to measure essential workplace skills and help individuals build career pathways.

As announced this past May, the WorkKeys Assessments are distributed in Mexico through an agreement with CONOCER and FUNDAMEE. Individuals who score high enough on WorkKeys will earn an ACT® WorkKeys® Career Readiness Certificate that documents essential work skills. The certificate, which is recognized in both the US and Mexico, is used by job seekers as a credential and by employers as a way to analyze the qualifications of job candidates. Nearly 3.8 million individuals have earned an ACT® WorkKeys® National Career Readiness Certificate® in the US.

About ACT

ACT is a mission-driven, nonprofit organization dedicated to helping people achieve education and workplace success. Headquartered in Iowa City, Iowa, ACT is trusted as a leader in college and career readiness, providing high-quality assessments grounded in nearly 60 years of research. ACT offers a uniquely integrated set of solutions designed to provide personalized insights that help individuals succeed from elementary school through career.

What the Research Says About the Effects of Test Prep

There have always been claims that targeted interventions can increase scores on academic achievement tests. Much of the media attention h...

Read this article


There have always been claims that targeted interventions can increase scores on academic achievement tests. Much of the media attention has focused on the extent that commercial test preparation courses can raise scores on college admissions tests.

This is a highly controversial issue in education because it addresses fundamental questions about test validity and fairness.  If a modest intervention such as a test prep program can result in a large increase in test scores, then what does that say about the validity of scores earned, both by students who received the intervention and by those who did not?

A thoughtful piece by Jim Jump published last month in Inside Higher Ed (5/22) raises some of the same issues and questions about recent claims of large score increases on the SAT based on moderate amounts of “instruction.”

The interest in this topic provides an opportunity to review the research on test preparation in general and to make some connections to similar claims made about the impact of other types of interventions on achievement.

To cut to the chase: The research clearly suggests that short-term test prep activities, while they may be helpful, do not produce large increases in college admission test scores.

There are some fundamental principles about admissions test scores that have remained constant across numerous redesigns, changes in demographics, and rescaling efforts. They include the following:
  • Scores on college admissions tests (as well as most cognitive tests) generally increase with retesting, so any claim about score increases must statistically explain the proportion of change attributed to additional time, practice, and growth apart from any intervention1.
  • John Hattie’s exhaustive synthesis of over 800 meta-analyses related to achievement show almost any type of intervention—more homework, less homework, heterogeneous grouping, homogeneous grouping—will show some positive effect on student achievement; it is hard to stop learning2. But, in general, small interventions and shorter interventions have less impact on student achievement.  Web-based instruction has an effect size of .30, which is certainly good. The average effect size across all interventions, however, is less than .40.
  • Students who participate in commercial coaching programs differ in important ways from other test takers.  They are more likely than others to:  be from high income families, have private tutors helping them with their coursework, use other methods to prepare for admission tests (e.g., books, software), apply to more selective colleges, and be highly motivated to improve their scores.  Such differences need to be examined and statistically controlled in all studies on the impact of interventions. Claims about the efficacy of instructional interventions and test preparation programs on test scores have been shown to be greatly exaggerated.
  • There have been about 30 published studies of the impact of test preparation on admissions test scores.  Results across these studies are remarkably consistent. They show a typical student in a test prep program can expect a total score gain of 25 to 32 points on the SAT 1600-point scale, and similar respective results can be found for the ACT and GRE. The reality is far less than the claims.
  • In 2009, Briggs3 conducted the most recent comprehensive study of test preparation on admissions tests. He found an average coaching boost of 0.6 point on the ACT Math Test, 0.4 point on the ACT English Test, and -0.7 point on the ACT Reading Test4. Similarly, test preparation effects for the SAT were 8 and 15 points on the reading and math sections, respectively.  The effects of computer-based instruction, software, tutors and other similar interventions appear no larger than those reported for test preparation.
Claims about score increases and to what they may be attributed are among the most difficult to make and verify. There are factors that confound the results, such as differences in student motivation to improve, regression to the mean, and the fact that oftentimes students engage in multiple activities to increase their scores. However, research in this area consistently refutes claims of large increases in average or median scores.

There have been many more studies which attempted to examine the impact of instructional programs on achievement. Again, such studies are equally difficult to conduct and equally unlikely to show effect sizes larger than the typical growth students encounter simply from another year of instruction, coursework and development. Research-based interventions which are personalized to the learner can improve learning, and increased learning will impact test scores. However, in order to support such claims, research studies which address the representativeness of the sample and equivalent control groups, the extent and type of participation in the intervention, and many other contextual factors need to be addressed and published. This is how we advance scientific knowledge in education and basically any field.

Jim Jump’s previously referenced column identified many questions and possible problems associated with the claims related to the efficacy of participation in Khan Academy’s programs on SAT scores.  However, few if any of these questions or concerns can be answered, simply because no research behind the claims has been made available by the College Board to review or examine—all we have is their press release—and claims can neither be supported nor refuted when there is no methodology to examine.  Further speculation about the efficacy of this intervention is not helpful.  But there are some additional facts about testing, interventions, and claims of score increases to consider when we read any claims or research on the subject.

First, while test preparation may not lead to large score increases, it can be helpful. Students who are familiar with the assessment, have taken practice tests, understand the instructions and have engaged in thoughtful review and preparation tend to be less anxious and more successful than those who haven’t.  Such preparation is available for free to all students on the ACT website and other sources.
Second, the importance of scores on tests such as the ACT and SAT continues to be exaggerated. What is ultimately important is performance in college.

We know that some interventions can increase test scores by two-thirds of a standard deviation. The question should be whether there is evidence of a similar increase in college grades (which is the outcome that admissions tests predict).  Claims that test preparation could result in large score increases required serious investigation because they threatened to undermine the validity of admission testing scores. Simply put, if an intervention increases test scores without increasing college grades, then there is some bias present in some scores.

It is possible that scores of students participating in test prep or another intervention are being over-predicted and will not result in similar increases in college grades. Or it could it be that the scores of students who have not engaged in test prep have been under-predicted.

Hardison and Sackett5 demonstrated that a 12-hour intervention could increase performance on an SAT writing prototype while also increasing performance in other writing assignments.   While this was a preliminary experimental study of the coachability of a new writing assessment, it demonstrated that instruction could result in better writing on both a test and in course assignments.

This type of study highlights the types of questions that are raised whenever claims of large score increases are reported.  When results are too good to be true (and even when they are not), it is always better to verify.

Claims that test preparation, short-term interventions, or new curricular or other innovations can result in large score gains on standardized assessments are tempting to believe. These activities require so much less effort than enrolling in more rigorous courses in high school or other endeavors which require years of effort and learning.

If we find an intervention that increases only a test score without a similar effect on actual achievement, then we need to be concerned about the test score. And when we hear claims about score increases that appear to be too good to be true, we need to conduct research based on the same professional standards to which other scientific research adheres. Because if it sounds too good to be true, it very likely is.

1
 See the What Works Clearinghouse for Criteria https://ies.ed.gov/ncee/wwc/
2 Hattie (2009). Visible Learning: A synthesis of over 800 meta-analyses related to achievement. New York, Routledge.
3 http://www.soe.vt.edu/highered/files/Perspectives_PolicyNews/05-09/Preparation.pdf
4 ACT scores are on a 1-36 scale so these raw numbers can’t be compared to the SAT. These effects represent a full model which controls for differences in socioeconomics, ability, and motivation between a baseline group and a test preparation group. Yes, students in the coached group saw a decrease in ACT reading scores relative to uncoached students.
5 Hardison and Sackett, (2009). Use of writing samples on standardized tests: Susceptibility to rule-based coaching and resulting effects on score improvement. Applied Measurement in Education, 21: 227-252.

Adaptive Learning Expert to Join ACT

IOWA CITY, Iowa—David Kuntz, an internationally acclaimed expert in adaptive learning, will join ACT on July 31, 2017 as principal adviser t...

Read this article


IOWA CITY, Iowa—David Kuntz, an internationally acclaimed expert in adaptive learning, will join ACT on July 31, 2017 as principal adviser to the CEO for adaptive learning.

At ACT, Kuntz will work on the design of a large-scale, cloud-based adaptive learning platform and on strategy and design for ACT’s adaptive learning initiatives.

Prior to accepting his position with ACT, Kuntz was chief research officer at Knewton. He previously served as Knewton’s vice president of research and adaptive learning.

“David Kuntz is a global leader in a field that promises to make education more effective and efficient than ever before,” said Marten Roorda, ACT chief executive officer. “Historically, teaching has been constrained by the inherent limitations of one teacher guiding many students, each of whom may be at a different place in their understanding. Adaptive learning overcomes those limits and gives teachers and students powers they never had before.”

Adaptive learning uses computers as interactive teaching devices, providing real-time, student-specific instructional information based on their responses.

In a properly designed adaptive learning environment, the system uses data generated by students’ interactions to understand both the students and the content and identifies what is most likely to help each student, moment-by-moment. This can mean, for example, providing a student with instruction on a particular topic, assessing his or her understanding of a particular concept or providing practice opportunities to strengthen skills.

“Each student arrives in the classroom with different needs, different skills, different backgrounds,” said Kuntz. “Adaptive learning is a data-driven way to help teachers understand these differences and provide them with student-specific resources to better support each student's mastery of learning objectives. An adaptive and personalized approach can put the goal of true mastery within reach.”

Kuntz has been awarded five patents, with a sixth pending. His patents have related to using technology to support and improve data-driven test assembly, performance scoring, and reporting. His current patent application is for technology that provides near-real-time personalized educational recommendations to students.

Kuntz earned his executive master’s degree (MSE/MBA) in technology management from the Wharton School of Management at the University of Pennsylvania and a master’s degree and a bachelor’s degree, both in philosophy, from Rutgers University and Brown University, respectively.

About ACT

ACT is a mission-driven, nonprofit organization dedicated to helping people achieve education and workplace success. Headquartered in Iowa City, Iowa, ACT is trusted as a national leader in college and career readiness, providing high-quality assessments grounded in nearly 60 years of research. ACT offers a uniquely integrated set of solutions designed to provide personalized insights that help individuals succeed from elementary school through career.
Top