Nowdays you can hear it everywhere that college students are lazy, care more about amenities on campus than studying, and that in general their quality has declined over time. In this post I write about a working paper that evaluates this trend, and try to hypothesize what drives it.
Indeed, Babcock and Marks (2010) show that there has been a clear decline in study hours per week from 1961 to 2003. Back in the day about 24 hours per week was the standard, by 2003 this declined to 14. That’s a 10-hour decline. In general, colleges state that a 30-hour per week studyload is appropriate.
This trend is very consistent. It is observed in numerous surveys and over various other time periods from 1928 to 2008. What appears not to be driving the decline is work, parental education, gender, major, or college selectivity. In other words, study times have declined for instance both for workers and non-workers, males and females, or across all majors.
The decline clearly happened, and it is obviously not driven by any of the easily measurable demographic variables mentioned above. So what is the reason? Let me first talk about the authors’ ideas, then about what I think.
Improvement in education technology. The internet can replace libraries, and writing term papers is certainly less time consuming. This is indeed true, but the authors mention that most of the decline happened prior to 1981 when these technologies were not yet available. Furthermore, the decline also occured (with the same magnitude as on average) in majors where little to no library research, or term paper writing is required like engineering or math.
College standards have fallen. It seems indeed that this might be a very important reason for the decline. Evidence includes for instance the fact that teacher evaluation forms gained popularity in the 60s and 70s, and today they are probably the most important metric of the quality of professors’ teaching. Teacher evaluation by students puts a clear pressure on professors to make courses easier, sacrifice quality and inflate grades, whereas the authors “are hard-pressed to name any reliable, non-internal reward instructors receive for maintaining high effort standards—and the penalties for doing so are clear.”
Changing structure of colleges. The role of better grades (which are usually the result of more studying) is to distinguish yourself from the rest of your class. This in the end will signal to employers that you have high ability. There is, however, evidence that differences between colleges (in terms of student quality) have increased over time, while differences within colleges have decreased. In other words, almost all bright students are at elite colleges, almost all mid-ability students are at mid-tier colleges and so on. Since students at the same college are of similar ability, grades are a relatively useless indicator of ability. They have lost their roles as signals of high ability to employers. But since the differences between schools have increased, the name of your alma mater has become a much more reliable signal of your ability. Therefore, students may concentrate more on getting into a good school, and less on getting good grades.
Some explanations I thought may be behind the trend:
Changing teaching styles. Although I am not 100% sure about this, but as far as I know the general trend is that in the past the educational system as a whole was more memorization-oriented. In the 60s thus, one had to memorize a lot of material because exams tested memory. Perhaps this started to change. The advent of the internet reinforced (or started?) this trend, as in today’s world we have Google to look up almost anything, so why bother with wasting our mental resources on memorization. Why do I need to memorize how to check whether a function is concave, when it’s always one click away from me on the internet? Indeed, nowadays even leading schools such as Caltech have mostly open-book exams. Of course since most of the decline in study hours happened prior to the 1980s, this might not be driving the trend as a whole, but it could be partially responsible for the further decline later on.
Smarter students. The authors rule this out. Again, I’m not 100% sure, but it indeed has been shown that IQ scores have been growing over time. Other than that, all I have is anecdotal evidence, and a hypothesis. While the genetics underlying IQ could not have possibly changed in such a short time span, the environment of children could and did. College education skyrocketed around the 60s and 70s, so children born in subsequent years (70s and afterwards) were much more likely to be raised by college-educated parents. This generally meant an environment much more advantageous for IQ. Also with technological changes around us, children born in the 80s and 90s have probably had a larger exposure to educational information via TV, the internet, educational toys, etc.
Lazier students. In the 60s and before that even more so, people who went to college generally were highly talented individuals. College was selective, in 1960 less than 10% of those 25 or older had a Bachelor’s degree or higher. Contrast this to today when going to college is the default option for all children. Every parent wants to send their kid to college, even if they’re clearly not college material the parent will try to get the kid to attend some school. This is undoubtedly the state of the world today. This means that while in the 60s only those who truly wanted to go to college went, in the 21st century colleges have become diploma mills. Of course, there is going to be a difference in motivation between these two groups of individuals, and thus we can see a decline in study hours.
What are the implications of this trend? For instance, if knowledge (which can perhaps be lost by not studying) is an important driver of growth and technological progress, then could this trend be bad news? The authors try to see whether hours studied (i.e. knowledge) is a significant determinant of future wages (i.e. productivity). They regress wages at two-year intervals from 1986 to 2004 on hours studied per week in 1981, while controlling for gender, year in college (freshman, sophomore, etc.), and a mental aptitude score. They find zero difference in early post-college years, but a significantly positive difference in later years.
So the authors conclude “[i]f one believes that declining study time signifies declining acquisition of human capital, as suggested by the evidence here, then the study time trend is a serious problem.” I completely disagree with this.
The regression the authors used in my opinion is flawed. At least, it does not show what the authors want it to show. First, there is absolutely no control for personality traits such as diligence, social skills, self-discipline, self-motivation, etc. All of these are clearly highly correlated with how much one studies. And they clearly influence one’s future earnings. In other words, it may not be hours studied per se that makes hard workers earn more, it may be their other traits which are not controlled for.
Second, the fact that the wage difference starts to show quite late (specifically 10+ years after graduation) confirms this as well. Who remembers what they studied in college right after graduation? Probably 20% of the graduates… Who remembers what they studied in college 1 year after graduation? No one does. But if you disagree, then consider who remembers what they studied 10+ years after graduation (when the coefficient of hours studied becomes significant)? Clearly, no one does. If hours studied drove these results then they should be significant right after college, after all, what you studied matters more at that time because you actually remember some of it.
On the other hand, the traits I mentioned above (diligence, self-discipline, etc.) are exactly such hidden skills that can take years to surface/prove in big organizations. So the fact that wage differences only appear in later years seems to confirm the hypothesis that it is these skills that cause it and not hours studied.
So no, in my opinion one need not worry about human capital being lost due to the lack of high standards at our schools. Motivated people will learn a lot no matter what, and unmotivated people can be forced to learn but they will for sure not initiate great technological advances if the only reason they learned a lot was because they were forced to. (Also note, most of those people who now spend their late teens and early 20s in college studying a “measly” 10 hours a week would have spent these same years in the 1960s working at some factory studying exaclty 0 hours a week.)
In sum, there is a clear decline in hours studied at our colleges which is not driven by any single demographic group. I think the most likely explanations for this trend are falling standards at colleges due to increasingly treating students as consumers; and the transformation of college education into a mass produced good whereby everyone – even unmotivated, less hard-working people – is going to college. For better or worse, acquiring a college education has become the new standard for the middle class, and with this come falling standards and lazier students. In my opinion however, there is no need to worry about potential long-term consequences. Those people who were in college in the 60s studying 24 hours a week are still here, they’re just outnumbered by lazier students who pull down the average. In short, the role of colleges – and with it their population – has changed.