Saturday, March 8, 2014

Are American universities better than Scandinavian ones? It depends on your definition of "good."

Image: Yeah, we could probably beat them at college football. (source)


In this blog, I try not to stray too far from topics I know enough about to comment on. Educational policy is certainly an area I do not have enough knowledge to make a meaningful contribution to. But university education is an important part of social welfare, and there is one particular point I want to emphasize, as it pertains directly to my goals in this blog. Americans are quick to assume that American universities are better than Scandinavian ones. But there's really no evidence that that's the case; a major reason is that data collection, especially across different countries, is extremely difficult.

But there is another issue. What do we mean by good? What do we want our universities to accomplish? This tension is readily apparent in the debate over the value of American higher education. As with most issues, opposing camps are having two entirely separate debates, and are just talking past each other. To see what I mean, look at what happens when researchers try to measure student learning in at universities:
There are three basic ways of trying to measure how well colleges educate students.  The most obvious is to use some form of a standardized test.  That's how K-12 schools are evaluated.  Given the difficulty and controversy K-12 testing has entailed, using standardized tests for college students might seem impossible at first.  Elementary and secondary students are at least expected to complete similar courses, to learn the same rules of punctuation and applications of the Pythagorean theorem.  Undergraduate studies are far more diverse: Some students choose to spend four years immersed in Ovid, others in organic chemistry.

But there turns out to be an answer: Instead of testing discreet pieces of knowledge, test the higher-order critical thinking, analysis, and communication skills that all college students should learn (and which employers value most).  The Collegiate Learning Assessment, recently developed by a subsidiary of the RAND Corporation, does exactly that.  Instead of filling in bubbles with a No. 2 pencil, CLA test-takers write lengthy essays, analyzing documents and critiquing arguments.
Unfortunately for us, the results of these tests are kept secret; we only know the results in aggregate. Nationally, using the CLA, 36% of college students show "exceedingly small or empirically nonexistent" learning after four years of college. Yes, over a third of college students learn nothing or almost nothing in four years of college. It's little wonder universities fight to keep these scores secret.

But if some students are learning nothing (or next to nothing), who are they? Are they concentrated at rural Moo-U colleges? At grimy, urban, blue collar commuter colleges? Flagship research institutions? Or are the students learning nothing concentrated among bottom-shelf cocktail swilling liberal arts majors at all four-year colleges and universities?

Fortunately, we can get some idea from the University of Texas system. UT system CLA results are all public. The "best" UT university--that is, the one with the highest rank in the US News and World Report, and the one with the best reputation--is the flagship university at Austin. UT-Austin performed abysmally.  The institutions with the most student learning actually occurred at the "worst" UT schools: UT-San Antonio, UT-El Paso, and UT-Permian Basin, all of which are near the bottom of the US News and World Report college rankings.

It's not just the CLA. Another tool used to assess universities, the National Survey of Student Engagement, shows basically the same results--student gains are shockingly small, and student learning is utterly uncorrelated to US News and World Report college rankings.

This begs the question: if flagship universities don't exist to teach students, what are they there for?

The CLA and NSSE results are consistent with my experience at the very highly ranked flagship public research university I attended for my undergraduate degree. My undergrad alma mater is one of the best regarded research institutions in the entire world, and student learning suffered as a direct result.  There were very many noble exceptions, of course, but professors were not there to teach--they were there to conduct research.  Their research--not their effectiveness in educating students--was what earned tenure, respect, grant money--basically, everything. Unsurprisingly, as a whole, they neglected their role as teacher. Teaching is difficult; adequately preparing for a lecture would have taken too much time away from their research. Even the TA's took very little interest in teaching: graduation requires a thesis, which in turn requires research, not teaching.

I obtained my graduate degree from a much smaller, much lower ranked university in the same state university system. This school is poorly regarded for research--for good reason--but their teaching was excellent (except for a few very notable exceptions), and that's why I chose that school.  Small class sizes allowed for debate and discussion--with the professor--which was entirely impossible in my enormous undergraduate classes. Professors were rewarded for their ability to teach, not their research. In short, the professors took their teaching responsibilities very seriously--but this meant that they spent very little time doing research.  One of my professors was one of the only faculty members of the entire university who was respected for his research; he was certainly the only one in the social work department who was.  But he spent the least amount of time preparing for classes, and it was obvious. His powerpoints made little sense, he was obviously disorganized and unprepared, and he struggled to maintain our attention. Very little learning occurred in his classroom. Obviously, had he done a better job at preparing for his teaching responsibilities, his research would have suffered.

Well-regarded institutions would seem to be scams.  You go, ostensibly to learn, but your pay tuition pays the salaries and overhead of faculty who have no interest in teaching you.  Along with whatever grants they pull in, your tuition goes to support their research. The official goal of American universities is teaching; the unspoken, actual goal, is research. Universities pay lipservice to honoring their commitment to education by forcing those researchers to spend a few hours each week, maybe one semester every two years, to mumble over muddled Powerpoint slides in front of 300 or so disengaged students.  Because there's only so many hours in the day, you can be a good researcher, or you can be a good teacher, not both.

My point here is that the scholarship and basic research that takes place at America's flagship universities are public goods, well worth our society's investment. But university education is also a public good, also well worth our society's investment. Perhaps we need to invest in these things separately. Perhaps it would be better for our education and research institutions to be separate entities. Large research universities are awful at teaching; small teaching colleges are awful at research. We should probably stop pretending that these two goals aren't in direct contention with each other.

Scholarship and basic research--as well as university education--are worthwhile public goods deserving of our investment. We disregard either at our own peril.

No comments:

Post a Comment