Saturday, March 22, 2014

Another blow against the idea that poverty itself cannot explain the diminished academic performance of poor students

I'm not an expert in educational policy, so I don't have much to say about it. I wrote previously about how it's long been known that poverty is the greatest predictor of student performance, and how parents or teachers need to be scapegoated in order to justify the poverty and inequality that are the real source of the problems in the American education system.

Basically, if poverty and inequality are the real causes of poor student performance, than there is an obligation to alleviate poverty and inequality. But that's a very expensive proposition. What if there was a cheaper, more convenient explanation of poor student achievement? Who can we blame instead?

Poor parents are a popular scapegoat: If poor parents were more involved in their children's lives, impoverished students would be doing just fine in school. If only poor parents had the initiative and desire to exercise the same level of responsibility as rich parents.

Dana Goldstein explains how a major study has just fatally wounded the scapegoating of poor parents:
In the largest-ever study of how parental involvement affects academic achievement, Keith Robinson, a sociology professor at the University of Texas at Austin, and Angel L. Harris, a sociology professor at Duke, mostly found that it doesn’t. The researchers combed through nearly three decades’ worth of longitudinal surveys of American parents and tracked 63 different measures of parental participation in kids’ academic lives, from helping them with homework, to talking with them about college plans, to volunteering at their schools. In an attempt to show whether the kids of more-involved parents improved over time, the researchers indexed these measures to children’s academic performance, including test scores in reading and math.

What they found surprised them. Most measurable forms of parental involvement seem to yield few academic dividends for kids, or even to backfire—regardless of a parent’s race, class, or level of education.

Saturday, March 8, 2014

Even the top tier of America's two-tiered welfare system is a really bad deal

Image: Excuse me, taxpayers--do you get a better deal living in the United States or in a social democracy? (source)

One of the most common arguments against social democratic programs is that the taxes needed to finance them are simply too high. But citizens in social democracies get loads of benefits in return for those high taxes. Is it worth it to pay those extra taxes? Let's find out.

This isn't a straightforward question. Many of the services all citizens of social democracies get from the government are available in the American private market--like health insurance, day care, retirement plans, etc. In other words, the United States government doesn't provide (most) people with day care services, but that doesn't mean you can't have them--you just have to pay for them in the private market. Essentially, we're asking: Who gets a better deal? A typical Dane, who gets X, Y, and Z services from the Danish government? Or a typical American, who gets X from the government but purchases Y and Z from the private market?

To start, we'll compare the costs of both systems: How much a typical Finn pays in federal and social security taxes--versus how much a typical Americans pays in federal and social security taxes, plus the cost of social welfare services purchased through the private market (ie, employer-sponsored health insurance, day care, 401(k) contributions, etc). We'll then compare what services are obtained through those investments (whether taxes or private spending), and hopefully be able to see who gets a better deal.

Excited? I hope so. Let's start with what the typical American has to pay for social welfare services.

Are American universities better than Scandinavian ones? It depends on your definition of "good."

Image: Yeah, we could probably beat them at college football. (source)

In this blog, I try not to stray too far from topics I know enough about to comment on. Educational policy is certainly an area I do not have enough knowledge to make a meaningful contribution to. But university education is an important part of social welfare, and there is one particular point I want to emphasize, as it pertains directly to my goals in this blog. Americans are quick to assume that American universities are better than Scandinavian ones. But there's really no evidence that that's the case; a major reason is that data collection, especially across different countries, is extremely difficult.

But there is another issue. What do we mean by good? What do we want our universities to accomplish? This tension is readily apparent in the debate over the value of American higher education. As with most issues, opposing camps are having two entirely separate debates, and are just talking past each other. To see what I mean, look at what happens when researchers try to measure student learning in at universities:
There are three basic ways of trying to measure how well colleges educate students.  The most obvious is to use some form of a standardized test.  That's how K-12 schools are evaluated.  Given the difficulty and controversy K-12 testing has entailed, using standardized tests for college students might seem impossible at first.  Elementary and secondary students are at least expected to complete similar courses, to learn the same rules of punctuation and applications of the Pythagorean theorem.  Undergraduate studies are far more diverse: Some students choose to spend four years immersed in Ovid, others in organic chemistry.

But there turns out to be an answer: Instead of testing discreet pieces of knowledge, test the higher-order critical thinking, analysis, and communication skills that all college students should learn (and which employers value most).  The Collegiate Learning Assessment, recently developed by a subsidiary of the RAND Corporation, does exactly that.  Instead of filling in bubbles with a No. 2 pencil, CLA test-takers write lengthy essays, analyzing documents and critiquing arguments.
Unfortunately for us, the results of these tests are kept secret; we only know the results in aggregate. Nationally, using the CLA, 36% of college students show "exceedingly small or empirically nonexistent" learning after four years of college. Yes, over a third of college students learn nothing or almost nothing in four years of college. It's little wonder universities fight to keep these scores secret.

But if some students are learning nothing (or next to nothing), who are they? Are they concentrated at rural Moo-U colleges? At grimy, urban, blue collar commuter colleges? Flagship research institutions? Or are the students learning nothing concentrated among bottom-shelf cocktail swilling liberal arts majors at all four-year colleges and universities?

Fortunately, we can get some idea from the University of Texas system. UT system CLA results are all public. The "best" UT university--that is, the one with the highest rank in the US News and World Report, and the one with the best reputation--is the flagship university at Austin. UT-Austin performed abysmally.  The institutions with the most student learning actually occurred at the "worst" UT schools: UT-San Antonio, UT-El Paso, and UT-Permian Basin, all of which are near the bottom of the US News and World Report college rankings.

It's not just the CLA. Another tool used to assess universities, the National Survey of Student Engagement, shows basically the same results--student gains are shockingly small, and student learning is utterly uncorrelated to US News and World Report college rankings.

This begs the question: if flagship universities don't exist to teach students, what are they there for?