Archive for: July, 2012

When Andrew Hacker asks "Is Algebra Necessary?", why doesn't he just ask "Is High School Necessary?"

Yes, I admit, the editorial at the New York Time entitled "Is Algebra Necessary?" pushes my buttons. Hacker makes some valid and relevant points, and I'll get back to that. However, the core of his argument is the ultimate in anti-intellectualism. What's worse, it's the kind of anti-intellectualism that you get from intellectuals, the sort of thing that sprouts from those on the math-ignorant side of the "two cultures" identified by C. P. Snow.

Andrew Hacker's argument against making algebra necessary for high school and college students is essentially: Math Is Hard. Having to do it gets in the way of people who could be amazing at other things, because they will drop out of high school because Math Is Hard. So, rather than stop them from achieving all that they might achieve, we should just remove algebra from the high school curriculum. He points out that failing math is one of the main reasons students leave school. Now, I might think that this is a reason to look at our educational culture, at how math is taught, at the fact that it is somehow deemed acceptable and indeed normal to find basic math impenetrable. But, if you're on the other side of the two cultures, evidently this means that we as a society should just give up on the general teaching of basic algebra. Evidently, it's OK that the elites who understand the simplest things about science become that much more separated from the general educated public, and that the generally educated public know that much less about them.

There's one particular part of the argument I want to highlight:

Nor is it clear that the math we learn in the classroom has any relation to the quantitative reasoning we need on the job. John P. Smith III, an educational psychologist at Michigan State University who has studied math education, has found that “mathematical reasoning in workplaces differs markedly from the algorithms taught in school.” Even in jobs that rely on so-called STEM credentials — science, technology, engineering, math — considerable training occurs after hiring, including the kinds of computations that will be required.

So, because algebra isn't what's needed in jobs, we shouldn't be teaching it. This is absolutely the wrong way to think about a lot of education.

If you accept that argument, we need to reevaluate the entire high school curriculum, and the entire core curriculum of all colleges and universities. I think most people would agree that you need to be able to read and write in order to function in today's society. Do you really need to be able to interpret themes in literature, however? Honestly, is anything that you do in high school or college English classes really necessary in the workplace, any more than algebra is? The kind of reading and writing that most people need is something that students should already know by the time they're out of middle school. Likewise, history, biology, all the rest: everything that they study in high school is not going to be necessary for their jobs. And, really, if the purpose of high school and college is to train people to function like good little Betas and Gammas within our economic system, why is Andrew Hacker singling out algebra for attack? If we're going to dumb down the curriculum because we don't like that right now some people aren't mastering it, why don't we just dumb it down all the way?

The simple fact is that a college or university education is not job training. In recent decades, it's become conflated with job training, at least in North America, and this is too bad. A liberal arts education is all about expanding your mind, all about being able to think. It's not about gaining skills that you are then going to use in a job. Too many of us professors tend to not have any clue what somebody is supposed to do to earn a living after a liberal arts education other than go to graduate school (so that your liberal arts education is "training" for what you do next). That's because that was our own life trajectory, and it's what we know. Liberal arts education is to make people into good citizens, not into good workers. They are to acquaint you with the intellectual achievements of humankind. That is why we read the Iliad, why we watch a performance of Hamlet, why we learn about the history of ancient Greece, and, yes, why we study algebra. Because we want people to be educated so that they understand the intellectual achievements that have made our society what it is today, and that will drive our society in the future. We're training people to be members of civilization, not employees.

I will say that Hacker makes some good points. There are other kinds of quantitative reasoning, which too many of those coming into college and too many in our society completely don't grasp, that people should learn. A better understanding of basic statistics may at this point be more important to the citizen of a democracy than an understanding of algebra. So, yes, I would agree that we could and perhaps should de-emphasize algebra in favor of making time for statistical awareness, and perhaps in filling in the basic number sense that students failed to get out of elementary school. However, to me this is a bit of a red herring. Yes, we should always be evaluating what the subject matter of mathematical high school education is. But, right now, the problems are bigger than that. That so many people through high school without basic quantitative reasoning skills is not a reason to throw out algebra. We do, however, have to figure out why it is somewhere around fifth grade that individuals and society both get the "Math Is Hard" meme so firmly embedded. Why it becomes normal not to "get" math and indeed a little weird to actually understand and like those classes. Why it becomes OK to not like and not try at math and just do what's necessary to get by without actually learning anything. I strongly believe that there are serious problems with a lot of the math education that's done at the later elementary, middle school, and high school level. But that's not a reason to give up. We might as well point at various studies of how little so many people know about the state of the world to say that teaching geography and international history just isn't worth doing any more.

Perhaps the problem, or part of the problem, is that we have conflated vocational and liberal arts education. Anybody who is interested in a liberal arts education does not deserve a degree if they are completely ignorant of algebra, and any society that values liberal arts education cannot neglect algebra. However, perhaps not everybody needs such a liberal education. If we have the problem right now of too many people failing out, it may be that we're pushing them through the wrong kind of education. This does not mean that a liberal arts education needs to jettison those parts of it that are hard for people on the wrong side of C. P. Snow's divide!

Algebra is fundamental to nearly all of "higher math". Even if you want to do more than the most basic of things with statistics, you need to know some algebra. To give up on that would be right on par with the giving up on the teaching of history as anything other than memorizing the occasional date, and to give up on the teaching of English literature as anything other than being able to read a short document for simple surface content and to put together a simple declarative sentence. If you want people to be educated beyond elementary school and beyond "job training", then algebra is one of the intellectual foundations of our civilization that simply cannot be neglected.

75 responses so far

The Higgs Boson and Statistics

GUILDENSTERN: ...Four: a spectacular vindication of the principle that each individual coin spun individually is as likely to come down heads as tails and therefore should cause no surprise each individual time it does.

   —"Rosencrantz & Guildenstern Are Dead" by Tom Stoppard

There has been a lot of bru-ha-ha over the last few days about the much anticipated discovery of what looks to be the Higgs Boson at CERN. Among many other things that you have probably read is the statement that the confidence that the signal is real is 99.9999%. You might be wondering, why so many 9's? That is, they had a signal a while back that was already 99% or thereabouts certain. If I had 99% confidence in winning the lottery I would go out right now and spend $1000 on lottery tickets. Why was a 99% confidence limit not good enough to indicate discovery? Indeed, the announced discovery, with 99.9999%, is at the statistical confidence level that is considered the minimum for a particle physicist to announce a discovery. Why do they have to be so damn confident?

Rather than talking about the energy spectra of interaction cross sections, let's talking about flipping coins. At the opening of Tom Stoppard's play Rosencrantz & Guildenstern Are Dead, the two courtiers are flipping coins (and have been doing so for some time). They are approaching a streak of 100 flips of heads in a row. Rosencrantz (who wins a coin each time it comes up heads) is not concerned about this, but Guildenstern is so disturbed by the seeming violation of the laws of probability that he philosophizes at length about what it is that's going on. (The real thing that's going on is that he's a character in a play, not a real person.) Let's keep it more modest, though.

Suppose I were to walk up to you with a quarter, and flip it six times in a row. If the quarter is normal, and if I'm not cheating, the probability that all six flips of the quarter will come down heads is about 1.5%. In other words, if I do flip six heads in a row, you can be 98.5% sure that it was not due to random chance, that I must have been cheating somehow. (Ask me to show you this sometime.) You're not 100% confident, because there is a small chance that six heads will come up in a row just randomly, but it is a very small chance... and so you would be well within your rights to think that something was probably up. It may not be good enough to convict somebody in a courtroom, but it's certainly good enough to bet on.

Suppose instead, however, that 30 people come up to you, and each one of them flips six coins in a row. The probability that at least one of those people will flip six heads in a row is 38%. So, while it won't happen every time this crowd of coin-flippers accosts you, you shouldn't be particularly surprised that somebody flipped six heads in a row if a whole bunch of people tried it. Even though it's extremely unlikely that any given coin flipper will flip the coin six times, the probably that somebody somewhere will is entirely reasonable. Lightning has to strike somewhere. (See Randall Munroe's much more concise take on this, and on overreactions to it.)

This same principle applies to particle physics. The particle physicists looking for the Higgs Boson were not sure at exactly what energy the particle would show up. Here's one of the plots from the CMS collaboration:

From the 2012 July 4 CMS Higgs Seminar; (c) CERN

The signature of the Higgs Boson is the extra bump of events at an energy of 125 GeV. There are lots and lots of events at all energies in the plot; there's a little something extra there, which indicates that something is going on there, and that something is probably the production of a short-lived Higgs boson. But they didn't know before they found it to look right at 125 GeV; it could have been at other energies, too. If all they were after was finding something that was "a little extra" at 95% confidence, they could have found it lots of places; indeed, there's a data point hanging out there at a bit over 135 GeV that is that far away from the background. But since there's 30 data points in the plot, I'm not the least bit surprised to see that. Randomly, you'd expect to see at least one of those more than 3/4 of the time somebody showed you a plot like this with 30 data points, even if there are no new particles.

The physicists in these collaborations were doing the equivalent of looking at a whole bunch of people flipping coins, and trying to find somebody who was flipping more heads than tails. If you look at 30 people who flip 6 coins and you find one person who has flipped 6 heads in a row, you have no right to declare that you've found a person who is cheating at flipping coins; the chances of that happening randomly are too high. Similarly, if you look at a whole bunch of different energies, and you see a single place where more is going on to 99% confidence than you'd expect from random fluctuations, you don't have much confidence that you've really found anything... because if you look at enough different energies, you will eventually find the unlikely random fluctuation. This is why for a particle physicist to be confident that she really has discovered something, she needs six nines in her confidence.

As for why the Higgs field (the "same thing" as the Higgs Boson... it's complicated) gives particles mass... that I really don't understand.

16 responses so far