Two More Things to Unlearn from School

In Three Things to Unlearn from School, Ben Casnocha cites Bill Bullard's list of three bad habits of thought: Attaching importance to personal opinions, solving given problems, and earning the approval of others. Bullard's proposed alternatives don't look very good to me, but Bullard has surely identified some important problems.

I can think of other school-inculcated bad habits of thought, too many to list, but I'll name two of my least favorite.

I suspect the most dangerous habit of thought taught in schools is that even if you don't really understand something, you should parrot it back anyway. One of the most fundamental life skills is realizing when you are confused, and school actively destroys this ability - teaches students that they "understand" when they can successfully answer questions on an exam, which is very very very far from absorbing the knowledge and making it a part of you. Students learn the habit that eating consists of putting food into mouth; the exams can't test for chewing or swallowing, and so they starve.

Much of this problem may come from needing to take three 4-credit courses per quarter, with a textbook chapter plus homework to be done every week - the courses are timed for frantic memorization, it's not possible to deeply chew over and leisurely digest knowledge in the same period. College students aren't allowed to be confused; if they started saying, "Wait, do I really understand this? Maybe I'd better spend a few days looking up related papers, or consult another textbook," they'd fail all the courses they took that quarter. A month later they would understand the material far better and remember it much longer - but one month after finals is too late; it counts for nothing in the lunatic university utility function.

Many students who have gone through this process no longer even realize when something confuses them, or notice gaps in their understanding. They have been trained out of pausing to think.

I recall reading, though I can't remember where, that physicists in some country were more likely to become extreme religious fanatics. This confused me, until the author suggested that physics students are presented with a received truth that is actually correct, from which they learn the habit of trusting authority.

It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism.

So what could you do? Teach students the history of physics, how each idea was replaced in turn by a new correct one? "Here's the old idea, here's the new idea, here's the experiment - the new idea wins!" Repeat this lesson ten times and what is the habit of thought learned? "New ideas always win; every new idea in physics turns out to be correct." You still haven't taught any critical thinking, because you only showed them history as seen with perfect hindsight. You've taught them the habit that distinguishing true ideas from false ones is perfectly clear-cut and straightforward, so if a shiny new idea has anything to recommend it, it's probably true.

Maybe it would be possible to teach the history of physics from a historically realistic point of view, without benefit of hindsight: show students the different alternatives that were considered historically plausible, re-enact the historical disagreements and debates.

Maybe you could avoid handing students knowledge on a silver platter: show students different versions of physics equations that looked plausible, and ask them to figure out which was the correct one, or invent experiments that would distinguish between alternatives. This wouldn't be as challenging as needing to notice anomalies without hints and invent alternatives from scratch, but it would be a vast improvement over memorizing a received authority.

Then, perhaps, you could teach the habit of thought: "The ideas of received authority are often imperfect but it takes a great effort to find a new idea that is better. Most possible changes are for the worse, even though every improvement is necessarily a change."

Comments

sorted by
magical algorithm
Highlighting new comments since Today at 11:51 AM
Select new highlight date
All comments loaded

In school, There are right answers. Copying from a reference work with known solutions is forbidden. Copying from someone else is forbidden. Asking someone who knows is forbidden. Working with others is often forbidden. Testing out the answer against reality is forbidden or impractical. You are expected to find the right answer by rooting around in your own head.

It would be difficult to find more crippling and maladaptive habits to instill in a mind that wanted to deal with reality.

Testing out the answer against reality is forbidden or impractical.

Not to mention counterproductive. You don't want right answers, you want 'Right' answers. Reality is far too narrow minded to be a good authority-figure-satisfier.

There goes my foolishness again. When am I gong to get with the program? I'll try that last sentence again.

You are expected to find the right answer by rooting around in your own head for what the teacher said.

Better?

It is obvious to most teachers, and to many students, that school tests and rewards are often quite at odds the usual stated purposes of school. It often seems like there are other ways we could teach and test that would be more in line with those stated purposes. You seem to be suggesting such alternatives.

But I think we have to take very seriously the fact that schools have long had the option to switch, and have chosen not to. I conclude that the real purpose of school is somewhat different from the stated purpose, and that the things taught are in fact more useful for the real purpose.

University English Lit departments should be closed down for teaching appalling habits of thought to impressionable young people.

You read the books, and then you pick up elements from them and turn them around a bit until they line up nicely to form a pleasing argument. The more tenuous (sorry, 'sensitive') your reading, the more marks you get. The more 'powerful' the story you weave, the more marks you get. Especially if it chimes with the prevailing intellectual fashions. Extra points also for being subversive or challenging the (straw man) orthodoxy. Looking behind the superficial to decode the deeper truth is, of course, compulsory. Marks deducted for anything as neolithic as thinking literature might teach us anything about the human condition.

Never do you weigh the merits of your chosen interpretation against other available interpretations - in fact the question is nonsensical, because there are no criteria for comparison. There is no analog of testing whether your hypothesis is consistent with facts. Never do you consider how the elements of your 'reading' hold together or relate to the real world - that is to say you can employ any half comprehended 'philosophy' without being held to task if that 'philosophy' is a poor description of reality. Internal logical consistency is not required.

Once you learn the tricks, it is child's play to get a first class degree.

Then you go out into the world and start applying your mental habits to the real world. For the results, see newspaper columnists, novelists and playwrights taking on topics such as economics, politics and foreign policy.

I am aware the above might make me look a bit like a nutjob .. perhaps I just had a particularly unpleasant match with my Eng Lit faculty. But I reckon there's something in it.

Re: "Wait, do I really understand this? Maybe I'd better spend a few days looking up related papers, or consult another textbook," they'd fail all the courses they took that quarter. A month later they would understand the material far better and remember it much longer - but one month after finals is too late; it counts for nothing in the lunatic university utility function.

This line of thought reminded me of Robert Frank's The Economic Naturalist: "When students are given tests designed to probe their knowledge of basic economics six months after taking the course, they do not perform significantly better than others who never took an introductory course. This is scandalous."

I gather the goal of Frank's student assignments is to have them think, even if imperfectly, rather than to parrot well.

I recall reading, though I can't remember where, that physicists in some country were more likely to become extreme religious fanatics. This confused me, until the author suggested that physics students are presented with a received truth that is actually correct, from which they learn the habit of trusting authority.

You're probably thinking of the engineering (and hard sciences in general) correlation; see http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=205920319 or http://www.theatlantic.com/magazine/archive/2008/01/primary-sources/6559/2/ and the original paper, http://www.nuff.ox.ac.uk/users/gambetta/Engineers%20of%20Jihad.pdf

Possible purposes of school include: 1) babysitting, 2) social mixing, 3) sorting by intelligence and/or consciensciousness, 3) imprinting work habits, 4) learning specific useful skills or knowledge, etc. If you know what general skills tend to be useful in typical office jobs in our economy, you will see relevance of typical work habits imprinted and characteristics sorted in school.

Considering the number of complaints I hear about recent graduates not being good at work, it's possible that schools aren't doing a good job of preparing people for typical office jobs--- after all there isn't reliable feedback from graduates or employers to the schools.

I suspect the short-run goals (baby-sitting, status enforcement vs. children and teenagers, acquisition of easily checked credentials) are the ones mostly being served.

Considering the number of complaints I hear about recent graduates not being good at work, it's possible that schools aren't doing a good job of preparing people for typical office jobs...

In my experience, schools aren't doing a good job of preparing them for software engineering jobs, either. Most of the candidates I've seen (and I've seen quite a few) run the gamut:

  • Has heard the word "linked list" before (just for example). Doesn't know what it means.
  • Has heard the word "linked list" before. Knows what it means. Doesn't know what it's for.
  • Can answer basic questions about data structures and algorithms. Knows what they're for, in theory. Doesn't know how he'd actually use them.
  • Knows how to use basic data structures and algorithms. Can apply them, but only if he is given a source code file with clearly labeled "YOUR CODE HERE" sections. Doesn't know how to get help, or how to ask for help. If he gets stuck, just sits there, staring forlornly at the screen.
  • Knows how to write programs. Knows how to ask for help directly. Doesn't know how to find help on his own.
  • Knows how to write programs and how to look up answers to questions. Forgets the answers as soon as he looks them up; doesn't know how to correlate them into a general picture. Does not believe that a general picture exists.
  • Knows how to write programs, how to look up answers, how to ask for help, and is able to actually learn on his own.

The last category is actually employable, and makes up maybe 5% of college graduates. The other 95% either lack the basic CS vocabulary needed for learning, or are unable to learn at all, which renders them quite ineffective as far as real-world software engineering is concerned.

So there's an argument for going into the classroom (as a teacher) completely unprepared, stumble through the material, reason things out in fron of the students, go down hopeless calculations for a while, then say "scratch that", "let's see....hmmmmm", etc....Nothing should be clear, the students would have to make huge efforts just to find out what's on the hw. I know some people like this (not by design). I wonder if their students learn some important life-skills though.

In my own experience, this can work well in a small group with engaged students. I had an excellent optics class where we would try to derive a known result as a group: the professor would explain the experiment, draw a picture, and then ask us to help. If we got him going, he would take a few steps, then ask again. Now, I remember next to nothing of equations for optics, but I have a very good idea of how to go about figuring out the outcomes for various experiments theoretically.

On the other hand, I've had professors stop referring to notes partway through a derivation or proof, get dreadfully confused, and simply frustrate themselves and their students. So this may be an all-or-nothing: for a given day or proof or class, either do a group derivation or present the material on a platter.

I will say that I also had a high school English teacher who would use the wrong word or give a ridiculous interpretation in the hopes that a student would correct him and learn to not always trust authority. I liked the theory, but in practice it meant that the attentive students had to do work that was frequently repetitive and irritating, such as correcting word choice or grammar (as these were students who were already thinking) while those who could learn most from such a lesson never noticed it.

I will say that I also had a high school English teacher who would use the wrong word or give a ridiculous interpretation in the hopes that a student would correct him and learn to not always trust authority.

I had a teacher somewhat similar to that my freshman year in high school, except she was a last-minute replacement and was not really an English teacher. Her grammar was atrocious, and I ended up getting detention for correcting her too often (interrupting class or lack of respect or some such was the reason given on the detention). It was probably my first real experience with an authority figure being so utterly and obviously wrong, and I wasn't sorry at all for the detention. It was well worth it.

Here's my bad teacher story:

When I was 13 or 14, my physical science teacher was talking to the class about space probes with trajectories that take them outside the solar system. He said that such probes get faster and faster as they go. Thinking he either had misspoken or was intentionally being wrong to see who would catch his error, I corrected him. To my surprise, he said he had not misspoken and that he was correct. We argued about it a bit then he told me to write down a defense of my position.

Later that day, kids came up to me and said, "Why are you arguing with Mr. S? You know he's right!".

I wrote a weak attempt at a defense of the law of inertia (using a reductio ad absurdum argument if I remember correctly). When I gave it to him the next day, he praised it and conceded the argument -- but only privately. He never admitted he was wrong in front of my classmates.

I argued publicly with my German teacher about the derivation of 'case' in class. At the beginning of the next lesson, she started with an admission that she'd been wrong and I'd been right. In conceding to a twelve year old on her home ground in front of a class of other children that her job was to control, she taught me an awesome lesson about honesty and humility. I held her in huge respect after that and was her ally ever after. Thank you Ms Eyre.

Yeesh, that's terrible. It kind of figures that he'd rather mislead a class full of students about the way physics works than own up to his mistake.

It reminds me of an error I had been taught about the way airfoils work that wasn't corrected until I read a flippin comic strip on the subject almost a decade after I graduated high school.

I was stunned, and spent the rest of the afternoon learning how airfoils really work. What makes this particular example so tragic is it leverages another principle of physics that you won't realize doesn't fit if you are taught to accept everything the teacher says as gospel. What's worse is I'm pretty sure the mistake is still there in the vast majority of textbooks.

Bad Habit #1) Don't notice when you're confused.

Bad Habit #2) All authoritative ideas / all new ideas / all ideas that have a few plausible reasons to support them, are true.

All textbooks should contain a few deliberately placed errors that students should be capable of detecting. This way if a student is confused he might suspect it is because his textbook is wrong.

Starting that in the current culture would be...interesting, to say the least.

I still recall vividly a day that I found an error in my sixth grade math textbook and pointed it out in class. The teacher, who clearly understood that day's lesson less well than I did, concocted some kind of just so story to explain the issue which had clear logical inconsistencies, which I also pointed out, along with a plausible just so story of my own of how the error could have happened innocently.

I ended up being mocked by both teacher and students as someone who "thinks he knows everything". Because of course, we all know that the textbook author not only does know everything, but is incapable of making typographical errors.

Oddly, at the time I was remarking on the error to stand up for a classmate who was expressing confusion. She couldn't understand why her (correct) answer to a question was wrong.

Great post, Eliezer (you've earned my approval). I think tied for worst school-nutured habit, along with parroting things back, is emphasis on what we think we know, as opposed to what we don't know. I think school science and history subjects would be a lot more interesting, and accurately presented, if at least equal time was given to all the problems and areas where we don't know what's going on, and for which there are various competing theories. Unfortunately one doesn't usually get this presentation of the state of things until one is working as a research assistant in college or grad school.

Maybe I was lucky to have "better than average" teachers, or maybe the french school system is a quite different from the US one, but I remember several counter-example to those problems from my high school and university time, in maths, physics, chemistry and biology, I'll tell one example from each.

In maths, we were often asked to figure by ourselves (intuitively at least) if a "theorem" would be true or not, before being a proof of it being true to false.

In physics, we were given experimental results and asked to draft what law could the results follow. It lacked the "devise new experiments to test your law" part, but it's still better than nothing.

In chemistry, we were once given a substance (potassium permanganate, but we weren't told what it was) and a set of solutions, and we were told the substance was used to test solutions, but not how, and we had to figure out what it could test (acidity).

And in biology, in genetics, it wasn't uncommon to give us some experimental results over generations, and ask us to devise the way a given characteristic was reflected in gene (using one or two gene, on sexual chromosome or not, dominant or not, ...). I remember even being told "try to make a law on part on the data, and then test it on the rest of the data", which is close as we can get to real experimental method on paper.

Another one in biology was a very interesting "proof" of evolution : we had two boxes, on each we were putting cotton with water and sugar, a pill of antibiotics in one side, and some bacteria in the other side. One box was to be exposed to UV light for a light while every day, the other not. Then we had two weeks to explain what will happen and how, and after we explained the predicted outcome, we would look at the boxes. (In the box that was exposed to UV light, the bacteria colonized everything, but in the one not exposed to UV light, the bacteria couldn't get near the antibiotics. After a few more weeks, the bacteria did spread everywhere in the two boxes).

Also, most of the teachers I had were very receptive and encouraging when pointed to a mistake they did (as long, at least, as the mistake was politely pointed at, not aggressively so), mitigating somewhat the "authority effect".

But I agree that those were rare, not exceptionally rare, but still much less common than the "here is the laws of newton mechanics, now compute the movement of a projectile with that initial speed and direction" or "here are the laws of thermodynamics and the gas state equation, now compute the final temperature of that system in which that compression was done". Which is better than pure "guessing the password", since you've to apply the laws and do computation, but which are still "here is the truth, apply it".

And I definitely would like we had more of those few examples, they were teaching much more than just giving the answer.

As you say Joshua, ad hominem. Since you ask, it's from providing therapy to friends who were damaged by the school system. But nobody here has alleged that I'm off-base in my description (as opposed to suggestions and conclusions), and therefore it doesn't matter how I got an accurate description, only that I did. As I recently told a schooled friend who was taught silly rules, "The first rule of math is that it doesn't matter how you get the correct answer, so long as it is correct."

(I also explained that "Math is what you do when you don't know what to do next. If you already know exactly how to solve a problem, it's not math, it's computation.")

Wait, don't leave us hanging! What's the real purpose?

I think the worst thing I learned in school was how to kill time.

On the propositon that 'knowing that you are confused is essential for learning' there is a structural equation model, tested empirically on 200+ subjects, that concludes that the ability of knowing-that-you-don't-understand is an essential prerequisite for learning, in the sense that people who have that ability learn much better than those who do not. Three other individual difference variables are also involved, but only come into play after the person realizes that they don't understand something. Its called 'Learning from instructional text: Test of an individual differences model' and is in the Journal of Educational Psychology (1998), 90, 476-491.

Another well-known study was of students learning a computer language from a computer tutoring program, in which all their keystrokes during learning were captured for analysis, and the biggest correlation with successful learning was the number of times they pushed a button labeled 'I don't understand.' (John Anderson's of Carnegie-Mellon)

Another famous result was from the notorious California State Legislature-mandated study of self-esteem: in high school seniors, i it was found that students with the highest self-esteem when they graduated -- they thought they already knew everything -- were those with the lowest self esteem the next year-- they couldn't keep a job because -- they thought they already knew everything.

The problem with self esteem is that you need a middling amount. Too little can lead to depression, too much can lead to narcissism and intractable ignorance.

Too little can lead to depression, too much can lead to narcissism and intractable ignorance.

Yes, and to make this more concrete, there are studies which show that since the self-esteem movement has grown, college students have become much more narcissistic. See this article.

There's also conflicting data about how criminality and self-esteem correlate. With some metrics self-esteem is inversely correlated with criminality but under other metrics it is postiively correlated. The positive correlation seems to win out when looks at repeat violent offenders. See 1, 2, 3 ,4. That last source is an extensive review of the literature on self-esteem and violent behavior, and suggests that a fair number of the studies which show an inverse correlation have methodological flaws or other problems. They make what seems to be a strong case that the correlation is actually positive ( I haven't read most of the sources they cite).

That article reads like it has a very large political axe to grind. While empathy may have decreased due to some large-scale social changes, blaming the "self-esteem movement" is confusing correlation with causation. I'd be curious to know, for instance, if people in urban communities score lower empathy than people in rural communities.

It seems reasonable that a lack of empathy and grandiosity would be associated with violent behavior, but I don't think it's meaningful to call this "self-esteem" or blame a movement that tries to make people feel better about themselves. There's a problem with your measure of self-esteem if it correlates with not being able to admit when you're wrong: that shouldn't be called self-esteem! A secure person is more likely to admit when they're wrong.

The survey in the first article measures empathy; I don't see the self-esteem surveys anywhere, but that last link says

it may be more correct to say that a form of high self-esteem -- more precisely, a highly favorable and possibly inflated view of self that is confronted with an external threat -- leads to violence.

That final article also refers to 'egotistical' and 'arrogant' as terms of "high self-esteem". While it makes sense that egotistical and arrogant people may be more likely to be violent, it's highly misleading to call that having high self-esteem. The article seems to be talking more about lacking the ability to react well to criticism, which sounds more like low self-esteem, not high. (That final article does note that many of the scales that measure self-esteem may be biased either negatively or positively.)

(Edited to make clear which article I mean in the last paragraph.)

While it makes sense that egotistical and arrogant people may be more likely to be violent, it's highly misleading to call that having high self-esteem. The article seems to be talking more about lacking the ability to react well to criticism, which sounds more like low self-esteem, not high.

So, the once standard explanation of bullying was "they have low self-esteem, and do it to feel better about themselves." This is not an explanation that plays well with actual bullies. The better explanation is "their self-esteem is higher than it should be, and they need to use violence to make up the gap." If I think I'm level 6, but really I'm level 4, it makes sense to say my self esteem is high (too high), and whenever someone criticizes me I'll blow up because to me it looks like they're trying to reduce my status from level 6 to 4 (even if they just wanted to fix my spelling this one time and don't know who I am).

People with low self esteem (you're level 4, but think you're level 2) aren't likely to be violent because they don't have anything to protect / uphold by that violence. If you criticize them, you're reinforcing their low-status view, not contradicting it.

My understanding is that the 'self-esteem movement' tends to go for relentless, effectively information-free affirmations, based on an ideology that people need to be told they've done a good job whether or not they actually have. Handing out halos like nametags, in other words. It is not hard for me to imagine that such a thing could lead to unwillingness to accept criticism, in the same sense that obsessively sheltering children from any possible irritant leads to allergies.

That's a very good point. Part of the issue may be connected to the fact that no one seems to have an agreed definition of self-esteem. You seem to be doing the same thing here when you say "There's a problem with your measure of self-esteem if it correlates with not being able to admit when you're wrong: that shouldn't be called self-esteem!" We need to be careful to not argue over definitions.

I quote from one of my favorite authors, Jamie Whyte:

Alas, most know next to nothing about the ways reasoning can go wrong. Schools and universities pack their minds with invaluable pieces of information--about the nitrogen cycle, the causes of World War II, iambic pentameter, and trigonometry--but leave them incapable of identifying even basic errors of logic. Which makes for a nation of suckers, unable to resist the bogus reasoning of those who want something from them, such as votes or money or devotion.

Perhaps I'm naive, but I think the problem can be alleviated by making the introductory logic course a requirement for all students. Such a course could include elements such as formal logic, inductive reasoning, or more specifically, how the scientific method is practiced. Perhaps it could even include some simple psychology so students can learn about our inherent biases in cognition, and then some statistics so they can learn about how data can elucidate the truth. Does this sound too ambitious?

I know this is was posted a long time ago, but I just want to note that when I took an introductory logic course in college, nearly every student came out of it thinking that introductory logic courses should be required at the high school level, if not earlier. It didn't include basic psychology or statistics, but an introduction to formal logic and the basics of inductive reasoning was still enough to transform the thought processes of most of the students who went through it.

Is the second thing to unlearn how to count? My traditional, school learned way of counting only finds one thing you want to unlearn, not two.

"..show students different versions of physics equations that looked plausible, and ask them to figure out which was the correct one": I used to do this. Professor Rosencrantz favours this, I wrote, and Dr Guildernstern that. I stopped when we started getting students who didn't know who R and G were.