by liberal japonicus
A topic that everyone might have thoughts on
The Initial Teaching Alphabet was a radical, little-known educational experiment trialled in British schools (and in other English-speaking countries) during the 1960s and 70s. Billed as a way to help children learn to read faster by making spelling more phonetically intuitive, it radically rewrote the rules of literacy for tens of thousands of children seemingly overnight. And then it vanished without explanation. Barely documented, rarely acknowledged, and quietly abandoned – but never quite forgotten by those it touched.
Why was it only implemented in certain schools – or even, in some cases, only certain classes in those schools? How did it appear to disappear without record or reckoning? Are there others like my mum, still aggrieved by ITA? And what happens to a generation taught to read and write using a system that no longer exists?
I've often said that teaching is a conservative (small c) profession that draws in the most liberal folks to work in it. I used to joke that there was no other profession that you could have someone from 100 years in the past come to the present and could do the same job, with the only requirement being that you show them how to take the cap off the white board eraser. That's observation is fraying a bit at the edges, what with AI and all that, but it's that conservatism that makes the introduction of AI probably more disruptive than in any other field. (cause in the other fields, you just give pink slips and turn things over to Skynet) Though China seems to be doing it right But as a counterpoint
https://youtu.be/s9EvvxTYjb8?si=Vs3G_XaCOQA95XUl&t=1703
The problems that some of the people in the article have synch with some things I have seen and experienced. When I first came to Japan, students were taught romanization using Cabinet style (Kunrei shiki) and when then moved to JHS, they shifted to Hepburn, when English was formally taught. I came on the first year of the JET program and there was this influx of foreigners (the first year's intake was GB, US, Aus and NZ. The next year, Canada and Ireland were added. There was also the chattering class of foreigners who were always horrified that Japanese spent 6 to 10 years learning Japanese, yet were unable to converse, so the push was for communicative English.
Kunrei shiki is the most logical way to romanize Japanese. You can take Japanese sounds, and make each row a vowel, and then make each column a consonant and you would have a 5×10 grid. Kunrei shiki (and for the purposed of this, I'll pretend like there is only one system, but there are some slight variants) would have the same consonant at the beginning of the sounds in one column. The only problem with this is that the sounds have undergone some change, so in Kunrei shiki, Fuji, the mountain symbolic of Japan, would be spelled Huzi.
Every since Meiji, people and groups have suggested that Japanese shift to a romanized script, and there have been massive battles about which system to use. Being Japan, they have tried to appease both sides, so traditionally, Kunrei shiki was used for information designed for Japanese, and Hepburn shiki was used for things aimed at foreigners. I don't have to note that you are going to have some problems. For example, when Japanese started getting credit cards, their passport would be in Kunrei shiki and the credit card would be in Hepburn shiki. If you were unfortunate to have spellings that were different, this could end with your credit card being rejected because it wasn't the same name as on the passport.
Recently, the Japanese government may take steps to make Hepburn shiki the standard romanization, which tracks with the fact that English has been pushed down to elementary school and it is becoming more confusing.
https://www.japantimes.co.jp/commentary/2024/03/06/japan/japanese-romanization-rules-change-explained/
The article supports the change, but suggests that it is only for "some obscure historical reason", that these spellings exist. While I agree that it is probably better to change, it is not an obscure historical reason, it was trying to make a statement about who the language was for and why it was used. But that is another blog post or 5.
This all applies to this story because when I arrived, I would also run into a number of Japanese who surprised me with the problems they had with Japanese. English was still only formally taught in JHS and there was not any sort of pressure for students to get better at English ahead of that (unlike in Korea), so there was a very carefully constructed curriculum. My 4th and 5th year on the program, I was in the Board of Education with all of the subject consultants. These were all sharp people who had achieved success in teaching and were promoted because of their skill in education. Yet with the exception of the English Teaching Consultants, none of them had any fluency in English. Sitting down with them, a lot of times, if they wrote down something in English, it would be in Kunrei shiki, which leads me to think that the problem was not that they learned English poorly, but that they learned it too well and were unable to get out of the rut.
This tracks with some personal experiences I have had. After my first 5 years in Japan, I had an ok reading knowledge and could write, albeit with a word processor rather than a pen. I went to grad school and kept up my spoken Japanese, but, because I thought it wasn't a big deal, studied Thai, which was a mistake for me. After I finished my MA and was packing to take a job in Japan, I found my old word processor (this was before internet and before pcs). I found the adapter, plugged it in and started it up and was shocked because I was greeted with a document that I could remember writing, but could not read. I could have been typing REDRUM REDRUM.
When I got back to Japan, re-learning kanji was a little like learning how to walk after a stroke. With age, I know that my ability to read and write was a lot better after my first 5 years than it is now. I also worked at one university for 3 years before moving to another university with a more permanent position and a similar situation occurred. I was packing up the books I sent over and I found my Thai notebooks and had the same experience, remembering writing them but being unable to read them.
I got to the place where I still am and was able to work on a research project connected with Vietnam. I went over for two trips of a few weeks and worked on with my phrasebook to get around. About 5 years later, I was able to organize a yearly conference in Hue and it was astonishing how much Vietnamese I remembered. I guess I had a sight vocabulary of about 1000 words and I believe that the reason was because my language ability was, at its core, my ability with the English alphabet.
So, to get back to the article, had I experienced something like the Initial Teaching Alphabet, it would have been as dire. I think that my acquisition of Japanese would be nil if I had an iphone, google maps and translate when I arrived in Japan.
Anyway, a thread for all your language and education musings.
I am involved in an international organization (ICANN, if you care). I have a nagging (unspoken) embarrassment because everybody speaks English. Most of them barely distinguishable from native speakers.** Partly that turns out to because they did college or grad school in the US, or perhaps the UK or Australia.
But there I am, speaking only English. The German I learned in high school is mostly gone. The Japanese I studied in grad school is also gone. And, of course, because I grew up in California I know a few bits and pieces of Spanish. (Actually, taking a few Spanish classes is on my Really Need To Get Around To This list.) For the moment, I try to at least learn how to say thank you in the language of wherever we are meeting.
I have the suspicion that English will eventually end up as the world language. There will no doubt be long and bitter fights to preserve the national language. But they will, in eventual history, be seen as futile fighting tetreats.
These days, the world requires a common language to function. One will come: the only question is which one it will be. The Chinese will argue, as their economic power increases, for Chinese (Mandarin), but a tonal language is simply too difficult for anyone not raised in one. Spanish might be a viable option, but it lacks a serious economic power to push it. French might have a chance, except that its spelling is nothing approaching phonetic, which makes it hard to learn.
English, thanks to British and then American economic dominance (plus the fact that India, with its huge population, already uses it because its people speak 5 mutually unintelligible native languages), is already getting there. A lot of countries, not just Japan, start English lessons in grammar school. I won’t claim that nothing could displace English. But it would take, at minimum, a couple of centuries of economic and cultural dominance.
** Working groups typically work in English (even if there are no native speakers involved), with a few having simultaneous translations for French and Spanish. For the three major meetings, we get translations in the 5 UN languages (English, French, Spanish, Russian, and Chinese) plus Arabic. And the language of wherever we are meeting, if it isn’t one of those.
I have the suspicion that English will eventually end up as the world language. There will no doubt be long and bitter fights to preserve the national language. But they will, in eventual history, be seen as futile fighting tetreats.
Sorry if I misunderstand you, but are you saying that the use of national languages in the native countries will disappear or be reduced? I don’t think that would be realistic.
Re: changing systems of roman-alphabet spellings in japanese: it would be good to get Hartmut’s input, since (IIRC) there was a systematic change in “official” German spelling ten? twenty? years ago.
Re: English taking over the world. There’s a saying that “English is the Lingua Franca of Science”.
Which I find amusing because “Lingua Franca” literally means “French”, but (IIRC) is in Latin.
So there’s three of the contenders for ‘language to take over everything’ right there in one sentence.
Why, yes, I *am* easily amused, why do you ask?
I have the suspicion that English will eventually end up as the world language.
More than 30 years ago now, I spent some time working with an engineering team at Ericsson, the Swedish telecom company. Ericsson’s internal organization at the time had hardware being done in Sweden, operating system being done in the UK, and application software being done in Spain. By decree, the official technical language inside the company was English.
The official rules for international fencing are written in French. This leads to occasional interesting difficulties. Epee rules intentionally allow some amount of incidental body-to-body contact, but not too much. There was a great deal of debate at the FIE over how to translate the French phrase for what was not allowed to English. They finally settled on “excessive jostling”.
Sorry if I misunderstand you, but are you saying that the use of national languages in the native countries will disappear or be reduced? I don’t think that would be realistic.
Yes, that is exactly what I am saying. Not that it will necessarily be quick. But it will happen.
See, for example, the various Native American / First Nations languages. They haven’t, quite, died out and there are various efforts to save one or another of them. But the reality is that native speakers are overwhelmingly old. Children may retain some fluency, in order to speak to their grandparents. But for everyday use, they speak English. And the children’s children will be straight Anglophones.
We see the same phenomena in immigrants. My wife’s immigrant grandparents were functional in English, but spoke Japanese with family and friends. Her parents were fluent in Japanese (from talking to parents and aunts and uncles when growing up), but generally spoke English except when talking to the older generation. My wife and her siblings? Even having lost virtually all of the Japanese I studied in college, I still speak more than they do.**
Granted, there is more inertia when it comes to languages with a big population base. So it will take longer. But modern communications mean that the next generation will be exposed to English far beyond the classroom. And anyone who interacts with the outside world, from academics to taxi drivers, will need to use it routinely. Already do, actually.
** When my wife and I first got together, we made occasional trips half the length of California to see her family. About the third trip, I got taken off to see Grandma. My future mother-in-law gave Grandma an explanation of who was this blue eyed blond, then introduced us. I remembered enough to say Hajimemashte. Grandma just lit up; from that moment, as far as she was concerned, I was in.
I don’t think subjugated peoples or immigrants are relevant examples. You are talking about e.g. the French not speaking French anymore – that’s never going to happen because language is so closely tied to indentity and culture, but also practically speaking you would have to change all the laws etc.
More generally, you would have to imagine (non-immigrant) parents speaking to their babies in a language other than their mother tongue. Unless we are a talking about an actual genocide, I don’t think that’s going to happen either.
Finally, people routinely overestimate the quantity and quality of proficiency in English in other countries. This is understandable because usually those making such assumptions tend to interact only with a highly educated subset of the population of those countries, thus skewing the picture.
Real-time translation and communicating with devices in any language the user chooses could lower the pressure to learn more than one language.
Or kill it altogether. Why learn, if there is an instant translator? If everyone has a babelfish in his or her ear, the need disappears (at least for the lazy majority).
English is a good candidate for being easy to get at least sufficient proficiency (Ironically one theory is that this is the result of Old Norse mugging Old English killing declension in the process).
(Proper) Latin is far too cumbersome in some aspects.
Latin supplanted and replaced numerous languages but during the Middle Ages the unity disappeared and Latin turned (degraded?) into numerous new languages as far as common use was concerned.
I consider it likely that over time English will get rid (or ridden) of its ‘gothi’ problem and some other irregularities will get filed off too by international use and become at least the second language for a majority in the world (provided our world and/or civilisation does not perish before that).
In case anyone is interested in the subject (and in lieu of fraught AI summaries):
https://en.wikipedia.org/wiki/English_as_a_lingua_franca
It’s entirely possible that English will become the lingua franca for international communications, but if it does, I’d expect, like Hartmut, that it continues to shed irregular constructions and colloquialisms and that native dialects will be treated as quaint variants with charming local color. I also predict that both Americans and Brits will complain bitterly that ELF is “not proper English” when that happens, and resent any standard that treats ELF as the paradigm.
Why learn, if there is an instant translator?
I don’t see that happening in complex, real-life contexts. It’s hard enough to make sense of people in one’s native language sometimes, lol.
Instant translation is fine for functional and transactional language, but it hits its limits pretty quickly as language complexity increases and becomes problematic for understanding as soon as there is an intertextual element at work. I see this a lot with my international students when they are working their way through English texts with the help of translation software. They miss a lot of the features that the authors are using to communicate – parallelisms, homophones, puns, etc.
To be fair, a lot of my native language domestic students miss those things too, but the international students have the reading skills to catch those elements in their own languages, and would notice those things if they were actually working with the original text.
One thing I can add that speaks to lj’s first point. Language-wise I’ve studied Spanish, French, Swedish, and Ancient Greek. I can muddle through in Spanish, and would probably be able to attain fluency in any of the first three in a few months with immersion. Greek, however, never sticks particularly well, and the alphabet contributes somewhat to that difficulty. It’s one more unfamiliar element (deciphering) that takes up processing power that would otherwise be used for linguistic sense-making.
Ancient Greek is far more versatile than Latin but also quite a bit more difficult. I never really got the hang of it (in 3.5 years at school compared to 9 years of Latin).
I think there is little to-day that could not be directly translated into ancient Greek while Latin would require massive reformulation to do the same and much would get lost doing it. In Latin one has to invent new words for concepts Cicero&Co. did not yet possess and would probably not understand. Greek (unlike classical Latin) allows composite words that can transport about any meaning with little to no loss. Latin is the hammer of a blacksmith, ancient Greek is a jeweller’s toolset.
It occurs to me that a lot of education should be guiding people to what they are good at. I always thought I was good at languages, but I realize that I am really good at the languages that are in roman letters. Had I known, I might have opted for vietnamese instead of thai, or even earlier, dove into chinese first (I have several english native friends who have fluency in Japanese and Chinese and all of them did Chinese first and I don’t know anyone who went the opposite way).
India, with its huge population, already uses it because its people speak 5 mutually unintelligible native languages
Off the top of my head, Hindi, Urdu, Bengali, Gujarati, Punjabi, Malayalam, Tamil, Telugu. There are many others.
…the alphabet contributes somewhat to that difficulty…
I’m puzzled by this. I’m not good at languages, relative to my other skills, but switching alphabets – Cyrillic, Greek, Georgian… is trivial.
(I struggle with Hebrew, where some of the letters are just too alike.)
In Latin one has to invent new words for concepts Cicero&Co. did not yet possess and would probably not understand. Greek (unlike classical Latin) allows composite words that can transport about any meaning with little to no loss.
No offense, but I wonder if being a native speaker of German might be coloring your view here. Composite words being one of the most noticable things (after gendered nouns) for English speakers when learning it.
I admit that I don’t have a wide enough base to know how common composite words are in languages generally.
I’m puzzled by this. I’m not good at languages, relative to my other skills, but switching alphabets – Cyrillic, Greek, Georgian… is trivial.
It’s not onerous, no, but it is a factor on at least two levels in my experience.
First off, it can create some noise when particular letters look similar to letters in the other language that are not phonetically equivalent, and that usually triggers a bit of recursion in the reading process. It’s not a lot of load on the system, but it is processing power that is not being used to make sense of the meaning. Writing English using the Greek alphabet barely affects reading comprehension when deciphering the message when one is fluent in English. Combine a lack of fluency with the need to decipher and the effects compound.
Second of all, it messes with the pattern recognition that one relies upon when skimming a text. When I’m reading Swedish or Spanish, I can skim the text fairly easily and a lot of the language has enough root-equivalency to make those reading skills transfer. That sort of whole-word pattern recognition doesn’t fire the same way when I am faced with another alphabet.
All of these things mess with your language in the same way that when a student is asked to write about an unfamiliar topic with its own technical vocabulary, they often end up writing language that has a greater number of grammar and spelling errors than when they are writing about familiar topics. The familiar has a much simplified processing economy.
And again, with functional and transactional language, these difficulties are much less pronounced than when dealing with more complex and nuanced subjects.
At least that is my experience, and it seems to match with my observations of how my non-native student writers interact with texts. Actual linguists would likely have a lot to say about the places where I’m wallpapering over some complex topics, or missing the boat entirely.
wj, imo English de facto has composite words, just not writing them as such (keeping the parts separate, not even using hyphens). It’s just convention that one does not write railwaystation or particleaccelerator or internalcombustionengine but imo those are perceived as units. Latin has very few of those and they are usually fixed terms of noun and adjective (e.g. res publica = state, res novae = coup d’etat, homo novus = upstart). A very rare exception is suovetaurilia (sacrificing a pig, a sheep and a bull together) but that is in essence a proper noun, a name.
“A very rare exception is suovetaurilia”
Thanks, Harmut for introducing us to the ancient Roman form of Turducken.
A couple of things about reading. It’s a bit like second language acquisition, in that no one is guaranteed to acquire reading. There is a basic idea that reading is a interactive process that bounces back and forth between bottom up and top down, but beyond that, there is not much. There is an notion of orthographic depth, but it’s often by speakers of one language (usually English) projecting onto other languages.
I tend to think of it a lot like umwelt, which is the unique subjective experience that an organism has that can not be understood by any other organism. Though it doesn’t use the word, “What Is It Like to Be a Bat?” by Thomas Nagel gets at that point. I’ll leave it to Harmut to explain umwelt, along with merkwelt and wirkwelt in Uexküll’s biosemiotic theory and I’ve never heard of any linguist taking this up, but I definitely will in another life where I am multi-lingual…
It’s just convention that one does not write railwaystation or particleaccelerator or internalcombustionengine but imo those are perceived as units.
German also has the useful convention of capitalizing nouns. Using Internalcombustionengine would at least signal it’s a noun, even though it starts off with an adjective. Camel notation* from computer programming would possibly be better: InternalCompustionEngine.
The example everyone remembers from college German is Handschuh (hand shoe) as a generic glove/mitten term, Fingerhandschuh for gloves, Fausthandschuh for mittens, Panzerhandschuh for armor, etc. The German is at least consistent. In English, it’s another of the English/Norman dualities. Glove is from Old English; mitten is from Norman for mitten; gauntlet is from the Norman for glove. I understand there are more types of Handschuh that correspond to some of the other uses English has piled on gauntlet, like “throw down the gauntlet” or “run the gauntlet”.
I’ll just go ahead and invite Hartmut to explain how wrong I am :^)
* Off and on for a half-century now I have occasionally tried to adopt camel notation when I’m writing code. It never lasts, and I always go back to underscores: source_index rather than sourceIndex in something I’ve been writing this week.
“it radically rewrote the rules of literacy for tens of thousands of children seemingly overnight.”
There are always stories about how miraculous various programs and phonics in particular are in teaching reading. But surely such programs would have been adopted by now and thus must have been producing undeniable results somewhere. Wouldn’t phonics have been adopted in red states (there is an obvious partisan divide on this) and shouldn’t those states now be much better in reading performance?
Spanish is almost perfectly phonetic so why do international comparisons put reading performance in Spain below that in the US? Can Japanese and Chinese (who do well on the international comparisons) only learn to read after seeing the romanized versions? Surely people in China were able to read before it was exposed to the West.
“Whole language” may have been a wrong turn if it really resulted in ignoring phonetics (I doubt that it really did in practice), but the fact is that proficient reading requires recognition of whole words. Most children can learn new words both audibly and in symbols very fast, but some may require more help from phonetics.
By the way the Roman alphabet is very poor for most languages (including English), which typically have more sounds. Few languages have only five vowels sounds, like Spanish and Italian (and presumably Latin). Phonetics is presumably helpful in deciphering foreign words, but the simple Roman alphabet will never describe them (except Spanish, etc) accurately even with code books. However some alphabets, such as Arabic, may be even simpler, omitting vowels altogether at times.
“Whole language” may have been a wrong turn if it really resulted in ignoring phonetics (I doubt that it really did in practice), but the fact is that proficient reading requires recognition of whole words.
To paraphrase a friend, “No one can get a PhD dissertation out of pushing phonics. You have to claim that something else is better.” Or at least that something else is as/more valuable than recognizing the words early on. The new things are all well and good, but memorizing a few hundred words-as-a-chunk is still necessary. You can’t sound out “bat” and “cat” forever; at some point it has to be automatic.
There are assorted postings — the internet has made them more common — that ask whether you can read “Aoccdrnig to rscheearch at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae.” Is this just an English thing? Does Romanized Japanese tolerate the same sort of misspellings for fluent readers?
New Math was the same sort of thing. It pushed a much broader view of what math was than just the algorithms. Look, long division is done the way it is because hundreds of years of experience informs us that it’s the best way to get the right answers when you have to do a hundred division problems a day, day after day. New Math failed when the teachers pushed the broader view but didn’t teach the mechanics.
Is this just an English thing?
According to Grok:
Scrambled Words Across Languages
Michael, the scrambling thing is called Typoglycemia. In German it’s Buchstabensalat (letter salad) and at a pedagocic course I had to suffer through at university as Badewanneneffekt (bathtub effect, no idea why).
https://en.wikipedia.org/wiki/Transposed_letter_effect#Internet_meme
With long words with completely random letter scrambling it’s rubbish even if first and last letter are retained.
I’m not sure if I’d be so harsh on the Roman alphabet. You don’t want a system that encodes everything.
Something that floors my students is when I teach them about an abcedary, which is a chart that represents the letters by giving them a word that has the sound (A is for apple, etc) Because Japanese kana are the sound they represent, there is no need to create one.
About Michael’s question, I think it would work in romanized Japanese because it is essentially creating an anglicized version of Japanese and the consonants plus the context can give enough clues to read them. However, Japanese don’t process near as much text in roman letters, so that would be an issue.
Charles, I gotta ask, don’t you wonder about quoting an LLM that can call itself ‘MechaHitler’?
https://www.pbs.org/newshour/politics/why-does-the-ai-powered-chatbot-grok-post-false-offensive-things-on-x
Camel notation* from computer programming would possibly be better: InternalCompustionEngine.
It is interesting that it is widely used in domain names, e.g. KaiserPermanente.org Clearly the sales and marketing folks think it will be easier to parse that way.
New Math was the same sort of thing. It pushed a much broader view of what math was than just the algorithms. Look, long division is done the way it is because hundreds of years of experience informs us that it’s the best way to get the right answers when you have to do a hundred division problems a day, day after day. New Math failed when the teachers pushed the broader view but didn’t teach the mechanics.
The trouble with New Math was that it was (apparently) designed by mathematicians. Mathematicians who had forgotten that a) you have to build the foundations (mechanics, as Michael says) first. And that b) normal people are not mathematicians — and that’s 99.99% (or more) of the population. They neither care nor need to know the theoretical underpinnings. They just need to know how to do basic arithmetic reliably.
I was taught using the School Mathematics Project, which seemed OK to me. But I may not be one of the “normal people”.
I suggest that being able to divide accurately with pen and paper is now almost useless, whereas being able to divide approximately in one’s head is useful for avoiding fat-finger errors. That is, the underpinnings have turned out to be more important than the algorithms.
I suppose I can see how, if everybody who knows how to read** has a phone/computer in their hip pocket, knowing basic arithmetic might be less critical than it once was. I’m not convinced, mind, but I can see that it might be.
** At the rate software is improving, I suppose computers will be able to read to us, and write down what we say as well. The reactionaries will no doubt be delighted if illiteracy once again becomes the norm. /snark
Here’s a post from Keith Devlin working through some thoughts about the tension between calculation and mathematical thinking.
https://devlinsangle.blogspot.com/2018/05/calculation-was-price-we-used-to-have.html
For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms), with various forms of calculation (numerical, algebraic, set-theoretic, logical, etc.) being tools developed and used in the pursuit of those goals. That’s the only kind of mathematics we have known.
Except, that is, when we were at school. By and large, the 19th Century revolution in mathematics did not permeate the world’s school systems, which remained firmly in the “mathematics is about calculation” mindset. The one attempt to bring the school system into the modern age (in the US, the UK, and a few other countries), was the 1960s “New Math”. Though well-intentioned, its rollout was disastrous, in large part because very few teachers understood what it was about – and hence could not teach it well. The confusion caused to parents (other than mathematician parents) was nicely encapsulated by the satirical songwriter and singer Tom Lehrer (who taught mathematics at Harvard, and did understand New Math), in his hilarious, and pointedly accurate, song New Math.
As a result of the initial chaos, the initiative was quickly dropped, and school math remained largely unchanged while real-world uses of mathematics kept steadily changing, leaving the schools increasingly separated from the way people did math in their jobs. Eventually, the separation blew up into a full-fledged divorce. That occurred in the late 1980s. The divorce was finalized on June 23, 1988. That was the date when Steve Wolfram released his mammoth software package Mathematica.[…]
Devlin is really good on matters pedagogical, and always worth the read.
I do tend to think, though, that students will have a very hard time with understanding math (or written communication) if they have not had enough experience with doing the work, and not seen enough examples to get an idea of the possible range of approaches to doing the work, etc.. Early in my teaching I tended not to give enough examples, figuring that teaching the conceptual side would lead students to sort through their own database of examples to see the underlying principles. I’ve since learned that most students come in having seen and understood too few examples, and having no idea of more than one approach to the tasks they have been called upon to do.
I do a lot more modeling of approaches, and evaluation of those approaches, now that I’m finally starting to figure out this whole teaching thing.
I’m on my phone, so can’t give links, but I encourage a dive into how japanese teach math vs US methods. A couple of points I remember:
-in the US, people who are goid at math get pushed into teaching math, so they often don’t understand why students make the mistakes they do. A large component of Japanese math education is predictive, so a good teacher should know where students are likely to go off the rails and adjust their teaching
-a passing grade in Japan is 60, which is good for math, if you understand 60% of some concepts, that’s not too bad, and the bulk of math education happens in hs. I recall I was involved with an exchange program that sent selected prefectural students to BC. One student was from one of the lower ranking schools, and was considered the weakest candidate academically. Wasn’t a bad kid, but was on the baseball team, so 95% of his effort was on the baseball field. He went to a BC high school where classes were in mid term and had to take a math test because that was scheduled and everyone was astonished because he had a perfect score.
-students in japan still aren’t permitted to use calculators
-they also don’t give partial credit, which is how I got thru my math courses.
Will try and toss some links tomorrow
For any mathematician alive today, mathematics is a subject that studies formally-defined concepts, with a focus on the establishment of truth (based on accepted axioms)
Absolutely, however teaching “formally define concepts, axioms, proofs, etc” are taught in PLANE GEOMETRY, not so much in calculation/algebra type classes.
Couple of links
https://www.educationworld.com/a_admin/admin/admin074.shtml
https://www.tandfonline.com/doi/full/10.1080/0020739X.2019.1614688#d1e163
There are a ton of pdfs as well.
Perhaps the greatest calculator, unmentioned in Devlin’s article, was Kepler, who worked out his laws of planetary motion from Brahe’s observations.
All this math talk has me celebrating pi, but not exactly.
First we smash all the (electronic*) calculators.
Get the logarithm table and/or the slide rule back.
That’s how one teaches the basics!
In all seriousness, I work as a tutor and by now mainly for math. Before one can teach them abstract concepts, they need to get the basics right and that means calculations (preferably without electronic help). I see no justification to teach set theory** when they get still puzzled by “Is 123 divisible by 3 without residue?” or “What’s the square root of 81?”. If that’s 19th century, then Gott erhalte Franz den Kaiser!
To me that sounds like the unfortunate discussion about why and whether kids still have to learn to write by hand in our modern age. Why learn orthography when there are spellcheckers? And what by the way is the use of kids reading fictional literature of guys long dead or ancient (pre-1990) history?
Yes, understanding should have priority over rote learning buy everyday math is still mostly basics. And I am cynical enough to say that we are failing there already. Imo math at school should concentrate on practical problem solving not Zermelo’s theorem.
*abaci will be tolerated
*exception: difference between natural, integer, rational and real numbers.
I played with an electronic calculator when I was a kid. It allowed me to start recognizing patterns in numbers, particularly when performing the same calculation recursively. I’m reasonably sure I wasn’t typical in that regard, but I wanted to mount some meager defense of electronic calculators.
All this math talk has me celebrating pi, but not exactly.
Hmmm. I’m thinking exactly, but not precisely.
Doing recursions on the calculator was also among my first experiences. I did not yet understand though why certain functions would yield the same result independent of start value after pressing the key repeatedly (converging on x=f(x)).
…why and whether kids still have to learn to write by hand in our modern age.
There seems to be a consistent body of work showing that taking notes during a lecture reinforces memory, and taking notes longhand reinforces more than typing on a keyboard. That’s the pseudo-academic in me speaking, of course.
For the last twelve years or so I’ve been using a little note-taking application that I wrote myself. There were just too many cases where pasting in an image, or having a live URL, or even just searching for a keyword seemed to justify it. Recently I’ve been considering going back to paper and pen.
I thought about using an Ipad with an Apple Pencil, which has gotten very much like paper and pen (so long as you don’t use the eraser much*). Unfortunately, Apple has seen fit to put handwriting recognition into the OS, and insists on putting a little line under anything you write/draw that it thinks might be a date or time. I’ve seen many complaints about it, and people asking why Apple can’t make it optional.
* One of the reasons I always took notes in ink while I was doing research work was because sometimes I wrote down something that I thought was right, and two days later discovered I was mistaken. With ink, you have to grab a different color pen and put in a dated bit with the correction.
At the rate software is improving, I suppose computers will be able to read to us, and write down what we say as well.
During the early 1990s I had lunch regularly with a librarian. We discussed archiving on a regular basis. Ken Burns’s Civil War documentary was still pretty new. She used to say, “You want to write the source material for someone to use in 120 years to make a documentary like Burns’s? Acid-free paper and pigment-based ink, my friend. And descendants willing to keep your writings in a trunk somewhere dark.”
There seems to be a consistent body of work showing that taking notes during a lecture reinforces memory
The more ways you can engage with a body of material, the more it will stick with you.
The greater the degree of attention required of you as part of that engagement, even more so.
Humans should think, machines should work, as the saying goes. That said, thinking *is* work, and attempts to find short cuts around that just make us stupid.
Don’t know about memory but in my case writing by hand clearly improves quality (and is also quicker).
And doing math (not calculating but solving a math problem by hand) I can’t essentially do any other way. Chemistry the same. Why spent minutes (at minimum) on the screen when a sketch on paper takes seconds? I am not a tablet guy admittedly.
Why spent minutes (at minimum) on the screen when a sketch on paper takes seconds?
Tablets are getting better at imitating what paper and pencil do. But it’s still a pale imitation. And, from what I’ve seen, the rate of progress towards duplicating it as slowed markedly.
Pro Bono: Perhaps the greatest calculator … was Kepler
PB, if you haven’t seen this 3Blue1Brown video, you really should take a look. The mind-boggling explanation of what Kepler accomplished starts at around 18:20, but the whole thing is great.
–TP
I love maps. All kinds: altases, road maps, city maps, world maps. Give me a map to use, read, ponder, and I can be absorbed in it for hours.
I remember, when I was in Athens (Greece, not Georgia), getting my hands on a fold-out street map, assuming I’d be able to make some sense out of it based on knowing the general names of streets, and where things like markets and bridges were. And I remember the delightful/terrifying feeling of not, in fact, being able to do that even a little bit. Thanks to the different alphabet, I was unable to tease out any meaning at all. Terrifying for obvious reasons, but delightful because it was rather fun to see one of my most-cherished objects – a map! – manifest as incomprehensible.
Michael – The issue of records being kept, preserved, and accessible for more than one generation is one I think about a LOT. Just seeing how quickly electronic media become obsolete makes me shake my head in bleak wonder.
We can read the direct writing of people from thousands of years ago – multiple thousands of years – right up to, what, a couple generations ago? When did people stop writing letters or keeping written journals?
It just seems like humanity, or at least the industrialized portions, is engaged in a headlong rush to erase itself from the record. (Which, considering where we are right now as a species, is kind of understandable, though no less alarming.)
Thanks, Tony P. The two most underrated scientists in popular science history are Kepler and James Clerk Maxwell. (Galileo is the most overrated.)
Perhaps the greatest calculator … was Kepler
Perhaps. But the ladies who did all the calculations for the Mercury and Apollo Projects were no slouches either. Men’s lives hung on their work. As the story goes, John Glenn asked explicitly for Katherine Johnson to do the calculations for his flight. He wouldn’t trust anyone else with his safety.
As we’ve gotten around to archives…
TL;DR version: I’ve started playing with a toy version of the beginnings of software that will eventually be a tool for my archival project. In a couple of years it ought to be interesting :^)
I’m starting to play with toy versions of software I’ll be using eventually in my role as extended family archivist building a digitized record from the hundreds/thousands of pictures and document pages that have accumulated. Everything I’m doing right now is grayscale, just so that’s not a surprise to anyone who goes so far as to look at the images. Most of the images are large; you’ll have to do whatever tricks your browser requires to see them at full resolution.
Text documents first. A JPEG image of a document page I snapped with my iPad is here. The original image is somewhat sharper than the one shown, since JPEG is not as good with details as Apple’s HEIC format. For the time being, I use ImageMagick to convert HEICs to uncompressed grayscale.
Right now the toy assumes the document is a rectangle laying flat, and I’m taking a picture of it that’s out of alignment. That makes it a linear transform problem. First step is to find the corners of the document. I’m doing something not entirely simple minded. The accuracy of the toy corner-finding code is illustrated here.
It’s been a long time since I did anything with linear transforms and the matrix calculations that go with that. After some online reading to refresh my memory, and finding simple versions of code for 3×3 matrices, the toy code can do a perspective transform and produce an approximate equivalent of a 300 dot-per-inch scan (or more, or less). The page in the picture is actually a pile of several sheets, stapled, so doesn’t quit meet the flat rectangle assumption. The result is shown here.
In some cases, I will want to do OCR on the images. I’m using the tesseract open-source OCR program for now. Tesseract is not a toy. When I converted the photo to an estimated 600 dpi scan and ran it through tesseract: (a) tesseract estimated the resolution as 607 dpi; (b) to a quick pass through the output, all the text in what are actual text fields are correct; and (c) flat text output is shown here.
The same toy sofware works on pictures of photographs if there’s a white border so that the toy can identify corners. An approximate equivalent to a 300 dpi scan of an old Polaroid picture of my wife-to-be from before I knew her is shown here.
Michael, this is just way cool!
When you feel like the software is mostly together, is it something you would be willing to share? Sell? (I hesitate to suggest beta test. 😉
Michael Cain, your wife-to-be from before you knew her exudes goodness in that photo. Top of the preservation list, I imagine.
When you feel like the software is mostly together, is it something you would be willing to share? Sell? (I hesitate to suggest beta test. 😉
Sure, but don’t hold your breath. I have a long list of things that “mostly together” will require.
When you feel like the software is mostly together, is it something you would be willing to share? Sell? (I hesitate to suggest beta test. 😉
Sure, but don’t hold your breath. I have a long list of things that “mostly together” will require.
Still be way sooner than I’d get anything similar coded myself.