Still I’ve never heard anyone talk about language reform. Why is that?
Maybe because you haven’t been paying attention? On the Chinese site there are effort for Simplified Chinese with the latest official implementation of a language reform in 2013.
The last language reform of German which I’m speaking was in 1996.
The French of their Academie Française which is constantly trying to do language reform and gets mostly ignored.
As far as Both English and Chinese go, they are languages with very different language communities use quite different phonemes to express the same word.
To do an English spelling reform you would need to decide which dialect of English is the correct one to map and which dialects are wrong. The US and the UK had problem switching to the metric system which is much easier then doing language reform.
A US president running on a platform of enforcing a switch to an English language with phonetic spelling would likely be seen much less serious as a candiate like Yang who calls for the switch to the metric system.
If you take Serbian as an example of a language going well, the Balkan is full of wars between people who believe they have a different national identity because they consider their dialect to be a full language and the dialect that other people speak to be another language.
To do an English spelling reform you would need to decide which dialect of English is the correct one to map and which dialects are wrong.
Disagree: a new writing system can be chosen to accommodate certain dialectical variations (the pen/pin and father/bother issues, for example) and simply represent others. (The name John is pronounced Jawn in some regions of the USA. It would be very easy to spell it with that vowel if we expected spelling to match the sounds.) And it can all be done without applying right/wrong labels to anybody’s dialect. It’s just a matter of having enough letters (or variations of letters) to accurately represent the differences where they are important. Since we (broadly speaking) currently teach our children to read and write with 26 letters times 4 variations of each (upper- and lower-case in manuscript and cursive) for a total of 104 characters, I see no reason to expect they can’t learn letters that map to the 40-ish sounds that make up the English language (plus a few marks to specify which version of the vowel is being indicated.)
If you’re concerned about one group “getting” the base vowels (with no diacritics), we could teach that a vowel with no markings represents a family of similar sounds and that adding a marking simply selects for a particular sound. (It’s not really all that different than trying to cram 8-12 vowel sounds into 5 letters like we’re doing now, just more precise.) You could spell any word with all unmarked vowels with no more ambiguity than that provided by homographs under the current system.
And we could finally represent puns in text! Maybe some of our jokes would actually make sense to archeologists in a few thousand years.
the Balkan is full of wars between people who believe they have a different national identity because they consider their dialect to be a full language and the dialect that other people speak to be another language.
That’s region-wide arguing by definition combined with a strong cultural emphasis on national identity. Those problems don’t mean that the local spelling system is problematic in any way. Language is a technology and spelling is a tool. That folks are using it to hurt each other doesn’t make a tool “bad” somehow, or we’d ban hammers and axes for being involved in murders. Granted, some governments do ban some objects for just that reason. But the practice is extremely uneven, and usually limited to objects that appear to have been designed to hurt people in the first place.
In principle, couldn’t you start writing in the International Phonetic Alphabet right now, and then right a script with an English dictionary to convert back and forth to standard English? Maybe providing prompts when you run the script to decide which regional dialect/variation you’re trying for, and which homophone?
Sure could! That strategy works just fine for recording language exactly as it’s heard, and even for mapping phonetic representations with traditional spellings in a many-to-one relationship, but it lacks the ability to encode words with dialectically neutral vowels. That is, IPA forces you to choose an exact vowel; there is no provision I know of to indicate “some vowel in this range”, which is needed to neutralize the spellings of most words. Though, it certainly wouldn’t be hard to extend the IPA to include that feature if that’s the character set you wanted to start with.
The inability to avoid exact phonetic representations would actually be beneficial, imo, because a fluent writer of IPA could then represent their native accent exactly, and a fluent reader could recognize that accent and imagine the author’s voice more accurately while reading. It would be useless for deaf people, though—but all written language reforms are, unfortunately.
Being able to represent accents accurately is definitely a benefit! I’d love to pick up a book and be able to gather information about the writer’s cultural background by the way they pronounce words (and without the “mangled” spellings that implies under the current system).
Likewise, I’d like to have the option to write in a neutral voice in order to avoid privileging one group of speakers in the canonical spellings (think, those used in government documents and the like). British and American English both have accents that imply socioeconomic status, and I’m sure that’s true of other languages as well. Being forced to write in a specific accent could needlessly alienate some readers who don’t identify with the group that speaks that way.
As for deaf people, there are many who learn to speak! Phonetic spellings would make that process much easier for learners who can’t get the immediate feedback of clearly hearing themselves and others pronounce words. Once they learned how to produce the sound each character makes, they could know how to pronounce words just by reading them; an even stronger version of the benefit hearing people get from phonetic systems!
Language reforms have always been standardized to the most common dialect that the people making the rules to be. Reforms don’t happen often, so by the time another one comes along, most people more or less can understand the new form.
“The term RP has murky origins, but it is regarded as the accent of those with power, influence, money and a fine education – and was adopted as a standard by the BBC in 1922. Today, it is used by 2% of the [UK] population”
Likewise, standard German is Prussian, standard Spanish is Castilian.
Maybe because you haven’t been paying attention? On the Chinese site there are effort for Simplified Chinese with the latest official implementation of a language reform in 2013.
The last language reform of German which I’m speaking was in 1996.
The French of their Academie Française which is constantly trying to do language reform and gets mostly ignored.
As far as Both English and Chinese go, they are languages with very different language communities use quite different phonemes to express the same word.
To do an English spelling reform you would need to decide which dialect of English is the correct one to map and which dialects are wrong. The US and the UK had problem switching to the metric system which is much easier then doing language reform.
A US president running on a platform of enforcing a switch to an English language with phonetic spelling would likely be seen much less serious as a candiate like Yang who calls for the switch to the metric system.
If you take Serbian as an example of a language going well, the Balkan is full of wars between people who believe they have a different national identity because they consider their dialect to be a full language and the dialect that other people speak to be another language.
Disagree: a new writing system can be chosen to accommodate certain dialectical variations (the pen/pin and father/bother issues, for example) and simply represent others. (The name John is pronounced Jawn in some regions of the USA. It would be very easy to spell it with that vowel if we expected spelling to match the sounds.) And it can all be done without applying right/wrong labels to anybody’s dialect. It’s just a matter of having enough letters (or variations of letters) to accurately represent the differences where they are important. Since we (broadly speaking) currently teach our children to read and write with 26 letters times 4 variations of each (upper- and lower-case in manuscript and cursive) for a total of 104 characters, I see no reason to expect they can’t learn letters that map to the 40-ish sounds that make up the English language (plus a few marks to specify which version of the vowel is being indicated.)
If you’re concerned about one group “getting” the base vowels (with no diacritics), we could teach that a vowel with no markings represents a family of similar sounds and that adding a marking simply selects for a particular sound. (It’s not really all that different than trying to cram 8-12 vowel sounds into 5 letters like we’re doing now, just more precise.) You could spell any word with all unmarked vowels with no more ambiguity than that provided by homographs under the current system.
And we could finally represent puns in text! Maybe some of our jokes would actually make sense to archeologists in a few thousand years.
That’s region-wide arguing by definition combined with a strong cultural emphasis on national identity. Those problems don’t mean that the local spelling system is problematic in any way. Language is a technology and spelling is a tool. That folks are using it to hurt each other doesn’t make a tool “bad” somehow, or we’d ban hammers and axes for being involved in murders. Granted, some governments do ban some objects for just that reason. But the practice is extremely uneven, and usually limited to objects that appear to have been designed to hurt people in the first place.
In principle, couldn’t you start writing in the International Phonetic Alphabet right now, and then right a script with an English dictionary to convert back and forth to standard English? Maybe providing prompts when you run the script to decide which regional dialect/variation you’re trying for, and which homophone?
Sure could! That strategy works just fine for recording language exactly as it’s heard, and even for mapping phonetic representations with traditional spellings in a many-to-one relationship, but it lacks the ability to encode words with dialectically neutral vowels. That is, IPA forces you to choose an exact vowel; there is no provision I know of to indicate “some vowel in this range”, which is needed to neutralize the spellings of most words. Though, it certainly wouldn’t be hard to extend the IPA to include that feature if that’s the character set you wanted to start with.
The inability to avoid exact phonetic representations would actually be beneficial, imo, because a fluent writer of IPA could then represent their native accent exactly, and a fluent reader could recognize that accent and imagine the author’s voice more accurately while reading. It would be useless for deaf people, though—but all written language reforms are, unfortunately.
Being able to represent accents accurately is definitely a benefit! I’d love to pick up a book and be able to gather information about the writer’s cultural background by the way they pronounce words (and without the “mangled” spellings that implies under the current system).
Likewise, I’d like to have the option to write in a neutral voice in order to avoid privileging one group of speakers in the canonical spellings (think, those used in government documents and the like). British and American English both have accents that imply socioeconomic status, and I’m sure that’s true of other languages as well. Being forced to write in a specific accent could needlessly alienate some readers who don’t identify with the group that speaks that way.
As for deaf people, there are many who learn to speak! Phonetic spellings would make that process much easier for learners who can’t get the immediate feedback of clearly hearing themselves and others pronounce words. Once they learned how to produce the sound each character makes, they could know how to pronounce words just by reading them; an even stronger version of the benefit hearing people get from phonetic systems!
I think these are all good examples of language reforms. I guess my issue is that I was over-fixating on english.
Language reforms have always been standardized to the most common dialect that the people making the rules to be. Reforms don’t happen often, so by the time another one comes along, most people more or less can understand the new form.
No, the dominant dialect.
“The term RP has murky origins, but it is regarded as the accent of those with power, influence, money and a fine education – and was adopted as a standard by the BBC in 1922. Today, it is used by 2% of the [UK] population”
Likewise, standard German is Prussian, standard Spanish is Castilian.
I’m sorry that’s what I meant the dialect used by the rulers.