Why no new notations since 1960?
Writing consists of language and also notations, systems of marks that communicate meaning in a specialized domain. Examples of fields with their own highly developed notation are music, mathematics, architecture, electronics and chemistry. There are also more minor types of notation, for example, welding, meteorology and finite state machines. Here’s the question: all the notations I’m aware of were invented before about 1960. Over the past few decades, people have invented all sorts of fancy notations, but none of them have caught on in the applicable field. Why not?
Some answers:
Of course there are new notations. I am just an old man and haven’t noticed. (I have noticed entity-relationship diagrams are a minor notation that caught on this century. But that’s all I can think of).
The upsurge of typed and computer-generated documents meant that anything that wasn’t easy to type on an existing keyboard wasn’t used. Maybe true for the ’60s to the ’80s, but we’ve had good systems for propagating arbitrary marks on paper both before and after.
It’s a hard time for notations in general. For example, the elaborate system of conventions around mechanical drawing that I learned in high school drafting class has been replaced by CAD systems, which are far more readable and easy to use. The communicative purpose of notations has been replaced by computer user interfaces.
The first example that came to my mind for a recent notation that has caught on in the field would be siteswaps in juggling. It was only invented in the 1980s. I am a juggler and can confirm that all the technical juggling nerds know what this is and it is used in crazy tricks. For example see 5551 below, which I heard was the first trick that was found through the notation:
Juggling lab is a software for rendering these.
...Programming?
Many popular languages today (notably the C family) ultimately descend from ALGOL, which is from 1958.
“Structured programming”, i.e. writing code as syntactically-delimited blocks, functions, and procedures rather than with numbered lines and GOTOs, was pioneered in ALGOL.
Popular languages today such as Python, Java, JavaScript, Go, and Rust may diverge pretty widely in features (and syntax), but all of these are ultimately ALGOL descendants; albeit with influences from other language families too.
(If your language has
forloops, it’s an ALGOL descendant.)Lisp and Fortran are also pre-1960.
Simula (and thus object-orientation) is from ’62, but influenced by ALGOL. Smalltalk is a Simula descendant. C++ is what you get if you try to build Simula ideas on top of a C compiler (and go a bit gaga for operator overloading).
There are some languages a little later than that, that look pretty different. For instance, APL is from ’68. Forth is from 1970. ML, which gave rise to Haskell, is from ’73.
I thought “new notation” included new symbols. Almost all programming languages exclusively use ASCII characters for their keywords, which are pretty old.
music:
piano roll
tracker
frequency spectrum
amplitude / time graph
scrubbing
text:
html / markup in general
wysiwyg (debatably new)
video:
digital video editing software timelines
images:
pixel data
many of these are skeuomorphic. perhaps it can be argued that they have history from the 60s. but at this point the digital interfaces have supplanted any real world metaphor. for example, the idea of showing a reticle moving along a line to represent “where you are in this song/movie” is a universal notation. i don’t believe it was common before personal computers.
Various kinds of tensor networks might be an example. Wikipedia claims that Penrose’s graphical tensor notation is from 1971. Its descendant, ZX calculus is from as late as 2008. Arguably the first tensor networks were Feynman diagrams though, and 1948 is before your cutoff of 1960. (Actually, now that I think about it, it’s kind of funny that the infinite dimensional case came before the finite dimensional one here.)
Relatedly: string diagrams (with Penrose’s tensor notation apparently being seen as a precursor)
emojis
UML
certain ubiquitous GUI conventions (☰ for a menu, > and ˅ for collapsable sections, the “home” icon, the “save” icon, escalating vertical bars for signal strength, %-complete bars, etc.)
I do think there’s some innovation on notation, but it mostly happens with existing typographic symbols because extending typography is harder than it used to be. Previously, you could just come up with whatever you wanted because work started out hand written. Then you’d pay to get the printer to make whatever weird symbol you wanted for publication, or, if on a budget, come up with some weird approximation using simpler symbols.
It seems like it should be easier on computers, and in theory it is, but lots of things drive us towards making default choices. The worst of these is probably that Unicode is already full of some many symbols that LaTeX can render, so it’s much easier to just pick some existing symbol rather than try to go through all the work of cooking up a new one.
I separately hope that there’s some effect from computer code, too, where people are trending towards favoring longer symbols that more resemble descriptive function and variables names, which feel less like notation but are easier to read on first glance, even if they bear costs in efficiency once familiar with them.
Type theory?
Gentzen’s stuff is from the 1930s and looks pretty much like the type theory I know.
Possibly one factor is that the evident versatility of using ASCII in nearly all programming languages (and also for stuff like LaTeX) made people less inclined to invent new notation.
I think there’s a general bias in Western culture arising from the problems of physicalism that gets people to consider realist ontology not worth seriously pursuing. Notation is downstream from ontology. You need to commit to an ontology to develop notation to represent that ontology.
Can you elaborate?
I think “squared notation” in Scott domains ( , , , , , ) is late 1960-s.
Also dataflow diagrams seem to come from 1970-s: https://en.wikipedia.org/wiki/Data-flow_diagram
Although visual dataflow programming seems to go back to 1960-s: https://en.wikipedia.org/wiki/Dataflow_programming
But yes, a bit later than 1960, but my examples are still quite old.
While Blocks are older, syntax highlighting is much newer. I am not sure that counts.
Some conjectures:
General cultural ossification: caused by widespread recording and information technology devices.
Cultural splintering: things change so fast nowadays that there simply isn’t time for new symbols to catch on before they’re replaced by even newer symbols (and culture is also so fragmented that there isn’t really a single power center that can take over).
Standardization: now that symbols tend to be standard across international communities, changing things might require a LOT of coordination.
Low hanging fruit has already been plucked: potential improvements (if they exist at all) might be minor and not worth overhauling already-standardized systems.
Symbols are still developing: how long has the like button been around? Or the karma system? Maybe we’ll keep innovating our symbology as new needs arise.