I know John disagrees with me, and I’m not an alignment researcher (and suspect the applicability to alignment is relatively less than, say, to physics), but I feel like basic abstract algebra was very useful to me. The big things were:
Understanding that “symmetry” really means “invariance under certain operations”.
Understanding quotient things. Sure, you can do equivalence classes without groups, but the way that the structure is determined helps give them color.
The concept of “generated by”. That is, the smallest X that contains a set S while still satisfying property P, which often is built from S via certain allowable operations. Sure, you can see this in linear algebra, but for me the general concept took seeing it in multiple places, and the algebraic one did most of the work.
General vibes of “is this an algebraic thing?” and “working with algebraic stuff”. Hard to explain what this is, but it’s a thing. A sense of “rigidity” or “discreteness”.
But, I still don’t really care about the Sylow theorems.
There’s a really good book called “Visual Group Theory”. I haven’t read the whole thing, but just from looking at parts of it on a lark I finally gained an understanding of what the hell the semidirect product is—all thanks to someone actually drawing the picture for me.
Representation theory is really cool, I wish I knew more, especially with the physical applications.
Now for my takes on other abstract math:
The main gain from analysis is basically the definition of a limit + enough examples of using the concept of “arbitrarily close, as in, for any epsilon we can bla bla” that you think of ‘epsilon definitions’ as operationalizations.
Some of the basic definitions of topology (closure, boundary, interior, exterior, and connectedness) are my go-to examples of finding “True Names” for intuitive concepts. Partly this is because I have a memory of correctly guessing how to operationalize “boundary of a set” as a kid. The nice thing is that anyone that can see (and possibly even the blind!) will have the intuitive concept, but will immediately run into difficulties trying to pin it down—and yet the answer is pretty easy to understand, and a smart kid could probably reinvent them (assuming they have hints like the idea of looking at arbitrarily small open balls).
Compactness… not so much. “Closed and bounded” is good enough.
I used to care about the metrization theorems as a motivation for those funky separation axioms, but now, meh. Algebraic topology is significantly more interesting than the advanced point-set stuff. As for broad usefulness: I’m told there’s some applications in e.g. certain funky materials, but I don’t know the details. One useful thing broadly throughout mathematics, which algebraic topology is an example of, is “find an invariant under the relevant operations” (likewise, “find operations that leave a relevant thing or property invariant”).
The usual intro measure theory doesn’t deserve to be on the list. However, the concept of “almost everywhere/nowhere/surely” is very useful. (I did find the geometric measure theory of fractals neat, but its probably not broadly applicable).
It looks to me like complex analysis is mainly applied to compute gnarly integrals. The Nyquist Stability Criterion is the reverse, computing poles via (an interpretation of) an integral. The other source of applications is from the Kramers-Kronig relations, which are general enough that I expect them to come up whenever complex functions do (which often occur when complex numbers come up).
Differential geometry’s ‘big idea’ is using local coordinate maps that you can smoothly translate between, and defining all your properties in ways that are equivalent under coordinate transform. It’s more interesting when you do Riemannian geometry, as then you get neat theorems like the Gauss Bonnet theorem, and the application to general relativity. I suspect this doesn’t apply broadly, though. Another broad concept is the “pushforward/derivative” or “pullback”, which applies just as well to e.g. probability theory.
Notice how a lot of these are handfuls of concepts that often could better be taught separate from the rest of the subject, and often aren’t even emphasized by usual educational material? I wish there was just a book with a grabbag of that sorta stuff from as many fields as possible.
I perhaps should’ve said explicitly: After calculus and linear algebra, abstract algebra would be my next choice. (though, I might be persuaded by Boyd to put convex analysis as third and abstract algebra as fourth). All the rest are more marginal.
I know John disagrees with me, and I’m not an alignment researcher (and suspect the applicability to alignment is relatively less than, say, to physics), but I feel like basic abstract algebra was very useful to me. The big things were:
Understanding that “symmetry” really means “invariance under certain operations”.
Understanding quotient things. Sure, you can do equivalence classes without groups, but the way that the structure is determined helps give them color.
The concept of “generated by”. That is, the smallest X that contains a set S while still satisfying property P, which often is built from S via certain allowable operations. Sure, you can see this in linear algebra, but for me the general concept took seeing it in multiple places, and the algebraic one did most of the work.
General vibes of “is this an algebraic thing?” and “working with algebraic stuff”. Hard to explain what this is, but it’s a thing. A sense of “rigidity” or “discreteness”.
But, I still don’t really care about the Sylow theorems. There’s a really good book called “Visual Group Theory”. I haven’t read the whole thing, but just from looking at parts of it on a lark I finally gained an understanding of what the hell the semidirect product is—all thanks to someone actually drawing the picture for me. Representation theory is really cool, I wish I knew more, especially with the physical applications.
Now for my takes on other abstract math:
The main gain from analysis is basically the definition of a limit + enough examples of using the concept of “arbitrarily close, as in, for any epsilon we can bla bla” that you think of ‘epsilon definitions’ as operationalizations.
Some of the basic definitions of topology (closure, boundary, interior, exterior, and connectedness) are my go-to examples of finding “True Names” for intuitive concepts. Partly this is because I have a memory of correctly guessing how to operationalize “boundary of a set” as a kid. The nice thing is that anyone that can see (and possibly even the blind!) will have the intuitive concept, but will immediately run into difficulties trying to pin it down—and yet the answer is pretty easy to understand, and a smart kid could probably reinvent them (assuming they have hints like the idea of looking at arbitrarily small open balls).
Compactness… not so much. “Closed and bounded” is good enough.
I used to care about the metrization theorems as a motivation for those funky separation axioms, but now, meh. Algebraic topology is significantly more interesting than the advanced point-set stuff. As for broad usefulness: I’m told there’s some applications in e.g. certain funky materials, but I don’t know the details. One useful thing broadly throughout mathematics, which algebraic topology is an example of, is “find an invariant under the relevant operations” (likewise, “find operations that leave a relevant thing or property invariant”).
The usual intro measure theory doesn’t deserve to be on the list. However, the concept of “almost everywhere/nowhere/surely” is very useful. (I did find the geometric measure theory of fractals neat, but its probably not broadly applicable).
It looks to me like complex analysis is mainly applied to compute gnarly integrals. The Nyquist Stability Criterion is the reverse, computing poles via (an interpretation of) an integral. The other source of applications is from the Kramers-Kronig relations, which are general enough that I expect them to come up whenever complex functions do (which often occur when complex numbers come up).
Differential geometry’s ‘big idea’ is using local coordinate maps that you can smoothly translate between, and defining all your properties in ways that are equivalent under coordinate transform. It’s more interesting when you do Riemannian geometry, as then you get neat theorems like the Gauss Bonnet theorem, and the application to general relativity. I suspect this doesn’t apply broadly, though. Another broad concept is the “pushforward/derivative” or “pullback”, which applies just as well to e.g. probability theory.
Notice how a lot of these are handfuls of concepts that often could better be taught separate from the rest of the subject, and often aren’t even emphasized by usual educational material? I wish there was just a book with a grabbag of that sorta stuff from as many fields as possible.
I perhaps should’ve said explicitly: After calculus and linear algebra, abstract algebra would be my next choice. (though, I might be persuaded by Boyd to put convex analysis as third and abstract algebra as fourth). All the rest are more marginal.