I live in London, work in decision intelligence, leading development of neurosymbolic AI for personal finance, and have a background in compsci, information science and sociology. I also have a strong interest in epistemology.
New to alignment thinking, but ramping quickly due to observed urgency.
I am a strong proponent of using care in application of AI in personal finance and am speaking a lot about it within financial industry circles.
There is a difference between “acts of individual violence don’t help at all” and “acts of individual violence have lower EV compared to non-violent acts”.
I think by leaning too much into the former phrasing throughout the essay, there is risk of the essay is read as persuasive by proponents of individual violence, rather than rational.
In my mind individual violence is most likely caused by a form of mental illness, followed by likelihood it is caused by rational action that insufficiently searched the action space to find a better action combined with insufficient depth to model 3rd order effects. In the 2nd order effects (inducing fear and inciting more violence) individual action looks appealing. It’s only after you model ripples through discourse and both communities that you conclude violence has a substantially lower EV than, say, influencing through written word.