I completely agree on the importance of strategic thinking. Personally, I like to hear what early AI pioneers had to say about modeling AI. For example, Minsky’s society of mind. I believe the trend of AI must be informed by the development of epistemology, and I’ve basically bet my research on the idea that epistemological progress will shape AGI
My use of “must” wasn’t just about technical necessity, but rather a philosophical or strategic imperative — that we ought to inform AGI not only through recent trends in deep learning (say, post-2014), but also by drawing from longer-standing academic traditions, like epistemic logic.
I completely agree on the importance of strategic thinking. Personally, I like to hear what early AI pioneers had to say about modeling AI. For example, Minsky’s society of mind. I believe the trend of AI must be informed by the development of epistemology, and I’ve basically bet my research on the idea that epistemological progress will shape AGI
What do you mean with ‘must’? The word has to different meanings in this context and it seems bad epistemology not to distinguish them.
My use of “must” wasn’t just about technical necessity, but rather a philosophical or strategic imperative — that we ought to inform AGI not only through recent trends in deep learning (say, post-2014), but also by drawing from longer-standing academic traditions, like epistemic logic.