Are there some kind of “Envisioning fallacy” that generalizes this?
I have seen myself and others fall prey to a sense of power that is readily difficult to describe when discussing topics as diverse as physics simulation, conways game of life-derivatives and automatic equation derivation.
And I have observed myself to once have this very same sense of power when I thought about some graphs probabilistically weighted edges and how a walker on this graph would be able to interpret data and then make AI (It was a bit more complicated and smelled like an honest attempt, but there was definitely black boxes there).
Are there some kind of “Envisioning fallacy” that generalizes this?
I have seen myself and others fall prey to a sense of power that is readily difficult to describe when discussing topics as diverse as physics simulation, conways game of life-derivatives and automatic equation derivation.
And I have observed myself to once have this very same sense of power when I thought about some graphs probabilistically weighted edges and how a walker on this graph would be able to interpret data and then make AI (It was a bit more complicated and smelled like an honest attempt, but there was definitely black boxes there).