One of the reasons why it’s plausible that today’s or tomorrow’s LLMs can result in brief simulations of consciousness or even qualia is that it happens with dreams in humans. Dreams are likely some sort of processing of information/compression/garbage collection, yet they still result in (badly) simulated experiences as a clear side-effect of trying to work with human experience data.
One of the reasons why it’s plausible that today’s or tomorrow’s LLMs can result in brief simulations of consciousness or even qualia is that it happens with dreams in humans. Dreams are likely some sort of processing of information/compression/garbage collection, yet they still result in (badly) simulated experiences as a clear side-effect of trying to work with human experience data.